The Moon is the Limit

The first time I heard about the Lua programming language was back in 2003. My coworkers were working (quite hard) at porting FarCry to PlayStation 2. The game code and engine rely heavily on the Lua language.I am a sort of split about using script language to define game and character behaviors in a game. The good part is that an interpreted language offers several benefits like no need to recompile the whole engine (or a dll) for even a small change, gentler learning curve so that even non-programmers could write a piece of the higher level code of the game. Even if they cannot write it, at least they could get a clue of what the code is doing. Moreover if the language is tailored for the domain (Domain Specific Language) then coding with that language could be easier than using a general purpose one.
On the other hand I had usually bad experiences – the language interpreter itself it is usually not enough to develop code, a debugger is needed as well and creating a good debugger is not an easy task. Moreover integrating the script language with the engine could be either boring or error prone (usually both). The domain specific language tends to evolve into a general purpose one as soon as its users discovers that they need exactly the effect of that instruction only slightly different.
As soon as the barrier from Domain Specific to General Purpose is crossed, the nice and cute language turns into a wild beast ready to bite your back at the first oversight.
The lack of constraints, typical of scripting language, enables a quick prototyping or a quick development of small programs, but leave you alone when facing with mid-sized application. Coherence and consistency should be achieved by using rules, conventions and strict discipline.
So… how does lua stacks in those respects?
After reading about half the book Lua I should say that stacks pretty bad. According to the book, Lua is intended for solving problems with a few hundreds lines of code. And many of the language design choices are biased toward this direction. The first that comes to mind is that variables have a global scope by default. Actually this is a bad design decision since the global namespace tends to be polluted rather quickly and generally global variables are considered a bad programming practice.
One of the most confusing part is that list and multiple values are quite freely mixed with varying semantic. A function may return a multiple value, so that you can write something like:

a, b = f(x)

If f(x) returns two values then they are respectively assigned to ‘a’ and ‘b’. If f(x) returns one value then ‘a’ gets it, while ‘b’ is assigned with ‘nil’. If f(x) returns three or more values then ‘a’ and ‘b’ get the first pair other are silently discarded.
But it doesn’t hold true everytime. In fact if you write:

a, b = f(x), 1

then ‘1’ is assigned to ‘b’ and only the first returned value from f(x) is assigned to ‘a’. In other words the extra return values from f(x) are discarded. If f(x) doesn’t return any value then ‘a’ gets the ‘nil’ value. Frightening, isn’t it?
What if the result of a function is passed as argument to another function, as in ‘f(g())’ ?
Somewhat of a nightmare. If the function is the only argument then all the result values are passed as arguments to the outer function, if other arguments are given then just the first return value is taken.
Consider that a function can be invoked with more or less than the specified number of arguments without error and you get the whole creepy plot.
This approach could do for a very few hundred lines of code, crossing this boundary requires the application of strict rules and conventions in order to keep things working without falling into chaos.
The book I’m reading seems quite oriented to a prototype level of application, in fact shows with good detail how to load and save data as source code. This is fine with your home app, but it is what you don’t want to do in a real application if you even slightly care about security.
I found rather elegant the use of associative container (the table) as a mean both for structuring data and for implementing the concept of class (I haven’t yet touch the point that deals with inheritance). Although the (ab)use of syntactic sugar in the language is likely to lead to syntactic diabetes.
According to what I read around it took about 10 years to fully master a programming language. I could agree, even if I think that within a year and a good amount of code you should be proficient. So my considerations are based on the first impression and my understanding of the language is far even from a beginner. Nonetheless I daresay that Lua has some problems in scaling up.

Selected power

It is alway a good surprise when a task you thought being rather hard turns out to be quite easy when you actually do it. Using select for I/O multiplexing and stuff like this is one of those pleasant surprise. The man page could be quite intimidating, therefore I start with an example. Suppose you are dealing with network communication (or any other form of interaction where an I/O operation could take too long to be correct). You are likely to read (or write) into a file descriptor (previously opened via socket and then bound in some way) AND to check for a timeout. If the operation is taking too long, you want to bail out of the read operation and perform the needed action.If you are stuck with standard read and timer operations you may need to set up some signaling check for the right thread to catch them and so on. But there is a better way.
Select accepts several arguments: a limit, three sets of file descriptors and a timeout, and returns as soon as one of the conditions (defined by arguments) is met. The file descriptor sets are defined via fd_set type (handled with fellow macros FD_SET, FD_ISSET, FD_ZERO and FD_CLR). All these arguments can either be NULL or point to a fd_set. The first one is the set of file descriptors checked for non blocking read. That is that if one of the file descriptors contained in this sets become ready to be read without blocking the caller, then select returns. The next argument is for writable file descriptors and the third one is filedescriptors that have to be checked for errors (exceptions).
The first argument is the maximum filedescriptor contained in the union of the three sets plus one. This serves as a limit to avoid checking the whole range of file descriptor.
The last argument is a timeout. It is a struct timeval (the same filled in by gettimeofday) that can define timeouts with a microsecond resolution. In practice the resolution is much less fine grained than that and depends on the kernel and the architecture. For example on Linux kernel 2.4 on ARM the resolution is 10ms. Better check the smallest handled timeout before blindly relying on it.
Select returns -1 in case of error, 0 in case of timeout or the number of the filedescriptors that changed status in the three sets.
For the example the return code is easily processed, while for more convoluted cases could be more complex.
Let’s take another example, suppose you are reading audio packets from a stream and you want to decode and playback them. The first approach to this problem could be using two threads with a coupling buffer. One thread reads packets and pushes them into the buffer and the other thread pops the packets out of the buffer and sends them to the audio driver. This is conceptually simple, but not straightforward to do in the right way. When dealing with threading you always have to synchronize them. It is likely that you need a third thread to control the streamer and the player threads.
If you employ select the solution is very simple and natural. Just check the wall clock and compute a timeout for the next play, then wait with select either for a new incoming packet or the time to play.
In this case there is just one thread and the warrant that if you are reading the buffer no one is writing in it. This allows you to simplify the buffer management.
If you are not so lucky to work with Linux, but your daunting task is to earn a living with Windows the good news is that a similar function is available for Microsoft platforms.

Humble programmer

This thing of Humble Programmer really got me. I think that, in the unlikely case that someday a wannabe programmer would come to me asking for advices on how to learn programming and how to become really good at it, I would suggest that he needs just two things – humility and will to understand. I have to be honest, I didn’t read the original paper by Dijkstra. I tried, but it is very long and the introduction is nearly as long, and … ok I know, I should have. Nonetheless I read books and magazines and grasped the idea of “humble programmer” elsewhere.
So, what so powerful in being “humble” at programming?
Well there is so much to learn, because you question your approach and your solution rather than the world. How many times, when the code is malfunctioning, do you blame Microsoft, the libraries, the kernel, the compiler?
It is easy to take for grant that the problem is someone else fault without really investigate the matter seeking for objective proofs. If humility is not for you, then you can consider it as a sort of attempt of substantiated arrogance. From my experience during investigations I realize that a) it was my fault (more or less subtle anyway) and b) now I better understand the system and c) not only I’ll avoid this problem in future, but also I’ll avoid an entire family of related problems.
It is a long time I don’t stumble in a compiler bug. Really. I was used to finding some back at Amiga days (Lattice C), and still some at the beginning of (my) PC era (Watcom C/C++), but that means 10 years ago. And that makes sense. Both Gcc and Microsoft Visual C are very mature products (ok, let’s forget for a while the problems with managed C++ and the like, these are young, recently introduced technologies). The chance of hitting a bug writing even non-trivial code are so far fetched that maybe it’s more likely being hit by an asteroid while walking in the street.
Anytime someone calls me invoking a compiler bug as the cause for a problem, my eyebrow automatically raises in a Spock-like manner.
The will to understand is the complementary force guiding to great programming. Every time you are in a hurry, twiddling the code to make it work, without really understanding what’s going on, you are in double trouble. First if you don’t understand what you are doing (that’s research? ;-)) you are likely to fill the matter with subtle problems ready to strike on your back as soon as you get distracted. Then you are wasting the chance to increase your comprehension of the system, the chance to better handle it in the future, you are intentionally keeping your tool chest from growing with new powerful tools.
Ok, humble, but boring, I’m quitting here ๐Ÿ™‚

90% of Research

According to a recent tv-ad 90% of research is funded by pharmaceutical corporations. Maybe this data is a little overestimated, nonetheless it is still astonishing. In other words the research for health and cure is privately founded. I wonder to which extent this is good. I mean the logic driving the research for health preservation and recovering is the logic of the Return of Investment, the dividend and share logic. The same logic that led one of the German pharmaceutical giant during the WWII to produce (and sell to the government) the infamous zyklon-B gas used to kill millions of people in concentration camps.
If you google around in internet for these terms you’ll find more than one reason to be concerned. Even if trusting the human nature, it is not possible to not doubt whether such research could find inexpensive treatments for any disease. What if something could be simply let pass, or just a spoonful of water and sugar is fine?
Leaving aside these gloomy thoughts, the link of the day is to Sibling Rivalry: C and C++ a pdf paper from B. Stroustrup detailing the source of incompatibilities between C99 and C++89 and the parallel evolution of these languages that aimed to be one.

C++ To Gui or Not To Gui?

One of the most missing feature of the C and C++ standards is, at least for us everyday programmers, the standardisation of the GUI. What a bless could be a single approach to the GUI programming. Tools companies may focus their efforts to provide wonderful tools, programmers could stop wasting time in #ifdeffing around, and so on. Anyway looking at Java language maybe that ISO/ANSI committee were not so wrong in taking such a conservative approach. At only some 10 years after the language debut, Java, although strictly controlled by Sun, has 4 different (and incompatible) GUI toolkits (JWT, AWT, Swing, SWT) plus the J2ME GUI interface.

Humble programmer, this time for real

Sometimes even the most skilled programmer like me lose their humbleness and blame the compiler. Luckily for him (or her) reality is ready the put them back at their places. This is what happened to me today. It was afternoon and after some thick coding with containers and containers of containers I was rather tired, maybe in need of a break. I don’t believe in break so I went on and wrote the destructor.
Pretty simple stuff a for loop iterating through all the container elements and basically deleting them one by one.
So far so good, then I wrote the test case (I do believe in test cases) and I got a rather surprising behavior on the destruction of an empty container. My code was supposed to skip the destruction loop and get out of the destructor cleanly, instead what actually happened was that the loop code was executed once causing exception and exception dialog popping all around.
Puzzled I stared at the code without a clue. It was a for loop as thousands other I wrote.
So I composed a small source to test just that behavior (I do believe in small source test, too, they usually help you a lot understanding stuff). And I got it perfectly right. The code was doing what I expected. So I cut and pasted some types from the project into my test. Maybe after all it had something to do with the complex types I used.
But once more the small test code run perfectly.
I was quite astonished and tempted to blame the compiler. So I went for the assembly window just to have the confirmation that the code was actually different and that the project code compiled to execute the loop once.
My personal C++ GURU was away so I had to handle it all by myself. At this point I tried to do some more cut & paste to understand were the problem arise and… I got it… There was, right after the closing bracket of the for statement and before the opening curly bracket… there a SEMICOLON!
Feeling dumb would have been an giant leap upward respect to that I felt. I knew that very seldom the compiler is to blame, so despite of appearance I was the culprit. Also I should believe some more in breaks, just a few minutes to get from ‘fused’ back to ‘bright’. And … yes C++ (or C, in this case it is a common pitfall) is a loaded gun ready to shot into your feet by default… but, what the heck! A little warning from the compiler would have saved me quite a time.
In fact my personal programming style is to always use the curly brackets even when the block is empty. I found this to be more readable and less error prone. It would be nice to instruct the compiler to emit a warning when this rule is broken. If the programmer is so smart to do everything in the for parenthesis and doesn’t need to specify a loop body, then she could spend part of the saved time writing a pair of curly brackets.

Don’t fear, it’s only C++

One thing I’m sure about C++ is that it had a troubled history. It sprang to life in mid 80s as a C with classes, a language aimed to bring the then new object oriented paradigm to C programmers. The standardization was far ahead, but the language when arrived to the first programmers outside AT&T was quite stable: single and multiple inheritance, late binding, function and operator overload.
But quite stable isn’t as stable as stable, so the first addenda to the language arrived by the beginning of 90s: exceptions and templates.
The standardization arrived quite late (late 90s) in the history of C++ and rather than codifying the existing practice, as it had been the case for C, took the way of empowering the language. The committee added many features to the language, one for all the STL. The standardization of the C++ language had been an impressive result, if for anything else at least for being a result. With the heavy load of new features it would have been easy to get lost on the way killed by the committee overhead.
Today, by admission of the language creator himself, Bjarne Stroustrup, it is unlikely for a single programmer to have a perfect knowledge of the whole language (considered as both the core language and the library).
With this troubled history, compiler vendors had their hard time to keep up with improvement first and standard later. The standard presented for sure a hard challenge. Being compliant could be a too ambitious goal when you have actually to develop a product and put it on the shelves.
Incompatibilities, non-compliancies, misbehavior were only part of the problem that programmers had to deal with, the part was the increased level of complexity: whole new concepts in the template field, tons of components and functions in the library. Also from the development battlefields new concern arose. Exceptions which had considered an elegant way to deal with errors and anomalies turned out to be if not a false friend, at least a very difficult one. If not carefully planned and the code actually being written with exceptions in mind, the exception mechanism would yield no benefits and the problems caused by misuse are more or less the same you would have without using exceptions and ignoring anomalies.
Another pretty impressive result from the battlefield was the meta-programming. Templates were intended to describe a set of classes or functions that performed the same algorithm on different types. In order to do this some business have to be done at compile time. It was discovered, from a programmer, that this kind of business could be used to perform non-trivial computation at compile time. If generating a set of prime numbers at compile time could be not much interesting, the matter changes when the result of the generation is a parser or an optimal evaluation of complex expressions.
With this sort of history it is pretty natural that many programmers are scared by the new features of C++ (if not by the whole language itself). As any kind of fear, this too deserves to be analyzed and addressed. There no point in avoiding everything that hide some side effect, because this is the main mechanism behind abstraction – ignoring details of the lower levels. Instead the programmer should develop an approach that let her avoid pitfalls and undesired effects of the use and abuse of the language.
Also, nowadays most of the compilers are reasonably compliant with the standard. Even Microsoft Visual C++ is now a pretty good compiler by standard… standards. Therefore even from this point of view, programmers could stop worrying and start using the language to its full potential. And C++ has a huge potential being nearly the only language that allows the programmer to properly choose the right amount of abstraction to employ.
For sure C++ isn’t for the faint hearted, stealing someone words, C++ is like a gun loaded, ready to shot and aimed straight to your feet by default. I’m sure it’s what you want when there are grizzly around.

XS – eXtreme Snack

I’d like to share a couple of thoughts about an aspect of XP – snack food and something else. According to XP in the development room there should always be plenty of snack food. Also Kent Beck in his Extreme Programming Explained – Embrace Change (you can read my opinions about the book, and the summary I wrote in Italian) recommends that for celebrating a milestone or some important achievements some food should be feed to the team.It happened sometimes to me to offer some food to the team the day after a milestone. Always I did what I felt, but I’m a bit dubious about using this as a technique for keeping the morale high.
If the milestone was difficult to reach and the team did some really hard work, then I think it would be correct for the ‘gold owner’ (to use an XP term) to give an amount of money (maybe small, but not too small to be insulting) to the team. So if you really want some food you can buy it ;-).
I mean using food as a compensation recalls quickly laboratory mice. “you find the exit of the labyrinth, good boy, here you are your piece of cheese”.
In the same way I’m against using T-shirts, games and other gadgets as milestone bonus. Not that I’m against receiving this stuff in general (it’s good to create a corporate spirit), I’m against the idea that my hard work, doing something difficult on time, what I brought as an added value to my being a software engineer, is rewarded with a gadget. It’s not professional. It would be like tipping a waiter for an excellent service with food rather than money.
And in fact the professional aspect of our work (programming and game development in general) is hardly recognized. I think that we who work in this field, should be the first to promote professionalism, letting our enthusiasm cooling down if needed.
One of the obstacle is that it is difficult for people to understand what it takes to become a good programmer. Also today, with many advanced development environments and tools, it is easy for many to write some automation and calling themselves programmers. As I recently read on DDJ magazine, this would be like that anyone who can keep a surgery knife in his hand and knows something of anatomy could be called a surgeon.
To close it up I would say: Real Programmers don’t accept food as compensation ๐Ÿ™‚ (and watch out for too young surgeons).