The first time I heard about the Lua programming language was back in 2003. My coworkers were working (quite hard) at porting FarCry to PlayStation 2. The game code and engine rely heavily on the Lua language.I am a sort of split about using script language to define game and character behaviors in a game. The good part is that an interpreted language offers several benefits like no need to recompile the whole engine (or a dll) for even a small change, gentler learning curve so that even non-programmers could write a piece of the higher level code of the game. Even if they cannot write it, at least they could get a clue of what the code is doing. Moreover if the language is tailored for the domain (Domain Specific Language) then coding with that language could be easier than using a general purpose one.
On the other hand I had usually bad experiences – the language interpreter itself it is usually not enough to develop code, a debugger is needed as well and creating a good debugger is not an easy task. Moreover integrating the script language with the engine could be either boring or error prone (usually both). The domain specific language tends to evolve into a general purpose one as soon as its users discovers that they need exactly the effect of that instruction only slightly different.
As soon as the barrier from Domain Specific to General Purpose is crossed, the nice and cute language turns into a wild beast ready to bite your back at the first oversight.
The lack of constraints, typical of scripting language, enables a quick prototyping or a quick development of small programs, but leave you alone when facing with mid-sized application. Coherence and consistency should be achieved by using rules, conventions and strict discipline.
So… how does lua stacks in those respects?
After reading about half the book Lua I should say that stacks pretty bad. According to the book, Lua is intended for solving problems with a few hundreds lines of code. And many of the language design choices are biased toward this direction. The first that comes to mind is that variables have a global scope by default. Actually this is a bad design decision since the global namespace tends to be polluted rather quickly and generally global variables are considered a bad programming practice.
One of the most confusing part is that list and multiple values are quite freely mixed with varying semantic. A function may return a multiple value, so that you can write something like:
a, b = f(x)
If f(x) returns two values then they are respectively assigned to ‘a’ and ‘b’. If f(x) returns one value then ‘a’ gets it, while ‘b’ is assigned with ‘nil’. If f(x) returns three or more values then ‘a’ and ‘b’ get the first pair other are silently discarded.
But it doesn’t hold true everytime. In fact if you write:
a, b = f(x), 1
then ‘1’ is assigned to ‘b’ and only the first returned value from f(x) is assigned to ‘a’. In other words the extra return values from f(x) are discarded. If f(x) doesn’t return any value then ‘a’ gets the ‘nil’ value. Frightening, isn’t it?
What if the result of a function is passed as argument to another function, as in ‘f(g())’ ?
Somewhat of a nightmare. If the function is the only argument then all the result values are passed as arguments to the outer function, if other arguments are given then just the first return value is taken.
Consider that a function can be invoked with more or less than the specified number of arguments without error and you get the whole creepy plot.
This approach could do for a very few hundred lines of code, crossing this boundary requires the application of strict rules and conventions in order to keep things working without falling into chaos.
The book I’m reading seems quite oriented to a prototype level of application, in fact shows with good detail how to load and save data as source code. This is fine with your home app, but it is what you don’t want to do in a real application if you even slightly care about security.
I found rather elegant the use of associative container (the table) as a mean both for structuring data and for implementing the concept of class (I haven’t yet touch the point that deals with inheritance). Although the (ab)use of syntactic sugar in the language is likely to lead to syntactic diabetes.
According to what I read around it took about 10 years to fully master a programming language. I could agree, even if I think that within a year and a good amount of code you should be proficient. So my considerations are based on the first impression and my understanding of the language is far even from a beginner. Nonetheless I daresay that Lua has some problems in scaling up.
One thought on “The Moon is the Limit”