Category: Blog

Back from Sciliar

I had great days on the Alpe di Siusi (Alp of Siusi) or SeiserAlm as those who live there call their home. It is a lovely and smooth plateau at around 1800m, braced by imposing Dolomite pikes. Sasso Piatto (“Flat Stone” a sort of understatement) bounds the East side, while Denti di Terra Rossa (“Red Soil Teeth”) bound the South Side and ends with the tooth shaped Sciliar pike.We had sun for nearly seven days and, despite of the warm winter, snow was enough to ski.
Alas, in order to appreciate great things, we have to compare them with the grey, dull industrial landscape of Castellanza, that’s why (I guess) I’m back home and at work.
The first interesting surprise hitting me at work has been the anticipation of the milestone I was working for. Our customer product has been selected for a design prize, so we are expected to deliver the working product earlier. Anyway we’re working hard, against time and hardware shortage to hit the milestone nonetheless.
At home, Santa (in the person of my wife) gave me a Xbox 360 and I started playing a not-so-Xmas-spirit game: Gears of War. I’m about the first boss and I should say that it’s great. From the technical viewpoint I think this is one of the first real next gen game. It runs on the Unreal 3 engine and the look is as detailed as awesome. The gameplay is based on taking cover, i.e. as soon as enemies are encountered you should take cover or you get badly shot. This is somewhat different from the classical shooter where the player drives a Rambo-like bullet-proof character (well, in Serious Sam, this was intended). The first boss is a chasing game play – run away from the monster, let him smash the doors for you and eventually take him off. Great.
While I was so fresh from the holidays and relaxed from Gears of War, I decided to update my notebook to the latest linux available. I gave a brief look to Sabayon Linux, only to discover that it behaves badly with the Toshiba touch pad and apparently has no support for my wireless adapter (I can’t believe that today distros still do not support the Centrino wireless adapter that is so widespread and at least two years old). So I turned to what I know quite well – Fedora Core 6.
I opted for the upgrade option instead of the install. Years ago I was used to upgrade, only to find that the system resulted in something that wasn’t completely new nor old and often was prone to glitches. A friend of mine suggested me to never upgrade, rather to backup the /home directory, install and restore it. This time I was so light from the holidays that I decided that an upgrade could do.
Well, I was wrong.
Yes I got a sort of FC6 tailored on my previous FC4 installation, and, yes, the wireless adapter sorta worked. But I could only browse the google website. No matter how I set the firewall/SELinux properties, there was no way to browse the rest of Internet. But this is another story.

What if everything be just fake?

Sometimes it happens that reality baffles my expectations and my ego pretends that electronics, physics, computer science or whatever it is not responding the way I expect, work just for chance. No matter what you do, the result will always be random.Consider a wifi USB dongle. It behaves pretty badly indoor. Let’s face it, all manufacturers claim a 100m range, but, in practice, in real apartment environments it works fine within 10m, it becomes unreliable at 15m, stone dead at 20m. If you want to do real time communication, in my experience, you’d better halve the ranges above.
Ok. It may seems stupid trying to reduce the range, but it is handy when you want to test what happens at the range limit, without wearing out your shoes.
Now well over a century of science tells you about Faraday cages, roughly put – a metal sheet shields radio waves. So if I enclose my wifi dongle in a pretty solid metal box I would expect either no radio communication or a dramatic cut of the range. Right?
Well, time to rethink.
I experienced the very same range of the unboxed dongle.
With the same reliability.
It is like Laws of the Universe consider themselves superior to my request and diregards my experiment, saving themselves for more worthy causes.
BTW, in the meantime, I have read Stefano Benni’s “La compagnia dei Celestini”, Roberto Ierusalimschy’s Programming in Lua, Eoin Colfer’s “The Secret safe”, Alfredo Castelli’s “Rama’s Left Eye”… I suppose it’ll take quite a time to write review for each one of them :-).

Secrets and Lies

Some days ago I helped a coworker with an oddly behaving Makefile. I am a long time user of this tool and I am no longer surprised at ‘make’ doing the unexpected in many subtle ways. This time the problem was that a bunch of source files in a recursively invoked Makefile were compiled with the host C compiler rather than the cross-compiler as configured.Make, in the attempt of easing the poor programmer life, pre-defines a set of dependencies with a corresponding set of re-make rules. One of this implicit rule states how to build an object file (.o) from a C source file (.c). The rule is somewhat like:

%.o: %.c
    $(CC) -c $(CPPFLAGS) $(CFLAGS) $<

And by default, the CC variable is set to ‘cc’, i.e. the default C compiler on Unix systems. Bear in mind that this is a recursively invoked make, therefore it is expected to be hidden at least one level away from the programmer. In the other hand the build has configured the top level make to use the cross compiler arm-linux-gcc. The problem could happen also because ‘make’ has a local scope for variables, i.e. variables are not exported by default to the recursively invoked makefiles.
The hard part in spotting the problem is that everything works as expected, i.e. the build operation completes without a glitch a you are left wondering why your shared libraries is not loaded on the target system.
Once you know, the problem is easy fixed, but if you are an occasional Makefile user you may experience some bad hours seeking what the heck is going on.
Hiding isn’t always bad – you need to hide details for abstraction, consider complex objects as black boxes to simplify their handling. One of the three pillars of OOP is “encapsulation”, that basically translates as data opaqueness, the object user is not allowed to peek inside the used object.
The question rising is – how much “hiding” is good and how much is wrong?
The C compiler is hiding away from the programmer the nits and bits of assembly programming so that he/she can think to the problem with a higher level set of primitives (variables instead of registers, struct instead of memory and so on).
If you want to go up with the abstraction level you must accept two things:

  • you are losing control of details;
  • something will happen under the hood, beyond your (immediate) knowledge;

Going up another level we meet the C++ language, with a greater deal of things working below the horizon. For example constructors implicitly call parent class constructors; destructors for objects instantiates as automatic variables (i.e. on the stack) are invoked when the execution leaves the scope where the objects had been instantiated.
If you are a bit fluent in C++ these implicit rules are likely not to surprise neither to harm you. If you consider a traditional programming language such as C, Pascal or even Basic (!), you will notice quite a difference. In traditional language you cannot define code that is executed without an explicit invocation. C++ (and Java for the matter) is more powerful and expressive by hiding the explicit invocation.
In many scripting languages (such as Python, Lua, unix shell, PHP… I think the list could go on for very long) you don’t have to declare variables. Moreover if you use a variable that has not yet been assigned you get it initialized by default. Usually an empty string, a null value or zero, it depends on the language. This could be considered handy so that the programmer could save a bunch of keystrokes and concentrate on the algorithm core. I prefer to consider it harmful because it can hide one or more potential error. Take the following pseudo-code as an example

# the array a[[]] is filled somewhere with numbers.
while( a[[index]] != 0 )
{
    total += a[[index]];
    index++;
}
print total;

If uninitialized variable values can be converted to number 0, then the script will correctly print the sum of the array content. But, what if some days later I add some code that uses a ‘total’ variable before that loop?
I will get an hard to spot error. Hard because the effect I see can be very far from the cause.
Another possible error is from mistyping. If the last line would be written as:

print tota1;

(where the last character of “tota1” is a one instead of a lowercase L)

I would get no parsing and no execution error, but the total would be always computed as zero (or with some variations in the code, could be the last non-zero element of the a[[]] array). That’s evil.
I think that one of the worst implicit variable definition is the one made in Rexx. By default Rexx variables are initialized by their name in upper case. At least 0 or nil is a pretty recognizable default value.
Time to draw some conclusions. You can recognize a pattern – evil hiding aims to help the programmer to save coding time, but doesn’t scale, good hiding removes details that prevent the program to scale up.
As you may have noticed lately, the world is not black or white, many are the shades and compromises are like the Force – they could yield both a light side and a dark side. E.g. C++ exceptions offer the error handling abstraction, at the cost of preventive programming nearly everywhere in order to avoid resource leaks or worse.
Knowing your tools and taking a set of well defined idioms (e.g. explicitly initialize variable, or use constructor/destructor according to the OOP tenets) are your best friends.

Spaghetti Alla Disperata

I’m no cook, but I see this post got several visualizations over the years. If you enjoy you are welcome to put a like or leave a comment.

You are back home late and you have to dine. You open the fridge wondering why it is so hard to open. The answer strikes you immediately when the door is pushed open – the fridge is full of vacuum and it… sucks. So you need to arrange something… what’s there? A tomato is desperately fighting against mold. A not so expired mozzarella cheese… well let’s have a look at what you can cook.

This is, more or less, what happened yesterday evening. Here is the recipe I recovered from a book requiring exactly (lucky I am, isn’t it?) what I had in the fridge.

The name is Spaghetti alla Caprese, I named them Spaghetti alla Disperata (Desperation Spaghetti). For 4 people:

  • 400g spaghetti
  • 100g mozzarella cheese
  • 70g tuna
  • 2 fillet of anchovies
  • ground pepper
  • 6 black olives
  • 3 spoon of olive oil
  • 350 of tomatoes

Peel the tomatoes, cut into small cubes, and take the seeds apart. Put them in the saucepan where the olive oil is already hot. Cook the tomatoes for about 15 minutes occasionally stirring them.

Mix the anchovies and the tuna in a mortar (you can use a chopping board with a chopping knife instead). When the mixture is well blended add the olives cut in little parts.

Strain the past a bit early and put it with the tomatoes, add the fish/olive mixture and some ground pepper. Finally, add the mozzarella cut into small cubes.

Enjoy and then go shopping something for to eat tomorrow.

Frustrated programmers on Linux

Life is hard. Especially in the working hours. The more hard the more you have to do your job on linux for linux. If you think about it, that’s odd. Back in the Days, Unix was the result of a young team that sought a programming (and hacking) environment. At times they had very programmers-unfriendly environment and Unix was very successful in this respect – text editors, interpreters, advanced shells, programming tools and the like flourished in the Unix filesystems.
Today is like those days… well it is still like that, in the sense that the rest of the world, most notably Microsoft, caught up and overtook the Unix command line.
First, suppose you want a project aware C/C++ editor. In the Windows world, maybe you have not much choice, but the answer is easy, the latest Visual Studio is quite good, allowing you to do nearly everything in a comfortable way. Linux is lagging behind, there is vim, emacs and Eclipse. Eclipse is indeed good (considered the price), but its C/C++ editing capabilities are far inferior to the basic Visual Studio. Maybe you can get it working in the same way, but this requires a great effort for those that are not fluent in this developing environment.
Suppose now that you want to interface your application with audio. If you use the basic operating system functionalities (likely OSS) you can do it rather quickly at the price of audio exclusivity. If your application is running no one else can access it.
This is known problem and has a known solution – using a sound daemon that perform mixing from multiple audio applications. This is reasonable.
What is unreasonable is that Linux sports a wide variety of such deamons, everyone has his own. What is yet more unreasonable is that both Redhat/Fedora and Ubuntu use the eSound daemon that has no documentation.
So you are forced to not have a standard choice and, what is worse, the choice you are forced to has no documentation whatsoever.
Frustrating, isn’t it?

Searching (other people’s) code

You know, the better you know how to use Google the better you do your work. Be it searching, discovering, copying or taking inspiration Google is today as close as possible to the sixties idea of the omniscient computer. Just ask, it knows the answer. As a programmer I’m used to googling quite often, but it’s just today that I discovered a Google tool aimed at easing the programmer life. Google code search is a specialization of the best search engine to search through all the code available on Internet. It is impressive by itself allowing you to use regular expression, specifying language or license! The advance search shows a more usable form and allows you to specify negation and case sensitivity.
As this is a great tool for programmers, it is a great tool for those looking for vulnerabilities and security issues (be they the Good or the Bad boys). Have a look here to get a glimpse over what you can recover via Google code search.

Mysterious Hardware

After all, I am only a programmer. I know something about electronics, but that’s nearly all theory. I did some soldering when I was young, but my resources were too constrained to allow me a real understanding of the matter through experience. So, today I’m doing my programming job on a prototype hardware containing an ARM chip and some other almost invisible components. Working with prototype computers ranges from funny to entangling complicated. The reason is that it is very hard to understand and find where the software bug ends and where the hardware bug begins.
The funny part is usually when you understand that’s not your fault before banging your head against the wall too hard.
This board has about an entry-level-PC audio capabilities, so a mic-in line and a speaker output. For testing purpose I hooked the audio input to a PC continuously playing audio via a voltage divider. The speaker out is always connected to a… well, a speaker.
Now when no software is running the speaker is mute no matter what is coming in through the mike.
You can imagine my surprise when hooking the PC serial cable to the board I heard the music through the speaker! Remove the serial and the speaker mutes back. *Gosh*! That’s surely not my fault!
Another pretty fault as easy as astonishing is that a clicking noise in the sampled audio happens only when the board is battery operated. The audio is noise-clean when the board is tied to the AC power supply.
Nonetheless I am admired that such an engineering jewel is working (mostly) as expected. Kudos to my colleagues!

The Moon is the Limit

The first time I heard about the Lua programming language was back in 2003. My coworkers were working (quite hard) at porting FarCry to PlayStation 2. The game code and engine rely heavily on the Lua language.I am a sort of split about using script language to define game and character behaviors in a game. The good part is that an interpreted language offers several benefits like no need to recompile the whole engine (or a dll) for even a small change, gentler learning curve so that even non-programmers could write a piece of the higher level code of the game. Even if they cannot write it, at least they could get a clue of what the code is doing. Moreover if the language is tailored for the domain (Domain Specific Language) then coding with that language could be easier than using a general purpose one.
On the other hand I had usually bad experiences – the language interpreter itself it is usually not enough to develop code, a debugger is needed as well and creating a good debugger is not an easy task. Moreover integrating the script language with the engine could be either boring or error prone (usually both). The domain specific language tends to evolve into a general purpose one as soon as its users discovers that they need exactly the effect of that instruction only slightly different.
As soon as the barrier from Domain Specific to General Purpose is crossed, the nice and cute language turns into a wild beast ready to bite your back at the first oversight.
The lack of constraints, typical of scripting language, enables a quick prototyping or a quick development of small programs, but leave you alone when facing with mid-sized application. Coherence and consistency should be achieved by using rules, conventions and strict discipline.
So… how does lua stacks in those respects?
After reading about half the book Lua I should say that stacks pretty bad. According to the book, Lua is intended for solving problems with a few hundreds lines of code. And many of the language design choices are biased toward this direction. The first that comes to mind is that variables have a global scope by default. Actually this is a bad design decision since the global namespace tends to be polluted rather quickly and generally global variables are considered a bad programming practice.
One of the most confusing part is that list and multiple values are quite freely mixed with varying semantic. A function may return a multiple value, so that you can write something like:

a, b = f(x)

If f(x) returns two values then they are respectively assigned to ‘a’ and ‘b’. If f(x) returns one value then ‘a’ gets it, while ‘b’ is assigned with ‘nil’. If f(x) returns three or more values then ‘a’ and ‘b’ get the first pair other are silently discarded.
But it doesn’t hold true everytime. In fact if you write:

a, b = f(x), 1

then ‘1’ is assigned to ‘b’ and only the first returned value from f(x) is assigned to ‘a’. In other words the extra return values from f(x) are discarded. If f(x) doesn’t return any value then ‘a’ gets the ‘nil’ value. Frightening, isn’t it?
What if the result of a function is passed as argument to another function, as in ‘f(g())’ ?
Somewhat of a nightmare. If the function is the only argument then all the result values are passed as arguments to the outer function, if other arguments are given then just the first return value is taken.
Consider that a function can be invoked with more or less than the specified number of arguments without error and you get the whole creepy plot.
This approach could do for a very few hundred lines of code, crossing this boundary requires the application of strict rules and conventions in order to keep things working without falling into chaos.
The book I’m reading seems quite oriented to a prototype level of application, in fact shows with good detail how to load and save data as source code. This is fine with your home app, but it is what you don’t want to do in a real application if you even slightly care about security.
I found rather elegant the use of associative container (the table) as a mean both for structuring data and for implementing the concept of class (I haven’t yet touch the point that deals with inheritance). Although the (ab)use of syntactic sugar in the language is likely to lead to syntactic diabetes.
According to what I read around it took about 10 years to fully master a programming language. I could agree, even if I think that within a year and a good amount of code you should be proficient. So my considerations are based on the first impression and my understanding of the language is far even from a beginner. Nonetheless I daresay that Lua has some problems in scaling up.