All software of sufficient complexity to do something useful (and even some that don’t do anything interesting at all) will contain bugs. This is a fact of life.
It’s the job of the programmer to find and remove these bugs. There are many ways of going about this but the most interesting, I think, is using a tool called a debugger.
If you’re a doctor, probably the best diagnostic tool you could imagine is to take a patient and cut them open and watch their innards move around, undisturbed, as they go about their daily lives. Imagine you could puncture a huge hole in someone’s chest to examine their heart and lungs while they sit there in the consulting room, chatting away amiably.
Unfortunately for the medical profession, people tend to cry out in great torment, flop on the floor and bleed everywhere if you do all that. (Though something very similar does come up in Use of Weapons by Iain M Banks, where it’s used as a party gimmick!) But programmers can get away with it.
A debugger is a tool that lets us run programs with their innards on display to the world. If something bad happens we can rewind to see what conditions caused it. We can see the last actions of the program before it did something stupid. Why did the user’s high score suddenly change to -2147483648? Why does it crash every time I click Save?
The debugger will even let you ‘run’ the program in discrete steps, like advancing a frame at a time. Or you can pause the program whenever you need — on every autosave, or every new wave of aliens, or just as the whim takes you.
If you see a problem you can tinker with the program as it runs. It would be like your doctor reaching in to ‘fix’ up your plaque-hardened arteries while you are in suspended animation, then releasing you to see how you feel. Pretty neat.
Debuggers are not much use outside the programming field. In fact, they’re one of these specialised tools that people can’t even imagine until they’ve seen one. But they’re also extremely cool, which is what counts.
It’s been a while since I spoke about bugs, so I think it’s time to get back into them. Today’s bug is called a memory leak. It is quite common but not generally fatal. It tends to show up in programs which run for a long time, and over time the performance will gradually reduce to nearly nothing. But if you restart the application it all works wonderfully again.
So what is this tricksy thing which slowly eats away at the speed of our applications? It happens when two parts of a computer system both think the other has access to some memory. In reality neither does, and it can’t be used. So the available memory is reduced.
( A comparison of memory leaks to locker keysCollapse )
It is an impossible task to tell whether a program will run for a very long time, or run forever. If it’s been running for twenty minutes, it may be just about finished. Or it may need twenty years. It may need twenty thousand years.
Or it could just be spinning its wheels in the mud, working furiously on nothing in particular. This could be because of problems outside the program itself — a browser will appear to be doing nothing if you try to access the web when you’ve not got internet access. That is no fault of the program, or its programmer.
But there are infinite loops which a programmer can cause. There are a good number of times where a programmer will start off a series of tasks but not check if the task is finished. This problem has been immortalised on film in Fantasia with poor Mickey Mouse as the inept programmer.
lucky poor folk who don’t know the story of the sorceror’s apprentice it goes like this. Mickey is the apprentice to a powerful wizard, and like all apprentices (Karate Kid, Wizard of Earthsea) spends most of his time doing drudge work. He’s tasked with fetching water from the well and — while his master is away — attempts to use magic to save effort.
He programs a broom to come to life and carry in the pails of water while he has a snooze. Alas, Mickey makes a schoolboy error — he sets off an infinite loop. The animated broom has no concept of when to stop and continues to bring water into the house until the floor becomes flooded.
The lesson could well be learned from this but I’m not satisfied. We can take the analogy further (for no good reason). Mickey cannot stop the magic broom so smashes it to pieces with an axe. Each piece jumps into life, grows arms and legs and picks up a bucket for itself.
In computer terms the hapless apprentice spawned dozens of non-terminating programs from his original one. What a mistake! Luckily the
system administrator sorceror came back at the right moment to kill all the magic broomsticks.
If you were paying attention to the field of pseudoscience a couple of weeks ago you’ll remember the story of the maths teacher who revolutionised division by zero and without a by-your-leave was teaching it to secondary school pupils. Not only was he subverting the standard procedure of childhood education (by teaching his pet fantasies as the next great Truth) he was also talking a load of cobblers.
His point rests on the unfortunate fact of there not being an answer for division by zero. In fact, it’s defined as not being answerable. In computer programming this has the unfortunate result of occasionally causing a problem or two. Division doesn’t get used in maths or programming as much as, for example, addition but it does pop up. What was your average score for all the games you’ve played? Well, add up all the scores and divide by the number of games.
But what if you’ve only just installed the software and the number of games played = zero. What then is the average, since we cannot divide by zero? This problem is easy enough to predict and avoid: we only calculate the average if there’s been at least one game played. Some circumstances are less easy to predict, and sometimes the programmer just plain forgets to check.
The result of division by zero is an error. This error is named Not a Number, or NaN for short. If the programmer commits such a faux pas then two things will happen. The program may follow Elvis and “leave the building”, which will often result in a little dialog box in Windows saying the program attempted a division by zero.
The other thing that happens — and this is, I think, more common in languages which don’t mind about well-typed results — is that “NaN” is rendered as a literal result. So the average score over the last zero games played is… NaN.
Just to illustrate the point, codeman38 kindly allowed me to use the weather widget shown. The person who wrote this obviously didn’t check for the presence of zeroes before dividing and the result is as you see — very silly.
This is my second in an exciting series of Bugs That Make Your Computer Go Bang. The first one was about sql injection flaws in web applications. This time I’ll be talking about buffer overflows.
Buffer overflows are extremely important. I’d hazard a guess that they’re the most common cause of exploitable flaws in modern software. Certainly all the big names in malware over the last few years — such as SQL Slammer or Sasser — were all made possible because of buffer overflows.
It’s also really easy to write code with a buffer overflaw bug. That, of course, is why it’s so common. Anyone who writes a program could make this mistake.
( Read more...Collapse )
We all know it: multitasking is wonderful. (I’m watching a video about Software Transactional Memory while I write this post. It turns out that Simon Peyton Jones looks like Rupert Everett…) You’re probably listening to music while reading this web page, and there’s probably at least an email package running at the same time as well. It’s all pretty great.
The strange thing is that the traditional computer can only do one thing at a time. Multitasking is all an illusion, using speed to gloss over the single-tasking reality.
The computer will have dozens or more tasks on the go at once and will be effortlessly swapping between them faster than you can blink, so you never notice.
( How multitasking works and what problems it causesCollapse )