Sunday, 12 June 2011
Thursday, 1 July 2010
Well, it gives me no pleasure to have been proved right in my earlier predictions - on Sunday the English football team were indeed humiliated and sent home by a clearly superior Germany. At least Germany had the grace to inflict a heavy defeat during normal time, without it having to go to penalties. Indeed, at 4-1 it was England's heaviest ever World Cup defeat.
Yes, Lampard's goal was disallowed, and maybe in some alternate universe it proved a turning point, inspiring England on to victory. But in this 'verse Germany weathered some pretty sustained pressure and then simply ran the length of the pitch and scored. Twice. The disallowed goal will give the disaffected fans something to whine about for the next few years (alright, for many, many years), but it finally makes up for England's disputed goal against Germany in the 1966 final.
Overall it was a lacklustre campaign. This wasn't helped by the fact that the British press are both vicious and fickle, and having whipped up national hopes (as always) to near-hysterical levels, will now turn on the team, looking for blood. The goalkeeping error against the USA was instantly (and rather brilliantly) dubbed 'The Hand of Clod', and Monday's headlines proclaimed 'Rout of Africa' and 'Fritz All Over Now'. The fate of Fabio Capello's stint as boss must now be in the balance.
So what's a nation to do? Why, turn to tennis, of course. Andy Murray's through to the Wimbledon semi finals tomorrow. He's Scottish, but that's near enough for the English papers, who conveniently refer to him as British while he's winning. And the cricket's going rather well, although the prospect of a whitewash in the current one day series has been spoiled by Australia rather inconveniently winning the fourth test to bring the score to 3-1.
Oh, one more thing. My wife has asked me to point out that, despite my previous snide comments, she does indeed understand the football offside rule. And she gave a very convincing demonstration of this using the kitchen table, assorted condiments for players, and a handy hard boiled egg for a ball. I most humbly apologise.
This is not an article about byte ordering.
Nor is it a discussion of the correct way to eat your eggs.
Rather, it is an exposition on the inability of the human brain to grasp the implications of statistics.
I'm writing this on a Saturday evening. On Friday night, the last thing I did before leaving the office was to start a stress test running on one of my chip designs. It's testing the chip communications. I'm testing that under no circumstances will the chip lock up, or fail to respond correctly to communications. I'm running the test because previous designs on other silicon have exhibited failure modes in which it was possible for the chip to stop responding to communications.
On Monday morning the first thing I will do on going into the office will be to see if the LED attached to the chip is still blinking, indicating that the test is still running, and no failures have occurred. I've been running this same test every night and every weekend for about three weeks, and seen no failures yet. Does this mean that the communications in this chip is bulletproof? No, it does not.
The highly recommended book The Black Swan, by Nassim Nicholas Taleb, talks about the nature of randomness, and in particular what the author dubs Black Swans - events deemed to be of very low likelihood, that have a devastating impact, and that can be explained away after the fact by experts with a little judicious hindsight. (In this podcast, Taleb describes this as "a retrospective ability to weave a causal link", which is a rather wonderful turn of phrase). These would be the self same experts who were completely blindsided by the events themselves, but who can confidently explain them away after the fact. Events such as the two World Wars and the various economic disasters of the last century or so are all Black Swans.
Taleb, in both this book and the earlier Fooled by Randomness, pours withering (and often entertaining) scorn upon such experts. Economists, stock traders, MBAs, and indeed anyone having the temerity to make predictions about the future based on past trends, are all mercilessly ridiculed. (He makes a few honourable exceptions - Karl Popper, George Soros, and Benoit Mandelbrot have all earned his respect.)
In one chapter of The Black Swan Taleb discusses the nature of human fallibility. In it he talks about the medical acronym NED (No Evidence of Disease). Apparently this is written on a patient's records after some tests have been run, and no sign of any malignancy or unusual activity has been found.
What, Taleb points out, doctors will never write is END - Evidence of No Disease. That is, they will happily say that they did not see any sign of a problem, but they will never say that there is no problem.
As engineers, it behooves us to adopt the same approach. If you have been working as an engineer for any length of time, you will have been caught out by the apparent absence of any problems in just the same way that I have in the past. Just because we have run tests for a period of time and seen no bugs, this does not mean that there are no bugs - it just means that we have not seen any. But even though I'm now on the lookout for feathery portents of doom, I'm still really hoping that that LED will be blinking on Monday morning.
Thursday, 10 June 2010
[Disclaimer: this post is completely unrelated to software, firmware, hardware, engineering or managerial practices, and in truth doesn’t really belong on this site. Normal service will be resumed in due course.]
A quadrennial crop of Saint George's crosses are flowering across the nation. Obviously every pub, white van, and lager lout seems to be bedecked with the emblem, but so are a surprising percentage of ordinary houses, shops, and cars. The reason of course is the forthcoming football World Cup.
A quick note to any American passers-by: when I say 'football', I mean football - what you would call soccer. What you call football is in fact American Football, which I am assured by sports-minded people is completely different. I'm told this is similar to rugby, but they stop every time anything exciting looks like happening. Also, when I say 'World', I actually mean the entire World - calling something a 'World Series' when it's only you that plays in the competition is charmingly parochial, but displays a shocking failure to grasp the basics of cartography. Thank you for your time and attention ;)
Yes, the World Cup is coming, and even I have noticed it. I don't watch television, don't read newspapers, and assiduously avoid news web sites. But in this country the sport is so insidious, pervasive, and all-embracing, that even I know Beckham's not playing this time round, thereby dashing his hopes of playing in four consecutive World Cups, Rio Ferdinand is out with an injury, and Wayne Rooney was recently carded for swearing. It's some form of weird cultural/sporting osmosis, I think.
I generally actively dislike watching sports, preferring to take part myself. But there's something about the camaraderie and group experience of watching a high profile, high stakes match with complete strangers in a packed room that breaks down the normal barriers and leaves you high on the atmosphere of the situation.
I vividly remember seeing Beckham sent off, Gazza in tears, and the hand of God. (This last, in this country at least, shamefully overshadows the simply breath-taking other goal that Maradona scored in the same game - still one of the most awesome pieces of skill you will ever see on a football pitch. I recently heard an interview with Gary Lineker, who was playing that day, in which he said that he wanted to simply stand up and applaud this goal at the time.)
It was before my time, but Kenneth Wolstenholme's commentary on the closing moments of the 1966 World Cup ('they think it's all over…') is completely legendary here. I suspect that it's actually taught in schools. Although any discussion about this game with German colleagues has inevitably ended in a dissection of England's controversial third goal...
In any case, if you can't stand football, good luck for the next month. There's normally a few places around here advertising themselves as footy-free zones during World Cups and European Championships, but this time round they seem conspicuous by their absence.
English World Cup campaigns generally follow a well-worn path. They start off with boundless enthusiasm and wildly optimistic self-belief on the part of the entire nation, to be followed inevitably by gut-wrenching disappointment, ignominy, and defeat. Generally featuring Argentina or Germany, and probably a penalty shoot-out for good measure. Yet the fire of hope burns forever bright.
I for one am looking forward to it. Come 7.30 on Saturday evening when England kick off against the USA, I'll have found a pub or friend with a TV, will be clutching a pint, and will be loudly expressing my concerns with the manager's strategic choices, something about which I know absolutely nothing. I do know I will be trying to explain the offside rule to my wife YET AGAIN.
UPDATE: In the office sweepstake, I've managed to pick England as my team. I tried to explain to the girl running it that I'm such a dead cert to win that she might as well hand over my winnings now, but unfortunately she too has known the heartbreak of following the Three Lions, and politely told me to naff off.
Wednesday, 28 April 2010
In AVR chips, game-changing actions like changing the clock prescaler or setting the watchdog are so important that they are afforded a protection mechanism all of their own. Timed write sequences are used to prevent accidental or rogue writes from fundamentally messing with your day.
These require you to perform a sequence of write operations within a defined time period - typically you need to do two writes within four clock cycles.
Care must be taken when performing such a write sequence that no other operations could take place that would violate the required timing. Typical candidates for such mischievous behaviour are interrupt service routines, which have an uncanny knack of firing at the most inconvenient times.
The obvious way to ensure that your timed write sequence is interrupt-safe is to disable interrupts around it, like so:
void do_timed_write_operation( void )
// timed write sequence goes here
Unfortunately this contains a nasty little gotcha: if you call this function from code in which interrupts are disabled, they will be enabled on return from the function. This is almost certainly not what is intended, and can lead to late night debugging sessions.
A safer approach is to save the interrupt state on function entry, disable interrupts, do your business, and then put the interrupts back the way you found them:
void do_timed_write_operation_2( void )
uint8_t interrupt_state = __save_interrupt();
// timed write sequence goes here
__restore_interrupt( interrupt_state );
This approach is so common and idiomatic that the IAR compilers for AVR support it directly with the __monitor keyword. The following function is equivalent to the previous one:
__monitor void do_timed_write_operation_3( void )
// timed write sequence goes here
Unfortunately declaring a monitor function involves a code overhead - after all, even though you can't see it explicitly in the C, a quick squint at the list files will show that the interrupts are still being saved, disabled, and restored.
A saving grace here is that most such Big Important Things tend to happen during system initialisation, when interrupts are disabled anyway. In such circumstances you can perform your timed writes au naturel, without any interrupt protection. However you really need to convince yourself that it is safe to do so. It would be a brave man indeed who called the first function above once the system was up and running without spending a very long time looking at possible consequences.
In general I would suggest that all timed write sequences be protected by default. Such protection can be removed if you're convinced that it's safe to do so - and of course if you document this assumption in the comments.
Wednesday, 21 April 2010
This is an excellent podcast by David Heinemeier Hansson (DHH) on Legacy Software. It's a recording from a talk at RailsConf Europe, and the podcast (i.e., audio-only) format doesn't really work in the last third of the talk, in which he's refactoring some code on stage. However, the gist of his argument is completely sound - if you wince in embarrassment/pain/surprise when you look at code you wrote a few years ago, it's nothing to be ashamed of. The code hasn't got worse - you've got better. In the interim you've learned new techniques and idioms, and the fact that the earlier you wasn't favoured with your current clear-sighted and visionary approach to writing code shows that you've progressed.
[The corollary of this is that the future you will be embarrassed/pained/surprised by what the current clear-sighted and visionary you is doing today, but what the hey, just appreciate the step change from yesteryear.]
Also highly recommended is this appearance by DHH on Jason Calacanis's This Week In Startups podcast. Two successful and highly opinionated people with entirely different takes on business, money, and work slug it out. Who's right? They both are, of course. They've figured out what works for them, and good luck to them. You don't have to agree wholeheartedly with either to enjoy the debate.
Sunday, 28 March 2010
Warning: this article contains meta material. If you are easily offended by blogging about blogging, please stop reading now.
Still here? Good, let's crack on then.
I personally am not a fan of metablogs - I read blogs for interest, entertainment, and education, and am not particularly interested in reading the authors' pontifications on the state of their respective navels. One saving grace for blogs in this respect is that they are less susceptible to the curse of self-reference than tech podcasts. These seem without exception to descend into an analysis of the microphones, headphones, software, and patch cables being used. I suspect this is largely a by-product of putting geeks in close proximity to, you know, geeky stuff. Perhaps blogging is slightly less prone to this phenomenon as it features fewer tools - after all, you just need a text editor.
And so it was with some surprise that I found myself writing this. What prompted it was the recent announcement by Joel Spolsky that he's going to quit (actually, by the time this goes up it will be 'has quit') blogging. He made the announcement in an Inc Magazine article, and his blog entry on distributed version control looks like being his last.
In case your universe hasn't intersected that of Joel's, he runs Fog Creek Software in New York. They have several products in their portfolio, but I think it's fair to say that their FogBugz bug tracking application is their main calling card. Joel has also run the Joel on Software blog for some time, and is a well known pundit and commentator on all things software. More recently he co-founded the Stack Overflow website, a Q and A website for programmers that has very rapidly gained a huge chunk of mindshare.
In his valedictory Inc Magazine article Joel states amongst other things that blogging is just too narrowly focused to be effective advertising for his company. Which makes perfect sense if you're using it as a source of potential revenue.
In his penultimate Joel on Software article Joel lists some of the subject articles that have been written several kajillion times, by legions of pundits exploiting the whole democratisation of opinion made possible by the explosion of the Internet. Which makes perfect sense if what you want to do is something truly original, especially in the narrowly-defined genre of geekerature.
All of which made my pause and consider - why am I doing this? If someone as well known as Joel has had enough, doesn't see the point, and is getting out, why should I bother? Why expend time and mental effort re-visiting subjects that other people have already addressed, more incisively and profoundly than I can?
- I enjoy the act of writing. I take pleasure in jotting down fleeting ideas or comments, and letting them germinate until they spring forth as fully-fledged articles. It forces you to articulate and justify vaguely-held opinions and beliefs. Under the weight of such scrutiny some ideas have self-imploded and got thrown out as blog fodder, but that doesn't diminish the importance of the process one whit. In fact, if anything it increases it.
- Joel's been doing this a lot longer than I have. He's been there, done that, got the T-shirt, and it's time to move on. Actually, he must have a whole wardrobe full of T-shirts. For me the world of blogging is still new and shiny and pretty. I'm enjoying it, so why stop?
- There's a vague chance that some of my ramblings may be of some use to somebody, somewhere, some when. Goodness knows I've learnt plenty from other people's blogs, and in some sense it's a little bit of repayment of accumulated karmic debt.
- The articles act as a reference of techniques and ideas that I can easily find in the future. If I want to do some fixed point arithmetic, or generate some random numbers, I've got ready-to-go code snippets and links back to original source material. Of course I haven't actually written all that much yet, but I've already had reason to dip back into the archives a few times. I've got a stack of blog ideas and sketched-out articles, so this blog will hopefully turn into a useful resource for the future me.
- It is also of course partly self-aggrandising/self-serving. The Internet has zeroed the cost of vanity publishing, so I may as well put something out there. Something I can link to at some point in the future to flesh out my CV a bit. Something that shows that I haven't spent the last several years just fulfilling a mandate as a corporate drone, but that I also think about and care about what I do.
My apologies for the brief digression into self-analysis. Relatively normal service will be resumed in due course.