This week’s column starts with a mea culpa. The column about Microsoft’s meeting with the Ministry of Education raised some eyebrows, and both Ministry employees and individuals wrote in to point out that there were inaccuracies in the reporting. They rightly observed that the author did not attend the meeting in question, and was therefore presenting hearsay evidence. While efforts were made to corroborate the details presented, it is an unfortunate truth that no public record was available. If any of the facts were incorrectly reported, the responsibility for this lies entirely with the author.
In the course of discussions about how to properly correct the record, two points kept recurring, both explicitly and implicitly: So-called ‘geeks’ often focus far too much on technology and not nearly enough on what it’s actually for. Additionally, there’s often a lot of talk – some might say too much talk – based on speculation. Making blithe assumptions can spell disaster for any project, but those with high-tech as a principle ingredient are even more prone to failure because of their inherent complexity.
Technology doesn’t come cheap. It’s generally true that properly functioning technical systems are cheaper than the alternatives, but getting those systems to run properly is often what geeks call ‘a non-trivial task.’ A computer is one of the most complex human inventions ever created for everyday use. Making dozens or hundreds of them work together doesn’t just add to this complexity, it multiplies it.
In the typical ICT-related project, time is short, planning is not detailed enough and although intentions are always good, there’s often not enough to time to obsess over all the possible outcomes. The usual result is that these projects reach an ‘OOPS’ moment, in which people realise that the best-laid plans of mice and modems oft go awry. Sometimes it’s possible to recover from the error gracefully, sometimes it’s painful, and sometimes there’s just no escaping the need to step back and try again.
So the odds of getting things at least a little wrong are usually high, and the cost of failure is sometimes quite high indeed. Not a very auspicious starting point for something that’s often touted as the answer to all our communication needs, is it?
Now let’s consider the plight of the average IT staff. They’re tasked with keeping everything running, keeping abreast of the latest advances, planning for tomorrow, and above all trying to make it all comprehensible to people who don’t have all day to study technology. Any one of those tasks can be a full-time job.
It’s a truism that technology moves quickly. Every day brings new and often unpredictable occurrences, inventions and trends. Computer users’ needs change constantly, hardware capabilities and specifications never stay put, new software appears overnight, like mushrooms on… fertile ground.
It’s understandable, therefore, if there’s a natural inclination among IT people to limit the scope of their work to what they consider to be essential. It’s a skill that gets learned early on in the technology game, and becomes hard to unlearn. Things change so much, so fast, that talking through the implications of every course of action can seem almost pointless.
Unfortunately, it’s not optional.
Computers are nothing more than the medium through which people communicate. The extent to which they play nice with one another defines how easily we can exchange information of all kinds. Technology, in other words, deals primarily with how we communicate, but it’s up to us to determine the what and the why. If we don’t answer that first technical question, though, the other more important questions become moot.
Just as no man is an island unto himself, in this day and age, no computer stands alone. While it may be frustrating and time-consuming to chew over every technical detail for every bit of technology, endlessly considering different scenarios and approaches, there really is no practical alternative. It was to this end, for example, that the VIGNET mailing list was created. There are many others like it, but this one is ours.
Developing a community of practice in which people can share their insights, experience and opinions is a key asset in trying to cope with the ocean of information that IT professionals swim in every day. It gives people the opportunity to learn from others’ mistakes, to benefit from alternative views and ultimately to ensure that the right hand knows what the left hand is doing.
Technology decisions are never taken in a vacuum. The technological choices that one person makes have both direct and indirect impact on the choices others make. Some of these impacts are intentional, some are not. Putting one’s work out in the open and talking things through is one way to avoid the worst. Basically, it’s like pooling the spare processing cycles of numerous computers in order to solve a very complex problem. What might take hours on a single processor takes only minutes when the task is split into morsels and processed by a whole bunch of them.
Of course, humans are not computers. This means that a public forum is a decidedly imperfect place. There are sometimes misunderstandings, priorities don’t always align, digressions and plain old silliness sometimes divert us from our work. It’s an imperfect process, but it’s the best available at this point in time.
Everyone benefits from a free and open exchange about information, communications, and the technology we use to achieve it. Mistakes and misunderstandings may happen, but they can be resolved with a little patience and a little more talk.