Over the last 10-15 years public IT in Europe has not developed in line with public interests, nor does it guarantee the fundamental rights of citizens such as privacy and freedom of expression. Tremendous opportunities in the field of economic development and employment have also been missed. Europe effectively outsources much of its information processing (software & services) to foreign parties at the direct cost of hundreds of billions of Euros (typically around 1% of GNP). The opportunity-cost to local economic growth and employment opportunities are much greater than that. Even more costly than either of these is the de-facto handing over of control of data of governments, businesses and individual citizens to foreign spies who use it for political manipulation, repression of citizens’ freedoms and industrial espionage. Although the warnings about the negative consequences of current policies date back at least 15 years, these aspects have been documented in irrefutable detail over the last year by the revelations of Edward Snowden. 12 months later there has not even been the beginning of a policy response.
It could all have been so different …
In the first 21 months of the 21st century, the dot-com bubble burst and then three skyscrapers in New York collapsed. Between these two events a largely forgotten report to the European Parliament appeared in the summer of 2001. This report described the scale and impact of electronic espionage in Europe by the U.S. and its ‘Echelon’ partners (Canada, UK, Australia and New Zealand). Besides a detailed problem analysis, the report also gave concrete examples of IT policies that governments could take to significantly limit foreign intelligence spying on Europe.
In the same period was U.S. government won one of the largest anti-trust cases its history, against Microsoft, and the EU followed this victory by launching a similar case that would also be won leading to the highest fine to a company for economic crimes in the history of the EU.
It was against this background that thinking about strategic versus operational aspects of IT in the public sector changed. The report on Echelon made it clear that reducing IT into a merely operational exercise had disastrous consequences on the sovereignty of European states with respect to, in particular, the United States (and perhaps in the near future, China, other technically capable countries or non-state organizations). The economic consequences of industrial espionage against many high-tech and R&D-intensive companies became a major concern for the government.
The IT policy of governments would from 2002 onwards be based first on the political principles of a democratic and sovereign state. This not only meant a very different policy in the field of technology selection and procurement, but also the balance between outsourcing versus in-house expertise and required an extreme degree of transparency from all suppliers. Open data standards for public information were required, and non-compliance resulted in severe penalties (although public ridicule from 2009 onward was generally the most effective). These new frameworks for public IT created a new market for service providers who based solutions on so-called ‘Free Software’ (previously better known as ‘opensource’). The high degree of transparency both in project implementation as the technology itself made for a well functioning market and made recycling of (parts-of) systems the norm. Spending on software fell sharply and the freed up budget was used for the recruitment of highly qualified IT workers under conditions that could compete with the offerings of market.
The full transparency with respect to both the IT projects and the tech itself, combined with a depth of expertise within the government, changed the market for public software and IT services. Quality rose steadily while prices remained permanently under pressure. Since all service providers had full access to all software used in government (with only a few exceptions in defense, justice and home affairs), there was a very open playing field where all providers were expendable (and those who performed below par were replaced regularly).
In addition, computer and IT education from kindergarten to university studies was fundamentally revised. Basic understanding of the operation of computers and information networks became as normal as reading and writing. From 2006 every 14 year-old was taught in school how to encrypt email and what the disadvantages were of using software whose source codes are not published. Through this awareness among young people in Europe the adoption of social media occurred very differently than in the U.S.. Young people not only had end-user skills but real understanding about what was happening to their information when sending a message or upload a photo to websites. Being careful with your private information was considered cool. The social media landscape was not dominated by a handful of U.S. companies, instead there was a landscape of federated services such as Diaspora who competed among themselves but were compatible in the same way as is the case with email. These services were sometimes somewhat centralized but, just as often, completely decentralized and run on micro-servers in many people’s homes (such as the UK-invented 35 Euro RaspberryPi).
Due to the high privacy and safety awareness online crime did not have much grip on most European countries. Hardly anyone was naive enough to log on to strange domains or websites in response to a fake email that appears to come from their bank. And the use of customized secure USB drives created by various banks was accepted as obvious for any major online financial transactions. At the level of organisations high levels of expertise and a high degree of diversity in technology implementations made for robust security that was only seldom breached. The large demand for experts in well-paid jobs also kept many would-be criminals from selling their skills for more destructive applications.
This is the IT that Europe could have had if other choices were made over the last 12 years. All the knowledge and technology for these choices were available in the first months of this century. Because these choices were not made Europe has spent hundreds of billions on software licenses and services from American companies, while there were cheaper (often free), more flexible and safer alternatives available that would not operate as a foreign espionage platform. All these hundreds of billions were not not invested in European service, training, education and R&D. The economic impact may be a multiple of the roughly $1 trillion in foreign software licenses spent by Europe this century, while the social cost resulting from manipulated politicians during transatlantic negotiations on trade or environmental matters will probably never be known.
Europe still has everything it needs to develop and implement such policies. It is not too late to turn, no matter how regrettable the policy failures of the last decade and no matter how many wasted billions. Today could be the first day of such a new course. Concrete examples in the Netherlands, Germany, France, Spain the UK and many other places show that this is not only possible, but almost immediately leads to huge savings, improved safety and independence from foreign parties in future IT choices.
It’s not often that regaining national sovereignty and the restoration of civil rights can spur national innovation and employment programs simultaneously. The only thing missing is the political will to stop rewarding businesses and governments that use their technological dominance to spy on the entire world. We have nothing to lose but our chains to the NSA.
Gartner, IT-journalists and even former employees of Microsoft agree: Windows 8 will be a disaster. The Metro interface designed for tablets (a market that virtually does not exist in relation to MS-Windows) is unworkable on a desktop with a vertical non-touch screen, keyboard and mouse. Most office spaces still have this and most run legacy applications with interfaces that rely on a Windows PC using a keyboard and mouse. It is precisely the ongoing purchase of desktop PCs with the combination of MS-Windows and MS Office that has kept Microsoft financially afloat over the last 15 years
The combination of legacy applications (mostly proprietary) and familiarity with MS Office, led many IT organisations to automatically buy the new Windows platform, despite the high cost of licences and support. The inevitable result is a world of pain, with new interfaces, a lack of compatibility and the sudden cessation of support for critical components. IT policy is organised around coping with these problems instead of focusing on sustainable alternative solutions. And solving or mitigating these problems requires so much time and money that there is often little left over to plan further ahead. Thus, in many organisations the perfect vicious circle has existed for so long that many IT people can not even see it.
An important point here is that Windows 8 is only a disaster for those who buy it and those who are unsuccessfully trying to sell it. For the rest of us, it is irrelevant. So if you use a Windows7 PC, Mac or Linux machine, is very easy to just let all this misery pass you by. After a disastrous version of Windows is released, another (slightly less) catastrophic version (think ME/XP or Vista/7) will follow, and for those who still genuinely believe that they need a Microsoft operating system, they merely hope that a half-decent version will come along in a few years.
Organisations that (virtually) no longer have platform-dependent applications because they have (to) provide a web interface, have no reason at all to even think about purchasing proprietary operating systems. Organisations that do use these applications are better just sticking with earlier (already purchased) versions of MS Windows, so that all interfaces remain compatible and end users can continue working in their familiar environment. The IT department’s resulting spare time and money can be used to break the vendor lock between applications and platforms.
Most application vendors are now thinking about web interfaces, or APIs for tablet apps (even if it is just to keep company directors happily playing with their iPads). Application vendors who are not yet doing this should understand that in times of tough cuts IT euros can only be spent once, either with them or with Microsoft. Seems an easy choice, right? Fortunately, even company-specific applications do not last forever and when the time comes where there is something new to choose from it is useful to calculate the TCO of applications by including the underlying infrastructure costs (licences, management, security), and compare this to the TCO of applications that do not have such dependencies. Conversely, you can also say to your hoster: “I do not care what platform you run my applications on, but what would I have to pay you if it is an open source stack?”. A little negotiation is always possible in a stagnant market.
As with Vista, the main victims of Microsoft’s iPad-wannabe software are the basic PC consumers – those who buy a PC or laptop from a retailer and get a machine with a pre-installed disaster. In the coming years many IT professionals will have to deal with family, friends and acquaintances crying down the phone because they cannot find or use their favorite or essential PC applications. It will be Vista revisited. Do your friends a favour and downgrade them to Win7 if needed or upgrade them to Ubuntu if possible. The main reason why home users still want Windows is for gaming. Fortunately, people have worked hard on alternatives, including by previously mentioned former employees.
Although I dislike the iPad because of its extremely locked-down platform, tablets (with the first iPad) have presented to non-techies, for the first time in 20 years, a completely different platform to the Windows PC. So for the first time in aeons there is a widespread discussion about possible alternatives. Once we take that mental step, we open the way to discuss IT policy that really starts with the question of how best functionality is achieved at the lowest possible cost (which may also lead to discussing the underlying platform).
If Microsoft’s profit margins on the Windows/Office combo are cut back to 20% (it is currently 60-80%) the TCO figures will be more reasonable. Like IBM, over the years Microsoft will become an ordinary business providing rather boring-but-sometimes-necessary products at more normal profit margins. And that, except for the shareholders, is not a disaster.
Update: in the week after publishing this column a few dozen Dutch governments organisations promptly made my point with the total loss of network functionality from a nasty Windows virus. The infection is still going on and the dataloss and privacy implications of the breach is still being investigated. many sysadmins have been working overtime to contain the problem. Of course there will be another one of these six months from now and so on and so on. This has been going on for years.
Doublethink is a concept that was introduced by George Orwell in his famous novel ‘1984 ‘. It is a mental mechanism that allows people to believe sincerely and simultaneously two completely opposing ideas without a problem.
In the ten years that I have been involved with open source and open standards in the Dutch public sector, I have encountered many double thinkers. So for years I have endured “experts” and insiders patiently explaining that the migration to open source desktops within that community would be impossible, because civil servants could not work with other platforms. Asking non-techies to use anything but the Windows + Office desktop they were taught at Dutch schools would lead to disaster. It Just Could Not Happen.
The certainty with which this (to this day) is mouthed as an aphorism everywhere has always amazed me. Previously, the Netherlands had migrated from WP5.2 in DOS to Windows Word 6, yet the Earth kept turning, children went to school and there was water from the tap.
Multiple migrations, mostly outside the Netherlands, have also demonstrated that ordinary users can do their work well with alternative platforms, provided they are given some training and support (something, indeed, that is perfectly normal when migrating to new releases of the usual proprietary systems).
The same people who for years have claimed with great certainty that "It Just Could Not Happen” have been busily rolling out iPads to the many managers and directors, who for many and varied reasons discover they need one. Apparently the adoption of an entirely different platform with a totally different interface is not as problematic as was asserted for all those years. Huh?
The classic “civil service desktop” tribe, led by IT heads of ministries and municipalities and supported by Microsoft, Pinkroccade and Centric, have had many happy years of “standardising” the Netherlands on proprietary tools, the management of which would then be done by the Dutch business partners of Microsoft. When asked why such a vulnerable and expensive monoculture was necessary, the standard reply is "working together!". For “working together”, according to these people, can only occur if everyone works with exactly the same stuff (never mind that millions of people on the internet are working together with very different tools). And that stuff should be consistent with what people already know, because learning something new is ultimately ‘not realistic’.
The Web 2.0 tribe wants everything on "the cloud" so that with iPads they can “work together” from Starbucks with colleagues and consumer-citizens-entrepreneurs. That this places control of state information in the hands of uncontrolled private and foreign parties is not part of the discussion. "We must work with the most modern tools!" When asked what they do in concrete terms, the answer is almost always shifty or there is some muttering about experiments and the importance of “working together”.
Both of the above tribes mix at “e-government” conferences and other such events and hear both perspectives, one after the other, with nobody apparently perceiving these contradictions. It is Doublethink in its ultimate form: simultaneously believing two contradictory ideas without experiencing a conflict: from 11:00 to 11:30 they can believe that a Microsoft monoculture is a necessary requirement for civil servants to “work together”, and then from 13:30 until 14:00 just as happily accept that all hip 2.0 workers, with their privately-bought iPads authorised via LinkedIn, must have access to the State-intranet so that they are finally able to “work together” with other officials. And nobody is pointing to the naked emperor and saying that at least ONE of these two stories has to be nonsense (and probably both).
Despite all this focus on collaboration between government organizations are regularly at odds, working against each other, re-inventing wheels 300 times, or point to each other when things go wrong. Even Caligula or G W Bush could still learn a thing or two from such levels of surrealism.
Proprietary vs. open source in government is just ONE of the examples where sly salesmen from dubious companies appear to be much more attractive than people with demonstrated expertise. Also in the cases of Electronic Health Records, voting computers, the public transport chip card and the security of its own systems, the government actively chose lying, cheating vendors and/or incompetent bureaucrats over its own citizens and academics with a proven expertise.
After last year’s ‘Leaktober month’ and the Diginotar drama, it appeared that some light might finally break in, but now it is clear that one deals with problems by treating them as an immutable fact of reality. With the logic of “as it is now, so shall it remain”, the years-long impetus towards greater vendor independence and diversity of systems ground to a halt. Now the same logic is used as an excuse to defend failure everywhere. It’s a bit like claiming to achieve fire safety by shouting that not every building is on fire, and anyway the fire engines can drive with 130km/hr away – "We react so quickly!". Prevention is seen as difficult and, moreover, "as it is now, so shall it remain – you will never be safe."
Despite this latest capitulation to foreign intelligence services and criminals, yet more megalomaniac IT projects are underway. Citizens continue to entrust the government with all their personal information, despite the fact that the government itself admits to being unable to protect them adequately. When working on such projects, you’d need to remain in a permanent state of Doublethink to avoid a serious moral dilemma.
Once the Netherlands had a government that built the Delta Works to keep the sea out and ensured that the country was ranked in the global top 2 or 3 in the fields of health, education, social security, security, democracy and transparency of governance. Only Sweden and Denmark sometimes did better.
Today feels like the Dutch government is abolishing itself. It knows nothing, wants nothing, does nothing. Perhaps we the citizens should do the same. Give them nothing, ask for nothing, expect nothing. The Zen of the citizen-government relationship. Happiness is low expectations!
What is a document? It started as a flat piece of beaten clay, onto which characters were scratched with a stick. 8000 years later it was found and after years of study, archaeologists concluded that it said: ‘You owe me three goats”. ??
Through papyrus and parchment scrolls we arrived at mass production of paper and book printing in Europe in the 15th century. Our sense of the nature of a document is still derived from this previous revolution in information capture and distribution. When computers became commonplace as a tool to create documents, there was therefore a strong focus on applications to produce paper document as quickly and nicely as possible. The creation had become digital, but the final result was not fundamentally different from the first printed book in 1452.
Most word processors in use today cling to this concept. There are hundreds of functions for page numbering, footnotes and layout to achieve a legible final result – on paper. Many IT tools around the management and access of documents are directed to the concept of a digital document as a stack of paper. Ready to print for ‘real’ use. The modern ways of working together for various reasons no longer apply to a paper-oriented way of recording and distribution. Paper is static, local, and now much slower and more expensive to transport than bits. It is this combination of restrictions has led to new ways of creating documents where both the creative process and the end result is digital. A famous example is Wikipedia, the world’s largest encyclopaedia with millions of participants continually writing and rewriting about the latest insights in technology, science, history, culture or even the biography of Dutch folk singer Andre Hazes.??
In this new form a document is a compilation of information at an agreed place online. The URL is the document.
Most editors show their age not only by focusing on paper, but also by focusing on the concept that documents provide a discrete all-in-one storage medium. Word processing began before computers could communicate naturally through networks, and that legacy continues to shape the concept of a digital document.
From the binary formats of Wordstar (.ws), via WordPerfect (.wpd) and Microsoft Office (.doc), we are now using XML-based formats such as ODF and OOXML. The original purpose of the ODF was to break the stranglehold of the Microsoft binary .doc format, which was changed regularly and was therefore was difficult to support on systems other than Microsoft itself. Of course, that was exactly the intention. Once you acquire market dominance, why would you be interested in whether other systems are compatible with you when this gives you the competitive edge and profit margins of 65%?
To my amazement yesterday I read this report of a workshop designed to make OpenOffice compatible with the proprietary version of Microsoft’s OOXML file format. The operational wish for individual OpenOffice users to be compatible with .docx is understandable, as they are a minority in a landscape totally dominated by Microsoft Office, which now saves documents as .docx. If you choose not to use MS-office (for whatever reason) it can be a daunting task to save and read a document. Most users of word processors are unaware that, by using this format, they are making the lives of the minority difficult; they merrily continue to send out this digital asbestos.
For clarity, the .docx version of OOXML is not the same as the ISO version of OOXML – .docx is a proprietary file format, OOXML ISO is a standard. The certification of the ISO standard was itself nearly destroyed during the voting process by bribery and intimidation. The ISO standard has not been implemented by anyone yet, including Microsoft itself.
Solving problems of adoption of OpenOffice by pursuing the proprietary file formats of your opponent seems to me a disastrous path to go down. In the same way as .doc, the .docx format can be subtly changed with each version and servicepack ‘upgrade’ to avoid 100% compatibility. After all, actively tinkering with proprietary software to block alternatives not a new concept for Redmond.
Microsoft survives primarily on Windows and Office licences, even though it has doggedly been trying to conquer other markets such as mobile telephony. It would be rather naive to assume that such an organisation, with such a history, will sit back quietly while its cash cow is dismantled.
If the predictions about digital documents are true, it means we need new ways of working along with new tools. Page numbering and footnotes are irrelevant in hypertext in terms of the document-standard. Since the majority of documents produced by most users in most organisations are no longer than 1-3 pages and are usually using templates, a browser with plug-ins would be sufficient. This means that PCs are less important for the end users, who increasingly work just as well on a tablet. Tablets are very different to Pcs, but that is no barrier to rapid adoption. Contrary to popular claims, ‘different’ is not a problem if it is also sexy.
Aping your opponent is never a good idea. As a great strategist once said long ago (in a galaxy far away)::it’s a trap!
Computer viruses and palliatives against them are a growing threat to high-tech care. There is a classic solution for the old problem of a vulnerable mono-culture: diversity.
Last Monday alarm bells went off in many IT departments. A viral infection on Windows XP computers was initially caused by an anti-virus update from McAfee. The update made part of the system appear to be a threat and system file protection software made the system unusable, a type of auto-immune disease.
In hospitals and care institutions XP is still widely used, as specialised medical applications are often not ready for the new Windows version (and as often purely because of under-investment). This time it was McAfee, but almost all anti-virus products from time to time cause such problems. Anti-virus updates are a real-time arms race and sometimes in the rush things goes wrong.
From agriculture and ecology, we know that monocultures are efficient but also very vulnerable. It is no different in the pig pen of IT. The management of 4500 identical systems seems simpler than a more varied infrastructure – until a virus or autoimmune disease outbreak. Then the overtime starts. The scale of many of these incidents shows that even large health care institutions do not have proper internal firewalling and compartmentalisation. Nevertheless, the situation is better than five years ago.
Security issues caused by monocultures are not a new story. In 2003 Daniel Greer and Bruce Schneier wrote a report about the security implications of the dominant OS monopoly. Since that time neither the market nor the government has succeeded in effectively breaking this monopoly. In health care applications with medical or laboratory equipment included, many are Windows-only. Vendors often set additional conditions on the PCs, for example no firewall, before guaranteeing proper functionality for of their own applications. Thus a computer virus (or an autoimmune disease) is not only annoying for the admin department, but can also make scanners unusable. The MRI scanner can still take images, but the PC is crucial to the operation and viewing the results. So a Philips or Siemens unit worth a cool million is effectively scrap metal and patients cannot be treated. Sooner or later, this is a real time problem and then way more people than just the help desk are affected. In England, more than 1100 National Health Service computers were infected with a data-thieving worm. And there goes your medical confidentiality.
From the many conversations I have had in recent years with IT workers, I conclude that the difference between a product monoculture (a ‘standard’ desktop) and the application of standards to achieve interoperability is still not understood. Some years ago I spoke to a ministry official who enthusiastically told me that a ‘standard’ desktop was going to be implemented for the entire government. When I asked what standards would be applied, he launched into a list of products, "this version of an OS, this version of a word processor" and so on. The perception is prevalent amongst many IT managers that systems can only work and be properly managed if they are all from the same vendor and version. But this is much more a symptom of market failures and the immaturity of the IT industry. It is a problem to be solved, not a law of nature to which we have to adapt.
That there is another way to do things can be seen from the work over the past 10 years in the Antonius Hospital in Nieuwegein. There they have consistently, in small steps, consciously worked to minimize dependence on a particular vendor, platform or application. What most IT managers of health institutions describe as ‘impossible’ has been done in Nieuwegein. Fortunately this hospital is in the centre of the Netherlands so when a really big crash occurs all critical patients can be sent there. In 2010 we can avoid succumbing to the first virus or software-update-gone-wrong by using virtualisation, web-enabling and open standards environments to build greater diversity and interoperability.