This article is more than 1 year old

Your Computer Is On Fire, but it will take much more than this book to put it out

Detailed diagnosis of tech industry delusion falls short of prescribing a cure

Book review Seasoned industry watchers will welcome Your Computer Is on Fire as a thorough and unflinching debunking of Big Tech's outlandish self-mythologising. They might even hope that governments, business, and the media organisations who buy into the barrage of propaganda start to ask a few important questions. But there are limits to this niche text that is at times prone to academic navel-gazing.

In the 1990s, despite the outward differences between the industry big guns, the background hum was the same. The internet offered opportunity for all, ecommerce could lead to frictionless economics, software made people more productive, and companies more competitive. Such delusions survived the dotcom crash and financial crisis then re-emerged in the early days of social media as the Arab Spring became a use case for the positive impact of Twitter and Facebook. Together with that movement's difficult development, the nefarious exploitation of social media user data that contributed to the election of US presidential regime with ever-so-slightly insurrectionist tendencies should have given pause for thought.

It's a wonder, then, that tech industry propaganda has barely shifted. Instead, it's a case of different tech, same tune. Last month, Google CEO Sundar Pichai told the BBC that AI would be the "most profound technology" that humanity will ever develop. Similarly, UK Cabinet Office minister Julia Lopez adopted industry language when she said that "now, more than ever, digital must be front and centre of government's priorities to meet user needs."

Into the arena steps a group of tech historians whose objective is not just to pick holes in the output of the sector but to take aim at the centre of myth-making and argue that inequality, prejudice, and self-delusion are at the very heart of the industry, and have been all along.

Published by The MIT Press, Your Computer Is On Fire hits out at sacred assumptions about the tech industry widely adopted in government, business, and much of the media, and turns them upside down. Hence: nothing is virtual, the cloud is a factory, AI is powered by humans, and the internet has a hierarchy. In short, no technology is neutral or consequence-free.

In vogue for more than a decade, the cloud is a term conjured by the tech industry: fluffy, white, and somewhere up there. As Nathan Ensmenger, associate professor at Indiana University, put it: "The metaphor of the Cloud erases all connection between computing services and traditional material infrastructure… As a result, the computer industry has largely succeeded in declaring itself out of this history, and therefore independent of political, social and environmental controls that have developed to constrain and mediate and constrain industrialisation."

Of course, the cloud is no such thing. As Reg readers know, it is a computer somewhere else. It needs plastics, metals, energy, and people just like any other computer. But where they come from, their impact on the environment, is of little concern once hidden behind the cloud metaphor, he argues: "The metaphor of the Cloud allows the computer industry to conceal and externalize a whole host of problems, from energy costs to e-waste pollution. But the reality is the world is burning. The Cloud is a factory. Let us bring back to earth this deliberately ambiguous metaphor by grounding it in a larger history of technology, labor and the built environment – before it is too late."

Not a bug

The volume is at its best when dealing with the specifics of tech industry history. Mar Hicks, associated professor at Illinois Institute of Technology, argues that "sexism is a feature, not a bug" in the chapter which describes how Britain wasted its early lead in the development of computer technologies by failing to hire, develop, and promote women who first operated these machines. Once the importance of business machines was realised, men were brought in.

"In 1959, one woman programmer spent the year training two new hires with no computer experience for a critical long-term set of computer projects in the government's main computer center while simultaneously doing all of the programming, operating and testing work as usual." The problem was, "the men who were tapped up for these jobs lacked the technical skills to do them and were often uninterested in computing work, in part because of its feminized past."

Most of the men trained for early computing positions soon left for more senior management jobs, causing departments to haemorrhage most of their computer staff. "This trend continued throughout the 1960s and 1970s, even as the status of the field rose," Hicks argues. "As a result, the programming, systems analysis and computer operating needs of government and industry went largely unmet."

Hicks charts the success of one woman who saw an opportunity in this glass ceiling, and in the development of software independent of the hardware provider, an unusual move at the time.

Having worked for the Dollis Hill Post Office Research Station in the 1950s, briefly with Colossus computer architect Tommy Flowers, Stephanie "Steve" Shirley went on to found Freelance Programmers, which later became Xansa before it was sold to Steria for £472m in 2007.

She had been passed over for promotion in the civil service before launching the successful startup in the early 1960s, making a point of hiring similarly cast-aside female tech talent on flexible contracts. The chapter includes a great picture of Ann Moffatt coding the black box recorder for Concorde, with a toddler in frame looking curiously over the table.

But this was the exception, not the rule. Far from revolutionising society, the computer industry has acted to enforce pre-existing social structure, Hicks argues.

Trustworthy

The breadth of the book is its strength. It reaches across not just the social, political, and economic history of computer technologies, but deals with the detail of code too, for those who can read it. Programming historian and Berkeley City College lecturer Ben Allen's chapter details the antics of Unix co-creator Ken Thompson, who authored an eponymous hacking technique to gain a backdoor into the operating systems.

As well printing code for the principles underlying the Thompson hack, Allen describes the approach for the attentive lay reader. Without going into the details here, the process relies on the liar's paradox – "this sentence is false" – and the process of bootstrapping to get from base-level machine code to higher-order languages. The upshot is a Trojan backdoor that in a practical sense is undetectable: only by reading every line of machine code could it in theory be spotted.

Thompson argued it meant that no computer system could be entirely trustworthy and that laws against those illegally accessing computers should be toughened as a result. On the other hand, he admitted implementing the code in the Bell Labs machine although denied ever releasing it in the wild. That we trust him is as much to do with his identity as a respected white male computer engineer. He was allowed to play around and that's fine; others might have been judged more harshly, Allen argues.

At its best, the book breaks down assumptions underpinning how the computer industry works. The QWERTY keyboard has long been derided as an arcane interface dating back to typewriter mechanics. But while engineers in the West have focused on how the keys might be better laid out, those relying on non-Latin scripts have been forced to think more radically. Forced by the ubiquity of the QWERTY system and its complete lack of purpose for representing languages native to billions of people worldwide, engineers in China began to see the keyboard as an input device for helping select the desired logograms, rather than literally typing individual letters. While western systems could learn much from this approach they are still bound by the concept of "type" and wedded to a system design for Remington typewriters in the 1920s, according to Stanford University professor Thomas Mullaney, author of the chapter "Typing is Dead".

Your Computer Is On Fire is not for the faint-hearted. It's a detailed and at times dense academic text that should be judged on that basis. It sets out to be "crafted as a call to arms."

Agitate for change

"We very much hope that the students who read this book will go on to take up positions in STEM fields and then to agitate therein on behalf of the issues we raise," Mullaney says in the introduction.

By the conclusion, the reader is faced with an existential crisis, apparently, a "call to face and embrace one's own death." It says: "There may be little more inimical to the modern mind that seeks prosperity, peace and beneficent politics than such a call to reconcile ourselves with the brevity of human life."

It's at this point that the imagined STEM student of the intro might be justified in putting the 400-page tome back in the library and wondering about their first professional pay check and Bay Area condo.

At the same time, the book lacks specifics about what might be done about the issues around regulating AI, privacy, breaking up monopolies, fair taxation, and so on.

Nonetheless, it's a valiant effort to tackle the delusions that have become embedded in the tech industry and spread to the wider world. Nothing was ever virtual, the industry isn't a meritocracy, technology is not neutral. But convincing those capable of tackling the trillion-dollar tech giants' considerable might need a bit more of a push. ®

Your Computer Is on Fire

Your Computer Is On Fire

Imprint: MIT Press
Published: March 2021
Pages: 416
Price: $35.00
ISBN: 9780262539739

More about

TIP US OFF

Send us news


Other stories you might like