How ‘insanely great’ became a minimum standard
In his book, On Bullshit, philosopher Harry Frankfurt proposed that the bullshitter was worse than the common or garden liar because, unlike the liar, who is trying to hide the truth, the bullshitter didn’t even care about the truth.
For some reason, the tech industry is drowning in the, er, absence of interest in the truth. Nowhere is this more obvious than in the habitual promise that every piece of hardware or software has near mystical qualities: buy this product and all will be well, they seem to say. The latest minor upgrades are presented as revolutionary while new products are passed off as utterly world changing.
Sales culture is nothing new, of course, but it wasn’t always like this in tech. In 1984, launching a new computer on the US market, tech executive Nigel Searle positioned the machine as good value at its price point. That computer was the Sinclair QL, and was a complete failure that contributed to the final demise of Sinclair Research. Still, Searle’s speech launching the machine is well worth watching (and, happily, available on YouTube).
Asked about the CPU, the Motorola 68k series, Searle noted Apple, who also used it, claimed the chip was 32-bit but that most people said it was 16-bit, and that the version used by the QL, the 68008, was 32-bit internally but had an 8-bit data bus. Asked what development environments were available, he said at the moment, none. The machine was more suited to classroom than business use, he said. On and on he goes, admitting compromises and costs, all the while demonstrating a self-deprecating and dry wit that provokes that rarest of things in a tech event audience: ripples of sincere laughter.
Compare this to the late Steve Jobs, famed for the phrase ‘insanely great’. OK, in fairness Apple products, other, perhaps, than the ill fated Apple III, were pretty great compared to Sinclair’s habitually half-baked offerings, and the QL was launched at the same time as the Macintosh, which genuinely was revolutionary in that it popularised the graphical user interface.
So who was right? Clearly, Jobs’s machine was better. In fact, the Macintosh was great, insanely or otherwise, and ushered in major changes in computing that echo down to the present day.
The point, then, isn’t that tech companies make uniformly bad products. They don’t. Apple made at least three shockingly innovative products – the Apple II, the Macintosh and the LaserWriter – and a lot of other ones, including the iPod and iPhone and, believe it or not, type-handling software, that pushed both the industry and wider society along in important ways. Jobs’s next company, NeXT Computer, was a lot less successful than Apple, but it too was very innovative and many of you who are reading this are reading it using software first developed at NeXT.
It is undeniable, however, that the industry, from Apple on down, is given to spoofing. Worse still, few tech executives have the carnival barking skills of the late Jobs: you think the antics of former Microsoft boss Steve Ballmer were cringeworthy? Imagine a cult run around a CRM product or online travel agent.
Brian Honan of security specialists BH Consultants said cybersecurity vendors were particularly given to making exaggerated claims.
“They exaggerate the threat, and the threat is real, but they also exaggerate the solution,” he said.
In other words, vendors claim they could have stopped an attack before even knowing how that attack functioned.
“Whenever there is a major attack, before we even know the details; you can be guaranteed I will have received a host of e-mails saying ‘if only they had used our product, this would not have happened’”.
Honan said the root of the issue was the need for marketing to penetrate people’s consciousness.
“It’s a cut-throat market out there,” he said.
True enough, but the law of diminishing returns applies, and I cannot be alone in feeling tech companies claims’ sound increasingly like the lyrics to Step Right Up by Tom Waits, a famous riff on disreputable marketing. Artificial intelligence, the metaverse, industry 4.0, Web 3.0, and all the rest of it: to me it just sounds like so much bullshit.
Personally, I’m tempted to claim that there is a touch of Trotskyism in the claims routinely made by many tech companies, with every tedious product upgrade or, worse, unwanted product being the next step in a permanent revolution. That would be bullshit, though.