We're often asked: why don't you work on AGI or [brain-computer interfaces (BCI)] instead of tools for thought? Aren't those more important and more exciting? And for AGI, in particular, many of the skills required seem related.
They certainly are important and exciting subjects. What's more, at present AGI and BCI are far more fashionable (and better funded). As a reader, you may be rolling your eyes, supposing our thinking here is pre-determined: we wouldn't be writing this essay if we didn't favor work on tools for thought. But these are questions we've wrestled hard with in deciding how to spend our own lives. One of us wrote a book about artificial intelligence before deciding to focus primarily on tools for thought; it was not a decision made lightly, and it's one he revisits from time to time. Indeed, given the ongoing excitement about AGI and BCI, it would be surprising if people working on tools for thought didn't regularly have a little voice inside their head saying “hey, shouldn't you be over there instead?” Fashion is seductive.
One striking difference is that AGI and BCI are based on relatively specific, well-defined goals. By contrast, work on tools for thought is much less clearly defined. For the most part we can't point to well-defined, long-range goals; rather, we have long-range visions and aspirations, almost evocations. The work is really about exploration of an open-ended question: how can we develop tools that change and expand the range of thoughts human beings can think?
Culturally, tech is dominated by an engineering, goal-driven mindset. It's much easier to set KPIs, evaluate OKRs, and manage deliverables, when you have a very specific end-goal in mind. And so it's perhaps not surprising that tech culture is much more sympathetic to AGI and BCI as overall programs of work.
But historically it's not the case that humanity's biggest breakthroughs have come about in this goal-driven way. The creation of language – the ur tool for thought – is perhaps the most important occurrence of humanity's existence. And although the origin of language is hotly debated and uncertain, it seems extremely unlikely to have been the result of a goal-driven process. It's amusing to try imagining some prehistoric quarterly OKRs leading to the development of language. What sort of goals could one possibly set? Perhaps a quota of new irregular verbs? It's inconceivable!
Similarly, the invention of other tools for thought – writing, the printing press, and so on – are among our greatest ever breakthroughs. And, as far as we know, all emerged primarily out of open-ended exploration, not in a primarily goal-driven way.
Even the computer itself came out of an exploration that would be regarded as ridiculously speculative and poorly-defined in tech today. Someone didn't sit down and think “I need to invent the computer”; that's not a thought they had any frame of reference for. Rather, pioneers such as Alan Turing and Alonzo Church were exploring extremely basic and fundamental (and seemingly esoteric) questions about logic, mathematics, and the nature of what is provable. Out of those explorations the idea of a computer emerged, after many years; it was a discovered concept, not a goal. Fundamental, open-ended questions seem to be at least as good a source of breakthroughs as goals, no matter how ambitious. This is difficult to imagine or convince others of in Silicon Valley's goal-driven culture. Indeed, we ourselves feel the attraction of a goal-driven culture. But empirically open-ended exploration can be just as, or more successful.