For over two decades internet search (which has been largely synonymous with Google for most of that time) has been the main way to augment the human brain with the digital information outside of it. First the virtual assistants and now LLMs are challenging and changing the way people access, process and use the data. It is becoming more like a true capability augmentation, though still clunky and unreliable, but improving at a dizzying rate. Specifically, it is becoming amazingly easier to create, rather than consume, though consumption is greatly facilitated, too. 

Assuming we are still ways away from the prophesied AI doom, what might be a better name for the way people interact with information?

New to LessWrong?

New Comment
9 comments, sorted by Click to highlight new comments since: Today at 6:03 PM

Related: seems like some search engines are already integrating LLMs:
- One approach is directly providing links; see https://metaphor.systems, brought up yesterday @ https://www.lesswrong.com/posts/rZwy6CeYAWXgGcxgC/metaphor-systems
- Another is LLM summarization of search engine provided links; https://you.com/search?q=what+was+the+recent+breakthrough+in+fusion+research%3F as an example

For many queries Google has been offering an answer that is not a link for some time. Calculating, graphing, facts, etc. It is becoming more of an answer engine than a search engine, but rather slowly. I assume that Google is working furiously now to catch up with other LLMs UIs, and they are in a good position to do so, if they let go of the Search mentality.

I think I read a thread somewhere which said that Google has a lot of tooling built and many teams already dedicated to integrating LLMs into their products. But the economics don't make sense at the moment, apparently. The cost of using these models would need to come down by 1-2 OOM before they'd deploy things. And that seems plausible? Like, I haven't done a detailed analysis, but Davinci is at around $0.1/1000 words, which sounds way too high to use to augment search. 

On the other hand, I expect that few people will need Gopher-like models. The mythical average person probably wants to hear what's new about celebrity X, or a link to a youtuber's channel or so on. When they need a link to a wikipedia page, or an answer to a pub quiz question, I suspect GOFAI is enough. So maybe cost is slightly less of an issue if only 1/10-1/100 queries needs these models to be used. 

"GPT" may become a verb just like "google" did. Although it has one or two syllables two many, so would probably get shortened to something like "jeept".

Or "prompting" ? Seems short and memorable, not used in many other contexts so its meaning would become clear, and it fits in with other technical terms that people are currently using in news articles, e.g. "prompt engineering". (Admittedly though, it might be a bit premature to guess what language people will use!)

Maybe, though prompting refers more generally to giving prompts in order to get the right kind of response/behavior from the LLM, not necessarily using it as a smarter version of a search engine

"Let me see what Chatty thinks," (or whatever humanesque name becomes popular).

I assume people will treat it just like talking to a very knowledgeable friend. Just ask a question, get a response, clarify what you meant or ask a followup question, and so on. Conversation in natural language already comes naturally to humans, so probably a lot more people will become a lot more adept at accessing knowledge.

And in future iterations, this "friend" will be able to create art, weave stories, design elucidating infographics, make entertaining music videos, teach academic subjects, try to sell you stuff (hmm), spread conspiracy theories (oops), etc., based on the gist of what it thinks you're looking for (and based on what it knows about you personally from your history of "friendship" with it). It would be nice if we could make it truthful and cooperative in a way that doesn't amplify the echo chamber effect of existing social media and search engines, but unfortunately, I don't see that as being very profitable for those deploying it.

Chat? That is how most people will use it, I imagine.

EDIT: It is still early days though, and the shape of things is unclear. What will be the most popular usecases, the ones that stands out in people's minds as what you use LLMs for? I don't know yet, so any naming seems pre-mature.

Ramble your question into a mic, get a good coherent answer. I will hate using audio. Newer generations will not even notice.