tldr: Some minor ranting and initial ideas in trying to imagine a better way to do search 
 

How would we make search engines less dogshit if we were trying to do that?

There is so much information. Like, hella stuff dude.

And plenty of "normal people" like me are regularly trying to find stuff. It seems like it would be cool if we had better tools for some of that sort of thing that weren't so inherently crowded and zero-sum.

So, the use cases I am most interested in improving are stuff like:

  • web search
    • answer query
    • find/ explore websites
    • locate a specific document / data source / resource
  • product search / product hunt
    • Amazon / eBay / Craigslist / Facebook Marketplace
    • media search / find me this ISBN or movie or whatever

To me, that whole ecosystem seems really shit and under optimized. 

Even with ad-block, much of the experience is bought and paid for. Some sites will just show you sponsored garbage, not label it as sponsored, and give that to you as a search result (Amazon). That sort of thing quickly enshittifies the search function.

But even if that isn't going on, it seems like SEO is a total race to the bottom. Like, cool, I get to type one thing in a box and then just find a page who did all of the stupid spammy bullshit correct.

Product hunting is perhaps the most adversarial version of this because there are literally people who's job is it to compete with one another to try to get their bullshit ahead of someone else's bullshit in a bullshit list. This is zero sum and often works against the interests of the user.

How do we do that more clever? 

I think that pcpartpicker.com is a really good example of cool product picking because is just takes the "filter by attributes" thing to the nth level and includes a lot of useful community support and expert advice to boot.

I want more advanced filtering options on normal search engines. I want to be able to do "x AND y NOT z" type stuff more robustly instead of just as a nerd feature. 

Or to perform a second search on the results of the first one.

Or to be able to favorite sites or whitelist them or something. I know that site:$site-identifier or whatever is a functionality that works on some search engines. I like that sort of thing.

I want to have some of the benefits of the engine "learning me", but without just turning to a fully black box ML solution. 

I don't want my search function to be developed adversarially for the sake of advertising to me nor do I want it to turn the internet into a race to the bottom of millions iterations of SEO fighting one another.

I think it would be cool if my search function was more standardized between sites/between whatever db I happen to be searching. Imagine re-inventing the wheel every time you want to create any somewhat messy cluster of data.

What is the positive sum version of all of this?

New to LessWrong?

New Comment