Nikita Sokolsky

Wiki Contributions

Comments

Sorted by

Here’s the corrected link: https://pastebin.com/B824Hk8J

Are you running this from an EC2 instance or some other cloud provider? They might just have a blocklist in IPs belonging to data centers.

Sorry for not being clear. My question was whether LW really likes the nanobot story because we think it might happen within our own lifetimes. If we knew for a fact that human-destroying-nanobots would take another 100 years to develop, would discussing them still be just as interesting?

Side note: I don't think the "sci-fi bias" concept is super coherent in my head, I wrote this post as best as I can, but I fully acknowledge that its not fully fleshed out.

Hm, are you sure they're actually that protective against scrapers? I ran a quick script and was able to extract all 548 unique pages just fine: https://pastebin.com/B824Hk8J The final output was:

Status codes encountered:
200: 548
404: 20

I reran it two more times, it still worked. I'm using a regular residential IP address, no fancy proxies. Maybe you're just missing the code to refresh the cookies (included in my script)? I'm probably missing something of course, just curious why the scraping seems to be easy enough from my machine?

They could but if you’re managing your firewall it’s easier to apply a blanket rule rather than trying to divide things by subdomain, unless you have a good reason to do otherwise. I wouldn’t assume malicious intent.

They do have a good reason to be wary of scrapers as they provide a free version of ChatGPT, I'm guessing they just went ahead and configured it over their entire domain name rather than restricting it to the chat subdomain.

Nanobots destroying all humans at once are indeed poor sci-fi. But how much of this story's popularity hinges on it happening within our lifetimes?

It's essential to my ability to enjoy life

This assumes that we'll never have the technology to change our brain's wiring to our liking? If we live in the post-scarcity utopia, why won't you be able to just go change who you are as a person so that you'll fully enjoy the new world?

I've watched the debate and read your analysis. The Youtube channel is great, doubly so given that you're just starting out and it will only get better from here.

Do you imagine there could be someone out there who could possibly persuade you to lower your P(doom)? In other words, do you think there could be a collection of arguments that are so convincing and powerful taken together that you'll change your mind significantly about the risks of AGI, at least when it comes to this century?

I've found a new box of Dexcom G7 on Ebay for just $90, ordered it now to try it out. 

Load More