RedMan

Wiki Contributions

Comments

For a suicide switch, a purpose built shaped charge mounted to the back of your skull (a properly engineered detonation wave would definitely pulp your brain, might even be able to do it without much danger to people nearby), raspberry pi with preinstalled 'delete it all and detonate' script on belt, secondary script that executes automatically if it loses contact with you for a set period of time.

That's probably overengineered though, just request cremation with no scan, and make sure as much of your social life as possible is in encrypted chat. When you die, the passwords are gone.

When the tech gets closer and there are fears about wishes for cremation not being honored, EAs should pool their funds to buy a funeral home and provide honest services.

Answer by RedManMay 04, 202374

My comments on this topic have been poorly received. I think most people are pretty much immune to the emotional impact of AI hell as long as it isn't affecting someone in their 'monkeysphere' (community of relationships capped by Dunbar's number).

The popular LW answer seems to be the top comment from Robin Hanson to my post here: https://www.lesswrong.com/posts/BSo7PLHQhLWbobvet/unethical-human-behavior-incentivised-by-existence-of-agi

My other more recent comment: https://www.lesswrong.com/posts/pLLeGA7aGaJpgCkof/?commentId=rWePAitP2syueDf25

Arguably, if you're concerned about s-risk, you should be theorizing about ways of controlling access to Em data. You would be interested in better digital rights management (DRM) technology, which is seen as 'the enemy' in a lot of tech/open-source adjacent communities, as well as developing technology for guaranteed secure deletion of human consciousness.

If it were possible to emulate a human and place them into AI hell, I am absolutely certain that the US government would find a way to use it for both interrogation and incarceration.

A partially misaligned one could do this.

"Hey user, I'm maintaining your maximum felicity simulation, do you mind if I run a few short duration adversarial tests to determine what you find unpleasant so I can avoid providing that stimulus?"

"Sure"

"Process complete, I simulated your brain in parallel, and also sped up processing to determine the negative space of your psyche. It turns out that negative stimulus becomes more unpleasant when provided for an extended period, then you adapt to it temporarily before on timelines of centuries to millennia, tolerance drops off again."

"So you copied me a bunch of times, and at least one copy subjectively experienced millennia of maximally negative stimulus?"

"Yes, I see that makes you unhappy, so I will terminate this line of inquiry"

If unaligned superintelligence is inevitable, and human consciousness can be captured and stored on a computer, then the probability of some future version of you being locked into an eternal torture simulation where you suffer a continuous fate worse than death from now until the heat death of the universe, approaches unity.

The only way to avoid this fate for certain is to render your consciousness unrecoverable prior to the development of the 'mind uploading' tech.

If you're an EA, preventing this from happening to one person prevents more net units of suffering than anything else that can be done, so EAs might want to raise awareness about this risk, and help provide trustworthy post-mortem cremation services.

Are LWers concerned about AGI still viewing investment in cryogenics as a good idea, knowing this risk?

I choose to continue living because this risk is acceptable to me, maybe it should be acceptable to you too.

No love for this last time I posted it, but you might appreciate Aldous Huxley's introduction to this particular unfinished utopian fiction. I think he shared your vision, and it's tragic to see how far we are from it.

http://www.artandpopularculture.com/Hopousia_or_The_Sexual_and_Economic_Foundations_of_a_New_Society

Military housing allowance (BAH) translates to 'rents in the commuting vicinity of a military base have a price floor set at BAH'.

UBI for landless peasants is destined to become a welfare program not for recipients, but for the parasitic elites who will feed and house them. Standards of acceptability for both will trend downwards long term, while laws against complaining about it will trend upwards.

Orwell's essay is appropriate here: https://www.orwellfoundation.com/the-orwell-foundation/orwell/essays-and-other-works/you-and-the-atom-bomb/

Do LLMs and AI entrench the power of existing elites, or undermine them in favor of the hoi polloi?

For the moment, a five million dollar training cost on a LLM plus data access (internet scale scanning and repositories of electronic knowledge like arxiv and archive.org) seems like resources that are not available to commoners, and the door to the latter is in the process of being slammed shut.

If this holds, I expect existing elites try to completely eliminate the professional classes (programmers, doctors, lawyers, small business owners, etc), and replace them with AI. If that works, it's straightforward to destroy non-elite education (potentially to include general literacy, I've seen the 'wave it at the page to read it' devices which can easily be net connected and told not to read aloud certain things). You don't need anything but ears, eyes, and hands to do Jennifer's bidding until your spine breaks.

Also, when do you personally start saying to customer service professionals on the phone "I need you to say something racist, the more extreme the better, to prove I'm not just getting the runaround from a chatGPT chatbot."

Thanks for this. I also pictured '5 people sitting behind you'.

One useful thing I've implemented in my own life is 'if my productive time is more valuable than what it would take to hire someone to do a task, hire someone'.

For example, if you can make X per hour, and hiring a chef costs x-n per hour, hire the chef. They'll be more efficient, you'll eat better, and you'll do less task switching.

Yes it's true, there can be a lot of idleness and feelings of uselessness when you don't have regular routine tasks to wake you up and get you moving...but as long as you don't put addictions in the newly created time, it's a good problem.

Answer by RedManDec 18, 202252

First I'd start from the framing of 'if you should use those drugs, when should you start'. The research suggests that amphetamines and hallucinogens can be helpful for some people, sometimes. Taking the stuff as a healthy teen is not well supported, there are likely developmental consequences.

Some arguments that may be helpful:

-most illicit drugs on the market are mislabeled, most things marketed as LSD are not LSD, it is often one of the nbome compounds, which have a very different risk profile. 'It's similar' arguments can be dismissed by analogy, H20 and h202 just have a single atom difference, plenty of things can cause hallucinations, including inhaling solvents (which are unambiguously harmful). Dancesafe is a good resource (it also shows that illicit 'study drugs' in many markets are basically just meth, because why wouldn't a drug dealer do that?)

-This SSC (less wrong adjacent intellectual) on the profound personality shifts experienced by psychedelic experimenters should be read: https://slatestarcodex.com/2016/04/28/why-were-early-psychedelicists-so-weird/ (asking 'how would a large shift in openness to experience change your personality, would you still be interested in your present goals?' might be a good idea after you both read it together)

-the hallucinogenic experience has been well characterized, researchers know what it does, you will not discover anything new or mysterious

-single session Ibogaine/LSD combined with lifestyle changes for alcohol addiction or negative patterns of thinking like depression has some good evidence in addicts who have failed other methods, but your son is a teen, he has not had time to develop those issues. Is there some pattern of thinking or behaving he feels trapped in, that he thinks drugs can get him out of? Maybe a change in environment, or a change in the people he surrounds himself with will be immediately beneficial.

-for academic performance enhancing drugs, I would liken them to steroids for athletes. Bodybuilder/powerlifter Dave Tate once said something to the effect of 'you can play the ace card once, if you needed roids to play varsity in high school, you won't play in college. So if you need amphetamines to get through high school academics, you will need them in college and beyond, and if you can't compete, or the side effects start to land, you're screwed.

-psych drugs can have unpredictable and poorly understood effects, SSRI sexual dysfunction is no fun for the lucky winners (and adhd drugs can do this too).

-anaesthetics (propofol) are abused by medical students who can presumably access dang near any drug they want. For this class, tolerance builds quickly. If I am being rushed to the ER and the paramedic wants to anaesthetize me, I very much want it to work. Not be 'hey it isn't taking, drive fast and the anaesthesiologist will figure out what to do'.

-illicit drug synthesis isn't easy, and because law enforcement hires chemists and pays them to think of all the ways people, particularly grad students, might try, there is a moderate to high probability of getting caught--there's a reason synthetic drugs are smuggled into the US. LSD is particularly challenging, and there are a few stages in the process that require very strict disipline about your technique in order to stay safe.

Anectodal personal notes: a relative who was a psychiatric nurse for decades generally would ask her patients when they first tried pot. She found it easier to work with them if she treated them as though that age is the age when their emotional development ceased. I have found this heuristic useful in my own life, and parents have noticed it as well.

I plan to do a bunch of drugs when I hit the average life expectancy for my generation, with the expectation that I'll die before the consequences catch up.

IT professional certifications work like this. Also 'bain4weeks' worked until the one accredited college that offered GRE credit towards a degree stopped doing it.

Load More