SmokeAndMirrors

Nurse who likes to read philosophy and rationalist literature

Wiki Contributions

Comments

This bet would've paid major dividends in hindsight. Is there a way to bet on OpenAI and Anthropic or other AI safety focused labs to both give them more access to capital and to make a profit? Nvidia stock has already ballooned quite a bit, and seems to be mostly duel use. Also I'm not confident about the safety credibility of many other AI companies. Although scoring each major foundation model building company for safety would be a useful project to do... (Pondering if I should do this)

I'm asking this mostly to see if anyone else has already done their homework on this question.

Answer by SmokeAndMirrorsMay 27, 202340

What did you end up choosing to study with data science? I'm in a bootcamp choosing a topic and I have been brainstorming ideas like crazy recently, and would be happy to discuss this more with you if you are interested. 

I see a hilarious and inspiring similarity between your story and mine. 

In high school, I realized that I enjoyed reflecting on topics to achieve coherence, discussing mechanisms of superficial phenomena, and wanted everyone to be happy on a deep level. So I created a religion, because, of course, I wanted to save the world.  I thought other religions were a failed attempt to incorporate modern positive psychology learnings (which had "solved happiness") into moral theories but I wanted to use the meme potential of social phenomena like religion and music.  Then I wasn't good at making a religion 'religious', I didn't like playing instruments much, and I was too arrogant to think old philosopher's had anything interesting to say that I hadn't thought of. So I settled for helping people as a nurse. I got stuck. 

10 years later, I heard of and finally read some philosophy, realized what I had been grasping at wasn't religion, but something more like logical positivism, then I found rationality and EA. It is strange how much this community feels like everything I was trying to articulate as a youngling, but I didn't realize economics, math, philosophy, and computer science were where my interests really lie, not religion and music. So now I'm in the rapid learning stage.

I think the only interesting insight I have here is that I'm now very weary of getting trapped in another local optimum (rationality / ea is the best, so I won't look at other intellectual movements) and continue to be open to seeing if there is anything better. However, it sounds like you implicitly understand this groupthink wariness, so enjoy and go save the world.

Answer by SmokeAndMirrorsDec 09, 202140

So I was brainstorming recently with a friend about this very topic: how to convince someone to support a goal of rationality (existential risk reduction, AI study), who doesn't enjoy engaging in rational reasoning.  Like I'd love to be able to persuade my random gen pop friend to be vegetarian, or think about the real disparities in the world, or about the implications of our actions if they were scaled up. 

2 possible strategies I came up with, and all border on Rhetoric, Persuasion, and I guess the Dark Arts in general are

Inspiration: "Inspire" them to value the rational process aka philosophical reasoning and evidence based reasoning. Inspiration involves realizing that something is socially virtuous, aesthetically pleasing, or has good instrumental results. 

  1. I think immersing them in a rational community and using their social reasoning values would work for many people who then have to use the tools of rational argument to get status or to see themselves as socially virtuous (like LessWrong Meet-Ups).  Maybe they would realize how powerful the tools are for preventing deception or cheating at some point, and come to value the tools of rationality apart from their use in status games.  
  2. Art in the form of books or movies that encourage deep deductive reasoning from evidence that rewards the heroes might be even better for this seeing as HPMOR is seen as a big draw for the gen pop to LessWrong. Examples are HPMOR, Bill Nye the Science Guy, Cosmos, the Skeptics Guide to the Galaxy. Anything that inspires an aesthetic Love of science, reasoning, or philosophy. I've yet to see the Scout Mindset inspire a wave of converts (although I'd be curious to analyze LessWrong / EA Forum usage after the book was published), and my friend didn't "covert" after listening to it. It does have useful tools for the converted though. 
  3. Finally conflating success at status, money, or relationships with rational argumentation or cohesion and correlation epistemological values. This would plant the seed of curiosity about how these tools are so effective, then people will eventually generalize these lessons, then boom. They're unknowingly rationalists and will possibly join the community itself once they find it.

Approach from the Side: Identities are powerful. You can emphasize the virtuousness of one part of peoples' identities as being consistent with the virtuousness of another part of their identity. Eventually you can weave these syllogisms together to support an "actually good value". For Example, I find that every liberal and conservative pays lip service to the ideas of a 'post-truth world' or 'fake news', and this is great because one can use this piece of their identity to make them more curious about what is true.  When I did this with my parents it went generally something like this.

  1. I like A (hating on people who tell lies)
  2. A is similar to B (liking people who tell the truth)
  3. B is similar to C (liking scientists who tell more truth, better than people who tell less truth)
  4. C is similar to D (I like more truth, better than less truth)
  5. D is similar to E (I like the process that creates more truth, better than the process that creates less truth)
  6. E is similar to .... Z (I like consistent principles that build up an accurate predictive model of the world within my mind and are very generally useful) 

My approach from the side technique worked briefly with my parents, maybe for like 2 or 4hrs they were curious about evidence for their beliefs and even some epistemology. Their belief statements went back to equilibrium (total confidence, signaling, etc.) after that though, although maybe it has shifted slightly in an undiscernible way. Anyways, the brief flicker of light I saw in them still feels worth it, but it might not for you.

Both these Means or Strategies feel like dirty and manipulative ways to use the Dark Arts, but the people I talked to are immersed in a whirlpool of these without even noticing it. Also the End Goal of being able to bring my friends and family closer to philosophy or rationality at all is of major utility to me and my posterity.  If splashing a drowning man might save him, I would do it. 

Answer by SmokeAndMirrorsNov 11, 202130

>!  

  1. My car's parking spot in my apartment complex. Lots of fits and jumps here and there, But it always ends up settled at home with only short trips elsewhere. I wonder if it would be considered some type of split equilibrium because of how long it stays at work often (36hrs /week).
  2. Humidity in my apartment. We have a dehumidifier set to 45 all year round (live in Oregon and got mold last year).
  3. My need for social activity acts a bit like an equilibrium with me taking actions to increase social activity if I'm feeling anti-social, or decrease it if I'm feeling over-social.

 

The Pruned

Epistemically Uncertain: Gender norms. I thought about this one for a while, and it was interesting but I just don't see too many equilibrium forming around "stable" femininity or masculinity throughout history. oh well. I also removed the Lipostat.

Boring Repetitions: Temperature of human body, other biologically important homeostasis levels, or psychologically important set - points.   !<

 

I can't seem to put spoiler tags on this?..