Case study: A simple algorithm for fixing motivation
So here I was, trying to read through an online course to learn about cloud computing, but I wasn't really absorbing any of it. No motivation.
Motives are a chain, ending in a terminal goal. Lack of motivation meant that my System 1 did not believe what I was doing would lead to achieving any terminal goal. The chain was broken.
So I traversed the chain to see which link was broken.
Ibogaine seems to reset opiate withdrawal. There are many stories of people with 20 year old heroin addictions that are cured within one session.
If this is true, and there are no drawbacks, then we basically have access to wireheading. A happiness silver bullet. It would be the hack of the century. Distributing ibogaine + opiates would be the best known mental health intervention by orders of magnitude.
Of course, that's only if there are no unforeseen caveats. Still, why isn't everybody talking about this?
Today I had some insight in what social justice really seems to be trying to do. I'll use neurodiversity as an example because it's less likely to lead to bad-faith arguments.
Let's say you're in the (archetypical) position of a king. You're programming the rules that a group of people will live by, optimizing for the well-being of the group itself.
You're going to shape environments for people. For example you might be running a supermarket and deciding what music it's going to play. Let's imagine that you're trying to create the optimal environment for people.
The problem is, since there is more than one person that is affected by your decision, and these people are not exactly the same, you will not be able to make the decision that is optimal for each one of them. If only two of your customers have different favourite songs, you will not be able to play both of them. In some sense, making a decision over multiple people is inherently "aggressive".
But what you can do, is reduce the amount of damage. My understanding is that this is usually done by splitting up the people as finely as possible. You might split up your audience i... (read more)
Here's a faulty psychological pattern that I recently resolved for myself. It's a big one.
I want to grow. So I seek out novelty. Try new things. For example I might buy high-lumen light bulbs to increase my mood. So I buy them, feel somewhat better, celebrate the win and move on.
Problem is, I've bought high-lumen bulbs three times in my life now already, yet I sit here without any. So this pattern might happen all over again: I feel like upgrading my life, get this nice idea of buying light bulbs, buy them, celebrate my win and move on.
So here's 4 life-upgrades, but did I grow 4 times? Obviously I only grew once. From not having high lumen light bulbs to having them.
My instinct towards growth seems to think this:
growth = novelty
But in reality, it seems to be more like this:
growth = novelty - decay
which I define as equal to
growth = novelty + preservation
The tap I installed that puts this preservation mindset into practice seems to be very helpful. It's as follows: if I wonder what to do, instead of starting over ("what seems like the best upgrade to add to my life?") I first check whether I'm on track with the implementation of past good ideas... (read more)
You may have heard of the poverty trap, where you have so little money that you're not able to spend any money on the things you need to make more. Being broke is an attractor state.
You may have heard of the loneliness trap. You haven't had much social interaction lately, which makes you feel bad and anxious. This anxiety makes it harder to engage in social interaction. Being lonely is an attractor state.
I think the latter is a close cousin of something that I'd like to call the irrelevance trap:
I speculate that some forms of depression (the dopaminergic type) are best understood as irrelevance traps. I'm pretty sure that that was the case for me.
How do you escape such a trap? Well you escape a loneliness trap by going against your intuition and showing up at a party. You escape an irrelevance trap by going against your intuition and taking on more responsibility than you feel you can handle.
I have gripes with EA's that try to argue about which animals have consciousness. They assume way too readily that consciousness and valence can be inferred from behavior at all.
It seems quite obvious to me that these people equate their ability to empathize with an animal with the ability for the animal to be conscious, and it seems quite obvious to me that this is a case of mind projection fallacy. Empathy is just a simulation. You can't actually see another mind.
If you're going to make guesses about whether a species is conscious, you sho... (read more)
I did all the epistemic virtue. I rid myself of my ingroup bias. I ventured out on my own. I generated independent answers to everything. I went and understood the outgroup. I immersed myself in lots of cultures that win at something, and I've found useful extracts everywhere.
And now I'm alone. I don't fully relate to anyone in how I see the world, and it feels like the inferential distance between me and everyone else is ever increasing. I've lost motivation for deep friendships, it just doesn't seem compatible with learning new t... (read more)
Here's an idea: we hold the Ideological Turing Test (ITT) world championship. Candidates compete to pass an as broad range of views as possible.
Points awarded for passing a test are commensurate with the amount of people that subscribe to the view. You can subscribe to a bunch of them at once.
The awarding of failures and passes is done anonymously. Points can be awarded partially, according to what % of judges give a pass.
The winner is made president (or something)
As someone who never came across religion before adulthood, I've been trying to figure it out. Some of it's claims seem pretty damn nonsensical, and yet some of it's adherents seem pretty damn well-adjusted and happy. The latter means there's gotta be some value in there.
The most important takeaway so far is that religious claims make much more sense if you interpret them as phenomenological claims. Claims about the mind. When buddhists talk about the 6 worlds, they talk about 6 states of mood. When christians talk about a coven... (read more)
So here's two extremes. One is that human beings are a complete lookup table. The other one is that human beings are perfect agents with just one goal. Most likely both are somewhat true. We have subagents that are more like the latter, and subsystems more like the former.
But the emphasis on "we're just a bunch of hardcoded heuristics" is making us stop looking for agency where there is in fact agency. Take for example romantic feelings. People tend to regard them as completely unpredictable, but it is actually possible to predict to some extent whether y
I've been trying to figure out what finance really is.
It's not resource allocation between different people, because the intention is that these resources are paid back at some point.
It's rather resource re-allocation between different moments in one person's life.
Finance takes money from a time-slice of you that has it, and gives it to a time-slice of you that can best spend it.
Optimal finance means optimal allocation of money across your life, regardless of when you earn it.
Question for the Kegan levels folks: I've noticed that I tend to regress to level 3 if I enter new environments that I don't fully understand yet, and that this tends to cause mental issues because I don't always have the affirmative social environment that level 3 needs. Do you relate? How do you deal with this?