G Gordon Worley III

Director of Research at PAISRI


Zen and Rationality
Formal Alignment
Map and Territory Cross-Posts
Phenomenological AI Alignment


Anatomy of a Gear

Since no one took you up on your question, I'll take it up.

Temperature gives us a gear for talking about how hot or cold something is by degrees. The more degrees something has the hotter we say it is, the less the colder down to the limit of absolute zero.

Yet temperature is actually one number over a lot of complex interactions and exchanges of energy between particles (for now bottoming out at that particular gear). If we look at those individual particles we'll find a more complex story that's noisy (and gets noisier as things get hotter), but that's not actually necessary to know about to answer questions like "will this thing melt?" or "will this thing freeze?", which generally just require knowing the average energy of the thing to predict how it will behave.

Retrospective: November 10-day virtual meditation retreat

I'm curious, did you notice you became more settled after the third day? It's a somewhat common experience during sesshin to have something of a breakthrough or surrender on the third day and give oneself over to the sesshin, so the speak. I realize this is a different tradition, so curious if you saw evidence of a similar phenomenon.

Using false but instrumentally rational beliefs for your career?

But perhaps believing the truth is against my interest. If I believe that my work is important to outcomes, I will work harder. If I believe my work has little importance, I may become lazy or seek side-hustles. Should I convince myself that my publication count and thesis quality are more important than is true?

So I think the answer is somewhat complicated, and requires unpacking a few things. The big thing is that it is possible to commit without belief, many people find this hard to do without expending willpower, and so find it instrumentally useful to lie to themselves about what they believe, only possibly to find out their beliefs are not very malleable and they can't successfully lie to themselves to achieve this end.

The best situation would be to believe the truth and do it anyway. This requires a level of non-identifcation with the belief, though, such that you can successfully invest in an uncertain outcome and be happy with the expected returns rather than the actual returns.

If that's not possible, next best would be setting up incentives such that you don't have to change your belief and can maintain beliefs you believe to be true but are nonetheless incentivized to do what you want yourself to do. This is painful for a lot of people because they feel themselves fighting the incentives they themselves set up, but it's an option.

Epistemically the worst option is to lie to yourself, but also probably the least painful if the first option is not available. It will work so long as you can maintain the lie, but you might not like having to do all the work to maintain it, and it'll inevitably poison other beliefs by the need to maintain the network of dependent beliefs that prop up the falsehood you're maintaining. Not recommended, and you'll create a lot of harm for yourself to unravel later.

Limiting my comments to the question at hand, but if you asked for my general opinion, it would be to have alternatives that let you out of the frame of the situation you've created so you don't have to do this.

Zen and Rationality: Skillful Means

I don't. That kind of knowledge is part of the tradition passed down from one teacher to the next.

There's also a tradition within vajrayana schools that involves a more direct kind of thing where teachers pick practices for their students to work with, but I believe that's also knowledge that is not transmitted in writing.

Zen and Rationality: Skillful Means

It's not impossible to figure out what is worth working on or what techniques to use as a student independent of a teacher's recommendation. There's a meta skill of doing this, a kind of way of both observing hints about yourself and what you need, and experimenting. Even with that you might be better served by a teacher who can short circuit experiments to rediscover knowledge, but it's not irreplaceably essential in all cases.

Zen and Rationality: Skillful Means

I agree and think this is a weakness within the rationality community's approach to training. The challenge is that it's hard to be a student, and rationality disproportionately attracts folks who are bad at being students (too many folks who are overly independent, avoidant, recalcitrant, of otherwise generally defiant of allowing themselves to be dominated by our subservient to others). Further, I'm not sure there are many good teachers within rationality in the sense you'd be willing to give them the kind of trust a Zen teacher asks of a student. Thus rationality has to take a different approach, offering up a whole bunch of stuff and some guidance about how to use it, but also largely leaving people to their own devices because of a combination of lacking stronger culture of a particular kind and having a culture that prefers going it alone.

What are some good examples of fake beliefs?

I think an important issue to keep in mind about fake beliefs is that they may be locally useful but not globally useful, i.e. they might help you for a while but you'll eventually have to unlearn them to get out of local maxima.

What are some good examples of fake beliefs?

This seems to point to a general category of beliefs necessary to prop up deontological ethics as possibly useful fake beliefs in that they help motivate you to follow norms (which are hopefully good norms).

Early Thoughts on Ontology/Grounding Problems

Interesting. I can't recall if I commented on the alignment as translation post about this, but I think this is in fact the key thing standing in the way of addressing alignment, and put together a formal model that identified this as the problem, i.e. how do you ensure that two minds agree about preference ordering, or really even the statements being ordered.

Writing to think

I have basically the same attitude towards writing, and found it really helpful over the years to write with a mind to publish, even if all I was doing was rambling and posting on Facebook to a somewhat limited audience. In fact, for a while my Facebook feed was full of long, rambling posts of me just working out ideas in writing and trying to find ways to express ideas.

Lowering the bar to what you will write and publish if you have this impulse I think is key, though doing that on LW can be a little hard because of downvotes and critical comments, so depending on how you respond to those finding a space where you feel you can just write what you want without consequence first can be really freeing to get things going. For me that was Facebook, but I could imagine it being LW short form, Twitter, or something else.

Load More