Wiki Contributions


I agree about the cooperation thing. One addendum I'd add to my post is that shared reality seems like a common precursor to doing/thinking together.

If I want to achieve something or figure something out, I can often do better if I have a few more people working/thinking with me, and often the first step is to 'get everyone on the same page'. I think lots of times this first step is just trying to shove everyone into shared reality. Partially because that's a common pattern of behavior, and partially because if it did work, it would be super effective.

But because of the bad news where people actually have different experiences, cracks often form in the foundation of this coordinated effort. But I think if the team has common knowledge about the nature of shared reality and the non-terrible/coercive/violent way of achieving it (sharing understanding), this can lead to better cooperation (happier team members, less reality-masking, better map-sharing).

I'm also not sure what you mean about the trust problem, maybe you mean the polls which claim that trust in government and other stuff has been on the decline? 

Yeah let's do in-person sometime, I also tried drafting long responses and they were terrible

Sure! I love talking about this concept-cluster.

I have a hunch that in practice the use of the term 'shared reality' doesn't actually ruin one's ability to refer to territory-reality. In the instances when I've used the term in conversation I haven't noticed this (and I like to refer to the territory a lot). But maybe with more widespread usage and misinterpretation it could start to be a problem?

I think to get a better sense of your concern it might be useful to dive into specific conversations/dynamics where this might go wrong. 


I can imagine a world where I want to be able to point out that someone is doing the psychological mistake of confusing their desire to connect with their map-making.  And I want the term I use to do that work, so I can just say "you want to share your subjective experience with me, but I'm disagreeing with you about reality, not subjective experience." 

Does that kind of resonate with your concern?

Hmm, I want a term that refers to all those many dimensions together, since for any given 'shared reality' experience it might be like 30% concepts, 30% visual & auditory, 30% emotion/values, etc.

I'm down to factor them out and refer to shared emotions/facts/etc, but I still want something that gestures at the larger thing. Shared experience I think could do the trick, but feels a bit too subjective because it often involves interpretations of the world that feel like 'true facts' to the observer.


Wherein I write more, because I'm excited about all this: 

The first time I heard the term 'shared reality' was in this podcast with Bruce Ecker, the guy who co-wrote Unlocking the Emotional Brain. He was giving an example of how a desire for 'shared reality' can make it hard to come to terms with e.g. emotional trauma.

by believing the parent's negative messages to you (either verbal or behavioral), you're staying in shared reality: and that's a big aspect of attachment. ... especially shared reality about yourself: 'they think I'm a piece of crap, and I do too. So I feel seen and known by them even if the content is negative'.

In this case, the parent thinks the kid is a 'piece of crap', which I expect doesn't feel like an emotion to the parent, it feels like a fact about the world. If they were more intellectually mature they might notice that this was an evaluation - but it's actually super hard to disentangle evaluations and facts.

I guess I think it's maybe impossible to disentangle them in many cases? Like... I think typically 'facts' are not a discrete thing that we can successfully point at, that they are typically tied up with intentions/values/feelings/frames/functions.  I think Dreyfus made this critique of early attempts on AI, and I think he ended up being right (or at least my charitable interpretation of his point) - that it's only within an optimization process / working for something that knowledge (knowing what to do given XYZ) gets created. 

Maybe this is an is/ought thing. I certainly think there's an external world/territory and it's important to distinguish between that and our interpretations of it. And we can check our interpretations against the world to see how 'factual' they are.  And there are models of that world like physics that aren't tied up in some specific intention. But I think the 'ought' frame slips into things as soon as we take any action, because we're inherently prioritizing our attention/efforts/etc. So even a sharing of 'facts' involves plenty of ought/values in the frame (like the value of truth-seeking). 

Sure! The main reason I use the term is because it already exists in the literature. That said, I seem to be coming at the concept from a slightly different angle than the 'shared reality' academics. I'm certainly not attached to the term, I'd love to hear more attempts to point at this thing. 

I think the 'reality' is referring to the subjective reality, not the world beyond ourselves. When I experience the world, it's a big mashup of concepts, maps, visuals, words, emotions, wants, etc.

Any given one of those dimensions can be more or less 'shared', so some people could get their yummies from sharing concepts unrelated to their emotions. In your example, I think if my parents had something closer to my beliefs, I'd have more of the nice shared reality feeling (but would probably quickly get used to it and want more).

Some side notes, because apparently I can't help myself:

  • I think people often only share a few dimensions when they 'share reality', but sharing more dimensions feels nicer. I think as relationships/conversations get 'deeper' they are increasing the dimensions of reality they are attempting to share. 
    • (I think often people are hoping that someone will be sharing ALL dimensions of their reality, and can feel super let down / disconnected / annoyed when it turns out their partner doesn't share dimension number X with them).
  • Having dimensions that you don't share with anyone can be lonely, so sometimes people try to ignore that part of their experience (or desperately find similar folks on the internet). 
  • My examples seem to have been mostly about joy, but I don't think there is any valence preference, People love sharing shitty experiences. 
    • That said, probably the stronger / more prominent the experience the more you want to share (and the worse it feels to not share). 

OK, I've added a disclaimer to the main text. I agree it's important. It seems worth having this kind of disclaimer all over the place, including most relationship books. Heck, it seems like Marshall Rosenburg in Non-Violent Communication is only successfully communicating like 40% of the critical tech he's using.

Do you understand how e.g. Rari's USDC pool makes 20% APY?

  • Lending would require someone to be borrowing at rates higher than 20%, but why do that when you can borrow USDC at much lower rates? Or maybe the last marginal borrower is actually willing to take that rate? Then why does Aave give such low rates?
  • Providing liquidity would require an enormous amount of trades that I don't expect to be happening, but maybe I'm wrong

The only thing that my limited imagination can come up with is 'pyramid scheme', where you also get paid a small fraction of the money that other people are putting into the pool. So as long as the pool keeps growing, you get great returns. But the last half of the pool gets small (or negative) returns.

I'd love to get a better sense of this, maybe you could point me to your favorite writeup?

Yeah I think that mosquito map is showing the Zika-carrying species, but there are 40 other species in Washington. Mosquitos in New England (certainly Maine where I grew up) can be pretty brutal, especially when you include the weeks when the black flies and midges are also biting.


I've been playing around with this concept I call 'faith', which might also be called 'motivation' or 'confidence'. Warning: this is still a naive concept and might only be positive EV when used in conjunction with other tools which I won't mention here.

My current go-to example is exercising to build muscle: if I haven't successfully built muscle before, I'm probably uncertain about whether it's worth the effort to try. I don't have 'faith' that this whole project is worth it, and this can cause parts of me to (reasonably!) suggest that I don't put in the effort. On the other hand, if I've successfully built muscle many times (like Batman), I have faith that my effort will pay off. It's more like a known purchase (put in the effort, you'll get the gains), instead of an uncertain bet (put in the effort, maybe get nothing).

Worth noting: It's not as clear cut as a known effort purchase. The world is more uncertain than that, and the faith I'm referring to is more robust to uncertainty. I expect every time Christian Bale re-built muscle, it was a different process. Some routines didn't work as well, and some new routines were tried. Faith is the confidence/motivation that even in the face of uncertainty and slow feedback loops, your effort will be worth it.

A lesswrong-style framing of this concept might be something like 'a fully integrated sense of positive expected value'.

Holding this concept in mind as something that might be going on (having/lacking/building/losing faith) has been useful lately. I might keep editing this as I better flesh out what's going on.

Load More