Can someone help me dissolve this, and give insight into how to proceed with someone who says this?

 

What are they saying, exactly? That the set of beliefs in their head that they use to make decisions is not the same set of beliefs that you use to make decisions?

 

Could I say something like "Yes, that's so, but how do you know that your truth matches what is in the real world? Is there some way to know that your truth isn't only true for you, and not actually true for everybody?"

 

I'm trying to get a feel for what they mean by "true" in this case, since it's obviously not "matching reality."

New to LessWrong?

New Comment
18 comments, sorted by Click to highlight new comments since: Today at 12:47 PM

Found my favourite version:

I'd guess it's something between "I'm pretty sure I'm right but I don't want to argue about it with you right now" and "fuck off brah".

Depends on the context, obviously, but my first interpretation would be "My values are not your values". In popular usage "truth" means more than empirically proven facts about the objective reality -- e.g. people routinely call "truth" what they believe not only in the descriptive but also in the normative sense.

I would recommend making clear two separations: between descriptive ("US economic growth has been slow recently") and normative ("We need to accelerate the US economic growth"); and between facts ("The US GDP grew by 2.4% in 2015"), preferences("Fighting inequality is more important than gross economic growth"), and forecasts, often conditional ("We can accelerate the economic growth by cutting taxes").

While it is safe to assume most people will understand that statement in their own way making me even doubt if I should write a response, I think some words can be said by me on the subject.

The statement is an affirmation by the issuer to communicate that it is aware that what it has understood of the world can be different to what you have understood of the world. The level of which the statement was meant depends on what it was thinking at the time as it could have logical grounds or grounds rooted in the realm of physics.

The desired effect can either be to initiate conversation regarding that idea or as a simple recycled move to limit the way the conversation is going or the way it can go.

They might simply mean "your apparent interest in truth is not an interest in matching reality, but in matching your feelings or something like that." They may or may not be right depending on who you are.

I expect that "truth" in this context just mean "opinion what the truth is" which can not be easily verified nor refuted and despite of it both debaters are pretty confident in their opinions. In this case a response can be something like "I would prefer word "opinion" but I hope we understand each other ;-)"

For more accurate response I need some examples. I don't realize coming across this phrase in a discussion.

The steelman of this is "we have different priors and different experiences, so our beliefs and values diverge". Most (but not all) of the time I've heard people use this phrase, it also indicates that the speaker doesn't expect fruitful or enjoyable further discussion of those differences.

"your map is not my map (and I don't expect to have my map improved by looking at your map)"

well, that's probably too charitable. more like:

"your map is not my map, and we both know that updating one's map means losing status, so neither of us is going to do that"

I've found it best to avoid the word "truth" whenever possible. The concept of "truth" implies an objective reality exists and that you know about it. Since we may be in a simulation, in the imagination of a god, or just hallucinating, we can never really be sure about "truth" and I find it boring to play semantic games in order to better hedge the word.

I find it much better to just focus on predictions and beliefs with explicit levels of confidence.

If you're talking about whether the sun rises tomorrow, and you say you predict that it will rise with high confidence, and your interlocutor responds, "That's not my truth," then you can just ask them to break that down into a prediction. Are they saying the sun won't rise? If so, okay, you can test that.

If the disagreement is over something that can't practically be tested, you can still interrogate their concrete predictions and see where they disagree with yours.

Religious people love talking about Truth because it is so confusing. I can't nail you down and show where you're wrong if you refuse to be concrete, so if you don't want to be shown to be wrong, just talk about abstract Truth.

Since we may be in a simulation, in the imagination of a god

Even then, what the simulation is actually simulating or what the god is actually imagining would still be a matter of fact on which people could disagree but at least one of them would be wrong.

Not necessarily true.

Why couldn't god create a reality that is illogical, chaotic, contradictory and random?

That's true, but I was trying to emphasize the angle that if you are in a universe where the "ground truth" is being hid from you by an adversarial simulator/god, then you very likely won't ever be able to know the objective truth. And since we can't ever know 100% that we aren't in such a situation, it's pointless to make claims with certitude about the "true" nature of reality. Much better to stick to what we expect to happen in our own domain.

[-][anonymous]7y00

My job is selling books. On the one hand, I should sell as many and/or as expensive as possible books; on the other, since customers keep coming back, I should (if they explicitly ask my opinion, or if I see them hesitate) offer them deals that would be okay to them (I don't aim higher - there are time constraints, after all). Taken together, this means I have to make assumptions about what they want and how much they want it (it's actually interesting - listening to imprecise words, reading faces, trying to remember if we have something without looking it up, etc.). Recently, a lady asked what I would consider a good present for a 11 y.o. child. 'How about Tom Sawyer?' 'Oh no, I mean, for a girl.' Now, I had originally offered 'Tom Sawyer' exactly because I did not know the child's gender, but her answer made it look like I had assumed that she was talking about a boy. I could have corrected her, but instead I pointed out 'Ann from Green Gables' and we parted (more-or-less) satisfied. (Obviously I'm still smarting.)

The point is, we could have argued about both our assumptions and our conclusions, but it was not worth it, and it did not matter much. Who of us was right?

Looking for a single answer that applies to all cases seems like a clear mistake. That said, your interlocutor probably does not think in terms of expected value, so they may have an epistemology that attempts to do the same job.

In my experience, people appear to have informal, relatively fluid and vague concepts for things that also have formal, precise and rigorous definitions/expressions in systematic thought, shall we say. Truth appears to be one such thing, as I am sure others have noted here. When someone speaks of "my truth" there could be a few things implied or confounded within that declaration

  1. What "feels" true to me right here, right now
  2. I can see the arguments, but the chain leads to a core belief of mine that simply cannot be wrong given my worldview, identity etc, therefore it may seem true to you, given what you know, but it is not true to me given what I know to be also true
  3. As others have suggested, this may be a conversation-stopper because the argument is emotionally upsetting, or the other side suspects your motives or reasoning but does not want to engage, or they simply do not care for the emotional and mental energy involved in adjudication of such matters as truth (as defined as beliefs matching some presumed objective reality).
  4. People's reasoning often appears motivated, utility-driven (be it status, or actual rewards or whatever), ergo they may be redefining truth in terms of what would maximise their expected utility in the context in question...so, for example if you, a sceptic/atheist are debating a religious person in the company of their fellow believers, they have more utility in winning or appearing to reach a noble impasse with you than re-negotiate their beliefs in public. Given how a lot of people do not like to admit to being wrong (and this includes everyone, at some point, and requires good metacognitive discipline to overcome), even if it is a two-person conversation, if the topic is one that requires "truth" to be brought into the picture (well, "Pikachu is ridiculous"...could be the topic, for all I know...in which case, there are very few truthful propositions to examine...), the motivation to not lose face, or not yield may be strong.

The rational response to such a comment is to issue oneself a firm "nolle prosequi" and exit the conversation politely*. I have no idea what the conversation was about, so I cannot know the truth of things ;)

*if the topic happened to be one with sufficient fogginess in the real world , re-examining one's own beliefs would be a necessary step - heck, I'd re-evaluate anyway, just as a sanity check.

Quite possibly. But the first is more dangerous and so there is more reason to pay attention to it.

Can someone help me dissolve this, and give insight into how to proceed with someone who says this?

You don't, they just don't want to talk about it. Some people can sadly not be saved.