infotropism

Wiki Contributions

Comments

Sorted by

Hi, could anyone help me obtain

"Limits of Scientific Inquiry" by G. Holton, R. S. Morison ( 1978 )

and

"What is Your Dangerous Idea?: Today's Leading Thinkers on the Unthinkable." Brockman, John (2007)

Thanks in advance

So yes, you'd likely lose the fun of normal dreaming - experiencing weird stuff, letting the insane flow of your dreams carry you like a leaf on a mad wind and not even feeling confused by it, but rather feeling like it was plain normal and totally making sense, having lots of warm fuzzy feelings and partway formed thoughts about your experiences in that dream.

Yet you might on the other hand gain the fun of being able to, for instance, capitalize on your dreaming time to learn and do some thinking. Not to mention the pleasure and sense of security derived from knowing your rational mind can work even under (some) adverse conditions.

From the popularity of the "Strangest thing an AI could tell you" post, and anosognosia tidbits in general, this topic seems to fascinate many people here. I for one would find it freakishly interesting to discover that I had such an impairment. In other words, I'd have motivation to at least genuinely investigate the idea, and even accept it.

How I'd come to accept it, would probably involve a method other than just "knowing it intuitively", like how I intuitively know the face of a relative to be that of a relative, or how I know with utter, gut level certainty that I have three arms. Considering that we are, well, rationalists, couldn't we be supposed to be able to use other methods, to discover truth, than our senses and intuitions ? Even if the truth is about ourselves, and contradicts our personal feeling ?

After all, it's not like people in the early 20th century had observed tiny pictures of atoms, they deduced their existence from relatively nonintuitive clues glued together into a sound theoretical framework. Observing nature and deducing its laws has often been akin to being blind, and yet managing to find your way around by using indirect means.

If I had to guess, I'd still not be certain that, even being a rationalist using scientific methods and all those tools that help straighten chains of inference, as well as finding anosognosia to be more of a treat than a pain to be rationalized, would make it a sure bet that I'd not yet retain a blindspot.

Maybe the prospect of some missing things could be too horrid to behold, not matter how abstractly, perhaps beholding them may require me to think in a way that's just too complicated, abstract and alien for me to ever notice it as being something salient, let alone comprehensible.

Still that's really not what my intuition would lead me to believe, what with truth being entangled and so forth. And such a feeling, such an intuition, may be exactly part of the problem of why and how I'd not pay attention to such an impairment. Perhaps I just don't want to know the truth, and willingly look away each time I can see it. Then again, if we're talking rationalization and lying to oneself, that has a particular feeling, and that is something one could be able to notice.

This, applies more generally than to anosognosia alone, and was very illuminating, thank you !

So, provided that as we grow, some parts of our brain, mind, change, then this upsets the balance of our mind as a whole.

Let's say someone relied on his intuition for years, and consistently observed it correlated well with reality. That person would have had a very good reason to more and more rely on that intuition, and uses its output unquestioningly, automatically to fuel other parts of his mind.

In such a person's mind, one of the central gears would be that intuition. The whole machine would eventually depend upon it, and to remove intuition would mean, at best, that years of training and fine-tuning that rational machine would be lost; and a new way of thinking would have to be reached, trained again; most people wouldn't even realize that, let alone be bold enough to admit it and start back from scratch.

And so some years later, the black-boxed process of intuition starts to deviate from correctly predicting reality for that person. And the whole rational machine carries on using it, because that gear just became too well established, and the whole machine lost its fluidity as it specialized in exploiting that easily available mental ressource.

Substitute emotions, drives for intuition, and that may work in the same way too. And so from being a well calibrated rationalist, you start deviating, slowly losing your mind, getting it wrong more and more often when you get an idea, or try to predict an action, or decide what would be to your best advantage, never realizing that one of the once dependable gears in your mind had slowly been worn away.

There's no such thing as an absolute denial macro. And I sure hope this to trigger yours.

Yes I would. Why the acute interest ?

Is it because by admitting to being able to believe that, one would admit to having no strong enough internal experience of morality ?

Experience of morality, that is, in a way that would make him say "no that's so totally wrong, and I know because I have experienced both genuine guilt and shame, AND also the embarrassment of being caught falsely signaling, AND I know how they are different things". I have a tendancy to always dig deep enough to find how it was selfish for me to do or feel something in particular. And yet I can't always help but feeling guilt or shame beyond whose deep roots exist aside from my conscious rationalizations of how what I do benefit myself. Oh, and sometimes, it also benefits other people too.

Now we have a lot higher GDP

Yes indeed. Do you expect that to remain true after a nuclear war too ? More basically, I suppose I could resume my idea as follows : you can poke a hole in a country's infrastructure or economy, and the hole will heal with time because the rest is still healthy enough to help with that - just as a hole poked into a life form can heal, provided that the hole isn't big enough to kill the thing, or send it into a downward spiral of degeneration.

But yes, society isn't quite an organism in the same sense. There you probably could have full scale cataplasia, and see something survive someplace, and perhaps even from there, start again from scratch (or better, or worse, than scratch).

Agranarian is the new vegetarian.

Well, kidding aside, your argument, taken from Pearl, seems elegant. I'll however have to read the book before I feel entitled to having an opinion on that one, as I haven't grokked the idea, merely a faint impression of it and how it sounds healthy.

So at this point, I only have some of my own ideas and intuitions about the problem, and haven't searched for the answers yet.

Some considerations though :

Our idea of causality is based upon a human intuition. Could it be that it is just as wrong as vitalism, time, little billiard balls bumping around, or the yet confused problem of consciousness ? That's what would bug me if I had no good technical explanation, one provably unbiased by my prior intuitive belief about causality (otherwise there's always the risk I've just been rationalizing my intuition).

Every time we observe "causality", we really only observe correlations, and then deduce that there is something more behind those. But is that a simple explanation ? Could we devise a simpler consistent explanation to account for our observation of correlations ? As in, totally doing away with causality ? Or at the very least, redefining causality as something that doesn't quite correspond to our folk definition of it ?

Grossly, my intuition, when I hear the word causality is something along the lines of

" Take event A and event B, where those events are very small, such that they aren't made of interconnected parts themselves - they are the parts, building blocks that can be used in bigger, complex systems. Place event A anywhere within the universe and time, then provided the rules of physics are the same each time we do that, and nothing interferes in, event B will always occur, with probability 1, independantly of my observing it or not." Ok, so could (and should ?) we say that causality is when a prior event implies a probability of one for a certain posterior event to occur ? Or else, is it then not probability 1, just an arbitrarily very high probability ?

In the latter case with less than 1 probability, then that really violates my folk notion of causality, and I don't really see what's causal about a thing that can capriciously choose to happen or not, even if the conditions are the same.

In the former case, I can see how that would be a very new thing, I mean, probability 1 for one event implying that another will occur ? What better, firmer foundation to build an universe upon ? It feels really, very comfortable and convenient, all too comfortable in fact.

Basically, neither of those possibilities strike me as obviously right, for those reasons and then some, the idea I have of causality is confused at best. And yet, I'd say it is not too unsophisticated or pondered as it stands. Which makes me wonder how people who'd have put less thought in it (probably a lot of people) can deservedly feel any more comfortable with saying it exists with no afterthought (almost everyone), even as they don't have any good explanation for it (which is a rare thing), such as perhaps the one given by Pearl.

Load More