Wiki Contributions

Comments

In using the term Antifragilism, I am simply referring to the broad pursuit of Antifragility (to avoid conflating this with the property of something itself being Antifragile). This seemed unambiguous to me, but perhaps I should edit to add this, if it is a cause for confusion. I have resisted the temptation to define Antifragility myself, as Taleb does a better job.

"Plenty of old traditions aren't antifragile, they are just robust because they stand the test of time."

This is basically one of the things I'm getting at - it seems to be a common failure mode of people trying to pursue Antifragility, that they instead settle upon a robust tradition. Taleb rails against conflating Antifragility with robustness in his book, but it still seems easy for people to do.

Interesting links - thanks for the wikipedia rabbit-hole :)

I initially interpreted your comment as considering Pragmatism to have the kind of "this belief is useful to me, so I will continue to behave as though it is true" attitude that is used to defend religious beliefs that make people happy. I would have disagreed with this interpretation, but after reading what you linked, I see that your point was much more subtle.

Looking at the articles for the Deflationary, Pragmatic and Correspondence Theories of Truth, I must admit that some of the nuances are lost on me, but I do think that there is enough overlap between these theories that there isn't anything too irrational about any of them. The Pragmatic Theory of Truth article states that Pierce's approach was at least superficially based on the Correspondence Theory, and the Deflationary Theory article uses Tarski's work as an example, even though he himself considered it to not be a Deflationary approach. I would probably need to spend a long time reading up on this to give a more intelligible response.

I would say that from the perspective of Newcomb-like problems, Pragmatism does an unusually good job at suggesting that you should one-box. When faced with a question about whether one or both boxes contain something, the true contents of the boxes are less relevant than the payoff you will actually receive. I'm not sure what this implies for which theory of truth is the most meaningful, but it seems relevant.