Nick Hay

11y3

Fascinating, I thought Tennanbaum's theorem implied non-standard models were rather impossible to visualize. The non-standard model of Peano arithmetic illustrated in the diagram only gives the successor relation, there's no definition of addition and multiplication. Tennenbaum's theorem implies there's no computable way to do this, but is there a proof that they can be defined at all for this particular model?

The chapter on Chomsky is contrasting the generative grammar approach, which Lakoff used to work within, to the cognitive science inspired cognitive linguistics approach, which Lakoff has been working in for the last few decades. Cognitive linguistics includes cognitive semantics which is rather different to generative semantics.

12y1

I largely agree with your critique, but more as a description of a different book that could have been written in this book's place. For example, a book on philosophy applying the results of this book's methodology, of which chapter 25 is a poor substitute. Or books drilling into one particular area in more detail with careful connections to the literature. This book serves better as an inspiring manifesto.

While these chapters are enlightening, they depend too heavily on the earlier account of metaphor, rarely draw upon other findings in cognitive science that are likely relevant, are sparse in scientific citations, and (as I've said) rarely cite actual philosophers claiming the things they say that philosophers claim.

Why is the dependence on the earlier theory of metaphor a problem?

Do you think the authors misrepresent what philosophers claim, in those chapters addressing philosophy (15-24) rather than (informal) philosophical ideas (9-14)?

If the goal in exercise is to lose weight, have you tried replacing carbohydrates with fat in your diet? Forcing yourself to exercise will serve to work up an appetite and make you hungry, but not to lose weight. There is a correlation between exercising and being thin, but the causality is generally perceived the wrong way around. There is also a correlation between exercising and (temporarily) losing weight, but that is confounded by diet changes which typically involving reducing carbohydrate intake.

I've heard you mention Gary Taube's work, but not that you've read it. If you haven't read his book he has a new shorter on which is well worth reading, linked here: http://www.garytaubes.com/2010/12/inanity-of-overeating/ The appendix has specific diet recommendations. Also good are these notes: http://higher-thought.net/complete-notes-to-good-calories-bad-calories/

Idealized Bayesians don't have to be logically omniscient -- they can have a prior which assigns probability to logically impossible worlds.

14y0

I would be there, but I'm not back in NZ until 16th December! Everyone else should definitely go.

The Von-Neumann Morgenstern axioms talk just about preference over lotteries, which are simply probability distributions over outcomes. That is you have an unstructured set O of outcomes, and you have a total preordering over Dist(O) the set of probability distributions over O. They do not talk about a utility function. This is quite elegant, because to make decisions you must have preferences over distributions over outcomes, but you don't need to assume that O has a certain structure, e.g. that of the reals.

The expected utility theorem says that preferences which satisfy the first four axioms are exactly those which can be represented by:

A <= B iff E[U;A] <= E[U;B]

for some utility function U: O -> R, where

E[U;A] = \sum{o} A(o) U(o)

However, U is only defined up to positive affine transformation i.e. aU+b will work equally well for any a>0. In particular, you can amplify the standard deviation as much as you like by redefining U.

Your axioms require you to pick a particular representation of U for them to make sense. How do you choose this U? Even with a mechanism for choosing U, e.g. assume bounded nontrivial preferences and pick the unique U such that \sup{x} U(x) = 1 and \inf{x} U(x) = 0, this is still less elegant than talking directly about lotteries.

Can you redefine your axioms to talk only about lotteries over outcomes?

To be concrete, suppose you want to maximise the average utility people have, but you also care about fairness so, all things equal, you prefer the utility to be clustered about its average. Then maybe your real utility function is not

U = (U[1] + .... + U[n])/n

but

U' = U + ((U[1]-U)^2 + .... + (U[n]-U)^2)/n

which is in some sense a mean minus a variance.

Very nice. These notes say that every countable nonstandard model of Peano arithmetic is isomorphic, as an ordered set, to the natural numbers followed by lexicographically ordered pairs (r, z) for r a positive rational and z an integer. If I remember rightly, the ordering can be defined in terms of addition: x <= y iff exists z. x+z <= y. So if we want to have a countable nonstandard model of Peano arithmetic with successor function and addition we need all these nonstandard numbers.

It seems that if we only care about Peano arithmetic with the successor function, then the naturals plus a single copy of the integers is a model. If I was trying to prove this, I'd think that just looking at the successor function, to any first-order predicate an element of the copy of the integers would be indistinguishable from a very large standard natural number, by standard FO locality results.