Very interesting list, thanks Louie!
I just randomly clicked on a few links for online courses, and it seems there's at least one issue: The "Probability and Computing" part points to "Analytic Combinatorics, Part I" coursera course, which is not about probability at all. The MIT and CMU links for this part seem wrong too. Someone should carefully go through all the links and fix them.
The funny thing is, that the rationalist Clippy would endorse this article. (He would probably put more emphasis on clippyflurphsness rather than this unclipperiffic notion of "justness", though. :))
You just say: 'For every relation R that works exactly like addition, the following statement S is true about that relation.' It would look like, '∀ relations R: (∀x∀y∀z: R(x, 0, x) ∧ (R(x, y, z)→R(x, Sy, Sz))) → S)', where S says whatever you meant to say about +, using the token R.
I would change the statement to be something other than 'S', say 'Q', as S is already used for 'successor'.
In Hungary this (model theory and co.) is part of the standard curriculum for Mathematics BSc. Or at least was in my time.
(Audiatur et altera pars is the impressive Latin name of the principle that you should clearly state your premises.)
That's not what I thought it means. My understanding was that it's something like: "all parties should be heard", and it's more of a legal thing...
I'm really itching to try this out! ;)
(Consider this as a word of encouragement. I'll to think about my predictions and will post them here if I come up with anything useful. But, in the time being I wanted to say at least this much.)
Who is the intended audience for this?
If someone has a good grasp of Bayes, it's not that informative. (Though I liked the original idea and the story. :)) But, if one doesn't already understand the math behind this, then it's just a bunch of magic numbers, I am afraid. The second half of it for sure.
The link to the "Hamlet" is broken. Not that it's hard to find, but you might still want to fix it.
Wow, this is amazing! Both, the idea and your presentation of it.
Very insightful and though-provoking. And, my mind was completely blown by the fact that you have converted. It so doesn't fit into my models that I am quite confused. I would be very curious what's behind it and what would you answer to your own questions (before and after). But, I guess you wrote about it a lot, so I'll just go and read it.
And yes, this definitely deserves a discussion post!