I maintain a reading list on Goodreads.
I am also on github
I think, rather than "category theory is about paths in graphs", it would be more reasonable to say that category theory is about paths in graphs up to equivalence, and in particular about properties of paths which depend on their relations to other paths (more than on their relationship to the vertices)*. If your problem is most usefully conceptualized as a question about paths (finding the shortest path between two vertices, or counting paths, or something in that genre, you should definitely look to the graph theory literature instead)
* I realize this is totally incomprehensible, and doesn't make the case that there are any interesting problems like this. I'm not trying to argue that category theory is useful, just clarifying that your intuition that it's not useful for problems that look like these examples is right.
As an algebraic abstractologist, let me just say this is an absolutely great post. My comments:
Category theorists don't distinguish between a category with two objects and an edge between them, and a category with two objects and two identified edges between them (the latter object doesn't really even make sense in the usual account). In general, the extra equivalence relation that you have to carry around makes certain things more complicated in this version.
I do tend to agree with you that thinking of categories as objects, edges and an equivalence relation on paths is a more intuitive perspective, but let me defend the traditional presentation. By far the most essential/prototypical examples are the categories of sets and functions, or types and functions. Here, it's more natural to speak of functions from x to y, than to speak of "composable sequences of functions beginning at x and ending at y, up to the equivalence relation which identifies two sequences if they have the same composite".
Again, I absolutely love this post. I am frankly a bit shocked that nobody seems to have written an introduction using this language - I think everyone is too enamored with sets as an example.
This is a reasonable way to resolve the paradox, but note that you're required to fix the max number of people ahead of time - and it can't change as you receive evidence (it must be a maximum across all possible worlds, and evidence just restricts the set of possible worlds). This essentially resolves Pascal's mugging by fixing some large number X and assigning probability 0 to claims about more than X people.
Just to sketch out the contradiction between unbounded utilities and gambles involving infinitely many outcomes a bit more explicitly.
If your probability function is unbounded, we can consider the following wager:
You win 2 utils with probability 1/2, 4 utils with probability 1/4, and so on.
The expected utility of this wager is infinite.
(If there are no outcomes with utility exactly 2, 4, etc, we can award more - this is possible because utility is unbounded).
Now consider these wagers on a (fair) coinflip:
(0 and 100000 can be any two non-equal numbers).
Both of these wagers have infinite expected utility, so we must be indifferent between them.
But since they agree on heads, and B is strictly preferred to A on tails, we must prefer B (since tails occurs with positive probability)
Information about people behaving erratically/violently is better at grabbing your brain's "important" sensor? (Noting that I had exactly the same instinctual reaction). This seems to be roughly what you'd expect from naive evopsych (which doesn't mean it's a good explanation, of course)
CFAR must have a lot of information about the efficacy of various rationality techniques and training methods (compared to any other org, at least). Is this information, or recommendations based on it, available somewhere? Say, as a list of techniques currently taught at CFAR - which are presumably the best ones in this sense. Or does one have to attend a workshop to find out?
There's some recent work in the statistics literature exploring similar ideas. I don't know if you're aware of this, or if it's really relevant to what you're doing (I haven't thought a lot about the comparisons yet), but here are some papers.
A thought about productivity systems/workflow optimization:
One principle of good design is "make the thing you want people to do, the easy thing to do". However, this idea is susceptible to the following form of Goodhart: often a lot of the value in some desirable action comes from the things that make it difficult.
For instance, sometimes I decide to migrate some notes from one note-taking system to another.
This is usually extremely useful, because it forces me to review the notes and think about how they relate to each other and to the new system. If I make this easier for myself by writing a script to do the work (as I have sometimes done), this important value is lost.
Or think about spaced repetition cards: You can save a ton of time by reusing cards made by other people covering the same material - but the mental work of breaking the material down into chunks that can go into the spaced-repetition system, which is usually very important, is lost.
This is a great list.
The main criticism I have is that this list overlaps way too much with my own internal list of high-quality sites, making it not very useful.
The example of associativity seems a little strange, I'm note sure what's going on there.
What are the three functions that are being composed?