There's a more elaborate walkthrough of the last argument at https://web.stanford.edu/~peastman/statmech/thermodynamics.html#the-second-law-of-thermodynamics
It's part of a statistical mechanics textbook, so a couple of words of jargon may not make sense, but this section is highly readable even without those definitions. To me it's been the most satisfying resolution to this question.
Nice video reviewing this paper at https://youtu.be/-buULmf7dec
In my experience it's reasonably easy to listen to such videos while doing chores etc.
https://youtu.be/QMqPAM_knrE is a video by one of the authors presenting on this research
The problem definition talks about clusters in the space of books, but to me it’s cleaner to look at regions of token-space, and token-sequences as trajectories through that space.
GPT is a generative model, so it can provide a probability distribution over the next token given some previous tokens. I assume that the basic model of a cluster can also provide a probability distribution over the next token.
With these two distribution generators in hand, you could generate books by multiplying the two distributions when generating each new token. This will bias the story towards the desired cluster, while still letting GPT guide the overall dynamic. Some hyperparameter tuning for weighting these two contributions will be necessary.
You could then fine-tune GPT using the generated books to break the dependency on the original model.
Seems like a fun project to try, with GPT-3, though probably even GPT-2 would give some interesting results.
Ok, I misread one of gwern's replies. My original intent was to extract money from the fact that gwern gave (from my vantage point) too high a probability of this being a scam.
Under my original version of the terms, if his P(scam) was .1:
Under my original version of the terms, if his P(scam) was .05:
In the second case, he would of course not want to take that bet. I'd thus like to amend my suggested conditions to have gwern only put $52 at stake against my $1000. For any P(scam) > .05 this is a positive expected value, so I would expect it to have been satisfactory to gwern[19 August 2012 01:53:58AM].
Well I still accept, since now it's a much better deal for me!
Done. $100 from you vs $1000 from me. If you lose, you donate it to her fund. If I lose, I can send you the money or do with it what you wish.
There are a lot of things I'd like to say, but you have put forth a prediction
It's probably a scam
It's probably a scam
I would like to take up a bet with you on this ending up being a scam. This can be arbitrated by some prominent member of CI, Alcor, or Rudi Hoffman. I would win if an arbiter decides that the person who posted on Reddit was in fact diagnosed with cancer essentially as stated in her Reddit posts, and is in fact gathering money for a her own cryonics arrangements. If none of the proposed arbiters can vouch for the above within one month (through September 18), then you will win the bet.
What odds would you like on this, and what's the maximum amount of money you'd put on the line?
I have donated $1000, and I really do believe that our community can get her fully funded. I understand how CI has to be cautious about these sorts of things, but I've seen enough evidence to be more than convinced.
I understand getting enough sleep, but what for example is your version of "eating right"?