I think part of your point, translated to local language is "GPTs are Tool AIs, and Tool AI doesn't necessarily become agentic"
IMO those issues are all very minor, even when summed.
Is that relevant? Imagine that we were discussing the replacement of a ramp with stairs. This has a very minor effect on my experience -- is that enough to conclude the change was benign?
This is an example where the true distribution of future prices is bimodal (with the average between the modes). If all you can do is buy or sell stock, then you actually have to disagree with the market about the distribution to make money.
Without having information about the probability of default, there might still be something to do based on the vol curve.
it would be 3 lines
~all of the information is in lines 2 and 3, so you'd get all of the info on the first screen if you nix line 1.~
edit: not sure what I was thinking -- thanks, Slider
For those who care, it's open source and you can host your own server from a docker image. (In addition to the normal "just click buttons on our website and pay us some money to host a server for you" option)
I think that to get the type of the agent, you need to apply a fixpoint operator. This also happens inside the proof of Löb for constructing a certain self-referential sentence.
(As a breadcrumb, I've heard that this is related to the Y combinator.)