These are very impressive! It looks like it gets the concepts, but lacks global coherency.
Could anyone comment on how far we are from results of similar quality as the training set? Can we expect better results just by scaling up the generator or CLIP?
Using CLIP is a pretty weird way to go. It's like using a CNN classifier to generate images: it can be done, but like a dog walking, we're more surprised to see it work at all.
If you think about how a contrastive loss works, it's perhaps less surprising why CLIP-guided images look the way they do, and do things like try to repeat an object many time: if you have a prompt like "Mickey Mouse", what could be even more Mickey-Mouse-y than Mickey Mouse tiled a dozen times? That surely maximizes its embedding encoding 'Mickey Mouse', and its distance from non-Di... (read more)
There is a paper describing the architecture https://arxiv.org/abs/1812.08989
It looks like the system is comprised of many independent skills and an algorithm to pick which skill to use at each state of the conversation. Some of the skills use neural nets, like a CNN for parsing images and a RNN for completing sentences but the models look relatively small.
Could you elaborate on step 4? How can you buy 10 shares of each bucket if you only have $10? Isn't the total cost 14.80 * 10?
The best way to get leverage
Second link broke, I believe it moved here https://www.optionseducation.org/referencelibrary/white-papers/page-assets/listed-options-box-spread-strategies-for-borrowing-or-lending-cash.aspx
There is a very extensive discussion of a UPRO/TMF strategy here. One thing to note is taxes severely decrease the returns of strategies which require frequent re-balancing.
Have you rechecked the data recently?
I see, thanks for clarifying!
Why do you think "lesser" AI being transformative is more worrying than AGI? This scenario seems similar to past technological progress.
Here is the reverse: https://beta.openai.com/?app=content-consumption&example=5_2_0