All of hatta_afiq's Comments + Replies

text-davinci-002, updated with a link to github 

2janus1y
text-davinci-002 is often extremely confident about its "predictions" for no apparent good reason (e.g. when generating "open-ended" text being ~99% confident about the exact phrasing) This is almost certainly due to the RLHF "Instruct" tuning text-davinci-002 has been subjected to. To whatever extent probabilities output by models trained with pure SSL can be assigned an epistemic interpretation (the model's credence for the next token in a hypothetical training sample), that interpretation no longer holds for models modified by RLHF.

Sorry, I might be missing something here but

  • Isn't price of energy typically measured in kW hours. Energy = Power x Time. 
  • If a space solar system can output more energy since it stays on for longer, wouldn't this mean that the cost per watt hour would naturally decrease? This would be because the price of a watt hour I imagine would be Energy / price. So, if our launch cost is a fixed cost, then we would find that E / price decreases. 
1Caridorc Tergilti1y
* Very good point: I think the website I linked to refers to peak power, so the Kilowatthours would be lower. (not sure on this, sorry) * If the panels on orbit last double the time and produce double the energy that is only a factor of 4, while the system is about 300 times more expensive. (but again you have transmission losses that I did not consider)

Should the role of a distiller include spotting mistakes? I assume that you'd only want distillers to get to work once you have some confidence that the original claims are correct. 

Thanks Derek. I'm writing a blog post on results from small samples - may I cite your answer? 

1Derek M. Jones1y
I'm always happy to be cited :-) Sample size is one major issue, the other is who/what gets to be in the sample. Psychology has its issues with using WEIRD subjects. Software engineering has issues with the use of student subjects, because most of them have relatively little experience. It all revolves around convenience sampling.