Posts

Sorted by New

Wiki Contributions

Comments

Crossposting from your substack: 
Re: “Llama is this so easy”, this repo https://github.com/ZrrSkywalker/LLaMA-Adapter seems to suggest that actually you may need *very* few additional parameters and a tiny amount of fine-tuning of a network that you bolt on top of an LLM to get pretty good instruction following. I haven’t experimented with it though.