I'm afraid linking what I've been working on for a while as my first post might not be greatly received, but I think you might find it interesting none the less.
I'm making a text interface to chat with local and remote models. It is made in 100% python, it uses tkinter/tcl which should be bundled with a normal python installation.
I made it because I wasn't able to find an interface that felt right to me. I didn't try them all though. I like adding "power user" features when I think of one.
Repo: https://github.com/Merkoba/Meltdown
Some features:
- Load llama.cpp models (only gguf tested for now).
- Use your ChatGPT api key with a specific model of openai.
- Model configuration
... (read 191 more words →)
As mostly an LLM user, most I can do is backup a few big models in case their availability ceases to exist in some doomsday scenario. Being the guy who has the "AI" in some zombie apocalypse can be useful.