![]() * Add minimal implementation of `LlamafileProvider`, a new `ChatModelProvider` for llamafiles. It extends `BaseOpenAIProvider` and only overrides methods that are necessary to get the system to work at a basic level. * Add support for `mistral-7b-instruct-v0.2`. This is the only model currently supported by `LlamafileProvider` because this is the only model I tested anything with. * Add instructions to use AutoGPT with llamafile in the docs at `autogpt/setup/index.md` * Add helper script to get it running quickly at `scripts/llamafile/serve.py` --------- Co-authored-by: Reinier van der Leer <pwuts@agpt.co> |
||
---|---|---|
.. | ||
llamafile | ||
__init__.py | ||
check_requirements.py | ||
git_log_to_release_notes.py |