* Split Docker build into separate CI job
* Update docker.yml
* Add build as part of CI process, bump Node to v20.x
* Add build as part of CI process, bump Node to v20.
* Use slim variant for Docker, CI fixes
* Config must be added after installation
* Use Python3.9, updates to CI
* Change min required version of Python
* integrate langchain
get rid of mongodb
use llama-cpp-python bindings
* fixed most chat endpoints except posting questions
* Working post endpoint !
* everything works except streaming
* current state
* streaming as is
* got rid of langchain wrapper for calling llm, went back to using bindings directly
* working streaming
* sort chats by time
* cleaned up styling and added back loading indicator
* Add persistence support to redis
* fixed tooltips
* fixed default prompts
* added link to api docs (closes How to use the api #155 )
* add script to migrate weights and update dockerfile to use the latest version
* bump version of llama.cpp to latest
* fixed conversion bug when downloading models
* begin work on dev environment
* more work on dev image
* working dev + prod images with SPA front-end
* reworked dockerfile
* make CI point to the right action
* Improvements to github actions (#79)
* Improvements to github actions
* Change username to repo owner username
* Add fix for login into ghcr (#81)
* Update bug_report.yml
* added dev instructions to readme
* reduced number of steps in dockerfile
---------
Co-authored-by: Juan Calderon-Perez <835733+gaby@users.noreply.github.com>
* initial work on linting & templates
* moved everyone into a nice dockerfile
* move everyone into a single dockerfile
* update sample .env file
* got rid of .env file
* rename db volume to avoid confusion and conflicts with previous version
* added bug report template