* integrate langchain get rid of mongodb use llama-cpp-python bindings * fixed most chat endpoints except posting questions * Working post endpoint ! * everything works except streaming * current state * streaming as is * got rid of langchain wrapper for calling llm, went back to using bindings directly * working streaming * sort chats by time * cleaned up styling and added back loading indicator * Add persistence support to redis * fixed tooltips * fixed default prompts * added link to api docs (closes How to use the api #155 )
18 lines
391 B
Bash
18 lines
391 B
Bash
#!/bin/bash
|
|
|
|
pip install -e ./api
|
|
pip install llama-cpp-python
|
|
|
|
redis-server /etc/redis/redis.conf &
|
|
|
|
# Start the web server
|
|
cd web && npm run dev -- --host 0.0.0.0 --port 8008 &
|
|
|
|
# Start the API
|
|
cd api && uvicorn src.serge.main:api_app --reload --host 0.0.0.0 --port 9124 --root-path /api/ &
|
|
|
|
# Wait for any process to exit
|
|
wait -n
|
|
# Exit with status of process that exited first
|
|
exit $?
|