A simple standalone chat bot empowered by LLM services and AI models working with Nextcloud Talk App.
git clone https://github.com/CrazyShipOne/nextcloud_talk_pybot.git
cd nextcloud_talk_pybot
cp .env.example .env
pip install -r requirements.txt
./start.sh
Required python version >= 3.11. It's easy to manage environment with miniconda, venv or any other favorite tools.
Ensure edit .env by your credentials and tokens. refer to .env / Docker Environment Variable
git clone https://github.com/CrazyShipOne/nextcloud_talk_pybot.git
cd nextcloud_talk_pybot
docker build -t crazyshipone/nextcloud_talk_pybot .
docker run -d \
-n nextcloud_talk_pybot \
--restart=always \
# Add environment variable by -e ENV_NAME=ENV_VALUE
# -e NC_BASE_URL=https://www.mynextcloud.com:8080
crazyshipone/nextcloud_talk_pybot
Ensure add environment variables refer to .env / Docker Environment Variable
NC_BASE_URL
: Required. Base url of nextcloud endpoint. For example https://www.mynextcloud.com:8080.
NC_USERNAME
: Required. User name of created bot account.
NC_PASSWORD
: Required. Either user password or app password of created bot account.
POLL_INTERVAL
: Optional. Message polling interval in seconds, default: 5.
ONLY_NEW
: Optional. Set True to poll messages send after the bot is started, False to poll all unread messages, default: True.
MAX_MESSAGE
: Optional. Maximum unread messages polled from one chat, default: 10
LOG_LEVEL
: Optional. Output logging level, default: Info
MAX_CHAT_HISTORY
: Optional. Maximum chat history stored, set 0 to not store. Caution: Set to a large number will cost lots of tokens! default: 0
HISTORY_STORAGE
: Optional. Storage to save chat history, values are below. default: memory
memory
: save in memory
redis
: save in redis
REDIS_HOST
: Required if HISTORY_STORAGE
is 'redis'. Redis host.
REDIS_PORT
: Optional. Redis port. default: 6379
REDIS_PASS
: Optional. Redis password.
REDIS_DB
: Optional. Redis db number. default: 0
Set if plugin will be used.
OPENAI_API_KEY
: OpenAI's api key.
GOOGLE_API_KEY
: Google project's api key.
AZURE_OPENAI_API_KEY
: OpenAI's api key.
AZURE_OPENAI_ENDPOINT
: Endpoints of Azure project.
OPENAI_API_VERSION
: Api version of response.
AZURE_OPENAI_CHAT_DEPLOYMENT_NAME
: Deployment name of model of Azure project.
HFACE_TOEKN
: HuggingFace's token.
Set environment variable as same as using boto3 library. Minimum envs are:
AWS_ACCESS_KEY_ID
: Access key.
AWS_SECRET_ACCESS_KEY
: Secret Key.
AWS_DEFAULT_REGION
: Region of the model.
Also use !bedrock:set_model_id if not using default Claude 3 Haiku model.
- Register a new account for chat bot in Nextcloud
- Start the agent with new account's credentails
- Open a new chat and invite chat bot
- Type
![plugin]:[function] message
to chat, for example!openai:chat Who are you?
. Any message not start with ! will be responded with a help message.
- Add LLM plugins:
- OpenAI✅
- Azure OpenAI✅
- Google Gemini✅
- Claude✅(By AWS BedRock)
- Models on Huggingface✅
- Anthropic✅
- Models run on local machine
- Add Tool plugins:
- Search result from google
- Weather
- Add memory for conversation✅