Description
LocalAI version:
3.1.1
Environment, CPU architecture, OS, and Version:
docker
Describe the bug
ERR failed to install model "bark-small_weights-f16.bin" from gallery error="no model found with name \"bark-small_weights-f16.bin\"" when already installed.
I was chatting away with deepseek then nothing, check log:
ERR failed to install model "deepseek-r1-distill-llama-8b-Q4_K_M.gguf" from gallery error="no model found with name \"deepseek-r1-distill-llama-8b-Q4_K_M.gguf\""
once again it's already installed.
To Reproduce
Not sure
Expected behavior
answers question
Logs
9:36AM DBG Stream request received
9:36AM INF Success ip= latency=393.902938ms method=POST status=200 url=/v1/chat/completions
9:36AM DBG Sending chunk: {"created":1751189799,"object":"chat.completion.chunk","id":"3aff22668","model":"deepseek-r1-distill-llama-8b","choices":[{"index":0,"finish_reason":"","delta":{"role":"assistant","content":""}}],"usage":{"prompt_tokens":0,"completion_tokens":0,"total_tokens":0}}
9:36AM ERR failed to install model "deepseek-r1-distill-llama-8b-Q4_K_M.gguf" from gallery error="no model found with name \"deepseek-r1-distill-llama-8b-Q4_K_M.gguf\""
9:43AM ERR failed to install model "bark-small_weights-f16.bin" from gallery error="no model found with name \"bark-small_weights-f16.bin\""
9:43AM DBG Stopping all backends except 'bark-cpp-small'
9:43AM DBG Deleting process deepseek-r1-distill-llama-8b
9:43AM INF BackendLoader starting backend=bark-cpp modelID=bark-cpp-small o.model=bark-small_weights-f16.bin
9:43AM DBG Loading model in memory from file: /models/bark-small_weights-f16.bin
9:43AM DBG Loading Model bark-cpp-small with gRPC (file: /models/bark-small_weights-f16.bin) (backend: bark-cpp): {backendString:bark-cpp model:bark-small_weights-f16.bin modelID:bark-cpp-small assetDir:/tmp/localai/backend_data context:{emptyCtx:{}} gRPCOptions:0xc0be85e588 externalBackends:map[bark:/build/backend/python/bark/run.sh coqui:/build/backend/python/coqui/run.sh diffusers:/build/backend/python/diffusers/run.sh exllama2:/build/backend/python/exllama2/run.sh faster-whisper:/build/backend/python/faster-whisper/run.sh kokoro:/build/backend/python/kokoro/run.sh rerankers:/build/backend/python/rerankers/run.sh transformers:/build/backend/python/transformers/run.sh vllm:/build/backend/python/vllm/run.sh] grpcAttempts:20 grpcAttemptsDelay:2 parallelRequests:false}
9:43AM DBG Loading external backend: /backends/bark-cpp/run.sh
9:43AM DBG external backend is file: &{name:run.sh size:191 mode:448 modTime:{wall:389370631 ext:63885975675 loc:0x468dc20} sys:{Dev:66306 Ino:5126488 Nlink:1 Mode:33216 Uid:0 Gid:0 X__pad0:0 Rdev:0 Size:191 Blksize:4096 Blocks:8 Atim:{Sec:1751189621 Nsec:120067606} Mtim:{Sec:1750378875 Nsec:389370631} Ctim:{Sec:1751189621 Nsec:118067467} X__unused:[0 0 0]}}
9:43AM DBG Loading GRPC Process: /backends/bark-cpp/run.sh
9:43AM DBG GRPC Service for bark-cpp-small will be running at: '127.0.0.1:35967'
9:43AM DBG GRPC Service state dir: /tmp/go-processmanager2135792008
9:43AM DBG GRPC Service Started
9:43AM DBG Wait for the service to start up
9:43AM DBG Options: ContextSize:1024 Seed:1117498056 NBatch:512 MMap:true NGPULayers:9999999 Threads:10
9:43AM DBG GRPC(bark-cpp-small-127.0.0.1:35967): stdout Initializing libbackend for bark-cpp
9:43AM DBG GRPC(bark-cpp-small-127.0.0.1:35967): stdout virtualenv activated
9:43AM DBG GRPC(bark-cpp-small-127.0.0.1:35967): stdout activated virtualenv has been ensured
9:43AM DBG GRPC(bark-cpp-small-127.0.0.1:35967): stderr Traceback (most recent call last):
9:43AM DBG GRPC(bark-cpp-small-127.0.0.1:35967): stderr File "/backends/bark-cpp/backend.py", line 11, in <module>
9:43AM DBG GRPC(bark-cpp-small-127.0.0.1:35967): stderr from scipy.io.wavfile import write as write_wav
9:43AM DBG GRPC(bark-cpp-small-127.0.0.1:35967): stderr ModuleNotFoundError: No module named 'scipy'
Additional context
It looks like Local-ai is trying to open another backend, and closing the one that is in use.
Restart fixes the issue but this could be a temp fix. Might have to use the active backend option.