Skip to content

Missing libomp.so in docker container #5602

Open
@FirelightFlagboy

Description

@FirelightFlagboy

LocalAI version:

Docker container: quay.io/go-skynet/local-ai:master-aio-gpu-hipblas@sha256:42c0725321d9e18c59316203cb17b3ed37e23854a6707226f1fb9ebbbb56019d

Environment, CPU architecture, OS, and Version:

OS: [email protected]
Docker compose: 2.36
CPU: AMD Ryzen 7 5800X 8-Core Processor
GPU: AMD Radeon 6900 XT

Describe the bug

When I try to generate image with localai using the model minicpm-v-2_6-mmproj-f16.gguf, The image fail to generate.

To Reproduce

I use the following docker compose file:

version: '3.6'

services:
  api:
    image: quay.io/go-skynet/local-ai:master-aio-gpu-hipblas
    ports:
      - 8080:8080
    environment:
      - MODELS_PATH=/models
      - DEBUG=true
      - GPU_TARGETS=gfx1030
      - LOCALAI_THREADS=16
      - LOCALAI_UPLOAD_LIMIT=500
    devices:
      - /dev/dri
      - /dev/kfd
    volumes:
      - localai-models:/models:cached
      - ./images/:/tmp/generated/images/

volumes:
  localai-models:

Once ready I go to text2image, select the model minicpm-v-2_6-mmproj-f16 and enter a prompt before starting the generation

Expected behavior

I would expect the image to be generated or an error other that a missing lib

Logs

api-1  | 10:04AM INF BackendLoader starting backend=stablediffusion-ggml modelID=minicpm-v-2_6-mmproj-f16.gguf o.model=minicpm-v-2_6-mmproj-f16.gguf
api-1  | 10:04AM DBG Loading model in memory from file: /models/minicpm-v-2_6-mmproj-f16.gguf
api-1  | 10:04AM DBG Loading Model minicpm-v-2_6-mmproj-f16.gguf with gRPC (file: /models/minicpm-v-2_6-mmproj-f16.gguf) (backend: stablediffusion-ggml): {backendString:stablediffusion-ggml model:minicpm-v-2_6-mmproj-f16.gguf modelID:minicpm-v-2_6-mmproj-f16.gguf assetDir:/tmp/localai/backend_data context:{emptyCtx:{}} gRPCOptions:0xc000404dc8 externalBackends:map[bark:/build/backend/python/bark/run.sh chatterbox:/build/backend/python/chatterbox/run.sh coqui:/build/backend/python/coqui/run.sh diffusers:/build/backend/python/diffusers/run.sh exllama2:/build/backend/python/exllama2/run.sh faster-whisper:/build/backend/python/faster-whisper/run.sh kokoro:/build/backend/python/kokoro/run.sh rerankers:/build/backend/python/rerankers/run.sh transformers:/build/backend/python/transformers/run.sh vllm:/build/backend/python/vllm/run.sh] grpcAttempts:20 grpcAttemptsDelay:2 parallelRequests:false}
api-1  | 10:04AM DBG Loading GRPC Process: /tmp/localai/backend_data/backend-assets/grpc/stablediffusion-ggml
api-1  | 10:04AM DBG GRPC Service for minicpm-v-2_6-mmproj-f16.gguf will be running at: '127.0.0.1:46321'
api-1  | 10:04AM DBG GRPC Service state dir: /tmp/go-processmanager1058842882
api-1  | 10:04AM DBG GRPC Service Started
api-1  | 10:04AM DBG Wait for the service to start up
api-1  | 10:04AM DBG Options: ContextSize:1024  Seed:1236533126  NBatch:512  MMap:true  NGPULayers:99999999  Threads:16  Options:"gpu"
api-1  | 10:04AM DBG GRPC(minicpm-v-2_6-mmproj-f16.gguf-127.0.0.1:46321): stderr stablediffusion-ggml: error while loading shared libraries: libomp.so: cannot open shared object file: No such file or directory

Additional context

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions