Skip to content

Eval bug: SYCL branch produces mul_mat bug when trying to run. #13674

Open
@HumerousGorgon

Description

@HumerousGorgon

Name and Version

Llama-server latest version.

Operating systems

Linux

GGML backends

SYCL

Hardware

3 x Arc A770 w/ 11600KF CPU.

Models

Qwen3-30B-A3B-UD-Q4_K_XL.gguf

Problem description & steps to reproduce

Built the latest version of llama.cpp, getting mul_mat bugs since changes were introduced when SYCL_CACHE_PERSISTENT=1 is set as an environment variable.
Running llama-server produces error.

First Bad Commit

Relevant log output

Exception caught at file:/home/llm/devtools/llama.cpp/ggml/src/ggml-sycl/ggml-sycl.cpp, line:2545, func:operator()
SYCL error: CHECK_TRY_ERROR(op(ctx, src0, src1, dst, src0_dd_i, src1_ddf_i, src1_ddq_i, dst_dd_i, dev[i].row_low, dev[i].row_high, src1_ncols, src1_padded_col_size, stream)): Exception caught in this line of code.
  in function ggml_sycl_op_mul_mat at /home/llm/devtools/llama.cpp/ggml/src/ggml-sycl/ggml-sycl.cpp:2545
/home/llm/devtools/llama.cpp/ggml/src/ggml-sycl/../ggml-sycl/common.hpp:115: SYCL error
/home/llm/devtools/llama.cpp/build/bin/libggml-base.so(+0x257b8)[0x73ced3d307b8]
/home/llm/devtools/llama.cpp/build/bin/libggml-base.so(+0x11478)[0x73ced3d1c478]
/home/llm/devtools/llama.cpp/build/bin/libggml-base.so(ggml_abort+0xd0)[0x73ced3d1b580]
/home/llm/devtools/llama.cpp/build/bin/libggml-sycl.so(+0x36128)[0x73ced3e36128]
/home/llm/devtools/llama.cpp/build/bin/libggml-sycl.so(+0x514f5)[0x73ced3e514f5]
/home/llm/devtools/llama.cpp/build/bin/libggml-sycl.so(+0x472f6)[0x73ced3e472f6]
/home/llm/devtools/llama.cpp/build/bin/libggml-sycl.so(+0x4786c)[0x73ced3e4786c]
/home/llm/devtools/llama.cpp/build/bin/libggml-sycl.so(+0x457a1)[0x73ced3e457a1]
/home/llm/devtools/llama.cpp/build/bin/libggml-sycl.so(+0x43efe)[0x73ced3e43efe]
/home/llm/devtools/llama.cpp/build/bin/libggml-base.so(ggml_backend_sched_graph_compute_async+0x5e8)[0x73ced3d392f8]
/home/llm/devtools/llama.cpp/build/bin/libllama.so(_ZN13llama_context13graph_computeEP11ggml_cgraphb+0x91)[0x73ced43623e1]
/home/llm/devtools/llama.cpp/build/bin/libllama.so(_ZN13llama_context6decodeER11llama_batch+0x6cd)[0x73ced4362add]
/home/llm/devtools/llama.cpp/build/bin/libllama.so(llama_decode+0x14)[0x73ced4366fe4]
./llama-server[0x5d044c]
./llama-server[0x44aa77]
./llama-server[0x4204c5]
/lib/x86_64-linux-gnu/libc.so.6(+0x29d90)[0x73ced3629d90]
/lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0x80)[0x73ced3629e40]
./llama-server[0x41cf35]

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions