Skip to content

llama.cpp: update to b5415 #28474

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

oytech
Copy link
Contributor

@oytech oytech commented May 18, 2025

Description

Type(s)
  • bugfix
  • enhancement
  • security fix
Tested on

macOS 14.7.5 23H527 arm64
Command Line Tools 16.2.0.0.1.1733547573

Verification

Have you

  • followed our Commit Message Guidelines?
  • squashed and minimized your commits?
  • checked that there aren't other open pull requests for the same change?
  • referenced existing tickets on Trac with full URL in commit message?
  • checked your Portfile with port lint?
  • tried existing tests with sudo port test?
  • tried a full install with sudo port -vs install?
  • tested basic functionality of all binary files? (llama-bench, llama-cli, llama-server, llama-simple, llama-simple-chat)
  • checked that the Portfile's most important variants haven't been broken? (+blas+openmp)

@macportsbot
Copy link

Notifying maintainers:
@i0ntempest for port llama.cpp.

@i0ntempest
Copy link
Member

I'll be updating with a new variant after I sort everything out

@i0ntempest i0ntempest closed this May 19, 2025
@oytech
Copy link
Contributor Author

oytech commented May 19, 2025

@i0ntempest current llama.cpp version (b4534) in port is 4 months old and llama.cpp is fast paced project adding stuff every week. In newer version you have for example multimodal support (ggml-org/llama.cpp#12898). Why not just update the version in the mean time?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
maintainer: open Affects an openmaintainer port type: update
Development

Successfully merging this pull request may close these issues.

3 participants