llama.cpp
#1
by blankreg - opened
Is there a PR to add support in official llama.cpp to avoid using your forked version?
Hello, @blankreg . Thank you for your attention.
We will open a PR soon to integrate the EXAONE 4.5 architecture into llama.cpp, so please stay tuned!
Also, before opening the PR, we will address the issue reported here: https://huggingface.co/LGAI-EXAONE/EXAONE-4.5-33B-GGUF/discussions/2
Thanks
blankreg changed discussion status to closed