- try installing llama-cpp-python and llama-cpp-python[server] from pip ... WITH ggml-metal.metal (WITH Metal GPU support) file in python executable directory
- llama-cpp-python[server] FAILS
##################################
# remove previous pip package
pip uninstall llama-cpp-python -y
pip list | grep llama
> [nothing found]
##################################
# fresh pip install - force ref
pip install llama-cpp-python --force-reinstall --upgrade --no-cache-dir
pip install 'llama-cpp-python[server]''
pip list | grep llama
> llama-cpp-python 0.1.59
##################################
# TEST
python3 -m llama_cpp.server --model $MODEL