-
Notifications
You must be signed in to change notification settings - Fork 10.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Again, the releases don't have the libraries. #11091
Comments
Please test the binaries on colab :D if it works there it works everywhere. |
Version 40xx work.. I tried a few. don't know ehen it "EFFED UP" :D |
I setup a script to automatically go through and test until I found one that works. Although, yes... this is currently an issue with latest versions. To be used with: https://gist.github.com/dagbs/1258a52addeb61cbf26e24f75b2ad7d9 #!/bin/bash
# Starting version and minimum version
start_version=4416
min_version=4000
current_version=$start_version
while [ $current_version -ge $min_version ]; do
echo "Attempting to build and test version b$current_version"
# Download the specified version
./download_latest.sh "b$current_version"
# Build and check if llama-cli works without errors
if ./build/bin/llama-cli --help 2>&1 | grep -q 'libllama.so'; then
echo "Error: libllama.so not found in version b$current_version."
else
echo "Version b$current_version built successfully with no libllama.so error."
break # Exit the loop if a successful build is found
fi
# Decrease the version number and wait 5 seconds before the next attempt
current_version=$((current_version - 1))
sleep 5
done
if [ $current_version -lt $min_version ]; then
echo "No suitable version found. All versions from b$start_version to b$min_version had issues."
fi |
@0wwafa Do you mean to say that when downloading llama.cpp it is not including the shared libraries? If so where are you downloading from? Or are you building, installing, and then running the application and those shared libraries are missing? If this happened before it is helpful to include a link to a previous issue or PR that resolved the problem you are seeing. Folks use llama.cpp in various ways so it is necessary to have the extra context so we can help resolve the issue accordingly! |
I ran into the same issue and I remember it to also happen on other platforms such as x86_64. Here is the entire build log from my last compiling attempt on the arm64 platform (but the same happens on x86_64): |
./build/bin/llama-quantize: error while loading shared libraries: libllama.so: cannot open shared object file: No such file or directory
it already happened in the past...
The text was updated successfully, but these errors were encountered: