Spaces:
Running
on
Zero
Running
on
Zero
| gradio==6.0.2 | |
| transformers==4.57.2 | |
| spaces==0.42.1 | |
| pandas>=2.3.3 | |
| boto3>=1.42.1 | |
| pyarrow>=21.0.0 | |
| openpyxl>=3.1.5 | |
| markdown>=3.7 | |
| tabulate>=0.9.0 | |
| lxml>=5.3.0 | |
| google-genai>=1.52.0 | |
| openai>=2.8.1 | |
| html5lib>=1.1 | |
| beautifulsoup4>=4.12.3 | |
| rapidfuzz>=3.13.0 | |
| python-dotenv>=1.1.0 | |
| torch<=2.9.1 --extra-index-url https://download.pytorch.org/whl/cpu | |
| llama-cpp-python==0.3.16 -C cmake.args="-DGGML_BLAS=ON;-DGGML_BLAS_VENDOR=OpenBLAS" | |
| # Direct wheel links if above doesn't work | |
| # I have created CPU Linux, Python 3.11 compatible wheels: | |
| # https://github.com/seanpedrick-case/llama-cpp-python-whl-builder/releases/download/v0.1.0/llama_cpp_python-0.3.16-cp311-cp311-linux_x86_64.whl | |
| # Windows, Python 3.11 compatible CPU wheels available: | |
| # https://github.com/seanpedrick-case/llama-cpp-python-whl-builder/releases/download/v0.1.0/llama_cpp_python-0.3.16-cp311-cp311-win_amd64_cpu_openblas.whl | |
| # If above doesn't work for Windows, try looking at'windows_install_llama-cpp-python.txt' for instructions on how to build from source |