firstAI / requirements.txt
ndc8
Set gemma-3n-E4B-it-GGUF as main model for all text generation endpoints
8d962fd
raw
history blame
310 Bytes
gradio>=5.41.0
huggingface_hub>=0.34.0
transformers>=4.36.0
torch>=2.0.0
Pillow>=10.0.0
accelerate>=0.24.0
requests>=2.31.0
# NOTE: GGUF models like 'gemma-3n-E4B-it-GGUF' must be downloaded manually or referenced from HuggingFace, not pip-installed.
fastapi>=0.100.0
uvicorn[standard]>=0.23.0
pydantic>=2.0.0