clarity-backend / requirements.txt
scriptsledge's picture
perf: switch to transformers library and native pytorch model for optimized inference
9b12d46 verified
raw
history blame contribute delete
45 Bytes
fastapi
uvicorn
transformers
torch
accelerate