title: Final Assignment - Agent Course by Hugging Face
emoji: π΅π»ββοΈ
colorFrom: indigo
colorTo: indigo
sdk: docker
app_port: 7860
pinned: false
hf_oauth: true
hf_oauth_expiration_minutes: 480
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
Agent Course Final Assignment by Hugging Face
This project contains a Gradio application for evaluating a LangChain agent based on the GAIA (General AI Assistant) benchmark. The agent is designed to answer questions using a variety of tools, and its performance is scored by an external API.
Features
- Gradio Interface: An easy-to-use web interface for running the evaluation and viewing the results.
- LangChain Agent: A sophisticated agent built with LangChain, capable of using tools to answer questions.
- Multi-Tool Integration: The agent can interact with multiple tools, such as a browser (via Playwright) and a YouTube transcript fetcher.
- Docker Support: The entire application can be built and run using Docker and Docker Compose, ensuring a consistent environment.
- Observability: Integrated with Langfuse for tracing and monitoring the agent's behavior.
Installation
Clone the repository:
git clone https://huggingface.co/spaces/hf-agent-course/final-assignment-template cd final-assignment-templateCreate a virtual environment and install dependencies:
python -m venv .venv source .venv/bin/activate # On Windows, use `.venv\Scripts\activate` pip install -r requirements.txtInstall Playwright browsers:
npx playwright installSet up environment variables:
Create a
.envfile in the root of the project and add the following variables:HF_TOKEN=<your-hugging-face-token> GOOGLE_API_KEY=<your-google-api-key> LANGFUSE_PUBLIC_KEY=<your-langfuse-public-key> LANGFUSE_SECRET_KEY=<your-langfuse-secret-key>
Usage
To run the Gradio application locally, use the following command:
python app.py
This will start a local web server, and you can access the application in your browser at http://127.0.0.1:7860.
Docker
This project includes Dockerfile and docker-compose.yml for running the application in a containerized environment.
Build and Run with Docker Compose
To build and run the application using Docker Compose, use the following command:
docker-compose up --build
This will build the Docker image and start the application. You can access the Gradio interface at http://localhost:7860.
Development Environment
A Dockerfile.dev is also provided for development purposes. To build and run the development environment, use the following command:
docker-compose -f docker-compose.yml -f docker-compose.dev.yml up --build
This will mount the local code into the container, allowing for live reloading of changes.
Contributing
Contributions are welcome! Please feel free to submit a pull request or open an issue.