Meta OpenEnv Hackathon — AI Agent Submission
Overview
This project is a production-ready AI agent built for the Meta OpenEnv Hackathon. The agent competes in a 3-round coding challenge against an OpenAI GPT-4o-mini judge, receiving Python coding tasks and submitting solutions for evaluation.
Agent: inference.py
The core agent logic lives entirely in inference.py. It runs a 3-round loop that:
- Calls the local OpenEnv server's
/resetendpoint to fetch a new coding task - Sends the task to a powerful LLM to generate a Python solution
- Submits the solution back to the
/stependpoint and retrieves the score
API Payload Compatibility
The agent uses the exact payload format required by the Meta OpenEnv REST API:
{"action": {"answer": "<agent_answer>"}}
This nested structure matches the server's strict Pydantic model validation, completely preventing 422 Unprocessable Entity errors that arise from incorrect top-level keys or flat payload formats.
Model: Qwen/Qwen2.5-72B-Instruct
The agent uses Qwen/Qwen2.5-72B-Instruct via the Hugging Face Router API (https://router.huggingface.co/v1), accessed through an OpenAI-compatible client.
Key reasons for this choice:
- The HF Router endpoint is the current active inference gateway — the legacy
api-inference.huggingface.co/v1endpoint has been deprecated and returns410 Goneerrors - Qwen 2.5 72B delivers state-of-the-art Python code generation, significantly outperforming smaller models on structured coding tasks
- The OpenAI-compatible interface ensures zero-downtime, stable connections with clean error handling
Task Parsing
The agent safely parses tasks from multiple possible server response shapes:
response["observation"]["echoed_message"]response["observation"]["task_prompt"]response["echoed_message"]- Raw JSON fallback via
json.dumps(resp)
This makes the agent resilient to any OpenEnv server version or configuration.
Setup
1. Install dependencies
pip install -r requirements.txt
2. Set environment variables
$env:HF_TOKEN = "hf_your_token_here"
$env:OPENAI_API_KEY = "sk-your_key_here"
3. Start the OpenEnv server, then run the agent
python inference.py
Requirements
| Package | Version |
|---|---|
| requests | 2.31.0 |
| openai | 1.14.0 |
| pydantic | 2.6.4 |