Skip to content

BitonicAI Service Quick Start

This guide will help you set up and run the BitonicAI FastAPI service.

Prerequisites

  • Python 3.12+
  • PostgreSQL database
  • S3-compatible storage (Garage) or AWS S3
  • Logto instance for authentication (or configure your own OAuth2 provider)

Installation

  1. Clone the repository (if not already done)

  2. Install dependencies (from workspace root):

uv sync --package bitonicai
  1. Set up environment variables:

Create a .env file:

# Database
DATABASE_URL=postgresql+psycopg://user:password@localhost:5432/bitonicai

# Logto OAuth2
LOGTO_URI=https://your-logto-instance.com
LOGTO_CLIENT_ID=your_client_id
LOGTO_CLIENT_SECRET=your_client_secret

# S3/Garage Storage
GARAGE_ENDPOINT=https://garage.example.com
GARAGE_ACCESS_KEY=your_access_key
GARAGE_SECRET_KEY=your_secret_key
GARAGE_BUCKET_NAME=resumes
GARAGE_REGION=us-east-1

# reCAPTCHA (for signup)
CAPTCHA_SECRET_KEY=your_recaptcha_secret

# OpenAI (for agents)
OPENAI_API_KEY=your_openai_api_key
  1. Set up database:
# Run migrations
uv run alembic upgrade head

# Or create tables manually (not recommended for production)
# python -c "from bitonicai.database import init_engine, create_all_tables; import asyncio; init_engine(); asyncio.run(create_all_tables())"

Running the Service

Development Mode

uv run --project apps/bitonicai uvicorn bitonicai.main:app --reload

The API will be available at http://localhost:8000

Production Mode

uv run --project apps/bitonicai gunicorn bitonicai.main:app -w 4 -k uvicorn.workers.UvicornWorker

Using Docker

docker build -t bitonicai .
docker run --env-file .env -p 8000:8000 bitonicai

API Usage

Authentication

All protected routes require an OAuth2 access token:

# Get access token from Logto (or your OAuth2 provider)
TOKEN="your_access_token"

# Make authenticated request
curl -H "Authorization: Bearer $TOKEN" \
     http://localhost:8000/api/vb1/resume/list

Upload Resume

curl -X POST \
     -H "Authorization: Bearer $TOKEN" \
     -F "file=@resume.pdf" \
     http://localhost:8000/api/vb1/resume/upload

Response:

{
  "executive_summary": "...",
  "skills": ["Python", "FastAPI", ...],
  "work_experience": [...],
  "education": [...],
  ...
}

List Resumes

curl -H "Authorization: Bearer $TOKEN" \
     http://localhost:8000/api/vb1/resume/list

Response:

{
  "resumes": [
    "resumes/user123/resume1.pdf",
    "resumes/user123/resume2.pdf"
  ]
}

Delete All Resumes

curl -X DELETE \
     -H "Authorization: Bearer $TOKEN" \
     http://localhost:8000/api/vb1/resume/all

OAuth2 Scopes

The service uses scope-based authorization:

  • upload:resume - Required for uploading resumes
  • read:resume - Required for listing resumes
  • delete:resume - Required for deleting resumes

Configure these scopes in your Logto instance (or OAuth2 provider).

Database Schema

The service uses SQLModel entities:

  • Resume - Resume metadata
  • Session - Agent conversation sessions
  • Message - Agent conversation messages
  • Skill - Skills extracted from resumes
  • SkillOwnership - Links skills to resumes
  • BetaSignUp - Beta signup records

See migrations in migrations/versions/ for schema details.

Troubleshooting

Database Connection Issues

Ensure PostgreSQL is running and DATABASE_URL is correct:

psql -h localhost -U postgres -d bitonicai

S3 Storage Issues

Verify S3/Garage credentials and bucket exists:

from bitonicai.internal.tools.s3_tool import S3Tool
tool = S3Tool()
# Test connection

Authentication Issues

Check Logto configuration and ensure: - LOGTO_URI points to your Logto instance - LOGTO_CLIENT_ID and LOGTO_CLIENT_SECRET are correct - OAuth2 scopes are configured in Logto

Next Steps