Athina Self Hosting Guide

Prerequisites

Core Services Setup

  • Clone the athina-deploy repository (ensure all Athina project folders are at the same directory level):
    git clone https://github.com/athina-ai/athina-deploy
    
  • Copy .example.env to .env.
  • Populate the respective values. Some default values and keys are provided. You can change them if needed. The DB Name can remain as it is.
  • Create a Google OAuth client for Google login (Google Sign-In Documentation). Ensure you set the correct redirect URI and authorized domain (:9000/auth/google/redirect).
  • Create a bucket for data import:
    • Use the bucket name specified in .env. Edit CORS setting of this bucket to allow the frontend to upload files:
      [
        {
          "AllowedHeaders": ["*"],
          "AllowedMethods": ["PUT"],
          "AllowedOrigins": ["http://domain:3000"],
          "ExposeHeaders": []
        }
      ]
      

Transactional Emails (Optional)

  • We use Loops for transactional emails like user invitations, OTP sending, and downloading exported data.
  • Create a free account and obtain the API key from the Loops dashboard.
  • Create three transactional email templates:
    • User invitation email:
      • Data variables: inviterName and invitedEmail
    • OTP email:
      • Data variable: otp
    • Data export email:
      • Data variable: s3_link

Workers Setup (For Automated Eval)

  • Clone the athina-workers repository:
    git clone https://github.com/athina-ai/athina-workers
    
  • Copy .example.env to .env.
  • Use AWS keys with full admin access.
  • Populate the respective values (refer to the .env file from previous steps).
  • Run the following command (this will take some time, ~5 minutes):
    serverless deploy --stage prod --config serverless-aws.yml --aws-profile default --verbose
    
  • Copy the endpoint and secret/API key from the previous command.
  • Create a bucket for data export:
    • Use the bucket name export-env. Do not block public access for the bucket.

Services Creation

  • Log in to ECR using credentials:
aws ecr get-login-password --region eu-central-1 | docker login --username AWS --password-stdin 867387325299.dkr.ecr.eu-central-1.amazonaws.com
  • Pull the required images:
docker-compose --profile core pull
  • Update the .env file in athina-deploy with the endpoint and API key obtained in the previous step.
  • Run the following command:
docker-compose --profile core up -d
  • Wait for all services to start.

Dashboard

  • Clone the athina-dashboard repository:
git clone https://github.com/athina-ai/athina-dashboard
  • Copy the following to .env:
# API URL
NEXT_PUBLIC_API_URL= 
# Frontend URL
NEXT_PUBLIC_ATHINA_BASE_URL=
  • Set them to the correct values.
  • Run the following commands:
npm install --legacy-peer-deps
npm run build
bash script.sh start
  • Navigate to http://<IP/DOMAIN>:3000/login.

Updates

You can use the script.sh for starting, stopping, updating and restarting the services.

bash script.sh start # Start the services
bash script.sh stop # Stop the services
bash script.sh pull # Pull the latest images
bash script.sh update # Update the services
bash script.sh restart # Restart the services

Post Installation

You can edit few configurations in the database to customize the platform.

-- Replace 'your_org_id' with the actual org_id
SELECT id FROM org;

-- Query for setting your domain so that further users signing up will be added to your org
UPDATE org SET domain = 'your_domain' WHERE id = 'your_org_id';

-- Set max_evals_per_month to 1000. 
UPDATE org SET max_evals_per_month = 1000 WHERE id = 'your_org_id';

-- Paywall org config updated with allowed team members
UPDATE paywall_org_config SET allowed_team_members = 25 WHERE org_id = 'your_org_id';