RedAI Operator Guide

Course teams, API keys, and service operations for instructors & admins. Back to Hub

Overview

LiteLLM fronts two local Qwen2.5 models (text + vision) and exposes an OpenAI-compatible API at http://femianjc-gpu.csi.miamioh.edu:9000/v1. Operators use the admin UI (/ui) or the REST endpoints below to provision course teams, invite students, and monitor usage.

1. Create a Course Team

Teams scope budgets and model access. Create through the UI or via API:

curl -X POST http://femianjc-gpu.csi.miamioh.edu:9000/team/new \
  -H 'Authorization: Bearer sk-classroom-master' \
  -H 'Content-Type: application/json' \
  -d '{
        "team_alias": "cse434-fall25",
        "models": ["classroom-text", "classroom-vision"],
        "max_budget": 25,
        "tpm_limit": 6000,
        "rpm_limit": 30
      }'

After creation, configure spend caps and allowed models from the LiteLLM dashboard.

2. Add Internal Users

Create one user per student or group. LiteLLM can auto-generate keys:

curl -X POST http://femianjc-gpu.csi.miamioh.edu:9000/user/new \
  -H 'Authorization: Bearer sk-classroom-master' \
  -H 'Content-Type: application/json' \
  -d '{
        "user_id": "student-a12",
        "user_alias": "Alice Example",
        "user_email": "a12@example.edu",
        "teams": ["cse434-fall25"],
        "models": ["classroom-text", "classroom-vision"],
        "auto_create_key": true
      }'

Returned fields include the generated key. Reset keys with /key/generate or remove students via /user/delete.

3. Student Testing

Share the base URL + student key; all OpenAI SDKs work. Example Colab snippet:

from openai import OpenAI
client = OpenAI(
    api_key="sk-STUDENT",
    base_url="http://femianjc-gpu.csi.miamioh.edu:9000/v1",
)
resp = client.chat.completions.create(
    model="classroom-text",
    messages=[{"role": "user", "content": "Summarize Newton's laws."}],
    max_tokens=200,
)
print(resp.choices[0].message.content)

4. API Reference

5. Operations & Services

All services run as user-level systemd units in ~/Projects/classroom-llm.

systemctl --user restart llama-text@classroom-text
systemctl --user restart llama-vision
systemctl --user restart litellm

Logs live in ~/Projects/classroom-llm/logs/. Use journalctl --user -u <service> -f to tail.

6. Invite Emails

Auto-email is disabled until we have an approved campus SMTP relay. LiteLLM still generates invite links:

7. Aliases & CSV Template