Tier 2 — always-on, multi-tenant, in Azure.
One ARM template provisions a Function App, a Storage Account, an Application Insights workspace, and wires everything to your Azure OpenAI resource.
agents/ directory.AZURE_OPENAI_ENDPOINT and the deployment name. Identity-based auth supported.perform() emits a trace. Slosh chains are traceable end-to-end.RAPP Swarm implements the same five endpoints as the brainstem. Identical request and response shapes. A client written for Tier 1 talks to Tier 2 by changing one URL.
| Method | Path | Purpose |
|---|---|---|
| POST | /chat | user_input + history → assistant response |
| GET | /health | status, model, loaded agents, token state |
| GET | /models | available LLM ids |
| GET | /repos | connected sources (agent repos) |
| POST | /login | Tier-1-only OAuth bootstrap (returns 404 in Tier 2) |
This is the one promise we never break: an agent file you wrote on your laptop today,
saved as agents/weather_poet_agent.py, runs unmodified in Tier 2 and Tier 3.
No build step. No translation layer. No "cloud variant." If Tier 2 ever rejects a file Tier 1 accepts,
Tier 2 is broken — not the file.
# identical contract on all three tiers
class WeatherPoetAgent(BasicAgent):
def perform(self, **kwargs):
...
AZURE_OPENAI_ENDPOINT = https://<name>.openai.azure.com AZURE_OPENAI_DEPLOYMENT = gpt-4o STORAGE_CONNECTION = <set automatically by ARM template> SOUL_PATH = ./soul.md AGENTS_PATH = ./agents