AXON
MODEL FAMILY
Enterprise-grade AI models built for coding agents, tool calling, and advanced reasoning. Available via API and integrated directly in Orbital.
Axon 2.5 Pro High
MOST CAPABLEHighest intelligence for the most demanding long-running agent tasks, complex reasoning, and advanced tool calling.
axon-2-5-pro-highAxon 2.5 Pro
LATESTHigh-intelligence model for long-running agent tasks, tool calling, coding and general purpose use.
axon-2-5-proAxon 2.5 Mini
FASTESTGeneral purpose super intelligent LLM coding model for low-effort day-to-day tasks.
axon-2-5-miniAxon 2.1 Pro
High-intelligence model for long-running agent tasks, tool calling, coding and general purpose use.
axon-2-proAxon 2.1
High-intelligence model for long running agent tasks, tool calling, coding and general purpose.
axon-2PROMPT
CACHING.
Cached input tokens are automatically discounted at 50% off the standard input price. Caching kicks in automatically for repeated prompt prefixes.
USE WITH
ANY SDK.
Fully OpenAI-compatible. Drop in your existing OpenAI client with just a base URL change.
from openai import OpenAI
client = OpenAI(
api_key="YOUR_MATTERAI_KEY",
base_url="https://api.matterai.so/v1",
)
response = client.chat.completions.create(
model="axon-2-pro",
messages=[{"role": "user", "content": "Hello"}],
)import OpenAI from "openai";
const client = new OpenAI({
apiKey: "YOUR_MATTERAI_KEY",
baseURL: "https://api.matterai.so/v1",
});
const response = await client.chat.completions.create({
model: "axon-2-pro",
messages: [{ role: "user", content: "Hello" }],
});