Drop‑in Replacement
E tsamaisanaE sebetsa le OpenAI le Anthropic SDKs. Fetola base URL feela.
Dipuo tsohle di lekana. Kgetha eo o batlang ho e sebedisa ha o bala.
AI API e lumellanang le OpenAI le Anthropic e nang le mehala ya mesebetsi, patlo ya webe le diphumo tse hlophisitsoeng.
Tsohle tseo o di hlokang ho qala ka API ya Shannon e lumellanang le OpenAI le Anthropic.
https://api.shannon-ai.com/v1/chat/completions Sebelisa Chat Completions API ka function calling le streaming.
https://api.shannon-ai.com/v1/messages Mokgwa wa Claude Messages ka tools le anthropic-version header.
Authorization: Bearer <api-key> Kapa X-API-Key le anthropic-version bakeng sa Claude-style calls.
Docs tsa setjhaba - senotlolo se a hlokahala Streaming, ho bitsa mesebetsi, dihahisong tse hlophisitsoeng, web search.
Drop‑in replacement bakeng sa OpenAI le Anthropic APIs ka tshehetso ya disebediswa, diphumo tse hlophisitsoeng le patlo ya webe e hahiloeng ka hare.
E sebetsa le OpenAI le Anthropic SDKs. Fetola base URL feela.
Hlalosa tools, Shannon o tla di bitsa. E tshehetsa auto, forced, none modes.
Web search ka nako ya nnete ka citations. E fumaneha ka bohona.
JSON mode le JSON Schema enforcement bakeng sa data e tshepehang.
Automatic function execution loops. Ho fihla ho 10 iterations ka kopo.
Server‑Sent Events bakeng sa streaming ya token ka nako ya nnete.
Qala ka mehato e meraro. Shannon e etsisa bareki ba OpenAI le Anthropic.
Sebelisa OpenAI‑compatible endpoint.
https://api.shannon-ai.com/v1/chat/completions Sebelisa Bearer auth ho Authorization header.
Kgetha puo le ho kenya key ya hao.
from openai import OpenAI
client = OpenAI(
api_key="YOUR_API_KEY",
base_url="https://api.shannon-ai.com/v1"
)
response = client.chat.completions.create(
model="shannon-1.6-lite",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello, Shannon!"}
],
max_tokens=1024
)
print(response.choices[0].message.content) import OpenAI from 'openai';
const client = new OpenAI({
apiKey: 'YOUR_API_KEY',
baseURL: 'https://api.shannon-ai.com/v1'
});
const response = await client.chat.completions.create({
model: 'shannon-1.6-lite',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Hello, Shannon!' }
],
max_tokens: 1024
});
console.log(response.choices[0].message.content); package main
import (
"context"
"fmt"
openai "github.com/sashabaranov/go-openai"
)
func main() {
config := openai.DefaultConfig("YOUR_API_KEY")
config.BaseURL = "https://api.shannon-ai.com/v1"
client := openai.NewClientWithConfig(config)
resp, err := client.CreateChatCompletion(
context.Background(),
openai.ChatCompletionRequest{
Model: "shannon-1.6-lite",
Messages: []openai.ChatCompletionMessage{
{Role: "system", Content: "You are a helpful assistant."},
{Role: "user", Content: "Hello, Shannon!"},
},
MaxTokens: 1024,
},
)
if err != nil {
panic(err)
}
fmt.Println(resp.Choices[0].Message.Content)
} curl -X POST "https://api.shannon-ai.com/v1/chat/completions" \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "shannon-1.6-lite",
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello, Shannon!"}
],
"max_tokens": 1024
}' {
"id": "chatcmpl-abc123",
"object": "chat.completion",
"created": 1234567890,
"model": "Shannon 1.6 Lite",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "Hello! I'm Shannon, your AI assistant. How can I help you today?"
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 25,
"completion_tokens": 18,
"total_tokens": 43
}
} Lekola Shannon API ka kotloloho ho browser ya hao. Haha kopo, tsamaisa, mme o bone karabo ka nako ya nnete.
Switch across OpenAI Chat Completions, Responses, and Anthropic Messages without leaving the playground.
Run real requests, inspect raw JSON, and view stream events from the same operator console.
Signed-in users can pull their Shannon API key straight into the dedicated playground workspace.
/st/docs/playground The playground now lives on its own route so the API docs stay Astro-rendered while the request builder remains an explicitly interactive client tool.
Dikopo tsohle tsa API di hloka netefatso ka senotlolo sa hao sa Shannon API.
Authorization: Bearer YOUR_API_KEY X-API-Key: YOUR_API_KEY
anthropic-version: 2023-06-01 Shannon e fana ka dimodelo tse mmalwa tse ntlafalitsoeng bakeng sa tshebediso e fapaneng.
shannon-1.6-lite Shannon 1.6 Lite Dikarabo tse potlakileng le tse sebetsang bakeng sa mesebetsi ya letsatsi le letsatsi
shannon-1.6-pro Shannon 1.6 Pro Reasoning e phahameng bakeng sa mathata a rarahaneng
shannon-2-lite Shannon 2 Lite
shannon-2-pro Shannon 2 Pro
shannon-coder-1 Shannon Coder E lokiselitsoe Claude Code CLI ka call‑based quota
Hlalosa disebediswa tseo Shannon e ka di bitsang ho etsa diketso kapa ho fumana tlhahisoleseding.
from openai import OpenAI
import json
client = OpenAI(
api_key="YOUR_API_KEY",
base_url="https://api.shannon-ai.com/v1"
)
# Define available tools/functions
tools = [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current weather for a location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "City name, e.g., 'Tokyo'"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"]
}
},
"required": ["location"]
}
}
}
]
response = client.chat.completions.create(
model="shannon-1.6-lite",
messages=[{"role": "user", "content": "What's the weather in Tokyo?"}],
tools=tools,
tool_choice="auto"
)
# Check if model wants to call a function
if response.choices[0].message.tool_calls:
tool_call = response.choices[0].message.tool_calls[0]
print(f"Function: {tool_call.function.name}")
print(f"Arguments: {tool_call.function.arguments}") import OpenAI from 'openai';
const client = new OpenAI({
apiKey: 'YOUR_API_KEY',
baseURL: 'https://api.shannon-ai.com/v1'
});
const tools = [
{
type: 'function',
function: {
name: 'get_weather',
description: 'Get current weather for a location',
parameters: {
type: 'object',
properties: {
location: { type: 'string', description: "City name" },
unit: { type: 'string', enum: ['celsius', 'fahrenheit'] }
},
required: ['location']
}
}
}
];
const response = await client.chat.completions.create({
model: 'shannon-1.6-lite',
messages: [{ role: 'user', content: "What's the weather in Tokyo?" }],
tools,
tool_choice: 'auto'
});
if (response.choices[0].message.tool_calls) {
const toolCall = response.choices[0].message.tool_calls[0];
console.log('Function:', toolCall.function.name);
console.log('Arguments:', toolCall.function.arguments);
} "auto" Model e etsa qeto haeba e tla bitsa function (default) "none" Tima function calling bakeng sa kopo ena {"type": "function", "function": {"name": "..."}} Qobella function e itseng {
"id": "chatcmpl-xyz",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": null,
"tool_calls": [
{
"id": "call_abc123",
"type": "function",
"function": {
"name": "get_weather",
"arguments": "{\"location\": \"Tokyo\", \"unit\": \"celsius\"}"
}
}
]
},
"finish_reason": "tool_calls"
}
]
} Qobella Shannon ho arabela ka JSON e nepahetseng e tsamaellanang le schema ya hao.
from openai import OpenAI
client = OpenAI(
api_key="YOUR_API_KEY",
base_url="https://api.shannon-ai.com/v1"
)
# Force JSON output with schema
response = client.chat.completions.create(
model="shannon-1.6-lite",
messages=[
{"role": "user", "content": "Extract: John Doe, 30 years old, engineer"}
],
response_format={
"type": "json_schema",
"json_schema": {
"name": "person_info",
"schema": {
"type": "object",
"properties": {
"name": {"type": "string"},
"age": {"type": "integer"},
"occupation": {"type": "string"}
},
"required": ["name", "age", "occupation"]
}
}
}
)
import json
data = json.loads(response.choices[0].message.content)
print(data) # {"name": "John Doe", "age": 30, "occupation": "engineer"} import OpenAI from 'openai';
const client = new OpenAI({
apiKey: 'YOUR_API_KEY',
baseURL: 'https://api.shannon-ai.com/v1'
});
const response = await client.chat.completions.create({
model: 'shannon-1.6-lite',
messages: [
{ role: 'user', content: 'Extract: John Doe, 30 years old, engineer' }
],
response_format: {
type: 'json_schema',
json_schema: {
name: 'person_info',
schema: {
type: 'object',
properties: {
name: { type: 'string' },
age: { type: 'integer' },
occupation: { type: 'string' }
},
required: ['name', 'age', 'occupation']
}
}
}
});
const data = JSON.parse(response.choices[0].message.content);
console.log(data); // { name: "John Doe", age: 30, occupation: "engineer" } {"type": "json_object"} Qobella JSON output e nepahetseng (ho se na schema e itseng) {"type": "json_schema", "json_schema": {...}} Qobella output e tsamaisanang le schema ya hao Bulela token streaming ya nako‑nnete ka Server‑Sent Events bakeng sa UI e arabelang.
from openai import OpenAI
client = OpenAI(
api_key="YOUR_API_KEY",
base_url="https://api.shannon-ai.com/v1"
)
# Enable streaming for real-time responses
stream = client.chat.completions.create(
model="shannon-1.6-lite",
messages=[
{"role": "user", "content": "Write a short poem about AI"}
],
stream=True
)
for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="", flush=True) import OpenAI from 'openai';
const client = new OpenAI({
apiKey: 'YOUR_API_KEY',
baseURL: 'https://api.shannon-ai.com/v1'
});
// Enable streaming for real-time responses
const stream = await client.chat.completions.create({
model: 'shannon-1.6-lite',
messages: [
{ role: 'user', content: 'Write a short poem about AI' }
],
stream: true
});
for await (const chunk of stream) {
const content = chunk.choices[0]?.delta?.content;
if (content) process.stdout.write(content);
} Shannon e na le web_search e hahiloeng ka hare e fumanehang ka bohona.
from openai import OpenAI
client = OpenAI(
api_key="YOUR_API_KEY",
base_url="https://api.shannon-ai.com/v1"
)
# Web search is automatically available!
# Shannon will use it when needed for current information
response = client.chat.completions.create(
model="shannon-1.6-lite",
messages=[
{"role": "user", "content": "What are the latest AI news today?"}
],
# Optionally, explicitly define web_search tool
tools=[{
"type": "function",
"function": {
"name": "web_search",
"description": "Search the web for current information",
"parameters": {
"type": "object",
"properties": {
"query": {"type": "string", "description": "Search query"}
},
"required": ["query"]
}
}
}]
)
print(response.choices[0].message.content)
# Response includes sources and citations import OpenAI from 'openai';
const client = new OpenAI({
apiKey: 'YOUR_API_KEY',
baseURL: 'https://api.shannon-ai.com/v1'
});
// Web search is automatically available!
// Shannon will use it when needed for current information
const response = await client.chat.completions.create({
model: 'shannon-1.6-lite',
messages: [
{ role: 'user', content: 'What are the latest AI news today?' }
],
// Optionally, explicitly define web_search tool
tools: [{
type: 'function',
function: {
name: 'web_search',
description: 'Search the web for current information',
parameters: {
type: 'object',
properties: {
query: { type: 'string', description: 'Search query' }
},
required: ['query']
}
}
}]
});
console.log(response.choices[0].message.content);
// Response includes sources and citations Shannon e tshehetsa le format ya Anthropic Messages API.
https://api.shannon-ai.com/v1/messages import anthropic
client = anthropic.Anthropic(
api_key="YOUR_API_KEY",
base_url="https://api.shannon-ai.com/messages"
)
response = client.messages.create(
model="shannon-1.6-lite",
max_tokens=1024,
messages=[
{"role": "user", "content": "Hello, Shannon!"}
],
# Tool use (Anthropic format)
tools=[{
"name": "web_search",
"description": "Search the web",
"input_schema": {
"type": "object",
"properties": {
"query": {"type": "string"}
},
"required": ["query"]
}
}]
)
print(response.content[0].text) import Anthropic from '@anthropic-ai/sdk';
const client = new Anthropic({
apiKey: 'YOUR_API_KEY',
baseURL: 'https://api.shannon-ai.com/messages'
});
const response = await client.messages.create({
model: 'shannon-1.6-lite',
max_tokens: 1024,
messages: [
{ role: 'user', content: 'Hello, Shannon!' }
],
// Tool use (Anthropic format)
tools: [{
name: 'web_search',
description: 'Search the web',
input_schema: {
type: 'object',
properties: {
query: { type: 'string' }
},
required: ['query']
}
}]
});
console.log(response.content[0].text); Sebedisa SDK efe kapa efe ya OpenAI kapa Anthropic — fetola feela base URL.
Official OpenAI Python SDK - e sebetsa le Shannon
pip install openai Official OpenAI Node.js SDK - e sebetsa le Shannon
npm install openai Community Go client bakeng sa OpenAI‑compatible APIs
go get github.com/sashabaranov/go-openai Community Ruby client bakeng sa OpenAI‑compatible APIs
gem install ruby-openai Community PHP client bakeng sa OpenAI‑compatible APIs
composer require openai-php/client Async Rust client bakeng sa OpenAI‑compatible APIs
cargo add async-openai Official Anthropic Python SDK - e sebetsa le Shannon
pip install anthropic Official Anthropic TypeScript SDK - e sebetsa le Shannon
npm install @anthropic-ai/sdk Shannon e sebedisa dikhoutu tsa boemo ba HTTP tse tlwaelehileng mme e khutlisa melaetsa ya phoso e qaqileng.
{
"error": {
"message": "Invalid API key provided",
"type": "authentication_error",
"code": "invalid_api_key"
}
} Dintlafatso le ntlafatso tsa moraorao tsa Shannon API.
YOUR_API_KEY Boloka API key ya hao e le sephiri. Ho hlahisa hape ho etsa key e ntjha le ho hlakola ya kgale.
Ha o kene, sheba tshebediso ya token le search leqepheng lena.
Call‑based quota bakeng sa Shannon Coder (shannon-coder-1). E reset‑wa ka 4 dihora.
Fumana senotlolo sa hao sa API mme o qale ho aha ka Shannon AI kajeno.
Dipatliso tse tummeng: