Unified API
One API for all models
Streaming
Real-time response streaming
Function Calling
Let AI call your functions
Vision
Analyze images with AI
React Hook (Recommended)
The simplest way to add AI chat to your app:
import { useChat } from '@sylphx/platform-sdk/react'
function ChatBot() {
const { messages, send, isLoading, error } = useChat({
model: 'anthropic/claude-3.5-sonnet',
systemMessage: 'You are a helpful assistant.',
})
return (
<div>
{messages.map((m, i) => (
<div key={i} className={m.role === 'user' ? 'text-right' : 'text-left'}>
{m.content}
</div>
))}
<input
placeholder="Type a message..."
onKeyDown={(e) => {
if (e.key === 'Enter' && !isLoading) {
send(e.currentTarget.value)
e.currentTarget.value = ''
}
}}
disabled={isLoading}
/>
{error && <p className="text-red-500">{error.message}</p>}
</div>
)
}Hook Options
const { messages, send, isLoading, error, clear, retry, stop } = useChat({
// Model to use (defaults to app config)
model: 'anthropic/claude-3.5-sonnet',
// System prompt
systemMessage: 'You are a helpful assistant.',
// Generation parameters
temperature: 0.7, // 0-2, higher = more creative
maxTokens: 1000, // Max response length
// Tools/functions the model can call
tools: [
{
type: 'function',
function: {
name: 'get_weather',
description: 'Get current weather',
parameters: { type: 'object', properties: { city: { type: 'string' } } },
},
},
],
// Initial messages
initialMessages: [
{ role: 'assistant', content: 'Hello! How can I help you today?' },
],
// Callbacks
onMessage: (message) => console.log('Received:', message),
onError: (error) => console.error('Error:', error),
})| Property | Type | Description |
|---|---|---|
send(content) | function | Send a message and get a response |
clear() | function | Clear all messages |
retry() | function | Retry the last message |
stop() | function | Stop current streaming response |
append(message) | function | Add a message without sending |
Low-Level API
For more control, use the useAI hook:
import { useAI } from '@sylphx/platform-sdk/react'
function MyComponent() {
const { chat, isLoading, error } = useAI()
const handleClick = async () => {
const response = await chat(
[
{ role: 'user', content: 'What is the capital of France?' }
],
{
model: 'anthropic/claude-3.5-sonnet',
temperature: 0.7,
maxTokens: 500,
}
)
console.log(response.choices[0].message.content)
// "The capital of France is Paris."
}
return <button onClick={handleClick} disabled={isLoading}>Ask</button>
}Server-Side Usage
For API routes or server actions, use the OpenAI-compatible REST API:
app/api/chat/route.ts
export async function POST(req: Request) {
const { messages } = await req.json()
const response = await fetch('https://sylphx.com/api/v1/chat/completions', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${process.env.SYLPHX_APP_SECRET}`,
'X-App-Id': process.env.SYLPHX_APP_ID!,
},
body: JSON.stringify({
model: 'anthropic/claude-3.5-sonnet',
messages,
stream: true,
}),
})
return new Response(response.body, {
headers: {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
},
})
}OpenAI Compatible
The API is OpenAI-compatible. Use any OpenAI SDK or library by changing the base URL.
Popular Models
Best Overall200K
Claude 3.5 Sonnet
anthropic/claude-3.5-sonnetFast & Cheap200K
Claude 3 Haiku
anthropic/claude-3-haikuHuge Context1M
Gemini 2.0 Flash
google/gemini-2.0-flashComplex Tasks200K
Claude 3 Opus
anthropic/claude-3-opusReasoning64K
DeepSeek R1
deepseek/deepseek-r1Open Weight128K
Llama 3.3 70B
meta-llama/llama-3.3-70b// Get full list of available models
import { useModels } from '@sylphx/platform-sdk/react'
const { models, isLoading, setSearch, setCapability } = useModels({
capability: 'chat', // Filter: 'chat' | 'vision' | 'embedding' | 'tool'
fetchOnMount: true,
})
// models is an array of:
// { id, name, contextWindow, inputCostPer1M, outputCostPer1M, ... }Vision (Image Analysis)
Analyze images with vision-capable models:
import { useAI } from '@sylphx/platform-sdk/react'
function ImageAnalyzer() {
const { vision, isLoading } = useAI()
const analyzeImage = async (imageUrl: string) => {
const description = await vision(
imageUrl,
'Describe what you see in this image.',
{ model: 'anthropic/claude-3.5-sonnet' }
)
console.log(description)
}
return (
<input
type="file"
accept="image/*"
onChange={async (e) => {
const file = e.target.files?.[0]
if (file) {
const url = URL.createObjectURL(file)
await analyzeImage(url)
}
}}
disabled={isLoading}
/>
)
}Function Calling
Let the model call functions:
const { messages, send } = useChat({
model: 'anthropic/claude-3.5-sonnet',
tools: [
{
type: 'function',
function: {
name: 'get_weather',
description: 'Get the current weather for a location',
parameters: {
type: 'object',
properties: {
city: { type: 'string', description: 'City name' },
},
required: ['city'],
},
},
},
],
onMessage: async (message) => {
// Handle function calls in response
if (message.toolCalls) {
for (const call of message.toolCalls) {
if (call.function.name === 'get_weather') {
const args = JSON.parse(call.function.arguments)
const weather = await fetchWeather(args.city)
// Send function result back to model
// ...
}
}
}
},
})Cost Optimization
Use
anthropic/claude-3-haiku for simple tasks to reduce costs. Reserve Sonnet/Opus for complex reasoning. Check Usage & Quotas for pricing.