The Chat API enables AI-powered conversations with support for MCP tools. Send messages and receive intelligent responses from Google Gemini models.
Base URL
https://www.girardai.com/api/chat
Authentication
All requests require authentication via Bearer token:
Authorization: Bearer sk_live_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Send Message
Send a message and receive an AI response.
Endpoint: POST /api/chat
Request Body
| Field | Type | Required | Description |
|---|---|---|---|
messages | array | Yes | Array of message objects |
model | string | No | Model ID (default: "gemini-2.0-flash") |
mcpServers | array | No | Array of MCP server IDs to enable |
Message Object
| Field | Type | Required | Description |
|---|---|---|---|
role | string | Yes | "user" or "assistant" |
content | string | Yes | Message content |
Available Models
| Model ID | Description |
|---|---|
gemini-2.0-flash | Fast, capable model |
gemini-1.5-pro | Advanced reasoning |
Available MCP Servers
| Server ID | Description |
|---|---|
postgres | PostgreSQL database |
sqlite | SQLite database |
filesystem | File system access |
github | GitHub API |
slack | Slack integration |
notion | Notion workspace |
web-search | Web search |
memory | Persistent memory |
puppeteer | Browser automation |
docker | Docker management |
aws | AWS services |
google-drive | Google Drive |
Basic Example
Send a simple message without tools:
curl -X POST https://www.girardai.com/api/chat \
-H "Authorization: Bearer sk_live_xxx" \
-H "Content-Type: application/json" \
-d '{
"messages": [
{
"role": "user",
"content": "Explain quantum computing in simple terms"
}
]
}'
Response:
{
"success": true,
"data": {
"content": "Quantum computing is a type of computing that uses quantum mechanics...",
"model": "gemini-2.0-flash"
}
}
Multi-turn Conversation
Include conversation history for context:
curl -X POST https://www.girardai.com/api/chat \
-H "Authorization: Bearer sk_live_xxx" \
-H "Content-Type: application/json" \
-d '{
"messages": [
{
"role": "user",
"content": "What is machine learning?"
},
{
"role": "assistant",
"content": "Machine learning is a subset of artificial intelligence..."
},
{
"role": "user",
"content": "How is it different from deep learning?"
}
],
"model": "gemini-1.5-pro"
}'
Response:
{
"success": true,
"data": {
"content": "Deep learning is actually a specialized subset of machine learning that uses neural networks with multiple layers...",
"model": "gemini-1.5-pro"
}
}
With MCP Tools
Enable MCP servers for extended capabilities:
curl -X POST https://www.girardai.com/api/chat \
-H "Authorization: Bearer sk_live_xxx" \
-H "Content-Type: application/json" \
-d '{
"messages": [
{
"role": "user",
"content": "Search the web for the latest news about AI regulations"
}
],
"mcpServers": ["web-search"]
}'
Response:
{
"success": true,
"data": {
"content": "Based on my search, here are the latest developments in AI regulations:\n\n1. The EU AI Act...\n2. US Executive Order on AI...",
"model": "gemini-2.0-flash",
"toolCalls": [
{
"name": "web-search_search_web",
"args": {
"query": "AI regulations latest news 2025"
},
"result": "{\"results\": [{\"title\": \"EU AI Act Implementation\"...}]}"
}
]
}
}
Database Query Example
Query a database using MCP tools:
curl -X POST https://www.girardai.com/api/chat \
-H "Authorization: Bearer sk_live_xxx" \
-H "Content-Type: application/json" \
-d '{
"messages": [
{
"role": "user",
"content": "Show me the top 5 customers by order count"
}
],
"mcpServers": ["postgres"]
}'
Response:
{
"success": true,
"data": {
"content": "Here are your top 5 customers by order count:\n\n| Customer | Orders |\n|----------|--------|\n| Acme Corp | 156 |\n| Tech Inc | 142 |\n...",
"model": "gemini-2.0-flash",
"toolCalls": [
{
"name": "postgres_query_database",
"args": {
"query": "SELECT customer_name, COUNT(*) as order_count FROM orders GROUP BY customer_name ORDER BY order_count DESC LIMIT 5"
},
"result": "{\"rows\": [{\"customer_name\": \"Acme Corp\", \"order_count\": 156}...]}"
}
]
}
}
Response Format
Success Response
{
"success": true,
"data": {
"content": "AI response text...",
"model": "gemini-2.0-flash",
"toolCalls": [
{
"name": "tool_name",
"args": {},
"result": "tool result"
}
]
}
}
| Field | Type | Description |
|---|---|---|
content | string | AI response text |
model | string | Model used for response |
toolCalls | array | Tool calls made (if any) |
Tool Call Object
| Field | Type | Description |
|---|---|---|
name | string | Tool name (server_function) |
args | object | Arguments passed to tool |
result | string | Tool execution result |
Code Examples
JavaScript/TypeScript
interface Message {
role: 'user' | 'assistant';
content: string;
}
interface ChatResponse {
success: boolean;
data: {
content: string;
model: string;
toolCalls?: Array<{
name: string;
args: Record<string, unknown>;
result: string;
}>;
};
}
async function chat(
messages: Message[],
options?: {
model?: string;
mcpServers?: string[];
}
): Promise<ChatResponse> {
const response = await fetch('https://www.girardai.com/api/chat', {
method: 'POST',
headers: {
'Authorization': `Bearer ${process.env.GIRARDAI_API_KEY}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({
messages,
model: options?.model || 'gemini-2.0-flash',
mcpServers: options?.mcpServers || [],
}),
});
return response.json();
}
// Usage
const result = await chat([
{ role: 'user', content: 'Hello, how are you?' }
]);
console.log(result.data.content);
Python
import requests
import os
def chat(messages, model='gemini-2.0-flash', mcp_servers=None):
"""Send a chat message to Girard."""
response = requests.post(
'https://www.girardai.com/api/chat',
headers={
'Authorization': f'Bearer {os.environ["GIRARDAI_API_KEY"]}',
'Content-Type': 'application/json',
},
json={
'messages': messages,
'model': model,
'mcpServers': mcp_servers or [],
}
)
return response.json()
# Usage
result = chat([
{'role': 'user', 'content': 'What is the capital of France?'}
])
print(result['data']['content'])
Python (Async)
import httpx
import os
async def chat_async(messages, model='gemini-2.0-flash', mcp_servers=None):
"""Send a chat message to Girard asynchronously."""
async with httpx.AsyncClient() as client:
response = await client.post(
'https://www.girardai.com/api/chat',
headers={
'Authorization': f'Bearer {os.environ["GIRARDAI_API_KEY"]}',
'Content-Type': 'application/json',
},
json={
'messages': messages,
'model': model,
'mcpServers': mcp_servers or [],
}
)
return response.json()
# Usage
import asyncio
async def main():
result = await chat_async([
{'role': 'user', 'content': 'Tell me a joke'}
])
print(result['data']['content'])
asyncio.run(main())
Error Responses
Error Format
{
"success": false,
"error": "Error message"
}
Common Errors
| Status | Error | Description |
|---|---|---|
| 400 | "Messages are required" | Missing messages array |
| 401 | "Unauthorized" | Invalid API key |
| 400 | "No organization found" | User has no organization |
| 503 | "Chat service not configured" | Server configuration issue |
| 500 | Error message | Internal server error |
Best Practices
Conversation Management
- Include relevant history - Pass previous messages for context
- Trim old messages - Keep conversation length manageable
- Summarize long conversations - Prevent context overflow
Tool Usage
- Enable only needed tools - Reduces complexity
- Handle tool failures - Tools may occasionally fail
- Review tool results - Verify data accuracy
Performance
- Use appropriate model - Flash for speed, Pro for complex tasks
- Batch similar requests - Reduce API calls
- Cache responses - For repeated queries
Credit Usage
| Action | Credits |
|---|---|
| Chat message | 1 |
| With tool calls | 1 + tool overhead |
Previous: Studio API | Next: Agents API