Search anything...
K
Back to Docs
  • Introduction
  • Quick Start
  • Account Setup
  • AI Studio
  • Chat
  • Agents
  • Voice
  • MCP Servers
  • Workflows
  • Authentication
  • Studio API
  • Chat API
  • Agents API
  • Voice API
  • Workflows API
  • Webhooks
  • Error Codes
  • Creating Custom Agents
  • MCP Integration
  • Building Workflows
  • Prompt Engineering
  • Team Management
  • Billing & Plans
  • Usage Monitoring
  • Single-Tenant Cloud
  • Private VPC Deployment
  • SSO Configuration
  • Security Policies
  • Compliance
  • Troubleshooting
  • API Versioning
DocsAPI ReferenceChat API

Chat API

API reference for AI chat conversations

The Chat API enables AI-powered conversations with support for MCP tools. Send messages and receive intelligent responses from Google Gemini models.

Base URL

https://www.girardai.com/api/chat

Authentication

All requests require authentication via Bearer token:

Authorization: Bearer sk_live_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

Send Message

Send a message and receive an AI response.

Endpoint: POST /api/chat

Request Body

FieldTypeRequiredDescription
messagesarrayYesArray of message objects
modelstringNoModel ID (default: "gemini-2.0-flash")
mcpServersarrayNoArray of MCP server IDs to enable

Message Object

FieldTypeRequiredDescription
rolestringYes"user" or "assistant"
contentstringYesMessage content

Available Models

Model IDDescription
gemini-2.0-flashFast, capable model
gemini-1.5-proAdvanced reasoning

Available MCP Servers

Server IDDescription
postgresPostgreSQL database
sqliteSQLite database
filesystemFile system access
githubGitHub API
slackSlack integration
notionNotion workspace
web-searchWeb search
memoryPersistent memory
puppeteerBrowser automation
dockerDocker management
awsAWS services
google-driveGoogle Drive

Basic Example

Send a simple message without tools:

curl -X POST https://www.girardai.com/api/chat \
  -H "Authorization: Bearer sk_live_xxx" \
  -H "Content-Type: application/json" \
  -d '{
    "messages": [
      {
        "role": "user",
        "content": "Explain quantum computing in simple terms"
      }
    ]
  }'

Response:

{
  "success": true,
  "data": {
    "content": "Quantum computing is a type of computing that uses quantum mechanics...",
    "model": "gemini-2.0-flash"
  }
}

Multi-turn Conversation

Include conversation history for context:

curl -X POST https://www.girardai.com/api/chat \
  -H "Authorization: Bearer sk_live_xxx" \
  -H "Content-Type: application/json" \
  -d '{
    "messages": [
      {
        "role": "user",
        "content": "What is machine learning?"
      },
      {
        "role": "assistant",
        "content": "Machine learning is a subset of artificial intelligence..."
      },
      {
        "role": "user",
        "content": "How is it different from deep learning?"
      }
    ],
    "model": "gemini-1.5-pro"
  }'

Response:

{
  "success": true,
  "data": {
    "content": "Deep learning is actually a specialized subset of machine learning that uses neural networks with multiple layers...",
    "model": "gemini-1.5-pro"
  }
}

With MCP Tools

Enable MCP servers for extended capabilities:

curl -X POST https://www.girardai.com/api/chat \
  -H "Authorization: Bearer sk_live_xxx" \
  -H "Content-Type: application/json" \
  -d '{
    "messages": [
      {
        "role": "user",
        "content": "Search the web for the latest news about AI regulations"
      }
    ],
    "mcpServers": ["web-search"]
  }'

Response:

{
  "success": true,
  "data": {
    "content": "Based on my search, here are the latest developments in AI regulations:\n\n1. The EU AI Act...\n2. US Executive Order on AI...",
    "model": "gemini-2.0-flash",
    "toolCalls": [
      {
        "name": "web-search_search_web",
        "args": {
          "query": "AI regulations latest news 2025"
        },
        "result": "{\"results\": [{\"title\": \"EU AI Act Implementation\"...}]}"
      }
    ]
  }
}

Database Query Example

Query a database using MCP tools:

curl -X POST https://www.girardai.com/api/chat \
  -H "Authorization: Bearer sk_live_xxx" \
  -H "Content-Type: application/json" \
  -d '{
    "messages": [
      {
        "role": "user",
        "content": "Show me the top 5 customers by order count"
      }
    ],
    "mcpServers": ["postgres"]
  }'

Response:

{
  "success": true,
  "data": {
    "content": "Here are your top 5 customers by order count:\n\n| Customer | Orders |\n|----------|--------|\n| Acme Corp | 156 |\n| Tech Inc | 142 |\n...",
    "model": "gemini-2.0-flash",
    "toolCalls": [
      {
        "name": "postgres_query_database",
        "args": {
          "query": "SELECT customer_name, COUNT(*) as order_count FROM orders GROUP BY customer_name ORDER BY order_count DESC LIMIT 5"
        },
        "result": "{\"rows\": [{\"customer_name\": \"Acme Corp\", \"order_count\": 156}...]}"
      }
    ]
  }
}

Response Format

Success Response

{
  "success": true,
  "data": {
    "content": "AI response text...",
    "model": "gemini-2.0-flash",
    "toolCalls": [
      {
        "name": "tool_name",
        "args": {},
        "result": "tool result"
      }
    ]
  }
}
FieldTypeDescription
contentstringAI response text
modelstringModel used for response
toolCallsarrayTool calls made (if any)

Tool Call Object

FieldTypeDescription
namestringTool name (server_function)
argsobjectArguments passed to tool
resultstringTool execution result

Code Examples

JavaScript/TypeScript

interface Message {
  role: 'user' | 'assistant';
  content: string;
}

interface ChatResponse {
  success: boolean;
  data: {
    content: string;
    model: string;
    toolCalls?: Array<{
      name: string;
      args: Record<string, unknown>;
      result: string;
    }>;
  };
}

async function chat(
  messages: Message[],
  options?: {
    model?: string;
    mcpServers?: string[];
  }
): Promise<ChatResponse> {
  const response = await fetch('https://www.girardai.com/api/chat', {
    method: 'POST',
    headers: {
      'Authorization': `Bearer ${process.env.GIRARDAI_API_KEY}`,
      'Content-Type': 'application/json',
    },
    body: JSON.stringify({
      messages,
      model: options?.model || 'gemini-2.0-flash',
      mcpServers: options?.mcpServers || [],
    }),
  });

  return response.json();
}

// Usage
const result = await chat([
  { role: 'user', content: 'Hello, how are you?' }
]);

console.log(result.data.content);

Python

import requests
import os

def chat(messages, model='gemini-2.0-flash', mcp_servers=None):
    """Send a chat message to Girard."""
    response = requests.post(
        'https://www.girardai.com/api/chat',
        headers={
            'Authorization': f'Bearer {os.environ["GIRARDAI_API_KEY"]}',
            'Content-Type': 'application/json',
        },
        json={
            'messages': messages,
            'model': model,
            'mcpServers': mcp_servers or [],
        }
    )
    return response.json()

# Usage
result = chat([
    {'role': 'user', 'content': 'What is the capital of France?'}
])

print(result['data']['content'])

Python (Async)

import httpx
import os

async def chat_async(messages, model='gemini-2.0-flash', mcp_servers=None):
    """Send a chat message to Girard asynchronously."""
    async with httpx.AsyncClient() as client:
        response = await client.post(
            'https://www.girardai.com/api/chat',
            headers={
                'Authorization': f'Bearer {os.environ["GIRARDAI_API_KEY"]}',
                'Content-Type': 'application/json',
            },
            json={
                'messages': messages,
                'model': model,
                'mcpServers': mcp_servers or [],
            }
        )
        return response.json()

# Usage
import asyncio

async def main():
    result = await chat_async([
        {'role': 'user', 'content': 'Tell me a joke'}
    ])
    print(result['data']['content'])

asyncio.run(main())

Error Responses

Error Format

{
  "success": false,
  "error": "Error message"
}

Common Errors

StatusErrorDescription
400"Messages are required"Missing messages array
401"Unauthorized"Invalid API key
400"No organization found"User has no organization
503"Chat service not configured"Server configuration issue
500Error messageInternal server error

Best Practices

Conversation Management

  1. Include relevant history - Pass previous messages for context
  2. Trim old messages - Keep conversation length manageable
  3. Summarize long conversations - Prevent context overflow

Tool Usage

  1. Enable only needed tools - Reduces complexity
  2. Handle tool failures - Tools may occasionally fail
  3. Review tool results - Verify data accuracy

Performance

  1. Use appropriate model - Flash for speed, Pro for complex tasks
  2. Batch similar requests - Reduce API calls
  3. Cache responses - For repeated queries

Credit Usage

ActionCredits
Chat message1
With tool calls1 + tool overhead

Previous: Studio API | Next: Agents API

Previous
Studio API
Next
Agents API