Back to The Times of Claw

OpenClaw with GPT-5: Setup and Best Practices

OpenClaw + GPT-5: configure DenchClaw to use OpenAI's GPT-5 model. Complete setup guide, best practices, and tips for getting the most from your AI CRM.

Mark Rachapoom
Mark Rachapoom
·8 min read
OpenClaw with GPT-5: Setup and Best Practices

OpenClaw connects to GPT-5 through OpenAI's standard API, making it one of the easiest model configurations to get working with DenchClaw. If you want maximum reasoning capability for complex CRM tasks — pipeline analysis, lead scoring, research synthesis — GPT-5 is a strong default. Here's how to set it up properly.

What GPT-5 Brings to DenchClaw#

GPT-5 is OpenAI's most capable general-purpose model. For DenchClaw specifically, it excels at:

  • Complex multi-step reasoning — planning sequences of CRM operations, synthesizing research from multiple sources
  • Structured output generation — reliable JSON, formatted reports, and precise data transformations
  • Code generation — writing custom Skills, query logic, and automation scripts
  • Nuanced writing — outreach emails, proposals, and summaries that actually read well
  • Tool use — GPT-5's function calling is mature and reliable, which DenchClaw Skills depend on heavily

If you're running heavy agent workloads or building custom Skills, GPT-5 is the model most likely to handle edge cases correctly.

Step 1: Get Your OpenAI API Key#

  1. Go to platform.openai.com
  2. Navigate to API Keys in your account settings
  3. Click Create new secret key
  4. Copy and store it securely — OpenAI only shows it once

Make sure your OpenAI account has GPT-5 access. At launch, GPT-5 may be gated to certain tiers. Check your account's model access in Settings → Limits.

Step 2: Add the API Key to OpenClaw#

openclaw config set apiKeys.openai YOUR_OPENAI_API_KEY

Or directly in ~/.openclaw/config.json:

{
  "apiKeys": {
    "openai": "sk-..."
  }
}

Step 3: Configure GPT-5 as Your Model#

openclaw config set model openai/gpt-5

Or in the config file:

{
  "model": {
    "provider": "openai",
    "model": "gpt-5"
  }
}

Restart OpenClaw:

openclaw restart

Step 4: Test the Connection#

openclaw chat "Summarize the current state of my CRM pipeline"

A well-configured GPT-5 setup will return a structured, thoughtful response. If you get an auth error, verify your API key. If you get a model access error, check your OpenAI account tier.

Understanding OpenAI Model Variants#

OpenAI typically offers multiple variants of each generation. For GPT-5, expect something like:

ModelBest ForSpeedCost
gpt-5Balanced defaultModerateMedium
gpt-5-turboFast, lower costFastLower
gpt-5-miniSimple tasks, high volumeVery fastLowest
gpt-5-previewLatest featuresVariesHigh

Start with gpt-5 for general use. Switch to gpt-5-turbo for high-frequency operations where you're watching costs, or gpt-5-mini for simple queries that don't need the full model's power.

Best Practices for GPT-5 with DenchClaw#

1. Use System Prompts for Consistent Behavior#

GPT-5 is highly steerable. DenchClaw passes system prompts to every model call, and Skills use them to define agent behavior. If you're customizing Skills, be specific:

You are a CRM analyst for a B2B SaaS company. 
When analyzing pipeline data:
- Always include deal values in USD
- Flag deals with no activity in 14+ days as "at risk"
- Prioritize deals closing within 30 days

Specificity improves output quality significantly more than it does with smaller models.

2. Leverage Structured Output Mode#

GPT-5 supports structured output (guaranteed valid JSON matching a schema). For Skills that produce data to be stored back in DuckDB, this eliminates parsing failures:

{
  "model": "gpt-5",
  "response_format": { "type": "json_schema", "schema": {...} }
}

OpenClaw handles this automatically for built-in Skills, but if you're building custom ones, use structured output mode.

3. Configure Temperature for Your Use Case#

For CRM data tasks (extracting information, categorizing leads, generating structured reports), use lower temperature:

{
  "model": {
    "provider": "openai",
    "model": "gpt-5",
    "temperature": 0.2
  }
}

For creative tasks (writing outreach emails, generating content), increase it:

{ "temperature": 0.7 }

4. Set Appropriate Max Tokens#

GPT-5 can generate long responses. For most CRM tasks, cap max tokens to avoid unnecessarily verbose output:

{
  "model": {
    "provider": "openai", 
    "model": "gpt-5",
    "maxTokens": 2048
  }
}

Override this for tasks that legitimately need long output, like generating detailed reports or long-form content.

5. Use Streaming for Long Tasks#

For agent tasks that take a while, enable streaming to see output as it generates:

openclaw chat --stream "Analyze all 200 leads and create a prioritized outreach plan"

This gives you immediate feedback that the agent is working, rather than waiting for the full response.

Managing API Costs#

GPT-5 is more expensive than smaller models. Here's how to keep costs manageable:

Route tasks by complexity: Use GPT-5 for tasks that actually need it, and a cheaper model (or local model) for simple queries.

# Simple query — use mini or local model
openclaw chat --model openai/gpt-5-mini "How many contacts in my CRM?"
 
# Complex analysis — use full GPT-5
openclaw chat --model openai/gpt-5 "Identify which leads are most likely to convert based on engagement patterns"

Set a usage cap in your OpenAI account settings under Billing → Usage Limits. This prevents runaway costs from automation gone wrong.

Cache common queries: If you're running the same analysis repeatedly, consider saving the output to a DenchClaw document rather than re-running the API call.

GPT-5 and DenchClaw's Skills System#

The Skills system is where GPT-5 really earns its keep. Complex Skills like the CRM Analyst use multi-step reasoning:

  1. Query DuckDB for relevant data
  2. Analyze patterns and relationships
  3. Generate recommendations
  4. Format the output as a report or update records

Each step requires reliable instruction following and structured output. GPT-5's mature tool-calling and reasoning makes multi-step Skills execute more reliably than smaller models.

When you install a new Skill via:

openclaw skills install crm-analyst

It will use your configured default model. You can pin specific Skills to specific models in the Skill's configuration if needed.

Combining GPT-5 with Local Data#

Your CRM data in DuckDB never leaves your machine — only the query context you explicitly send to OpenAI goes over the wire. This is an important distinction:

  • DuckDB data: stays on your machine, always
  • Query context: sent to OpenAI's API for inference
  • Response: returned to your machine and potentially stored in DuckDB

If you're concerned about sending specific data to OpenAI, use the local model setup with Ollama or LM Studio for sensitive queries and route only non-sensitive operations to GPT-5.

Troubleshooting#

"Model not found" or access errors

GPT-5 access may be gated. Check your OpenAI account's model access list at platform.openai.com/account/limits. You may need to upgrade your plan or join a waitlist.

"Insufficient quota" errors

Your OpenAI usage has exceeded your billing limit. Either add credits to your account or raise the usage cap in billing settings.

Slow responses

GPT-5 is more compute-intensive than GPT-4o. For latency-sensitive workflows, consider using gpt-5-turbo or gpt-5-mini for most operations. You can also enable streaming to get partial output sooner.

Function calling fails or returns unexpected format

Make sure you're using the correct model identifier — some older model strings behave differently. The most reliable string is typically the full versioned name (e.g., gpt-5-2026-03-01). Check OpenAI's model reference docs for the current canonical name.

FAQ#

Is GPT-5 available on the free OpenAI tier?

Likely not at launch — new flagship models typically require a paid plan. Check your account at platform.openai.com for current access requirements.

Should I use GPT-5 or GPT-4o for DenchClaw?

If GPT-5 is available to you, it's better for complex reasoning tasks. For simpler, high-frequency operations, GPT-4o-mini or GPT-5-mini offer a much better cost-to-capability ratio. Many DenchClaw users keep GPT-4o-class models as their daily driver and reserve GPT-5 for specific complex tasks.

How does OpenClaw handle OpenAI rate limits?

OpenClaw doesn't automatically retry rate-limited requests, but you can configure delays between agent steps or use multiple API keys (with different accounts) for high-volume workloads. OpenAI Tier 4-5 accounts have significantly higher rate limits.

Can I use OpenAI's batch API for bulk CRM enrichment?

Not directly through OpenClaw's standard interface, but you can write a custom script using the OpenAI SDK alongside DenchClaw's DuckDB database. This is a good approach for one-time large-scale enrichment jobs where latency doesn't matter.

Does OpenClaw support OpenAI's Assistants API?

OpenClaw currently uses the completions/chat API directly. The Assistants API's thread management would conflict with OpenClaw's own context management, so the direct API is the right integration point.

Ready to try DenchClaw? Install in one command: npx denchclaw. Full setup guide →

Mark Rachapoom

Written by

Mark Rachapoom

Building the future of AI CRM software.

Continue reading

DENCH

© 2026 DenchHQ · San Francisco, CA