How to Track Product Usage Data in Your CRM
Learn how to track product usage data in DenchClaw CRM to understand customer behavior, identify churn risk, and drive expansion revenue.
How to Track Product Usage Data in Your CRM
Product usage data is the most honest signal you have about customer health. Unlike NPS scores or check-in calls, usage data doesn't lie — it shows you exactly who's getting value and who's about to churn. Connecting that data to your CRM closes the loop between what customers do and what your sales and success teams know.
Here's how to set up product usage tracking in DenchClaw so that every account in your CRM reflects real behavioral data — not just what your customers say in meetings.
Why Product Usage Belongs in Your CRM#
Most teams keep product analytics in one tool (Mixpanel, Amplitude, PostHog) and customer data in another (HubSpot, Salesforce). The problem: your success manager is flying blind when they don't know that a key account hasn't logged in for 21 days, or that a user in a trial hit the export feature four times this week — a classic buying signal.
DenchClaw is a local-first CRM that stores everything in DuckDB on your machine. Because the database is directly queryable, you can import usage events from any analytics source and query them alongside contact and deal data instantly.
The workflow:
- Define usage fields on your customer or company object
- Import usage data (via API, webhook, or CSV)
- Build views and alerts around usage thresholds
- Let your AI agent surface at-risk accounts proactively
Step 1: Add Usage Fields to Your Company Object#
First, add the relevant usage fields to your company or customers object in DenchClaw.
Ask the agent:
"Add these fields to the company object: last_login_date (date), monthly_active_users (number), feature_adoption_score (number 0-100), sessions_last_30d (number), events_last_7d (number)"
Or do it manually in the .object.yaml:
fields:
- name: Last Login Date
type: date
- name: Monthly Active Users
type: number
- name: Feature Adoption Score
type: number
- name: Sessions Last 30d
type: number
- name: Events Last 7d
type: number
- name: Usage Tier
type: enum
options: [Power User, Active, At Risk, Churned]Keep fields focused on the signals your team actually acts on. Drowning your CRM in 40 usage metrics makes the data invisible — pick the 5-6 that predict outcomes.
Step 2: Import Usage Data into DuckDB#
Option A: Webhook from Your Analytics Tool#
Most analytics platforms (PostHog, Amplitude, Segment) support webhooks. When a key event fires — daily active user sync, weekly summary, feature usage — POST the data to a DenchClaw webhook endpoint.
Set up a webhook listener skill or use DenchClaw's built-in webhook handling:
# Your analytics tool POSTs to:
https://your-denchclaw-instance/webhooks/usage-syncThe agent receives the payload and runs an update query:
UPDATE entry_fields
SET value = '2026-03-25'
WHERE entry_id = (
SELECT e.id FROM entries e
JOIN entry_fields ef ON e.id = ef.entry_id
JOIN fields f ON ef.field_id = f.id
WHERE f.name = 'Domain' AND ef.value = 'acme.com'
)
AND field_id = (SELECT id FROM fields WHERE name = 'Last Login Date');Option B: CSV Import via Action Field#
If you export weekly usage CSVs from your analytics tool, create an Action field on the company object called "Sync Usage Data." The action script:
- Reads the CSV from a known path
- Matches rows by domain or account ID
- Updates the usage fields via DuckDB
#!/usr/bin/env python3
import csv
import duckdb
import os
entry_id = os.environ.get("ENTRY_ID")
domain = os.environ.get("DOMAIN")
conn = duckdb.connect("/path/to/workspace.duckdb")
# Find the matching row in your usage CSV
with open("/path/to/usage-export.csv") as f:
reader = csv.DictReader(f)
for row in reader:
if row["domain"] == domain:
# Update fields
print(f'{{"type":"status","message":"Found usage data for {domain}"}}')
# ... update logicOption C: Direct SQL Insert via the Agent#
For smaller teams, ask the agent to do manual syncs:
"Update Acme Corp's usage data: last login yesterday, 12 monthly active users, adoption score 67, 45 sessions last 30 days"
The agent writes the SQL directly to DuckDB. Simple, no infrastructure required.
Step 3: Build Usage-Based Views#
With usage data in your CRM, the power comes from filtering. Create views that surface what matters:
At-Risk Accounts View#
- name: At Risk Accounts
filters:
- field: Sessions Last 30d
operator: less_than
value: 5
- field: Usage Tier
operator: not_equals
value: Churned
sort:
- field: Last Login Date
direction: ascPower Users (Expansion Targets)#
- name: Expansion Targets
filters:
- field: Monthly Active Users
operator: greater_than
value: 10
- field: Feature Adoption Score
operator: greater_than
value: 75
sort:
- field: Monthly Active Users
direction: descAsk the agent: "Create a view showing all companies with sessions last 30 days under 5 and no churned status" — it writes the YAML automatically.
Step 4: Automate Usage Tier Classification#
Manually updating "Usage Tier" is tedious. Instead, create a scheduled agent task that recalculates tiers based on your usage field values.
Tell the agent:
"Every morning at 8am, update the Usage Tier field for all companies: if sessions_last_30d > 20 = Power User, 10-20 = Active, 1-9 = At Risk, 0 = Churned"
The agent creates a cron job that runs this SQL:
UPDATE entry_fields ef
SET value = CASE
WHEN sessions.v >= 20 THEN 'Power User'
WHEN sessions.v >= 10 THEN 'Active'
WHEN sessions.v >= 1 THEN 'At Risk'
ELSE 'Churned'
END
FROM (
SELECT entry_id, value::int as v
FROM entry_fields ef2
JOIN fields f ON ef2.field_id = f.id
WHERE f.name = 'Sessions Last 30d'
) sessions
WHERE ef.entry_id = sessions.entry_id
AND ef.field_id = (SELECT id FROM fields WHERE name = 'Usage Tier');Step 5: Surface Usage Insights via Telegram#
The best part about DenchClaw is that your usage data is always one message away. From Telegram or WhatsApp:
- "Show me all at-risk accounts with ARR over $10k"
- "Which enterprise customers haven't logged in this week?"
- "Who are my top 5 accounts by monthly active users?"
- "Alert me if any account with >$5k ARR drops below 3 sessions"
Because everything lives in local DuckDB, these queries run instantly — no API roundtrip to a third-party analytics platform.
Building a Health Score Dashboard#
Once usage data is flowing, build a Dench App dashboard that visualizes it:
const atRisk = await dench.db.query(`
SELECT
ef_name.value as company,
ef_sessions.value::int as sessions,
ef_score.value::int as score
FROM entries e
JOIN entry_fields ef_name ON e.id = ef_name.entry_id
JOIN fields f_name ON ef_name.field_id = f_name.id AND f_name.name = 'Company Name'
JOIN entry_fields ef_sessions ON e.id = ef_sessions.entry_id
JOIN fields f_sessions ON ef_sessions.field_id = f_sessions.id AND f_sessions.name = 'Sessions Last 30d'
JOIN entry_fields ef_score ON e.id = ef_score.entry_id
JOIN fields f_score ON ef_score.field_id = f_score.id AND f_score.name = 'Feature Adoption Score'
WHERE ef_sessions.value::int < 5
ORDER BY ef_sessions.value::int ASC
LIMIT 10
`);This gives you a live "at-risk accounts" widget right in your DenchClaw sidebar.
Frequently Asked Questions#
How do I connect Mixpanel or Amplitude to DenchClaw?#
The simplest path is a CSV export + import workflow. Export weekly usage summaries from Mixpanel/Amplitude, then use a DenchClaw action field or agent command to sync those values into your company records. For real-time sync, set up a webhook that POSTs to a DenchClaw endpoint when key events occur.
What usage metrics should I track in my CRM?#
Focus on the signals that predict outcomes for your business. Common choices: last login date (churn predictor), monthly active users (expansion indicator), key feature adoption (depth of engagement), session frequency (habit formation). More than 6-8 metrics becomes noise.
Can I track individual user behavior, not just company-level data?#
Yes. Create a users object linked to companies via a relation field. Add usage fields at the user level, then roll them up to the company via DuckDB aggregation queries. This lets you track which specific users are champions versus which are disengaged within a single account.
How often should I sync usage data?#
For most teams, daily is sufficient. Real-time sync adds complexity; unless you're running automated playbooks that trigger within hours of a usage event, daily batch imports give you 95% of the value at 10% of the infrastructure cost.
Does DenchClaw have native analytics integrations?#
DenchClaw doesn't have point-and-click integrations with Mixpanel or PostHog yet — the integration happens via DuckDB SQL and the browser agent. Because DenchClaw uses your Chrome profile, the browser agent can log into any analytics tool you're already authenticated with and pull data automatically.
Ready to try DenchClaw? Install in one command: npx denchclaw. Full setup guide →
