Automated LLM Monitoring is the easiest and most powerful way to gain comprehensive insights into your AI agents’ interactions. Instead of manually tracking each event, you can simply wrap your existing LLM client with one of our specialized monitors. This approach automatically captures a rich set of data with minimal code changes, allowing you to focus on building your application while ensuring complete governance.

Why Use Automated Monitoring?

  • Effortless Integration: Add comprehensive monitoring to your application in just a few lines of code.
  • Complete Data Capture: Automatically track requests, responses, latency, token usage, and costs without manual effort.
  • Real-time Compliance: Seamlessly integrate with the Compliance Engine to scan all LLM responses for potential violations.
  • Production Ready: Designed for high performance with efficient batching and asynchronous processing, ensuring minimal impact on your application’s latency.

Supported Integrations

We provide dedicated monitors for the most popular LLM providers in the financial services industry.

How It Works

The automated monitoring process is straightforward:
1

1. Choose the Right Monitor

Instead of the base AgentMonitor, you instantiate a provider-specific monitor, such as AnthropicAgentMonitor or OpenAIAgentMonitor.
2

2. Register Your Agent

Just as with manual tracking, you register your agent’s profile, including its model and compliance settings.
3

3. Wrap Your LLM Client

Use the provided wrapper method (e.g., wrapAnthropic or wrapOpenAI) on your existing LLM client instance.
4

4. Use as Normal

Use the newly created “monitored” client exactly as you would the original. The wrapper intercepts the API calls, captures the data, and then passes the request to the provider.

Example: Wrapping the Anthropic Client

This example shows how simple it is to add monitoring to an existing application using the Anthropic SDK.
import { AnthropicAgentMonitor } from '@agent-governance/node';
import Anthropic from '@anthropic-ai/sdk';

// 1. Initialize the specialized monitor
const monitor = new AnthropicAgentMonitor({
  apiKey: process.env.AGENT_GOVERNANCE_API_KEY,
  organizationId: 'your-org-id',
});

// 2. Register the agent
await monitor.registerAgent({
  id: 'claude-banking-agent',
  name: 'Claude Banking Assistant',
  llmProvider: 'anthropic',
  model: 'claude-3-5-sonnet-20241022',
  // ... other agent details
});

// 3. Wrap the existing Anthropic client
const anthropic = new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY });
const monitoredAnthropic = monitor.wrapAnthropic(anthropic, 'claude-banking-agent');

// 4. Use the monitored client as you normally would
const response = await monitoredAnthropic.messages.create({
  model: 'claude-3-5-sonnet-20241022',
  messages: [{ role: 'user', content: 'What is my account balance?' }],
  sessionId: 'customer-session-123'
});

All tracking - including latency, tokens, cost, and compliance - is handled automatically!

What’s Tracked Automatically?

By using an integration wrapper, you automatically capture the following for each LLM call:
While automated monitoring is powerful, you can still use manual tracking methods like monitor.trackToolCall() or monitor.trackError() alongside the wrappers to add even more context to your workflows.

Next Steps