LLM Observability | Datadog

Monitor your OpenAI LLM spend with cost insights from Datadog

Learn how to use Cloud Cost Management and LLM Observability to optimize your OpenAI costs.

Optimize LLM application performance with Datadog's vLLM integration

Learn how to use our vLLM integration to monitor the performance and resource usage of your LLM workloads.

Best practices for monitoring LLM prompt injection attacks to protect sensitive data

Learn how to monitor prompt injection attacks and prevent your LLM applications from leaking sensitive data.

This Month in Datadog - October 2024

Get up to speed on LLM Observability's native integration with Google Gemini, unified Error Tracking, Security ...

Troubleshooting RAG-based LLM applications

Learn how to mitigate some of the common challenges to building RAG-based LLM applications.

Monitor your Azure OpenAI applications with Datadog LLM Observability

Learn how to use Datadog LLM Observability to monitor your Azure OpenAI applications.

Monitor your Anthropic applications with Datadog LLM Observability

Learn how Datadog LLM Observability helps you ensure the performance and safety of your Anthropic-powered LLM ...

Get granular LLM observability by instrumenting your LLM chains

Learn how to trace your LLM chains to closely monitor the performance and accuracy of your LLM applications ...

Recapping Microsoft Build 2024

A recap of the biggest new features from Microsoft Build 2024 from an observability perspective.

Monitor, troubleshoot, improve, and secure your LLM applications with Datadog LLM Observability

Learn how Datadog LLM Observability helps you troubleshoot issues in your LLM applications, improve their ...

...
...