Capability · Framework — observability

Comet LLM / Opik-Comet

Comet is a mature ML experiment tracking platform that expanded into LLMOps with CometLLM (prompt + response logging) and then acquired / merged with Opik, the dedicated LLM tracing product. For teams already using Comet for model training runs, this gives a single pane of glass across classic ML and LLM workflows — dashboards, artifacts, compare views — without buying a second SaaS.

Framework facts

Category
observability
Language
Python / TypeScript
License
Apache-2.0 (Opik OSS) / Proprietary SaaS (Comet)

Install

pip install opik
# or legacy
pip install comet-llm

Quickstart

import opik
from opik.integrations.openai import track_openai
import openai

opik.configure(api_key='COMET_KEY')
client = track_openai(openai.OpenAI())
client.chat.completions.create(model='gpt-4o', messages=[{'role':'user','content':'hi'}])

Alternatives

  • Weights & Biases Weave
  • MLflow LLM
  • Langfuse — OSS

Frequently asked questions

CometLLM or Opik?

Opik is the current direction — OSS, richer tracing, better agent support. CometLLM is still available for users on legacy pipelines but new projects should start with Opik.

Can I use Opik without Comet?

Yes. Opik is Apache-2.0 and self-hostable; Comet's SaaS just hosts the same server with added team features and integrations with Comet experiments.

Sources

  1. Opik docs — accessed 2026-04-20
  2. Comet LLMOps — accessed 2026-04-20