Capability · Framework — observability

Arize Phoenix

Phoenix is Arize AI's open-source complement to their enterprise platform. It captures OpenInference traces from LangChain, LlamaIndex, DSPy, and the OpenAI SDK, stores them locally in a Python app you can `pip install`, and layers evaluations, datasets, and annotation workflows on top. It's become one of the de-facto default observability layers for AI engineers.

Framework facts

Category
observability
Language
Python / TypeScript
License
Elastic License v2
Repository
https://github.com/Arize-ai/phoenix

Install

pip install arize-phoenix openinference-instrumentation-openai
# start the UI
python -m phoenix.server.main serve

Quickstart

from phoenix.otel import register
from openinference.instrumentation.openai import OpenAIInstrumentor
from openai import OpenAI

register(project_name='my-app', endpoint='http://localhost:6006/v1/traces')
OpenAIInstrumentor().instrument()

OpenAI().chat.completions.create(
    model='gpt-4o-mini',
    messages=[{'role':'user','content':'hello'}]
)
# traces now appear in http://localhost:6006

Alternatives

  • Langfuse — open-source tracing alternative
  • LangSmith — LangChain's hosted observability
  • Weights & Biases Weave — W&B's LLM product
  • Helicone — proxy-based observability

Frequently asked questions

Phoenix vs LangSmith?

Phoenix is open-source and framework-agnostic (OpenInference traces from any SDK). LangSmith is hosted and deeply tied to LangChain. Many teams run Phoenix locally for development and LangSmith or Arize Cloud for production.

Is Phoenix free for commercial use?

Yes, under Elastic License v2 — free to use in production as long as you don't offer it as a multi-tenant managed service competing with Arize.

Sources

  1. Phoenix — docs — accessed 2026-04-20
  2. Phoenix GitHub — accessed 2026-04-20