Capability · Framework — orchestration
Marvin
Marvin takes a 'Python-native' approach to LLM development: decorate a function with `@marvin.fn` and let an LLM implement it, or pass a Pydantic model to `marvin.extract` to pull typed data from text. It's designed to feel like adding AI to existing Python code rather than building a new application stack, which makes it a favourite for data pipelines and ETL-style tasks.
Framework facts
- Category
- orchestration
- Language
- Python
- License
- Apache 2.0
- Repository
- https://github.com/prefecthq/marvin
Install
pip install marvin Quickstart
import marvin
from pydantic import BaseModel
class Person(BaseModel):
name: str
age: int
people = marvin.extract(
'Alice is 30 and Bob is 25',
target=Person
)
# [Person(name='Alice', age=30), Person(name='Bob', age=25)] Alternatives
- Instructor — more popular Pydantic-based extraction
- BAML — schema-first DSL alternative
- Outlines — constrained generation with regex
- Mirascope — decorator-based LLM library
Frequently asked questions
Marvin vs Instructor?
Instructor is more OpenAI-SDK-native and focused purely on structured output. Marvin is more opinionated and includes higher-level helpers (classify, cast, generate). Pick Instructor for minimal overhead, Marvin for more batteries-included data tasks.
Is Marvin still active?
Yes — it's maintained by Prefect and has seen steady 3.x releases with support for Anthropic, OpenAI, and custom providers.
Sources
- Marvin — docs — accessed 2026-04-20
- Marvin on GitHub — accessed 2026-04-20