Logfire
Uncomplicated observability for Python and beyond, from the team behind Pydantic
Alternative To
- β’ Datadog
- β’ New Relic
- β’ Sentry
Difficulty Level
For experienced users. Complex setup and configuration required.
Overview
Logfire is an uncomplicated observability platform built by the team behind Pydantic, designed with the same philosophy that powerful tools can be easy to use. It provides exceptional visibility into Python applications with rich display of Python objects, event-loop telemetry, and profiling capabilities. Built on OpenTelemetry, Logfire offers SQL-based querying, special Pydantic integration, and support for virtually any programming language.
System Requirements
- CPU: 2+ cores
- RAM: 4GB+
- GPU: Not required
- Storage: 1GB+
- Python: 3.8+ (for Python SDK)
Installation Guide
Prerequisites
- Python 3.8+ installed
- pip (Python package manager)
- Basic knowledge of Python programming
Installation
The simplest way to install Logfire is via pip:
pip install logfire
Setup
After installation, you’ll need to configure Logfire to connect to the cloud service:
Create a Logfire account at logfire.pydantic.dev
Initialize Logfire in your project:
import logfire # Configure Logfire (this will prompt for authentication if needed) logfire.configure() # Test the installation logfire.info("Hello, Logfire!")Authenticate via CLI (alternative method):
logfire auth login
You can store your credentials in a configuration file or environment variables to avoid authenticating each time.
Using with Frameworks
Logfire integrates with many popular Python frameworks and libraries. Here’s an example with FastAPI:
pip install 'logfire[fastapi]'
import logfire
from fastapi import FastAPI
# Initialize Logfire
logfire.configure()
# Create and instrument a FastAPI app
app = FastAPI()
logfire.instrument_fastapi(app)
@app.get("/")
def read_root():
return {"Hello": "World"}
Note: For detailed installation instructions specific to your operating system and environment, please refer to the official documentation.
Practical Exercise: Getting Started with Logfire
Let’s create a simple Python application with Logfire integration to monitor its behavior.
Step 1: Basic Logging
Create a new Python file called basic_logging.py:
import logfire
from datetime import date
# Configure Logfire
logfire.configure()
# Simple logging
logfire.info('Hello, {name}!', name='world')
# Using structured context
user = {
'id': 12345,
'name': 'Jane Doe',
'email': 'jane@example.com',
'roles': ['admin', 'user']
}
logfire.info('User logged in: {user}', user=user)
# Logging errors with context
try:
result = 10 / 0
except Exception as e:
logfire.error('Division failed: {error}', error=e)
Run the script and check the Logfire dashboard to see your logs.
Step 2: Using Spans for Tracing
Create a file called tracing_example.py:
import logfire
import time
import random
# Configure Logfire
logfire.configure()
# Function with tracing using the decorator
@logfire.instrument("Process user data")
def process_user_data(user_id, data):
# Simulate some processing time
time.sleep(random.random())
return {"processed": True, "user_id": user_id, "data_size": len(data)}
# Function with tracing using context manager
def analyze_data(dataset):
with logfire.span("Analyzing dataset with {count} items", count=len(dataset)):
# Add some nested spans
with logfire.span("Data validation"):
time.sleep(random.random() * 0.5)
logfire.info("Data validation complete")
with logfire.span("Data processing"):
time.sleep(random.random())
logfire.info("Data processing complete")
with logfire.span("Result generation"):
time.sleep(random.random() * 0.8)
result = {"status": "success", "metrics": {"accuracy": 0.95}}
logfire.info("Generated results: {result}", result=result)
return result
# Run the functions
if __name__ == "__main__":
# Process multiple users
for i in range(5):
user_data = {"field1": f"value{i}", "field2": i * 100}
result = process_user_data(i, user_data)
logfire.info("Processing result: {result}", result=result)
# Analyze a dataset
dataset = [{"id": i, "value": random.random()} for i in range(10)]
analysis_result = analyze_data(dataset)
logfire.info("Analysis complete: {result}", result=analysis_result)
Step 3: Pydantic Integration
Create a file called pydantic_integration.py:
import logfire
from pydantic import BaseModel, Field, ValidationError
from typing import List, Optional
from datetime import date
# Configure Logfire
logfire.configure()
# Instrument Pydantic to log validations
logfire.instrument_pydantic()
class User(BaseModel):
id: int
name: str
email: str
age: int = Field(gt=0)
tags: List[str] = []
created_at: Optional[date] = None
# Try with valid data
try:
user = User(
id=1,
name="John Doe",
email="john@example.com",
age=30,
tags=["customer", "premium"],
created_at="2023-01-15"
)
logfire.info("Created user: {user}", user=user)
except ValidationError as e:
logfire.error("Validation error: {error}", error=e)
# Try with invalid data to see validation errors
try:
invalid_user = User(
id="not-an-integer", # Type error
name="", # Empty string
email="invalid-email", # Invalid email format
age=-5, # Age validation error
tags=123 # Not a list
)
except ValidationError as e:
logfire.error("Validation error: {error}", error=e)
Step 4: Advanced Features to Explore
Once you’re comfortable with the basics, try exploring these advanced features:
- Use SQL to query your logs and traces
- Implement custom OpenTelemetry instrumentation
- Set up monitoring and alerts based on your data
- View live data streaming in the dashboard
- Integrate with other frameworks like FastAPI, Django, SQLAlchemy
- Use profiling to identify performance bottlenecks
Resources
Official Documentation
The official documentation provides comprehensive guides, API references, and examples:
GitHub Repository
The GitHub repository contains the source code, issues, and contribution guidelines:
Pydantic Integration
Learn more about the special integration with Pydantic:
Learning Resources
Expand your knowledge with these resources:
- Why Pydantic is Building an Observability Platform - Background and philosophy
- Uncomplicated Observability for Python Applications - Tutorial and overview
- Why Hyperlint Chose Pydantic Logfire - Case study
Community Support
Get help and connect with the Pydantic team:
Suggested Projects
You might also be interested in these similar projects:
Chroma is the AI-native open-source embedding database for storing and searching vector embeddings
Blazing-fast, AI-ready web crawler and scraper designed specifically for LLMs, AI agents, and data pipelines
A powerful low-code tool for building and deploying AI-powered agents and workflows