Guide me in layman terms in explanatory tone "how to setup my project for observability using OTel/Arize Phoenix"
Setting up observability for your AI project is like installing a "flight recorder" (OpenTelemetry) and a "mission control dashboard" (Arize Phoenix). Since you are already building SPAs and working with story datasets, this setup will help you see exactly how your logic flows.
Here is a layman’s guide to getting it running in your project.
The "Big Picture" Flow
The Application: Your code (React, Python, etc.).
The Instrument (OTel): A small piece of code that "listens" to what your app is doing.
The Collector: The pipe that carries that info.
The Viewer (Phoenix): The dashboard where you watch the traces.
Step 1: Start Your Dashboard (Arize Phoenix)
Before your app can send data anywhere, you need a place for it to land. The easiest way is to run Phoenix locally on your machine.
In your terminal, run:
pip install arize-phoenix
phoenix serve
What just happened? You just started a local server. You can now open
http://localhost:6006in your browser to see your (currently empty) dashboard.
Step 2: Install the "Flight Recorder"
You need to install the OpenTelemetry (OTel) libraries that specifically understand AI and LLMs. Since you're likely using Python for your story-filtering logic, you’ll want the arize-phoenix-otel helper.
In your project folder, run:
pip install arize-phoenix-otel openinference-instrumentation-openai
(Note: If you use LangChain or LlamaIndex instead of raw OpenAI, there are specific "instrumentors" for those too.)
Step 3: Turn on the "Auto-Recorder"
Now, you need to tell your code: "Hey, every time I talk to an LLM, record it and send it to Phoenix."
You can do this with just two lines of code at the very top of your main script:
from phoenix.otel import register
# This sets up the plumbing and points it to your local Phoenix server
tracer_provider = register(project_name="my-story-app", auto_instrument=True)
Why this is cool:
By setting auto_instrument=True, you don't have to manually write "Start Recording" and "Stop Recording" around every function. The OTel library "wraps" your LLM calls automatically.
Step 4: Run Your Code and Watch
Now, run your script (e.g., the one filtering your TinyStories dataset).
Perform an LLM action (like classifying a story).
Go back to
http://localhost:6006.Magic: You will see a "Trace" appear.
How to use this for your specific projects:
Pro Tip: If you want to move from your laptop to a "real" setup later, you just change the Endpoint URL in your code to point to a cloud version of Phoenix. The code itself stays exactly the same!
See All Posts on Arize Phoenix <<< Previously Next >>>
No comments:
Post a Comment