Skip to content

Commit b55cfe1

Browse files
Add OpenAI example
Signed-off-by: Adrian Cole <[email protected]>
1 parent 23f67eb commit b55cfe1

File tree

8 files changed

+120
-2
lines changed

8 files changed

+120
-2
lines changed

.pylintrc

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ extension-pkg-whitelist=cassandra
77

88
# Add list of files or directories to be excluded. They should be base names, not
99
# paths.
10-
ignore=CVS,gen,Dockerfile,docker-compose.yml,README.md,requirements.txt,docs
10+
ignore=CVS,gen,Dockerfile,docker-compose.yml,README.md,requirements.txt,docs,.venv
1111

1212
# Add files or directories matching the regex patterns to be excluded. The
1313
# regex matches against base names, not paths.

CHANGELOG.md

+3-1
Original file line numberDiff line numberDiff line change
@@ -13,9 +13,11 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
1313

1414
### Added
1515

16+
- Add example to `opentelemetry-instrumentation-openai-v2`
17+
([#3006](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3006))
1618
- `opentelemetry-instrumentation-sqlalchemy` Update unit tests to run with SQLALchemy 2
1719
([#2976](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/2976))
18-
- Add `opentelemetry-instrumentation-openai-v2` to `opentelemetry-bootstrap`
20+
- Add `opentelemetry-instrumentation-openai-v2` to `opentelemetry-bootstrap`
1921
([#2996](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/2996))
2022

2123
### Fixed
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
# Update this with your real OpenAI API key
2+
OPENAI_API_KEY=sk-YOUR_API_KEY
3+
4+
# Uncomment to use Ollama instead of OpenAI
5+
# OPENAI_BASE_URL=http://localhost:11434/v1
6+
# OPENAI_API_KEY=unused
7+
# CHAT_MODEL=qwen2.5:0.5b
8+
9+
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
10+
OTEL_EXPORTER_OTLP_PROTOCOL=http/protobuf
11+
OTEL_SERVICE_NAME=opentelemetry-python-openai
12+
13+
# Change to 'false' to disable logging
14+
OTEL_PYTHON_LOGGING_AUTO_INSTRUMENTATION_ENABLED=true
15+
# Change to 'console' if your OTLP endpoint doesn't support logs
16+
OTEL_LOGS_EXPORTER=otlp_proto_http
17+
# Change to 'false' to hide prompt and completion content
18+
OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
# Use an alpine image to make the runtime smaller
2+
FROM docker.io/python:3.12.7-alpine3.20
3+
RUN python -m pip install --upgrade pip
4+
5+
COPY /requirements.txt /tmp/requirements.txt
6+
RUN pip install -r /tmp/requirements.txt
7+
8+
COPY main.py /
9+
10+
CMD [ "opentelemetry-instrument", "python", "main.py" ]
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,52 @@
1+
OpenTelemetry OpenAI Instrumentation Example
2+
============================================
3+
4+
This is an example of how to instrument OpenAI calls with zero code changes,
5+
using `opentelemetry-instrument`.
6+
7+
When `main.py <main.py>`_ is run, it exports traces and logs to an OTLP
8+
compatible endpoint. Traces include details such as the model used and the
9+
duration of the chat request. Logs capture the chat request and the generated
10+
response, providing a comprehensive view of the performance and behavior of
11+
your OpenAI requests.
12+
13+
Setup
14+
-----
15+
16+
Minimally, update the `.env <.env>`_ file with your "OPENAI_API_KEY". An
17+
OTLP compatible endpoint should be listening for traces and logs on
18+
http://localhost:4318. If not, update "OTEL_EXPORTER_OTLP_ENDPOINT" as well.
19+
20+
Run with Docker
21+
---------------
22+
23+
If you have Docker installed, you can run the example in one step:
24+
25+
::
26+
27+
docker-compose run --build --rm python-opentelemetry-openai
28+
29+
You should see a poem generated by OpenAI while traces and logs export to your
30+
configured observability tool.
31+
32+
Run with Python
33+
---------------
34+
35+
If you prefer to run the example with Python, set up a virtual environment for
36+
the example like this:
37+
38+
::
39+
40+
python3 -m venv .venv
41+
source .venv/bin/activate
42+
pip install "python-dotenv[cli]"
43+
pip install -r requirements.txt
44+
45+
Now, run the example like this:
46+
47+
::
48+
49+
dotenv run -- opentelemetry-instrument python main.py
50+
51+
You should see a poem generated by OpenAI while traces and logs export to your
52+
configured observability tool.
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
services:
2+
opentelemetry-python-openai:
3+
container_name: opentelemetry-python-openai
4+
build:
5+
context: .
6+
env_file:
7+
- .env
8+
environment:
9+
OTEL_EXPORTER_OTLP_ENDPOINT: "http://host.docker.internal:4318"
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
import os
2+
3+
from openai import OpenAI
4+
5+
6+
def main():
7+
client = OpenAI()
8+
chat_completion = client.chat.completions.create(
9+
model=os.getenv("CHAT_MODEL", "gpt-4o-mini"),
10+
messages=[
11+
{
12+
"role": "user",
13+
"content": "Write a short poem on OpenTelemetry.",
14+
},
15+
],
16+
)
17+
print(chat_completion.choices[0].message.content)
18+
19+
20+
if __name__ == "__main__":
21+
main()
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
openai~=1.54.4
2+
3+
opentelemetry-sdk~=1.28.2
4+
opentelemetry-exporter-otlp-proto-http~=1.28.2
5+
opentelemetry-distro~=0.49b2
6+
opentelemetry-instrumentation-openai-v2~=2.0b0

0 commit comments

Comments
 (0)