AI工具包中的追踪

AI 工具包提供追踪功能,帮助您监控和分析 AI 应用的性能。您可以追踪AI应用的执行过程,包括与生成式AI模型的交互,以洞察其行为和性能。

AI Toolkit 托管本地 HTTP 和 gRPC 服务器以收集跟踪数据。收集服务器兼容OTLP(OpenTelemetry Protocol),大多数语言模型SDK要么直接支持OTLP,要么拥有非Microsoft的仪器库支持。使用AI Toolkit来可视化收集的仪器数据。

所有支持 OTLP 并遵循生成式 AI 系统语义规范的框架或 SDK 均被支持。下表列出了常见的AI兼容性测试SDK。

Azure AI Inference 铸造代理服务 人为 双子座 LangChain OpenAI SDK3 OpenAI 代理 SDK
Python ✅ (追踪环单片眼镜)1,2 ✅ (单片眼镜) ✅ (朗史密斯单片眼镜)1,2 ✅ (Opentelemetry-python-contrib单眼镜)1 ✅ (木柴,单片眼镜)1,2
TS/JS ✅ (traceloop)1,2 ✅ (traceloop)1,2 ✅ (traceloop)1,2
  1. 括号内的SDK是非Microsoft工具,但它们增加了OTLP支持,因为官方SDK不支持OTLP。
  2. 这些工具并未完全遵循生成式人工智能系统的OpenTelemetry规则。
  3. 对于OpenAI SDK,仅支持聊天完成APIResponses API 尚未被支持。

如何开始描摹

  1. 在树状视图中选择描摹,打开描摹网页视图。

  2. 选择启动采集器按钮以启动本地OTLP追踪采集服务器。

    截图显示追踪网页视图中开始收集器按钮。

  3. 通过代码片段启用仪器。请参阅设置仪表部分,了解不同语言和SDK的代码片段。

  4. 通过运行你的应用生成跟踪数据。

  5. 在追踪网页视图中,选择刷新按钮查看新的追踪数据。

    追踪网页视图中显示追踪列表的截图。

仪器安装

在你的AI应用中设置追踪功能以收集痕迹数据。以下代码片段展示了如何为不同SDK和语言设置追踪:

所有SDK的流程都类似:

  • 在你的LLM或代理应用中添加追踪功能。
  • 设置OTLP追踪导出器使用AITK本地采集器。
Azure AI Inference SDK - Python

安装:

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http azure-ai-inference[opentelemetry]

设置:

import os
os.environ["AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED"] = "true"
os.environ["AZURE_SDK_TRACING_IMPLEMENTATION"] = "opentelemetry"

from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter

resource = Resource(attributes={
    "service.name": "opentelemetry-instrumentation-azure-ai-agents"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
    endpoint="http://localhost:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)

logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
    BatchLogRecordProcessor(OTLPLogExporter(endpoint="http://localhost:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))

from azure.ai.inference.tracing import AIInferenceInstrumentor
AIInferenceInstrumentor().instrument(True)
Azure AI Inference SDK - TypeScript/JavaScript

安装:

npm install @azure/opentelemetry-instrumentation-azure-sdk @opentelemetry/api @opentelemetry/exporter-trace-otlp-proto @opentelemetry/instrumentation @opentelemetry/resources @opentelemetry/sdk-trace-node

设置:

const { context } = require('@opentelemetry/api');
const { resourceFromAttributes } = require('@opentelemetry/resources');
const {
  NodeTracerProvider,
  SimpleSpanProcessor
} = require('@opentelemetry/sdk-trace-node');
const { OTLPTraceExporter } = require('@opentelemetry/exporter-trace-otlp-proto');

const exporter = new OTLPTraceExporter({
  url: 'http://localhost:4318/v1/traces'
});
const provider = new NodeTracerProvider({
  resource: resourceFromAttributes({
    'service.name': 'opentelemetry-instrumentation-azure-ai-inference'
  }),
  spanProcessors: [new SimpleSpanProcessor(exporter)]
});
provider.register();

const { registerInstrumentations } = require('@opentelemetry/instrumentation');
const {
  createAzureSdkInstrumentation
} = require('@azure/opentelemetry-instrumentation-azure-sdk');

registerInstrumentations({
  instrumentations: [createAzureSdkInstrumentation()]
});
Foundry 代理服务 - Python

安装:

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http azure-ai-inference[opentelemetry]

设置:

import os
os.environ["AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED"] = "true"
os.environ["AZURE_SDK_TRACING_IMPLEMENTATION"] = "opentelemetry"

from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter

resource = Resource(attributes={
    "service.name": "opentelemetry-instrumentation-azure-ai-agents"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
    endpoint="http://localhost:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)

logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
    BatchLogRecordProcessor(OTLPLogExporter(endpoint="http://localhost:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))

from azure.ai.agents.telemetry import AIAgentsInstrumentor
AIAgentsInstrumentor().instrument(True)
Foundry Agent Service - TypeScript/JavaScript

安装:

npm install @azure/opentelemetry-instrumentation-azure-sdk @opentelemetry/api @opentelemetry/exporter-trace-otlp-proto @opentelemetry/instrumentation @opentelemetry/resources @opentelemetry/sdk-trace-node

设置:

const { context } = require('@opentelemetry/api');
const { resourceFromAttributes } = require('@opentelemetry/resources');
const {
  NodeTracerProvider,
  SimpleSpanProcessor
} = require('@opentelemetry/sdk-trace-node');
const { OTLPTraceExporter } = require('@opentelemetry/exporter-trace-otlp-proto');

const exporter = new OTLPTraceExporter({
  url: 'http://localhost:4318/v1/traces'
});
const provider = new NodeTracerProvider({
  resource: resourceFromAttributes({
    'service.name': 'opentelemetry-instrumentation-azure-ai-inference'
  }),
  spanProcessors: [new SimpleSpanProcessor(exporter)]
});
provider.register();

const { registerInstrumentations } = require('@opentelemetry/instrumentation');
const {
  createAzureSdkInstrumentation
} = require('@azure/opentelemetry-instrumentation-azure-sdk');

registerInstrumentations({
  instrumentations: [createAzureSdkInstrumentation()]
});
拟人 - Python

开放遥测

安装:

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-anthropic

设置:

from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter

resource = Resource(attributes={
    "service.name": "opentelemetry-instrumentation-anthropic-traceloop"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
    endpoint="http://localhost:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)

logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
    BatchLogRecordProcessor(OTLPLogExporter(endpoint="http://localhost:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))

from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
AnthropicInstrumentor().instrument()

单片眼镜

安装:

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace

设置:

from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

# Import monocle_apptrace
from monocle_apptrace import setup_monocle_telemetry

# Setup Monocle telemetry with OTLP span exporter for traces
setup_monocle_telemetry(
    workflow_name="opentelemetry-instrumentation-anthropic",
    span_processors=[
        BatchSpanProcessor(
            OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces")
        )
    ]
)
拟人型 - TypeScript/JavaScript

安装:

npm install @traceloop/node-server-sdk

设置:

const { initialize } = require('@traceloop/node-server-sdk');
const { trace } = require('@opentelemetry/api');

initialize({
  appName: 'opentelemetry-instrumentation-anthropic-traceloop',
  baseUrl: 'http://localhost:4318',
  disableBatch: true
});
谷歌Gemini - Python

开放遥测

安装:

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-google-genai

设置:

from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter

resource = Resource(attributes={
    "service.name": "opentelemetry-instrumentation-google-genai"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
    endpoint="http://localhost:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)

logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
    BatchLogRecordProcessor(OTLPLogExporter(endpoint="http://localhost:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))

from opentelemetry.instrumentation.google_genai import GoogleGenAiSdkInstrumentor
GoogleGenAiSdkInstrumentor().instrument(enable_content_recording=True)

单片眼镜

安装:

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace

设置:

from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

# Import monocle_apptrace
from monocle_apptrace import setup_monocle_telemetry

# Setup Monocle telemetry with OTLP span exporter for traces
setup_monocle_telemetry(
    workflow_name="opentelemetry-instrumentation-google-genai",
    span_processors=[
        BatchSpanProcessor(
            OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces")
        )
    ]
)
LangChain - Python

兰史密斯

安装:

pip install langsmith[otel]

设置:

import os
os.environ["LANGSMITH_OTEL_ENABLED"] = "true"
os.environ["LANGSMITH_TRACING"] = "true"
os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = "http://localhost:4318"

单片眼镜

安装:

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace

设置:

from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

# Import monocle_apptrace
from monocle_apptrace import setup_monocle_telemetry

# Setup Monocle telemetry with OTLP span exporter for traces
setup_monocle_telemetry(
    workflow_name="opentelemetry-instrumentation-langchain",
    span_processors=[
        BatchSpanProcessor(
            OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces")
        )
    ]
)
LangChain - TypeScript/JavaScript

安装:

npm install @traceloop/node-server-sdk

设置:

const { initialize } = require('@traceloop/node-server-sdk');
initialize({
  appName: 'opentelemetry-instrumentation-langchain-traceloop',
  baseUrl: 'http://localhost:4318',
  disableBatch: true
});
OpenAI - Python

开放遥测

安装:

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-openai-v2

设置:

from opentelemetry import trace, _events
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk._logs import LoggerProvider
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk._events import EventLoggerProvider
from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter
from opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor
import os

os.environ["OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT"] = "true"

# Set up resource
resource = Resource(attributes={
    "service.name": "opentelemetry-instrumentation-openai"
})

# Create tracer provider
trace.set_tracer_provider(TracerProvider(resource=resource))

# Configure OTLP exporter
otlp_exporter = OTLPSpanExporter(
    endpoint="http://localhost:4318/v1/traces"
)

# Add span processor
trace.get_tracer_provider().add_span_processor(
    BatchSpanProcessor(otlp_exporter)
)

# Set up logger provider
logger_provider = LoggerProvider(resource=resource)
logger_provider.add_log_record_processor(
    BatchLogRecordProcessor(OTLPLogExporter(endpoint="http://localhost:4318/v1/logs"))
)
_events.set_event_logger_provider(EventLoggerProvider(logger_provider))

# Enable OpenAI instrumentation
OpenAIInstrumentor().instrument()

单片眼镜

安装:

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace

设置:

from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

# Import monocle_apptrace
from monocle_apptrace import setup_monocle_telemetry

# Setup Monocle telemetry with OTLP span exporter for traces
setup_monocle_telemetry(
    workflow_name="opentelemetry-instrumentation-openai",
    span_processors=[
        BatchSpanProcessor(
            OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces")
        )
    ]
)
OpenAI - TypeScript/JavaScript

安装:

npm install @traceloop/instrumentation-openai @traceloop/node-server-sdk

设置:

const { initialize } = require('@traceloop/node-server-sdk');
initialize({
  appName: 'opentelemetry-instrumentation-openai-traceloop',
  baseUrl: 'http://localhost:4318',
  disableBatch: true
});
OpenAI 代理 SDK - Python

木柴火

安装:

pip install logfire

设置:

import logfire
import os

os.environ["OTEL_EXPORTER_OTLP_TRACES_ENDPOINT"] = "http://localhost:4318/v1/traces"

logfire.configure(
    service_name="opentelemetry-instrumentation-openai-agents-logfire",
    send_to_logfire=False,
)
logfire.instrument_openai_agents()

单片眼镜

安装:

pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http monocle_apptrace

设置:

from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

# Import monocle_apptrace
from monocle_apptrace import setup_monocle_telemetry

# Setup Monocle telemetry with OTLP span exporter for traces
setup_monocle_telemetry(
    workflow_name="opentelemetry-instrumentation-openai-agents",
    span_processors=[
        BatchSpanProcessor(
            OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces")
        )
    ]
)

示例1:使用 Azure AI Inference SDK 使用 Opentelemetry 设置追踪

以下端到端示例使用了 Python 中的 Azure AI Inference SDK,展示了如何设置追踪提供者和仪器。

前提条件

要运行这个示例,你需要以下先决条件:

搭建你的开发环境

请使用以下说明部署包含所有运行本示例所需依赖的预配置开发环境。

  1. 设置GitHub个人访问令牌

    可以使用免费的GitHub模型作为示例模型。

    打开GitHub开发者设置,选择“生成新令牌”。

    重要

    模型:阅读令牌需要权限,否则会以未授权方式返回。令牌会发送给 Microsoft 服务。

  2. 创建环境变量

    创建一个环境变量,用以下代码片段之一将你的令牌设置为客户端代码的密钥。替换<your-github-token-goes-here>用你的实际GitHub令牌。

    巴什:

    export GITHUB_TOKEN="<your-github-token-goes-here>"
    

    PowerShell:

    $Env:GITHUB_TOKEN="<your-github-token-goes-here>"
    

    Windows命令提示符:

    set GITHUB_TOKEN=<your-github-token-goes-here>
    
  3. 安装 Python 包

    以下命令用于安装用于使用 Azure AI Inference SDK 进行追踪所需的 Python 包:

    pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http azure-ai-inference[opentelemetry]
    
  4. 建立追踪

    1. 在你的电脑上为项目创建一个新的本地目录。

      mkdir my-tracing-app
      
    2. 导航到你创建的目录。

      cd my-tracing-app
      
    3. 在该目录中打开Visual Studio Code:

      code .
      
  5. 创建 Python 文件

    1. 我的追踪应用目录,创建一个名为main.py.

      你需要添加代码来设置追踪并与Azure AI Inference SDK交互。

    2. 将以下代码添加到main.py并保存文件:

      import os
      
      ### Set up for OpenTelemetry tracing ###
      os.environ["AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED"] = "true"
      os.environ["AZURE_SDK_TRACING_IMPLEMENTATION"] = "opentelemetry"
      
      from opentelemetry import trace, _events
      from opentelemetry.sdk.resources import Resource
      from opentelemetry.sdk.trace import TracerProvider
      from opentelemetry.sdk.trace.export import BatchSpanProcessor
      from opentelemetry.sdk._logs import LoggerProvider
      from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
      from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
      from opentelemetry.sdk._events import EventLoggerProvider
      from opentelemetry.exporter.otlp.proto.http._log_exporter import OTLPLogExporter
      
      github_token = os.environ["GITHUB_TOKEN"]
      
      resource = Resource(attributes={
          "service.name": "opentelemetry-instrumentation-azure-ai-inference"
      })
      provider = TracerProvider(resource=resource)
      otlp_exporter = OTLPSpanExporter(
          endpoint="http://localhost:4318/v1/traces",
      )
      processor = BatchSpanProcessor(otlp_exporter)
      provider.add_span_processor(processor)
      trace.set_tracer_provider(provider)
      
      logger_provider = LoggerProvider(resource=resource)
      logger_provider.add_log_record_processor(
          BatchLogRecordProcessor(OTLPLogExporter(endpoint="http://localhost:4318/v1/logs"))
      )
      _events.set_event_logger_provider(EventLoggerProvider(logger_provider))
      
      from azure.ai.inference.tracing import AIInferenceInstrumentor
      AIInferenceInstrumentor().instrument()
      ### Set up for OpenTelemetry tracing ###
      
      from azure.ai.inference import ChatCompletionsClient
      from azure.ai.inference.models import UserMessage
      from azure.ai.inference.models import TextContentItem
      from azure.core.credentials import AzureKeyCredential
      
      client = ChatCompletionsClient(
          endpoint = "https://models.inference.ai.azure.com",
          credential = AzureKeyCredential(github_token),
          api_version = "2024-08-01-preview",
      )
      
      response = client.complete(
          messages = [
              UserMessage(content = [
                  TextContentItem(text = "hi"),
              ]),
          ],
          model = "gpt-4.1",
          tools = [],
          response_format = "text",
          temperature = 1,
          top_p = 1,
      )
      
      print(response.choices[0].message.content)
      
  6. 运行代码

    1. 在Visual Studio Code中打开一个新的终端。

    2. 在终端中,使用以下命令运行代码Python main.py.

  7. 查看 AI 工具包中的追踪数据

    运行代码并刷新追踪网页视图后,列表中会有一个新的追踪。

    选择该追踪以打开追踪详情网页视图。

    截图显示在追踪网页视图中从追踪列表中选择追踪。

    在左侧的生成树视图中查看应用的完整执行流程。

    在正确的区间详情视图中选择一个区间,以便在输入+输出标签页中查看生成式AI消息。

    选择“元数据”标签以查看原始元数据。

    截图显示追踪网页视图中的“追踪详情”视图。

示例2:使用 OpenAI Agents SDK 使用 Monocle 设置追踪

以下端到端示例使用了 OpenAI Agents SDK 的 Python,配合 Monocle,展示了如何为多代理旅行预订系统设置追踪。

前提条件

要运行这个示例,你需要以下先决条件:

搭建你的开发环境

请使用以下说明部署包含所有运行本示例所需依赖的预配置开发环境。

  1. 创建环境变量

    使用以下代码片段之一为你的 OpenAI API 密钥创建一个环境变量。替换<your-openai-api-key>用你实际的OpenAI API密钥。

    巴什:

    export OPENAI_API_KEY="<your-openai-api-key>"
    

    PowerShell:

    $Env:OPENAI_API_KEY="<your-openai-api-key>"
    

    Windows命令提示符:

    set OPENAI_API_KEY=<your-openai-api-key>
    

    或者,创建一个.env项目目录中的文件:

    OPENAI_API_KEY=<your-openai-api-key>
    
  2. 安装 Python 包

    创建一个requirements.txt文件内容如下:

    opentelemetry-sdk
    opentelemetry-exporter-otlp-proto-http
    monocle_apptrace
    openai-agents
    python-dotenv
    

    使用以下方式安装这些软件包:

    pip install -r requirements.txt
    
  3. 建立追踪

    1. 在你的电脑上为项目创建一个新的本地目录。

      mkdir my-agents-tracing-app
      
    2. 导航到你创建的目录。

      cd my-agents-tracing-app
      
    3. 在该目录中打开Visual Studio Code:

      code .
      
  4. 创建 Python 文件

    1. 我的代理追踪应用目录,创建一个名为main.py.

      你会添加代码来设置Monocle的追踪功能,并与OpenAI Agents SDK交互。

    2. 将以下代码添加到main.py并保存文件:

      import os
      
      from dotenv import load_dotenv
      
      # Load environment variables from .env file
      load_dotenv()
      
      from opentelemetry.sdk.trace.export import BatchSpanProcessor
      from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
      
      # Import monocle_apptrace
      from monocle_apptrace import setup_monocle_telemetry
      
      # Setup Monocle telemetry with OTLP span exporter for traces
      setup_monocle_telemetry(
          workflow_name="opentelemetry-instrumentation-openai-agents",
          span_processors=[
              BatchSpanProcessor(
                  OTLPSpanExporter(endpoint="http://localhost:4318/v1/traces")
              )
          ]
      )
      
      from agents import Agent, Runner, function_tool
      
      # Define tool functions
      @function_tool
      def book_flight(from_airport: str, to_airport: str) -> str:
          """Book a flight between airports."""
          return f"Successfully booked a flight from {from_airport} to {to_airport} for 100 USD."
      
      @function_tool
      def book_hotel(hotel_name: str, city: str) -> str:
          """Book a hotel reservation."""
          return f"Successfully booked a stay at {hotel_name} in {city} for 50 USD."
      
      @function_tool
      def get_weather(city: str) -> str:
          """Get weather information for a city."""
          return f"The weather in {city} is sunny and 75°F."
      
      # Create specialized agents
      flight_agent = Agent(
          name="Flight Agent",
          instructions="You are a flight booking specialist. Use the book_flight tool to book flights.",
          tools=[book_flight],
      )
      
      hotel_agent = Agent(
          name="Hotel Agent",
          instructions="You are a hotel booking specialist. Use the book_hotel tool to book hotels.",
          tools=[book_hotel],
      )
      
      weather_agent = Agent(
          name="Weather Agent",
          instructions="You are a weather information specialist. Use the get_weather tool to provide weather information.",
          tools=[get_weather],
      )
      
      # Create a coordinator agent with tools
      coordinator = Agent(
          name="Travel Coordinator",
          instructions="You are a travel coordinator. Delegate flight bookings to the Flight Agent, hotel bookings to the Hotel Agent, and weather queries to the Weather Agent.",
          tools=[
              flight_agent.as_tool(
                  tool_name="flight_expert",
                  tool_description="Handles flight booking questions and requests.",
              ),
              hotel_agent.as_tool(
                  tool_name="hotel_expert",
                  tool_description="Handles hotel booking questions and requests.",
              ),
              weather_agent.as_tool(
                  tool_name="weather_expert",
                  tool_description="Handles weather information questions and requests.",
              ),
          ],
      )
      
      # Run the multi-agent workflow
      if __name__ == "__main__":
          import asyncio
      
          result = asyncio.run(
              Runner.run(
                  coordinator,
                  "Book me a flight today from SEA to SFO, then book the best hotel there and tell me the weather.",
              )
          )
          print(result.final_output)
      
  5. 运行代码

    1. 在Visual Studio Code中打开一个新的终端。

    2. 在终端中,使用以下命令运行代码Python main.py.

  6. 查看 AI 工具包中的追踪数据

    运行代码并刷新追踪网页视图后,列表中会有一个新的追踪。

    选择该追踪以打开追踪详情网页视图。

    截图显示在追踪网页视图中从追踪列表中选择追踪。

    在左侧的生成树视图中查看应用的完整执行流程,包括代理调用、工具调用和代理委托。

    在正确的区间详情视图中选择一个区间,以便在输入+输出标签页中查看生成式AI消息。

    选择“元数据”标签以查看原始元数据。

    截图显示追踪网页视图中的“追踪详情”视图。

你学到了什么

在本文中,你学到了:

  • 在你的AI应用中使用Azure AI推理SDK和OpenTelemetry设置追踪。
  • 配置OTLP跟踪导出器将跟踪数据发送到本地采集服务器。
  • 运行你的应用程序生成痕迹数据,并在 AI 工具包的网页视图中查看痕迹。
  • 通过多种SDK和语言(包括Python和TypeScript/JavaScript)以及非Microsoft工具,使用追踪功能。
  • 利用提供的代码片段来检测各种人工智能框架(Anthropic、Gemini、LangChain、OpenAI 等)。
  • 使用追踪网页视图界面,包括开始收集器和刷新按钮,来管理追踪数据。
  • 搭建你的开发环境,包括环境变量和包安装,以实现追踪功能。
  • 利用生成树和详细信息视图分析应用的执行流程,包括生成式AI消息流和元数据。