Skip to Content
Code integrationsOpenAI Responses

MCP Server - OpenAI Responses

The OpenAI Responses API natively supports MCP servers through the tools array. Store your Xweather credential as XWEATHER_API_KEY and reuse the same configuration across environments.

import os from openai import OpenAI client = OpenAI() response = client.responses.create( model="gpt-4.1-mini", input=[ { "role": "user", "content": [ {"type": "input_text", "text": "Will it rain in Chicago tonight?"} ], } ], tools=[ { "type": "mcp", "server_url": "https://mcp.api.xweather.com/mcp", "server_label": "xweather", "authorization": os.environ["XWEATHER_API_KEY"], "allowed_tools": [ "xweather_get_current_weather", "xweather_get_weather_forecast" ] } ], ) print(response.output_text)

Handling responses

Responses include the assistant output plus any tool calls with parameters and JSON payloads. Persist the payload (e.g., in a database) if you need auditable weather evidence alongside generated narratives.

Error handling

  • 401/403: Check for missing headers or plan scope issues (see Troubleshooting).
  • 429: Add exponential backoff or queue work if you expect bursts.
  • Malformed payloads: Validate tool arguments before making downstream decisions.
© 2026 Xweather (opens in a new tab)Terms of Service (opens in a new tab)Privacy Policy (opens in a new tab)