Skip to Content

MCP Server - LangChain

LangChain supports MCP servers through the langchain-community/tools/mcp package. Use it to blend Xweather data with other tools, retrievers, or chains.

from langchain_community.chat_models import ChatAnthropic from langchain_community.tools.mcp import JsonMCPTool from langchain.agents import AgentExecutor, create_openai_functions_agent weather_tool = JsonMCPTool( name="xweather", server_url="https://mcp.api.xweather.com/mcp", authorization_header=f"Bearer {os.environ['XWEATHER_API_KEY']}", allowed_tools=[ "xweather_get_current_weather", "xweather_get_weather_forecast", "xweather_get_weather_impacts", ], ) llm = ChatAnthropic(model="claude-3-5-sonnet-20241022") agent = create_openai_functions_agent( llm=llm, tools=[weather_tool], prompt="You are a weather analyst who uses Xweather tools when asked." ) executor = AgentExecutor(agent=agent, tools=[weather_tool], verbose=True) response = executor.invoke({"input": "Draft an operations brief for ORD tonight."}) print(response["output"])

Tips

  • Combine the MCP tool with a vector retriever for playbooks or escalation policies.
  • Set allowed_tools to the exact functions needed for each chain to avoid unnecessary tool calls.
  • Use callbacks or event tracing to capture tool parameters for observability.
© 2026 Xweather (opens in a new tab)Terms of Service (opens in a new tab)Privacy Policy (opens in a new tab)