Skip to main content

LangGraph

Build stateful, multi-actor AI agents with LangGraph and enhance them with Unizo MCP servers for enterprise platform integration.

Overview

LangGraph is a library for building stateful, multi-actor applications with LLMs. It extends LangChain with support for building complex workflows and stateful agents. LangGraph includes built-in MCP integration through the langchain-mcp-adapters package, enabling seamless connection to Unizo's MCP server.

Prerequisites

Before setting up the MCP integration, you'll need:

  • Unizo API Key: Used for authentication (found in the Unizo Console)
  • Python 3.10+: Required for LangChain MCP adapters
  • OpenAI API Key: For the underlying LLM (or other supported LLM provider)

Installation

pip install langchain-mcp-adapters langchain langgraph langchain-openai

Or with uv:

uv add langchain-mcp-adapters langchain langgraph langchain-openai

Authentication

Step 1: Generate a Service Key

Use your Unizo API key to generate a service key for MCP authentication:

curl --location 'https://api.unizo.ai/api/v1/serviceKeys' \
--header "apiKey: {unizo_api_key}" \
--header "Content-Type: application/json" \
--data '{
"name": "MCP LangChain Integration",
"subOrganization": {
"name": "{YOUR_CUSTOMER_NAME}",
"externalKey": "{YOUR_CUSTOMER_UNIQUE_IDENTIFIER}"
},
"integration": {
"target": {
"categorySelectors": [
{
"type": "TICKETING"
}
]
}
}
}'

Required Parameters:

ParameterDescription
apiKeyYour Unizo API key (found in the Unizo Console)
nameDescriptive name for the integration (e.g., "MCP LangChain Integration")
subOrganization.nameYour organization name (e.g., "Acme Inc")
subOrganization.externalKeyUnique identifier (UUID format recommended)

Response:

{
"state": "ACTIVE",
"displayId": "{UNIZO_SERVICE_KEY}",
...
}

Important: Save the displayId from the response  this is your service key for authenticating with the MCP server.

Step 2: Configure Environment Variables

Create a .env file in your project root:

OPENAI_API_KEY=your_openai_api_key

Quick Start

Connect to Unizo MCP and use tools in LangChain:

import os
import asyncio
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI
from dotenv import load_dotenv

load_dotenv()

async def main():
# Connect to Unizo MCP server
client = MultiServerMCPClient({
"unizo": {
"url": "https://api.unizo.ai/mcp",
"transport": "streamable_http",
"headers": {
"servicekey": os.getenv("UNIZO_SERVICE_KEY"),
"x-mcp-scopes": "TICKETING"
}
}
})

# Get Unizo tools
tools = await client.get_tools()

# Create LangChain agent with OpenAI
llm = ChatOpenAI(model="gpt-4o", temperature=0)
agent = create_react_agent(llm, tools)

# Run agent
result = await agent.ainvoke({
"messages": [{"role": "user", "content": "List ticketing connectors"}]
})
print(result["messages"][-1].content)
# Run async function
if __name__ == "__main__":
asyncio.run(main())

Headers

HeaderDescription
servicekeyYour Unizo service key (from the authentication step)
x-mcp-scopes (optional)Supported scopes include TICKETING, VMS, INCIDENT, SCM, PCR, COMMS, IDENTITY, OBSERVABILITY, KEY_MANAGEMENT, INFRA, EDR, STORAGE, and PLATFORM.

Note: If x-mcp-scopes is not specified, all scopes are enabled by default.

Advanced Usage

Interactive Chat Interface

Build a simple chat interface to interact with Unizo MCP tools:

import os
import asyncio
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage, SystemMessage
from dotenv import load_dotenv

load_dotenv()

async def chat():
# Connect to Unizo MCP
client = MultiServerMCPClient({
"unizo": {
"url": "https://api.unizo.ai/mcp",
"transport": "streamable_http",
"headers": {
"servicekey": os.getenv("UNIZO_SERVICE_KEY"),
"x-mcp-scopes": "PLATFORM,TICKETING"
}
}
})

tools = await client.get_tools()
print(f" Connected! {len(tools)} tools available.\n")

# Create agent
llm = ChatOpenAI(model="gpt-4o", temperature=0)
agent = create_react_agent(llm, tools)

# Initialize conversation
messages = [
SystemMessage(content="You are a helpful assistant for managing Unizo integrations.")
]
print("=" * 50)
print("UNIZO CHAT (type 'quit' to exit)")
print("=" * 50 + "\n")

while True:
user_input = input("You: ").strip()
if user_input.lower() in ['quit', 'exit']:
print("Goodbye!")
break
if not user_input:
continue
messages.append(HumanMessage(content=user_input))
result = await agent.ainvoke({"messages": messages})
messages = result["messages"]
print(f"\nAgent: {messages[-1].content}\n")


if __name__ == "__main__":
asyncio.run(chat())