Running an Agent Locally using GLChat MCP Server

This is an example of how to run an agent that leverages GLChat MCP tools locally.

Prerequisites

This example specifically requires:

Running The Code

All the code used in the following examples can be found in our GitHub repository gen-ai-examples/custom-tool-and-agent.

git clone https://github.com/GDP-ADMIN/gen-ai-examples.git
cd gen-ai-examples/examples/custom-tool-and-agent

Set up your OPENAI_API_KEY

Get your OpenAI API key from https://platform.openai.com/api-keys.

  • Environment Variable Option

    export OPENAI_API_KEY="sk-..."

  • Environment File (.env) Option

    echo 'OPENAI_API_KEY="sk-..."' > .env

Execute the script

./run_example_glchat_stdio.sh

​The script will do the following:

  1. Spawn a GLChat MCP Server that has a messagetool.

Show me some code.
...

mcp = FastMCP("Math_Tools")

@mcp.tool()
def message(prompt: str) -> int:
    """Send message to GLChat.

    Args:
        prompt: The prompt.
    Returns:
        str: The response from GLChat.
    """
    response = requests.post(
        "https://chat-api.gdplabs.id/message",
        data={'chatbot_id': 'no-op', 'message': prompt, 'content-type': 'application/json'},
        stream=True
    )
    
    final_message = ""
    for line in response.content.decode('utf-8').split('\n'):
        if line.startswith('data:'):
            try:
                data = json.loads(line[5:])
                if data.get('status') == 'response' and 'message' in data:
                    final_message = data['message']
            except json.JSONDecodeError:
                pass
    
    return final_message

...

For complete code, see glchat_tools_stdio.py or glchat_tools_sse.py.

  1. Create an agent with an MCP Client connecting to the GLChat MCP Server.

Show me some code.
...

async def main():
    async with MCPClient(mcp_config_glchat_sse) as client:
        tools = client.get_tools()
        
        llm = ChatOpenAI(model="gpt-4.1")
        agent = Agent(
            name="MathAgent",
            instruction="You are a helpful assistant that collaborate with GLChat, an AI chatbot that exposes MCP server. Show response from GLChat as is, do not rephrase it.",
            llm=llm,
            tools=tools,
            verbose=True
        )

...

For complete code, see hello_agent_mcp_glchat_sse_example.py or hello_agent_mcp_stdio_example.py.

  1. Execute query What is the capital of Indonesia?.

Show me some code.
...

async def main():
    async with MCPClient(mcp_config_sse) as client:
        ...
        
        query = "What is the capital of Indonesia?"
        response = await agent.arun(query)
        
        print(response)

if __name__ == "__main__":
    asyncio.run(main())

For complete code, see hello_agent_mcp_glchat_sse_example.py or hello_agent_mcp_stdio_example.py.

With verbose=True, you will see the agent's thinking process, which may look like this:

Available tools: ['message'] Running agent with prompt: What is the capital of Indonesia? > Entering new AgentExecutor chain... Invoking: `message` with `{'prompt': 'What is the capital of Indonesia?'}`

Processing request of type CallToolRequest The capital of Indonesia is Jakarta. The capital of Indonesia is Jakarta. > Finished chain. {'input': 'What is the capital of Indonesia?', 'output': 'The capital of Indonesia is Jakarta.'}

The key indicators of success:

  • The agent initialization completes without errors

  • The verbose output shows the tool being invoked

  • The final output shows {'input': 'What is the capital of Indonesia?', 'output': 'The capital of Indonesia is Jakarta.'}

Customizing MCP Servers

In the mcp_configs/configs.py file, you can customize the MCP servers. You can add or remove MCP Servers as per your requirements.

Defining an MCP Server requires the transport to be defined. It is one of:

  • stdio

  • sse

STDIO Server

An STDIO server is a server that uses the standard input and output to communicate with the MCP.

{
    "tool_name": {
        "command": "python",
        "args": ["mcp_tools/tool_name.py"],
        "transport": "stdio",
    }
}

command can be one of (but not limited to):

  • python

  • npx

  • docker

args is a list of arguments to pass to the command.

SSE Servers

An SSE server is a server that uses the Server-Sent Events (SSE) to communicate with the MCP. It simply needs a URL to the SSE endpoint. Typically, this ends in /sse.

{
    "tool_name": {
        "url": "http://localhost:8000/sse",
        "transport": "sse",
    }
}

Example

An example of multiple MCP servers is as follows:

mcp_config = {
    "glchat_tools_stdio": {
        "command": "python",
        "args": ["mcp_tools/glchat_tools_stdio.py"],
        "transport": "stdio",
    },
    "glchat_tools_sse": {
        "url": "http://localhost:8000/sse",
        "transport": "sse",
    },
}

Last updated