Implementing an LLM Agent with Instrument Entry Utilizing MCP-Use


MCP-Use is an open-source library that allows you to join any LLM to any MCP server, giving your brokers device entry like net searching, file operations, and extra — all with out counting on closed-source purchasers. On this tutorial, we’ll use langchain-groq and MCP-Use’s built-in dialog reminiscence to construct a easy chatbot that may work together with instruments through MCP. 

Putting in uv package deal supervisor

We’ll first arrange our surroundings and begin with putting in the uv package deal supervisor. For Mac or Linux:

curl -LsSf https://astral.sh/uv/set up.sh | sh 

For Home windows (PowerShell):

powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/set up.ps1 | iex"

Creating a brand new listing and activating a digital surroundings

We’ll then create a brand new challenge listing and initialize it with uv

uv init mcp-use-demo
cd mcp-use-demo

We are able to now create and activate a digital surroundings. For Mac or Linux:

uv venv
supply .venv/bin/activate

For Home windows:

uv venv
.venvScriptsactivate

Putting in Python dependencies

We’ll now set up the required dependencies

uv add mcp-use langchain-groq python-dotenv

Groq API Key

To make use of Groq’s LLMs:

  1. Go to Groq Console and generate an API key.
  2. Create a .env file in your challenge listing and add the next line:

 Change with the important thing you simply generated.

Courageous Search API Key

This tutorial makes use of the Courageous Search MCP Server.

  1. Get your Courageous Search API key from: Brave Search API
  2. Create a file named mcp.json within the challenge root with the next content material:
{
  "mcpServers": {
    "brave-search": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-brave-search"
      ],
      "env": {
        "BRAVE_API_KEY": ""
      }
    }
  }
}

Change together with your precise Courageous API key.

Node JS

Some MCP servers (together with Courageous Search) require npx, which comes with Node.js.

  • Obtain the most recent model of Node.js from nodejs.org
  • Run the installer.
  • Go away all settings as default and full the set up

Utilizing different servers

When you’d like to make use of a distinct MCP server, merely exchange the contents of mcp.json with the configuration for that server.

Create an app.py file within the listing and add the next content material:

Importing the libraries

from dotenv import load_dotenv
from langchain_groq import ChatGroq
from mcp_use import MCPAgent, MCPClient
import os
import sys
import warnings

warnings.filterwarnings("ignore", class=ResourceWarning)

This part hundreds surroundings variables and imports required modules for LangChain, MCP-Use, and Groq. It additionally suppresses ResourceWarning for cleaner output.

Organising the chatbot

async def run_chatbot():
    """ Working a chat utilizing MCPAgent's inbuilt dialog reminiscence """
    load_dotenv()
    os.environ["GROQ_API_KEY"] = os.getenv("GROQ_API_KEY")

    configFile = "mcp.json"
    print("Beginning chatbot...")

    # Creating MCP shopper and LLM occasion
    shopper = MCPClient.from_config_file(configFile)
    llm = ChatGroq(mannequin="llama-3.1-8b-instant")

    # Creating an agent with reminiscence enabled
    agent = MCPAgent(
        llm=llm,
        shopper=shopper,
        max_steps=15,
        memory_enabled=True,
        verbose=False
    )

This part hundreds the Groq API key from the .env file and initializes the MCP shopper utilizing the configuration offered in mcp.json. It then units up the LangChain Groq LLM and creates a memory-enabled agent to deal with conversations.

Implementing the chatbot

# Add this within the run_chatbot perform
    print("n-----Interactive MCP Chat----")
    print("Kind 'exit' or 'give up' to finish the dialog")
    print("Kind 'clear' to clear dialog historical past")

    strive:
        whereas True:
            user_input = enter("nYou: ")

            if user_input.decrease() in ["exit", "quit"]:
                print("Ending dialog....")
                break
           
            if user_input.decrease() == "clear":
                agent.clear_conversation_history()
                print("Dialog historical past cleared....")
                proceed
           
            print("nAssistant: ", finish="", flush=True)

            strive:
                response = await agent.run(user_input)
                print(response)
           
            besides Exception as e:
                print(f"nError: {e}")

    lastly:
        if shopper and shopper.periods:
            await shopper.close_all_sessions()

This part allows interactive chatting, permitting the consumer to enter queries and obtain responses from the assistant. It additionally helps clearing the chat historical past when requested. The assistant’s responses are displayed in real-time, and the code ensures that each one MCP periods are closed cleanly when the dialog ends or is interrupted.

Working the app

if __name__ == "__main__":
    import asyncio
    strive:
        asyncio.run(run_chatbot())
    besides KeyboardInterrupt:
        print("Session interrupted. Goodbye!")
   
    lastly:
        sys.stderr = open(os.devnull, "w")

This part runs the asynchronous chatbot loop, managing steady interplay with the consumer. It additionally handles keyboard interruptions gracefully, making certain this system exits with out errors when the consumer terminates the session.

Yow will discover your complete code here

To run the app, run the next command

This can begin the app, and you may work together with the chatbot and use the server for the session


I’m a Civil Engineering Graduate (2022) from Jamia Millia Islamia, New Delhi, and I’ve a eager curiosity in Information Science, particularly Neural Networks and their utility in numerous areas.

Leave a Reply

Your email address will not be published. Required fields are marked *