On this hands-on tutorial, we convey the core rules of the Mannequin Context Protocol (MCP) to life by implementing a light-weight, context-aware AI assistant utilizing LangChain, LangGraph, and Google’s Gemini language mannequin. Whereas full MCP integration sometimes entails devoted servers and communication protocols, this simplified model demonstrates how the identical concepts, context retrieval, instrument invocation, and dynamic interplay might be recreated in a single pocket book utilizing a modular agent structure. The assistant can reply to pure language queries and selectively route them to exterior instruments (like a customized data base), mimicking how MCP purchasers work together with context suppliers in real-world setups.
!pip set up langchain langchain-google-genai langgraph python-dotenv
!pip set up google-generativeai
First, we set up important libraries. The primary command installs LangChain, LangGraph, the Google Generative AI LangChain wrapper, and atmosphere variable assist through python-dotenv. The second command installs Google’s official generative AI shopper, which allows interplay with Gemini fashions.
import os
os.environ["GEMINI_API_KEY"] = "Your API Key"
Right here, we set your Gemini API key as an atmosphere variable so the mannequin can securely entry it with out hardcoding it into your codebase. Exchange “Your API Key” together with your precise key from Google AI Studio.
from langchain.instruments import BaseTool
from langchain_google_genai import ChatGoogleGenerativeAI
from langchain.prompts import ChatPromptTemplate
from langchain.schema.messages import HumanMessage, AIMessage
from langgraph.prebuilt import create_react_agent
import os
mannequin = ChatGoogleGenerativeAI(
mannequin="gemini-2.0-flash-lite",
temperature=0.7,
google_api_key=os.getenv("GEMINI_API_KEY")
)
class SimpleKnowledgeBaseTool(BaseTool):
identify: str = "simple_knowledge_base"
description: str = "Retrieves primary details about AI ideas."
def _run(self, question: str):
data = {
"MCP": "Mannequin Context Protocol (MCP) is an open commonplace by Anthropic designed to attach AI assistants with exterior information sources, enabling real-time, context-rich interactions.",
"RAG": "Retrieval-Augmented Era (RAG) enhances LLM responses by dynamically retrieving related exterior paperwork."
}
return data.get(question, "I haven't got info on that subject.")
async def _arun(self, question: str):
return self._run(question)
kb_tool = SimpleKnowledgeBaseTool()
instruments = [kb_tool]
graph = create_react_agent(mannequin, instruments)
On this block, we initialize the Gemini language mannequin (gemini-2.0-flash-lite) utilizing LangChain’s ChatGoogleGenerativeAI, with the API key securely loaded from atmosphere variables. We then outline a customized instrument named SimpleKnowledgeBaseTool that simulates an exterior data supply by returning predefined solutions to queries about AI ideas like “MCP” and “RAG.” This instrument acts as a primary context supplier, just like how an MCP server would function. Lastly, we use LangGraph’s create_react_agent to construct a ReAct-style agent that may motive by prompts and dynamically determine when to name instruments, mimicking MCP’s tool-aware, context-rich interactions precept.
import nest_asyncio
import asyncio
nest_asyncio.apply()
async def chat_with_agent():
inputs = {"messages": []}
print("🤖 MCP-Like Assistant prepared! Sort 'exit' to stop.")
whereas True:
user_input = enter("nYou: ")
if user_input.decrease() == "exit":
print("👋 Ending chat.")
break
from langchain.schema.messages import HumanMessage, AIMessage
inputs["messages"].append(HumanMessage(content material=user_input))
async for state in graph.astream(inputs, stream_mode="values"):
last_message = state["messages"][-1]
if isinstance(last_message, AIMessage):
print("nAgent:", last_message.content material)
inputs["messages"] = state["messages"]
await chat_with_agent()
Lastly, we arrange an asynchronous chat loop to work together with the MCP-inspired assistant. Utilizing nest_asyncio, we allow assist for working asynchronous code contained in the pocket book’s current occasion loop. The chat_with_agent() perform captures person enter, feeds it to the ReAct agent, and streams the mannequin’s responses in actual time. With every flip, the assistant makes use of tool-aware reasoning to determine whether or not to reply instantly or invoke the customized data base instrument, emulating how an MCP shopper interacts with context suppliers to ship dynamic, context-rich responses.
In conclusion, this tutorial gives a sensible basis for constructing context-aware AI brokers impressed by the MCP commonplace. We’ve created a purposeful prototype demonstrating on-demand instrument use and exterior data retrieval by combining LangChain’s instrument interface, LangGraph’s agent framework, and Gemini’s highly effective language era. Though the setup is simplified, it captures the essence of MCP’s structure: modularity, interoperability, and clever context injection. From right here, you may lengthen the assistant to combine actual APIs, native paperwork, or dynamic search instruments, evolving it right into a production-ready AI system aligned with the rules of the Mannequin Context Protocol.
Right here is the Colab Notebook. Additionally, don’t neglect to observe us on Twitter and be part of our Telegram Channel and LinkedIn Group. Don’t Overlook to affix our 85k+ ML SubReddit.

Asif Razzaq is the CEO of Marktechpost Media Inc.. As a visionary entrepreneur and engineer, Asif is dedicated to harnessing the potential of Synthetic Intelligence for social good. His most up-to-date endeavor is the launch of an Synthetic Intelligence Media Platform, Marktechpost, which stands out for its in-depth protection of machine studying and deep studying information that’s each technically sound and simply comprehensible by a large viewers. The platform boasts of over 2 million month-to-month views, illustrating its recognition amongst audiences.