In this Colab‑ready tutorial, we demonstrate how to integrate Google’s Gemini 2.0 generative AI with an in‑process Model Context Protocol (MCP) server, using FastMCP. Starting with an interactive getpass prompt to capture your GEMINI_API_KEY securely, we install and configure all necessary dependencies: the google‑genai Python client for calling the Gemini API, fastmcp for defining and hosting our MCP tools in‑process, httpx for making HTTP requests to the Open‑Meteo weather API, and nest_asyncio to patch Colab’s already‑running asyncio event loop. The workflow proceeds by spinning up a minimal FastMCP “weather” server with two tools, get_weather(latitude, longitude) for a three‑day forecast and get_alerts(state) for state‑level weather alerts, then creating a FastMCPTransport to connect an MCP client to that server. Finally, using the Gemini function‑calling feature, we send a natural‑language prompt to Gemini, have it emit a function call based on our explicit JSON schemas, and then execute that call via the MCP client, returning structured weather data into our notebook.
from getpass import getpass
import os
api_key = getpass("Enter your GEMINI_API_KEY: ")
os.environ["GEMINI_API_KEY"] = api_key
We securely prompt you to enter your Gemini API key (without displaying it on the screen) and then store it in the GEMINI_API_KEY environment variable, allowing the rest of your notebook to authenticate with Google’s API.
!pip install -q google-genai mcp fastmcp httpx nest_asyncio
We install all the core dependencies needed for our Colab notebook in one go—google‑genai for interacting with the Gemini API, mcp and fastmcp for building and hosting our Model Context Protocol server and client, httpx for making HTTP requests to external APIs, and nest_asyncio to patch the event loop so our async code runs smoothly.
We apply the nest_asyncio patch to the notebook’s existing event loop, allowing us to run asyncio coroutines (like our MCP client interactions) without encountering “event loop already running” errors.
from fastmcp import FastMCP
import httpx
mcp_server = FastMCP("weather")
@mcp_server.tool()
def get_weather(latitude: float, longitude: float) -> str:
"""3‑day min/max temperature forecast via Open‑Meteo."""
url = (
f"https://api.open-meteo.com/v1/forecast"
f"?latitude={latitude}&longitude={longitude}"
"&daily=temperature_2m_min,temperature_2m_max&timezone=UTC"
)
resp = httpx.get(url, timeout=10)
daily = resp.json()["daily"]
return "\n".join(
f"{date}: low {mn}°C, high {mx}°C"
for date, mn, mx in zip(
daily["time"],
daily["temperature_2m_min"],
daily["temperature_2m_max"],
)
)
@mcp_server.tool()
def get_alerts(state: str) -> str:
"""Dummy US‑state alerts."""
return f"No active weather alerts for {state.upper()}."
We create an in‑process FastMCP server named “weather” and register two tools: get_weather(latitude, longitude), which fetches and formats a 3‑day temperature forecast from the Open‑Meteo API using httpx, and get_alerts(state), which returns a placeholder message for U.S. state weather alerts.
import asyncio
from google import genai
from google.genai import types
from fastmcp import Client as MCPClient
from fastmcp.client.transports import FastMCPTransport
We import the core libraries for our MCP‑Gemini integration: asyncio to run asynchronous code, google‑genai and its types module for calling Gemini and defining function‑calling schemas, and FastMCP’s Client (aliased as MCPClient) with its FastMCPTransport to connect our in‑process weather server to the MCP client.
client = genai.Client(api_key=os.getenv("GEMINI_API_KEY"))
MODEL = "gemini-2.0-flash"
transport = FastMCPTransport(mcp_server)
We initialize the Google Gemini client using the GEMINI_API_KEY from your environment, specify the gemini-2.0-flash model for function‑calling, and set up a FastMCPTransport that connects the in‑process mcp_server to the MCP client.
function_declarations = [
{
"name": "get_weather",
"description": "Return a 3‑day min/max temperature forecast for given coordinates.",
"parameters": {
"type": "object",
"properties": {
"latitude": {
"type": "number",
"description": "Latitude of target location."
},
"longitude": {
"type": "number",
"description": "Longitude of target location."
}
},
"required": ["latitude", "longitude"]
}
},
{
"name": "get_alerts",
"description": "Return any active weather alerts for a given U.S. state.",
"parameters": {
"type": "object",
"properties": {
"state": {
"type": "string",
"description": "Two‑letter U.S. state code, e.g. 'CA'."
}
},
"required": ["state"]
}
}
]
tool_defs = types.Tool(function_declarations=function_declarations)
We manually define the JSON schema specifications for our two MCP tools, get_weather (which accepts latitude and longitude as numeric inputs) and get_alerts (which accepts a U.S. state code as a string), including names, descriptions, required properties, and data types. It then wraps these declarations in types. Tool object (tool_defs), which informs Gemini how to generate and validate the corresponding function calls.
async def run_gemini(lat: float, lon: float):
async with MCPClient(transport) as mcp_client:
prompt = f"Give me a 3‑day weather forecast for latitude={lat}, longitude={lon}."
response = client.models.generate_content(
model=MODEL,
contents=[prompt],
config=types.GenerateContentConfig(
temperature=0,
tools=[tool_defs]
)
)
call = response.candidates[0].content.parts[0].function_call
if not call:
print("No function call; GPT said:", response.text)
return
print("🔧 Gemini wants:", call.name, call.args)
result = await mcp_client.call_tool(call.name, call.args)
print("\n📋 Tool result:\n", result)
asyncio.get_event_loop().run_until_complete(run_gemini(37.7749, -122.4194))
Finally, this async function run_gemini opens an MCP client session over our in‑process transport, sends a natural‑language prompt to Gemini asking for a 3‑day forecast at the given coordinates, captures the resulting function call (if any), invokes the corresponding MCP tool, and prints out the structured weather data, all of which is kicked off by running it in the notebook’s event loop with run_until_complete.
In conclusion, we have a fully contained pipeline that showcases how to define custom MCP tools in Python, expose them via FastMCP, and seamlessly integrate them with Google’s Gemini 2.0 model using the google‑genai client. The key frameworks, FastMCP for MCP hosting, FastMCPTransport and MCPClient for transport and invocation, httpx for external API access, and nest_asyncio for Colab compatibility, work together to enable real‑time function calling without external processes or stdio pipes. This pattern simplifies local development and testing of MCP integrations in Colab and provides a template for building more advanced agentic applications that combine LLM reasoning with specialized domain tools.
Here is the Colab Notebook. Also, don’t forget to follow us on Twitter and join our Telegram Channel and LinkedIn Group. Don’t Forget to join our 90k+ ML SubReddit.
Asif Razzaq is the CEO of Marktechpost Media Inc.. As a visionary entrepreneur and engineer, Asif is committed to harnessing the potential of Artificial Intelligence for social good. His most recent endeavor is the launch of an Artificial Intelligence Media Platform, Marktechpost, which stands out for its in-depth coverage of machine learning and deep learning news that is both technically sound and easily understandable by a wide audience. The platform boasts of over 2 million monthly views, illustrating its popularity among audiences.
Leave a comment