FastAPI-MCP Tutorial for Newcomers and Specialists

Have you ever ever come throughout a scenario the place you wished your chatbot to make use of a device after which reply? Sounds sophisticated, proper! However now, MCP (Mannequin Context Protocol) provides you a technique to combine your LLM to exterior instruments simply and the LLM will be capable of use these instruments in each method. On this tutorial, we are going to dive into the method of changing a easy internet app made utilizing FastAPI, powered by an MCP Server, utilizing the FastAPI-MCP.

FastAPI with MCP

FastAPI is a quite simple device inbuilt Python which lets you construct internet functions utilizing APIs. It’s designed to be straightforward to make use of in addition to quick on the identical time. Consider FastAPI as a sensible waiter who takes your order (HTTP requests), goes to the Kitchen (Database/Server) after which takes your order (Output) after which exhibits it to you. It’s an important device for constructing Net backends, Providers for Cell apps and so forth.

MCP is an open customary protocol by Anthropic that gives a performance for the LLMs to speak with exterior knowledge sources and instruments. Consider MCP as a toolkit that gives the proper device for the given process. We’d be utilizing MCP for making a server.

Now, what if these functionalities are given to your LLM? It should make your life a lot simpler! That’s why FastAPI to MCP integration helps lots. FastAPI takes care of the providers from totally different sources and MCP takes care of the context of your LLM. Through the use of FastAPI with MCP server, we are able to get entry to each device deployed over the online and make the most of that as a LLM device and make the LLMs do our work extra effectively.

 Within the above picture, we are able to see that there’s an MCP server that’s related to an API endpoint. This API endpoint generally is a FastAPI endpoint or every other third get together API service obtainable on the web.

What’s FastAPI-MCP?

FastAPI-MCP is a device which helps you to convert any FastAPI utility into some device that LLMs like ChatGPT or Claude can perceive and use simply. Through the use of FastAPI-MCP you may wrap your FastAPI endpoints in such a method that they’ll develop into a plug and play device in an AI ecosystem using LLMs.

If you wish to know the best way to work with MCP, learn this text on Methods to Use MCP?

What APIs Can Be Transformed into MCP Utilizing FastAPI-MCP?

With FastAPI-MCP, any FastAPI endpoint could be transformed right into a MCP device for LLMs. These endpoints ought to embrace:

  • GET endpoints: Transformed into MCP sources.
  • POST, PUT, DELETE endpoints: Transformed into MCP instruments.
  • Customized utility capabilities: May be added as further MCP instruments

FastAPI-MCP is a really easy-to-use library that robotically discovers and converts these endpoints into MCP. It additionally preserves the schema in addition to the documentation of those APIs.

Fingers-on utilizing FastAPI-MCP

Let’s take a look at a easy instance on the best way to convert a FastAPI endpoint right into a MCP server. Firstly, we are going to create a FastAPI endpoint after which transfer in direction of changing it right into a MCP server utilizing fastapi-mcp.

Configuring FastAPI

1. Set up the dependencies

Make your system suitable by putting in the required dependencies.

pip set up fastapi fastapi_mcp uvicorn mcp-proxy

2. Import the required dependencies

Make a brand new file with the identify ‘major.py’, then import the next dependencies inside it.

from fastapi import FastAPI, HTTPException, Question

import httpx

from fastapi_mcp import FastApiMCP

3. Outline the FastAPI App

Let’s outline a FastAPI app with the identify “Climate Updates API”.

app = FastAPI(title="Climate Updates API")

4. Defining the routes and capabilities

Now, we are going to outline the routes for our app, which can denote which endpoint will execute which operate. Right here, we’re making a climate replace app utilizing climate.gov API (free), which doesn’t require any API key. We simply have to hit the https://api.climate.gov/factors/{lat},{lon} with the proper worth of latitude and longitude.

We outlined a get_weather operate which can take a state identify or code as an argument after which discover the corresponding coordinates within the CITY_COORDINATES dictionary after which hit the bottom URL with these coordinates.

# Predefined latitude and longitude for main cities (for simplicity)
# In a manufacturing app, you could possibly use a geocoding service like Nominatim or Google Geocoding API
CITY_COORDINATES = {
   "Los Angeles": {"lat": 34.0522, "lon": -118.2437},
   "San Francisco": {"lat": 37.7749, "lon": -122.4194},
   "San Diego": {"lat": 32.7157, "lon": -117.1611},
   "New York": {"lat": 40.7128, "lon": -74.0060},
   "Chicago": {"lat": 41.8781, "lon": -87.6298},
   # Add extra cities as wanted
}


@app.get("/climate")
async def get_weather(
   stateCode: str = Question(..., description="State code (e.g., 'CA' for California)"),
   metropolis: str = Question(..., description="Metropolis identify (e.g., 'Los Angeles')")
):
   """
   Retrieve at the moment's climate from the Nationwide Climate Service API based mostly on metropolis and state
   """
   # Get coordinates (latitude, longitude) for the given metropolis
   if metropolis not in CITY_COORDINATES:
       increase HTTPException(
           status_code=404,
           element=f"Metropolis '{metropolis}' not present in predefined checklist. Please use one other metropolis."
       )
  
   coordinates = CITY_COORDINATES[city]
   lat, lon = coordinates["lat"], coordinates["lon"]
  
   # URL for the NWS API Gridpoints endpoint
   base_url = f"https://api.climate.gov/factors/{lat},{lon}"
  
   strive:
       async with httpx.AsyncClient() as shopper:
           # First, get the gridpoint info for the given location
           gridpoint_response = await shopper.get(base_url)
           gridpoint_response.raise_for_status()
           gridpoint_data = gridpoint_response.json()
          
           # Retrieve the forecast knowledge utilizing the gridpoint info
           forecast_url = gridpoint_data["properties"]["forecast"]
           forecast_response = await shopper.get(forecast_url)
           forecast_response.raise_for_status()
           forecast_data = forecast_response.json()


           # Returning at the moment's forecast
           today_weather = forecast_data["properties"]["periods"][0]
           return {
               "metropolis": metropolis,
               "state": stateCode,
               "date": today_weather["startTime"],
               "temperature": today_weather["temperature"],
               "temperatureUnit": today_weather["temperatureUnit"],
               "forecast": today_weather["detailedForecast"],
           }
  
   besides httpx.HTTPStatusError as e:
       increase HTTPException(
           status_code=e.response.status_code,
           element=f"NWS API error: {e.response.textual content}"
       )
   besides Exception as e:
       increase HTTPException(
           status_code=500,
           element=f"Inside server error: {str(e)}"
       )

5. Arrange MCP Server

Let’s convert this FastAPI app into MCP now utilizing the fastapi-mcp library. This course of may be very easy, we simply want so as to add a couple of traces of and the fastapi-mcp robotically converts the endpoints into MCP instruments and detects its schema and documentation simply.

mcp = FastApiMCP(
   app,
   identify="Climate Updates API",
   description="API for retrieving at the moment's climate from climate.gov",
)
mcp.mount() 

6. Beginning the app

Now, add the next on the finish of your Python file.

if __name__ == "__main__":
   import uvicorn
   uvicorn.run(app, host="0.0.0.0", port=8000) 

And go to terminal and run the principle.py file.

python major.py 

Now your FastAPI app ought to begin in localhost efficiently. 

Configuring Cursor

Let’s configure the Cursor IDE for testing our MCP server.

  1. Obtain Cursor from right here https://www.cursor.com/downloads.
  2. Set up it, join and get to the house display.
Cursor Home Screen
  1. Now go to the File from the header toolbar. and click on on Preferences after which on Cursor Settings.
Cursor Settings
  1. From the cursor settings, click on on MCP.
Configuring Cursor
  1. On the MCP tab, click on on Add new world MCP Server.
    It should open a mcp.json file. Paste the next code into it and save the file.
{
   "mcpServers": {
     "Nationwide Park Service": {
         "command": "mcp-proxy",
         "args": ["http://127.0.0.1:8000/mcp"]
     }
   }
}
  1. Again on the Cursor Settings, you must see the next:
Linked MCP Server

In case you are seeing this in your display, meaning your server is operating efficiently and related to the Cursor IDE. If it’s displaying some errors, strive utilizing the restart button in the proper nook.

We have now efficiently arrange the MCP server within the Cursor IDE. Now, let’s take a look at the server.

Testing the MCP Server 

Our MCP server can retrieve the climate updates. We simply should ask the Cursor IDE for the climate replace on any location, and it’ll fetch that for us utilizing the MCP server. 

Question:Please inform me what’s at the moment’s climate in San Diego

Prompt Response 1

 Question:New York climate?

Prompt Response 2

We will see from the outputs that our MCP server is working properly. We simply have to ask for the climate particulars, it should resolve by itself whether or not to make use of MCP server or not. Within the second output we requested vaguely “New York climate?” it was capable of notice the context of the question based mostly on our earlier immediate, and used the suitable MCP instruments to reply.

Conclusion

MCP permits LLMs to extend their answering capabilities by giving entry to exterior instruments and FastAPI provides a straightforward method to do this. On this complete information, we mixed each the applied sciences utilizing the fastapi-mcp library. Using this library, we are able to convert any API into MCP server, which can assist the LLMs and AI brokers to get the most recent info from the APIs. There can be no have to outline a customized device for each new process. MCP with FastAPI will care for every little thing robotically. The revolution within the LLMs was introduced by the introduction of MCP, and now, FastAPI paired with MCP is revolutionizing the way in which LLMs are accessing these instruments.

Harsh Mishra is an AI/ML Engineer who spends extra time speaking to Massive Language Fashions than precise people. Enthusiastic about GenAI, NLP, and making machines smarter (so that they don’t substitute him simply but). When not optimizing fashions, he’s most likely optimizing his espresso consumption. 🚀☕

Login to proceed studying and revel in expert-curated content material.