DEV Community

Neweraofcoding
Neweraofcoding

Posted on • Edited on

How to Integrate Zomato MCP in a Next.js Application

As AI-powered applications become more contextual and action-oriented, Model Context Protocol (MCP) is emerging as a powerful way to connect large language models with real-world tools and services.

In this blog, we’ll walk through how to integrate Zomato MCP, what problems it solves, and how you can use it to build smarter food discovery and ordering experiences using Zomato data.

Image

Image

Image


What Is MCP (Model Context Protocol)?

MCP is a standardized way to expose tools, APIs, and services so that AI models (like LLMs) can:

  • Discover available capabilities
  • Understand inputs and outputs
  • Safely call real-world APIs
  • Act based on structured context instead of raw prompts

In simple terms:

MCP lets AI models use APIs intelligently, not just talk about them.


Why Integrate Zomato with MCP?

Zomato provides rich data such as:

  • Restaurants
  • Menus
  • Ratings
  • Locations
  • Cuisine types

By integrating Zomato with MCP, you can enable AI experiences like:

  • “Find the best biryani near me under ₹300”
  • “Suggest healthy lunch options open now”
  • “Plan a food crawl for 4 people in Delhi”

Instead of hardcoding logic, the AI understands when and how to call Zomato APIs.


High-Level Architecture

A typical Zomato MCP setup looks like this:

  1. AI Client (LLM / Agent)
  2. MCP Server
  3. Zomato API
  4. Your App (Web / Mobile / Backend)

Flow:

  • User asks a question
  • LLM decides which tool to use
  • MCP server exposes Zomato capabilities
  • Zomato API returns structured data
  • AI responds with a contextual answer

Step 1: Get Access to Zomato APIs

Before MCP integration, you need:

  • Zomato API access
  • API key / token
  • Knowledge of endpoints (search, restaurant details, menus)

Make sure your API responses are structured and predictable, as MCP relies heavily on schemas.


Step 2: Define Zomato as an MCP Tool

In MCP, each integration is defined as a tool with:

  • Name
  • Description
  • Input schema
  • Output schema

Example (conceptual):

{
  "name": "search_restaurants",
  "description": "Search nearby restaurants using Zomato",
  "input_schema": {
    "location": "string",
    "cuisine": "string",
    "budget": "number"
  }
}
Enter fullscreen mode Exit fullscreen mode

This tells the AI:

  • What the tool does
  • When it should be used
  • What inputs are required

Step 3: Implement the MCP Server

Your MCP server acts as a bridge between:

  • AI models
  • Zomato APIs

Responsibilities:

  • Validate inputs
  • Call Zomato endpoints
  • Normalize responses
  • Return clean, structured output

Tech stack options:

  • Node.js (Express / Fastify)
  • Python (FastAPI)
  • Cloud Functions

Security tip:

  • Never expose Zomato API keys directly to the client
  • Handle rate limits and caching at the MCP layer

Step 4: Connect MCP to the AI Model

Once the MCP server is live:

  • Register your MCP tools with the LLM
  • Enable tool calling / function calling
  • Let the model decide when to invoke Zomato

Now the AI can:

  • Interpret user intent
  • Call the correct Zomato tool
  • Combine results with natural language reasoning

Step 5: Build Real Use Cases

Here are some high-impact use cases:

🍽 Smart Restaurant Discovery

“Show top-rated Italian restaurants near Connaught Place open after 10 PM”

🧠 AI Food Assistant

“What should I order if I want high-protein food under ₹400?”

📍 Travel + Food Planning

“Plan dinner options near my hotel for 3 days”

🧪 Recommendation Systems

Personalized food suggestions based on past preferences


Best Practices

  • Keep MCP schemas simple and descriptive
  • Cache frequent Zomato queries
  • Add fallback responses when APIs fail
  • Log tool usage for observability
  • Start with read-only use cases before actions

Challenges to Watch Out For

  • API rate limits
  • Location accuracy
  • Data freshness
  • Ambiguous user queries
  • Cost control for AI + API calls

Solving these at the MCP layer keeps your AI experience reliable.


Final Thoughts

Integrating Zomato MCP allows you to move from AI that talks to AI that acts.

By combining:

  • Zomato’s real-world data
  • MCP’s structured tooling
  • LLM reasoning

You can build intelligent, scalable, and delightful food-tech experiences.


Modern web apps are no longer just UI-driven—they’re AI-assisted, context-aware, and action-oriented. With Model Context Protocol (MCP), you can connect AI models directly to real-world services like Zomato in a structured and secure way.

In this blog, we’ll walk through how to integrate Zomato MCP in a Next.js app, using best practices for scalability, security, and performance.

Image

Image

Image

Image


Why Use MCP with Next.js?

Next.js is a great fit for MCP-based AI apps because it provides:

  • API Routes / Route Handlers for MCP servers
  • Server Components for secure AI + API calls
  • Edge & Server runtimes for performance
  • Easy integration with AI SDKs

By combining Next.js + MCP + Zomato, you can build:

  • AI-powered food discovery apps
  • Smart travel & food planners
  • Conversational restaurant recommendation tools

High-Level Architecture (Next.js + MCP)

Here’s how everything fits together:

  1. Next.js App (UI)
  2. AI Layer (LLM / Agent)
  3. MCP Server (Next.js API route)
  4. Zomato API

Flow:

  • User asks a question in the UI
  • LLM decides to use a Zomato MCP tool
  • Next.js MCP route calls Zomato API
  • Structured data flows back to the AI
  • AI responds with contextual results

Step 1: Create an MCP Server Using Next.js API Routes

In a Next.js (App Router) project, your MCP server can live inside:

app/api/mcp/zomato/route.ts
Enter fullscreen mode Exit fullscreen mode

This route will:

  • Expose Zomato capabilities as MCP tools
  • Validate inputs
  • Call Zomato APIs securely
  • Return structured JSON responses

Why this works well:

  • API keys stay on the server
  • Easy deployment on Vercel or Node runtimes
  • No separate backend required

Step 2: Define Zomato MCP Tools

Each Zomato capability is exposed as an MCP tool, for example:

  • Search restaurants
  • Filter by cuisine, rating, budget
  • Fetch restaurant details

Each tool includes:

  • Tool name
  • Clear description (for the AI)
  • Input schema
  • Output schema

This helps the AI decide when to call Zomato automatically, instead of relying on brittle prompt logic.


Step 3: Securely Call Zomato APIs

Inside your Next.js MCP route:

  • Store Zomato API keys in process.env
  • Normalize Zomato responses
  • Remove unnecessary fields
  • Handle errors and rate limits

Best practices:

  • Cache frequent queries (location + cuisine)
  • Add request timeouts
  • Log tool usage for observability

Step 4: Connect MCP to the AI Layer

Once your MCP endpoint is live:

  • Register it as a tool with your LLM
  • Enable tool/function calling
  • Let the AI choose when to invoke Zomato MCP

Now your Next.js app can handle queries like:

“Find the best South Indian food near me under ₹250”

The AI:

  1. Understands intent
  2. Calls the Zomato MCP tool
  3. Combines results with reasoning
  4. Returns a clean, user-friendly answer

Step 5: Build the Next.js UI

On the frontend, you can:

  • Use Server Actions for AI requests
  • Stream responses for better UX
  • Show restaurant cards, maps, or filters

Common UI patterns:

  • Chat-style food assistant
  • Location-based discovery
  • AI-powered search bar
  • Recommendation panels

Because MCP runs server-side, your UI stays:

  • Fast
  • Secure
  • SEO-friendly

Example Use Cases in Next.js

🍽 AI Restaurant Finder

“Show me top-rated cafes near Connaught Place open now”

🧠 Smart Meal Suggestions

“What should I eat if I want something healthy and spicy?”

✈️ Travel + Food Planner

“Plan lunch and dinner near my hotel for 2 days”

🧪 Personalization

Food recommendations based on past searches or preferences


Best Practices for Next.js + MCP

  • Use Server Components for AI calls
  • Keep MCP schemas simple and descriptive
  • Add fallback responses when APIs fail
  • Cache aggressively at the MCP layer
  • Start with read-only actions before expanding

Common Challenges

  • Zomato API rate limits
  • Location accuracy
  • Ambiguous user queries
  • Cost control for AI + API usage

Solving these at the MCP level keeps your Next.js app clean and maintainable.


This setup allows you to build production-ready AI apps that can reason, act, and scale—all within a single Next.js codebase.


Below is a complete, end-to-end Next.js (App Router) implementation showing how to integrate Zomato via MCP (Model Context Protocol) in a real, runnable way.

This includes:

  • MCP server (API route)
  • Zomato API integration
  • Tool definitions
  • AI tool calling
  • Frontend UI (chat-style)
  • Secure environment setup

I’m assuming:

  • Next.js 14+ (App Router)
  • Node runtime
  • OpenAI / compatible LLM with tool calling
  • Zomato API access

1️⃣ Project Structure

/app
 ├─ /api
 │   ├─ /mcp
 │   │   └─ /zomato
 │   │       └─ route.ts        # MCP Server
 │   └─ /chat
 │       └─ route.ts            # AI Orchestrator
 ├─ page.tsx                    # UI
 └─ layout.tsx
/lib
 ├─ zomato.ts                   # Zomato API wrapper
 ├─ mcp-tools.ts                # MCP tool definitions
 └─ openai.ts                   # AI client
.env.local
Enter fullscreen mode Exit fullscreen mode

2️⃣ Environment Variables (.env.local)

OPENAI_API_KEY=sk-xxxx
ZOMATO_API_KEY=your_zomato_api_key
Enter fullscreen mode Exit fullscreen mode

3️⃣ Zomato API Wrapper (lib/zomato.ts)

export async function searchRestaurants({
  city,
  cuisine,
  budget,
}: {
  city: string;
  cuisine?: string;
  budget?: number;
}) {
  const res = await fetch(
    `https://developers.zomato.com/api/v2.1/search?q=${city}&cuisines=${cuisine || ""}`,
    {
      headers: {
        "user-key": process.env.ZOMATO_API_KEY!,
      },
    }
  );

  if (!res.ok) {
    throw new Error("Zomato API error");
  }

  const data = await res.json();

  return data.restaurants.slice(0, 5).map((r: any) => ({
    name: r.restaurant.name,
    rating: r.restaurant.user_rating.aggregate_rating,
    address: r.restaurant.location.address,
    cuisines: r.restaurant.cuisines,
    costForTwo: r.restaurant.average_cost_for_two,
  }));
}
Enter fullscreen mode Exit fullscreen mode

4️⃣ MCP Tool Definition (lib/mcp-tools.ts)

export const zomatoTools = [
  {
    type: "function",
    function: {
      name: "search_restaurants",
      description:
        "Find restaurants using Zomato based on city, cuisine, and budget",
      parameters: {
        type: "object",
        properties: {
          city: { type: "string" },
          cuisine: { type: "string" },
          budget: { type: "number" },
        },
        required: ["city"],
      },
    },
  },
];
Enter fullscreen mode Exit fullscreen mode

5️⃣ MCP Server (Next.js API Route)

📍 app/api/mcp/zomato/route.ts

import { NextResponse } from "next/server";
import { searchRestaurants } from "@/lib/zomato";

export async function POST(req: Request) {
  const body = await req.json();
  const { city, cuisine, budget } = body;

  try {
    const data = await searchRestaurants({ city, cuisine, budget });

    return NextResponse.json({
      success: true,
      data,
    });
  } catch (error) {
    return NextResponse.json(
      { success: false, error: "Failed to fetch restaurants" },
      { status: 500 }
    );
  }
}
Enter fullscreen mode Exit fullscreen mode

✔ This is your MCP Tool Execution Layer


6️⃣ OpenAI Client (lib/openai.ts)

import OpenAI from "openai";

export const openai = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY!,
});
Enter fullscreen mode Exit fullscreen mode

7️⃣ AI Orchestrator (Tool Calling)

📍 app/api/chat/route.ts

import { NextResponse } from "next/server";
import { openai } from "@/lib/openai";
import { zomatoTools } from "@/lib/mcp-tools";

export async function POST(req: Request) {
  const { message } = await req.json();

  const completion = await openai.chat.completions.create({
    model: "gpt-4o-mini",
    messages: [{ role: "user", content: message }],
    tools: zomatoTools,
    tool_choice: "auto",
  });

  const msg = completion.choices[0].message;

  // If AI decides to call Zomato
  if (msg.tool_calls) {
    const toolCall = msg.tool_calls[0];
    const args = JSON.parse(toolCall.function.arguments);

    const zomatoRes = await fetch(
      `${process.env.NEXT_PUBLIC_BASE_URL}/api/mcp/zomato`,
      {
        method: "POST",
        body: JSON.stringify(args),
      }
    );

    const data = await zomatoRes.json();

    return NextResponse.json({
      role: "assistant",
      content: `Here are some great options:\n${data.data
        .map(
          (r: any) =>
            `• ${r.name} (${r.rating}⭐) – ${r.cuisines}, ₹${r.costForTwo}`
        )
        .join("\n")}`,
    });
  }

  return NextResponse.json(msg);
}
Enter fullscreen mode Exit fullscreen mode

✔ This is pure MCP-style tool orchestration


8️⃣ Frontend UI (app/page.tsx)

"use client";

import { useState } from "react";

export default function Home() {
  const [input, setInput] = useState("");
  const [messages, setMessages] = useState<string[]>([]);

  async function sendMessage() {
    const res = await fetch("/api/chat", {
      method: "POST",
      body: JSON.stringify({ message: input }),
    });

    const data = await res.json();
    setMessages((prev) => [...prev, input, data.content]);
    setInput("");
  }

  return (
    <main className="p-6 max-w-xl mx-auto">
      <h1 className="text-2xl font-bold mb-4">
        🍽 AI Food Assistant (Zomato MCP)
      </h1>

      <div className="border p-4 rounded mb-4 min-h-[200px]">
        {messages.map((m, i) => (
          <p key={i} className="mb-2">
            {m}
          </p>
        ))}
      </div>

      <input
        value={input}
        onChange={(e) => setInput(e.target.value)}
        placeholder="Find biryani near Delhi under 300"
        className="border p-2 w-full mb-2"
      />

      <button
        onClick={sendMessage}
        className="bg-black text-white px-4 py-2 rounded"
      >
        Ask
      </button>
    </main>
  );
}
Enter fullscreen mode Exit fullscreen mode

9️⃣ Example Prompts That Work

“Find the best biryani in Delhi under 300”
“Suggest Italian food near Bangalore”
“Good cafes in Mumbai with rating above 4”


🔐 Security & Production Notes

  • Keep Zomato API keys server-only
  • Add caching in MCP route
  • Add rate limits
  • Log tool usage
  • Add fallbacks when APIs fail

🚀 Final Result

You now have:
✔ MCP-compliant tool architecture
✔ AI-driven decision making
✔ Next.js-native backend
✔ Clean UI
✔ Production-ready pattern


create mock-server for the Zomato MCP. We can directly config the mcp_config.json.

  • Open Agent, click on Triple dots, and select MCP Server.
  • ⁠To add custom MCP, we need to click on View raw config.
  • ⁠then edit mcp_config.json.

Top comments (0)