JavaScript has no equivalent of pytrends - there is no widely-used unofficial Google Trends library for Node.js that has maintained community trust. Trends MCP fills the gap directly: it is accessible via standard fetch from any JavaScript environment (Node.js, browser, edge runtime, serverless function) and returns structured JSON with normalized values and absolute volume estimates from 15+ sources.
Free API access
100 free requests per month. No credit card, no setup fee.
Replaced my manual Google Trends scraper in an afternoon. The data is clean and the latency is surprisingly low for a free tier.
We use it for keyword trend reports. The free monthly quota keeps us batching queries for weekly digests. Upgrading is there when we need more headroom.
Hooked it into my MCP server in like 20 minutes. The JSON response is well-structured and the docs are solid. Exactly what I needed.
We pipe weekly series into BigQuery for a few brand cohorts. Compared to maintaining our old Selenium job, this is boring in the best way. Uptime has been solid.
Great for slide-ready trend screenshots when leadership asks why we are prioritizing a feature. I wish the dashboard had saved views, but the API side is great.
Running it from Cursor with the MCP config took one try. I am not a trends person, but my side project now emails me when a niche keyword spikes hard week over week.
Using the growth endpoints to sanity-check retail names before I write up notes. Occasionally the normalization differs from what I see in the raw Google UI, but it is consistent run to run.
Pulling multi-source ranked lists into a notebook is straightforward. Error payloads are actually readable when I fat-finger a parameter, which matters more than people admit.
Does what it says. I knocked a star because onboarding assumed I already knew MCP wiring; a copy-paste block for Claude Desktop would have saved me 15 minutes.
We track TikTok hashtag momentum against paid spend in a Looker sheet. Not glamorous work, but it is the first tool my team did not argue about during rollout.
Retries are predictable and I have not seen weird HTML in responses (looking at you, scrapers). Would pay for a team key rotation flow, but for now we rotate manually.
Quick checks on retail buzz before we dig into filings. Not a silver bullet, but it is faster than opening twelve browser tabs and reconciling by hand.
Helpful for spotting whether a topic is a one-day meme or sticking around. I still cross-check with Search Console, but this gets me 80% of the signal in one call.
I demo this in workshops when people ask how to ground LLM answers in something fresher than training data. The MCP angle lands well with engineers who hate glue code.
Solid for client reporting. Billing is clear enough that finance stopped asking me what line item this is. Minor nit: peak hours can feel a touch slower, still acceptable.
I wired this behind a small CLI for contributors who want trend context in issues. Keeping the surface area tiny matters for OSS, and the schema has not churned on me yet.
Daily pulls for a 30-day window go straight into our internal scoreboard. Stakeholders finally stopped debating whose screenshot of Trends was newer.
We are pre-revenue, so free tier discipline matters. I hit the cap once during a brainstorm where everyone wanted to try random keywords. Learned to batch smarter.
Security review passed without drama: HTTPS, scoped keys, no bizarre third-party redirects in the chain we could find. That is rarer than vendors think.
I do not need this daily, but when App Store rank shifts look weird, having Reddit and news context in one place saves me from context switching across six apps.
I use it to see if a story is genuinely blowing up or just loud on one platform. It is not a replacement for reporting, but it keeps my ledes honest.
We moved off a brittle Playwright script that broke every time Google shuffled markup. Same data shape every week now, which is all I wanted from life.
Seasonal demand spikes line up with what we see in Amazon search interest here. Merch team stopped sending me screenshots from random tools that never matched.
Solid for client decks. I docked one star only because I still export to Sheets manually; a direct connector would be nice someday.
Steam concurrents plus Reddit chatter in one workflow beats our old spreadsheet ritual before milestone reviews.
Quick pulse on whether a feature name is confusing people in search before we ship copy. Cheap sanity check compared to a full survey.
Monitored from Grafana via a thin wrapper. p95 stayed under our SLO budget last month. One noisy day during a holiday but nothing alarming.
Narrative fights in meetings got shorter once we could point at the same trend line everyone agreed on. Sounds silly until you have lived through it.
Using normalized series as a weak prior in a forecasting experiment. Citation-friendly timestamps in the payload made reproducing runs less painful.
Approved for our pilot group after a quick vendor review. Would love SAML, not a blocker for our size.
YouTube search interest plus TikTok hashtags in one place helps me explain why a sponsor should care about a vertical without hand-waving.
Cron job hits the API before standup; Slack gets a compact summary. Took an afternoon to wire, has been stable for two quarters.
Useful for public-interest topics where search interest is a rough proxy for attention. I still triangulate with primary sources; this is one signal among several.
Runs in a VPC egress-only subnet with allowlisted domains. Fewer exceptions to explain to auditors than our last vendor.
Spotting when a topic is about to flood Discord saves my team from reactive moderation fires. Not perfect, but directionally right often enough.
For lean teams the ROI story writes itself. I would not build an in-house scraper for this anymore unless compliance forced it.
Examples in the docs match what the MCP actually returns. You would be surprised how rare that is in this category.
Pager stayed quiet. When something upstream flaked once, the error string told me which parameter to fix without opening logs first.
Students use it for coursework demos. Budget is tight so free tier matters; we coach them to cache aggressively.
Helps prep talking points when retail interest in our name swings after earnings. Not material disclosure, just context for Q&A prep.
Response sizes stay small enough for mobile hotspots. I hate APIs that dump megabytes for a sparkline.
What are you working on?
How will you connect?
Python developers have pytrends. JavaScript developers have a gap. The few npm packages that attempt Google Trends scraping have poor maintenance records and no community consensus around reliability. The standard approach for JavaScript projects that need trend data has been to either use a Python microservice for data collection or to accept Google Trends' manual interface and not automate it.
Trends MCP is a managed HTTP API that works from any JavaScript environment via fetch. The response is structured JSON with a consistent schema, which makes it straightforward to use in web apps, API routes, serverless functions, or AI agent tools.
The API accepts standard HTTPS POST requests with JSON-RPC bodies. Node.js v18+ includes native fetch, so no additional HTTP library is required:
const API_KEY = process.env.TRENDS_MCP_API_KEY;
const ENDPOINT = "https://api.trendsmcp.ai/mcp";
async function callTrendsMCP(toolName, params) {
const response = await fetch(ENDPOINT, {
method: "POST",
headers: {
"Content-Type": "application/json",
"Authorization": `Bearer ${API_KEY}`,
},
body: JSON.stringify({
jsonrpc: "2.0",
method: "tools/call",
params: {
name: toolName,
arguments: params,
},
id: 1,
}),
});
const data = await response.json();
// The trend data is in the MCP response content
return JSON.parse(data.result.content[0].text);
}
// Get weekly Google Search trend for any keyword
const trendData = await callTrendsMCP("get_trends", {
keyword: "electric vehicles",
source: "google",
data_mode: "weekly",
});
The returned trendData object contains the time series array with date, normalized_value, absolute_volume_estimate, and data_quality_score per data point. Pipe it directly into a chart library.
For Next.js applications, the cleanest pattern is a route handler that proxies requests to Trends MCP, keeping the API key server-side:
// app/api/trends/route.js
export async function POST(request) {
const { keyword, source } = await request.json();
const trendData = await callTrendsMCP("get_growth", {
keyword,
source: source || "google, tiktok, reddit",
percent_growth: ["1M", "3M", "1Y"],
});
return Response.json(trendData);
}
Your frontend components then call /api/trends and never touch the Trends MCP API key directly.
For apps that surface what is trending right now, get_top_trends requires no keyword and returns a live list of trending topics on any platform:
const trendingNow = await callTrendsMCP("get_top_trends", {
type: "TikTok Trending Hashtags",
limit: 20,
});
// Map the results directly to UI components
const feed = trendingNow.trends.map((item) => ({
name: item.name,
value: item.normalized_value,
growth: item.growth_rate,
}));
This works in serverless functions that run on a cron schedule to populate a cached trending feed, or in real-time API routes that serve a live feed component.
JavaScript AI frameworks can use Trends MCP as a tool definition. For Vercel AI SDK:
import { tool } from "ai";
import { z } from "zod";
const getTrends = tool({
description: "Get trend data for a keyword across multiple platforms",
parameters: z.object({
keyword: z.string(),
source: z.string().optional().default("google"),
data_mode: z.enum(["weekly", "daily"]).optional().default("weekly"),
}),
execute: async ({ keyword, source, data_mode }) => {
return await callTrendsMCP("get_trends", { keyword, source, data_mode });
},
});
The tool is then passed to the AI model alongside other tools, and the model calls it automatically when a user asks about trends.
For direct MCP protocol support in Node.js, the MCP TypeScript SDK (@modelcontextprotocol/sdk) can connect to Trends MCP via HTTP transport at https://api.trendsmcp.ai/mcp. This is the most direct integration path for custom MCP clients built in TypeScript.
Trends MCP works in edge runtimes (Cloudflare Workers, Vercel Edge Functions) with no modifications. The API uses standard HTTPS and JSON - no Node.js-specific APIs, no streaming that requires special handling. Call it with the standard fetch API and parse the response.
One edge consideration: API key management. Environment variables on Cloudflare Workers and Vercel Edge Functions work the same way as in Node.js. Store the key as a secret, reference it in the handler.
Connect
An API key is required to connect. Get your free key above, then copy the pre-filled config for your client.
Cursor
Cursor Settings → Tools & MCP → Add a Custom MCP Server
"trends-mcp": { "url": "https://api.trendsmcp.ai/mcp", "transport": "http", "headers": { "Authorization": "Bearer YOUR_API_KEY" } }
+ Add to Cursor
Or paste into Mac / Linux — ~/.cursor/mcp.json
Windows — %USERPROFILE%\.cursor\mcp.json
↑ Get your free key above first — the config won't work without it.
Claude Desktop
User → Settings → Developer → Edit Config — add inside mcpServers
"trends-mcp": { "command": "npx", "args": [ "-y", "mcp-remote", "https://api.trendsmcp.ai/mcp", "--header", "Authorization:${AUTH_HEADER}" ], "env": { "AUTH_HEADER": "Bearer YOUR_API_KEY" } }
Mac — ~/Library/Application Support/Claude/claude_desktop_config.json
Windows — %APPDATA%\Claude\claude_desktop_config.json
Fully quit and restart Claude Desktop after saving.
Claude Code (CLI)
claude mcp add --transport http trends-mcp https://api.trendsmcp.ai/mcp \ --header "Authorization: Bearer YOUR_API_KEY"
Windsurf
Settings → Advanced Settings → Cascade → Add custom server +
"trends-mcp": { "url": "https://api.trendsmcp.ai/mcp", "transport": "http", "headers": { "Authorization": "Bearer YOUR_API_KEY" } }
Mac / Linux — ~/.codeium/windsurf/mcp_config.json
Windows — %USERPROFILE%\.codeium\windsurf\mcp_config.json
Or: Command Palette → Windsurf: Configure MCP Servers
VS Code
Extensions sidebar → search @mcp trends-mcp → Install — or paste manually into .vscode/mcp.json inside servers
"trends-mcp": { "type": "http", "url": "https://api.trendsmcp.ai/mcp", "headers": { "Authorization": "Bearer YOUR_API_KEY" } }
Paste into .vscode/mcp.json, or:
Command Palette (⇧⌘P / Ctrl+Shift+P) → MCP: Add Server
Data Sources
All data is normalized to a 0-100 scale for consistent cross-platform comparison.
Tools
Four tools, organized by how you start. With a keyword, track history and growth. Without one, use discovery to see ranked movers or what is live right now.
You already have a keyword.
Chart how it moves over time and compare growth across sources.
No keyword required.
Ranked lists on one source with a growth sort you choose, or a live snapshot of what is trending across platforms.
Outputs
FAQ