Price intelligence is one of the most valuable data sources in ecommerce, travel, SaaS, and B2B. If you know when a competitor drops their price, launches a promo, or quietly raises rates — you can respond.
The traditional path: write a custom scraper. But scrapers break. Every site update, every anti-bot upgrade, every DOM change requires maintenance. And if you need to monitor 50 sites, you need 50 scrapers.
There’s a simpler model.
What price monitoring actually needs
A price monitoring system has four jobs:
- Extract the current price from a source URL
- Store the result with a timestamp
- Compare against the previous value
- Alert when something exceeds a threshold
Jobs 2–4 are standard backend logic. Job 1 is where people get stuck — and where workers help.
Using workers for price extraction
Workers are pre-built, maintained extractors. Instead of writing and hosting a scraper for Amazon, a worker for Amazon product pages already exists. You call it with a URL, get structured data back.
curl -X POST https://api.seek-api.com/v1/workers/amazon-product/jobs \
-H "X-Api-Key: YOUR_KEY" \
-H "Content-Type: application/json" \
-d '{"url": "https://www.amazon.com/dp/B0CHWCC249"}'
Result:
{
"title": "Apple AirPods Pro (2nd Generation)",
"price": 189.99,
"currency": "USD",
"availability": "In Stock",
"rating": 4.4,
"reviewCount": 82341,
"seller": "Amazon.com",
"prime": true,
"originalPrice": 249.00,
"discount": "24%"
}
No scraper to write. No proxy pool to manage. No Playwright session to debug.
The full pipeline
Here’s a complete price monitoring system in ~50 lines of Node.js:
import Cron from 'node-cron';
import fetch from 'node-fetch';
const TARGETS = [
{ name: "AirPods Pro", worker: "amazon-product", url: "https://amazon.com/dp/B0CHWCC249" },
{ name: "Competitor SaaS", worker: "website-price-extractor", url: "https://competitor.com/pricing" },
];
const HISTORY = {};
async function checkPrice(target) {
const { job_uuid } = await fetch('https://api.seek-api.com/v1/workers/' + target.worker + '/jobs', {
method: 'POST',
headers: { 'X-Api-Key': process.env.SEEKAPI_KEY, 'Content-Type': 'application/json' },
body: JSON.stringify({ url: target.url }),
}).then(r => r.json());
// Wait for result
let result;
while (true) {
const status = await fetch(`https://api.seek-api.com/v1/jobs/${job_uuid}`, {
headers: { 'X-Api-Key': process.env.SEEKAPI_KEY }
}).then(r => r.json());
if (status.status === 'completed') { result = status.result; break; }
await new Promise(r => setTimeout(r, 3000));
}
const prev = HISTORY[target.name];
const curr = result.price;
if (prev && prev !== curr) {
await sendAlert(target.name, prev, curr);
}
HISTORY[target.name] = curr;
}
async function sendAlert(name, prev, curr) {
const direction = curr < prev ? '⬇️ dropped' : '⬆️ raised';
await fetch(process.env.SLACK_WEBHOOK, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
text: `*${name}* price ${direction}: $${prev} → $${curr}`
}),
});
}
// Run every 6 hours
Cron.schedule('0 */6 * * *', () => {
TARGETS.forEach(checkPrice);
});
Deploy this to any server or cloud function. You now have a live price alert system with Slack notifications.
Scaling to hundreds of products
For large catalogs, batch-submit all jobs, then poll in parallel:
const jobs = await Promise.all(
PRODUCTS.map(product =>
submitJob(product.worker, { url: product.url })
.then(j => ({ ...j, product }))
)
);
// Poll all job statuses concurrently
const results = await Promise.all(jobs.map(j => pollUntilComplete(j.job_uuid)));
100 product checks completes in seconds rather than minutes because jobs run in parallel.
Use case: SaaS competitor pricing changes
Many SaaS companies update their pricing pages without announcement. A weekly automated check against /pricing pages using a generic text-extractor worker can surface changes to tiers, limits, and prices before they get picked up by industry newsletters.
POST /v1/workers/webpage-extractor/jobs
{ "url": "https://competitor.com/pricing", "fields": ["price", "plan_name", "limits"] }
Use case: Flight price drops
POST /v1/workers/flight-prices/jobs
{ "origin": "CDG", "destination": "JFK", "date": "2026-06-15" }
Run daily, alert when the cheapest fare drops more than 15% vs. last observation.
What you don’t need to build
With this approach, you skip:
- Proxy management
- Anti-bot bypass research
- Playwright / Puppeteer browser sessions
- CSS selector maintenance
- Site-specific scraper logic
Workers are maintained by people who specialize in extracting from those sources. When a site changes, the worker gets updated — not your code.