Skip to content

scrape-badger/scrapebadger-node

Repository files navigation

ScrapeBadger

ScrapeBadger Node.js SDK

npm version npm downloads Tests TypeScript License: MIT

The official Node.js/TypeScript client library for the ScrapeBadger API.

Features

  • Full TypeScript Support - Complete type definitions for all API endpoints
  • Modern ESM & CommonJS - Works with both module systems
  • Async Iterators - Automatic pagination with for await...of syntax
  • Smart Rate Limiting - Reads API headers and throttles pagination automatically
  • Resilient Retries - Exponential backoff with colored console warnings
  • Typed Exceptions - Distinct error classes for every failure scenario

Installation

npm install scrapebadger
yarn add scrapebadger
pnpm add scrapebadger

Quick Start

import { ScrapeBadger } from "scrapebadger";

const client = new ScrapeBadger({ apiKey: "your-api-key" });

// Get a tweet
const tweet = await client.twitter.tweets.getById("1234567890");
console.log(`@${tweet.username}: ${tweet.text}`);

// Scrape a website
const result = await client.web.scrape("https://scrapebadger.com", { format: "markdown" });
console.log(result.content);

// Get a user profile
const user = await client.twitter.users.getByUsername("elonmusk");
console.log(`${user.name} has ${user.followers_count.toLocaleString()} followers`);

Authentication

// Pass API key directly
const client = new ScrapeBadger({ apiKey: "sb_live_xxxxxxxxxxxxx" });

// Or use environment variable SCRAPEBADGER_API_KEY
const client = new ScrapeBadger();

Available APIs

API Description Documentation
Web Scraping Scrape any website with JS rendering, anti-bot bypass, and AI extraction Web Scraping Guide
Twitter 37+ endpoints for tweets, users, lists, communities, trends, and real-time streams Twitter Guide
Vinted Search items, get details, user profiles, and reference data across all Vinted markets Vinted Guide

Error Handling

import {
  ScrapeBadger,
  AuthenticationError,
  RateLimitError,
  NotFoundError,
  InsufficientCreditsError,
} from "scrapebadger";

const client = new ScrapeBadger({ apiKey: "your-api-key" });

try {
  const tweet = await client.twitter.tweets.getById("1234567890");
} catch (error) {
  if (error instanceof AuthenticationError) {
    console.error("Invalid API key");
  } else if (error instanceof RateLimitError) {
    console.error(`Rate limited. Retry after: ${error.retryAfter}`);
  } else if (error instanceof NotFoundError) {
    console.error("Tweet not found");
  } else if (error instanceof InsufficientCreditsError) {
    console.error("Out of credits");
  } else {
    throw error;
  }
}

Configuration

const client = new ScrapeBadger({
  // Required: Your API key (or use SCRAPEBADGER_API_KEY env var)
  apiKey: "your-api-key",

  // Optional: Custom base URL (default: https://scrapebadger.com)
  baseUrl: "https://scrapebadger.com",

  // Optional: Request timeout in milliseconds (default: 30000)
  timeout: 30000,

  // Optional: Maximum retry attempts (default: 10)
  maxRetries: 10,

  // Optional: Initial retry delay in milliseconds (default: 1000)
  retryDelay: 1000,
});

Retry Behavior

The SDK automatically retries requests that fail with server errors (5xx) or rate limits (429) using exponential backoff (1s, 2s, 4s, 8s, ...). Each retry prints a colored warning:

⚠ ScrapeBadger: 503 Service Unavailable — retrying in 4s (attempt 3/10)

Rate Limit Aware Pagination

When using *All pagination methods (e.g. searchAll, getFollowersAll), the SDK reads X-RateLimit-Remaining and X-RateLimit-Reset headers from each response. When remaining requests drop below 20% of your tier's limit, pagination automatically slows down to spread requests across the remaining window — preventing 429 errors:

⚠ ScrapeBadger: Rate limit: 25/300 remaining (resets in 42s), throttling pagination

This works transparently with all tier levels (Free: 60/min, Basic: 300/min, Pro: 1000/min, Enterprise: 5000/min).

Exceptions

  • ScrapeBadgerError - Base exception class
  • AuthenticationError - Invalid or missing API key
  • RateLimitError - Rate limit exceeded
  • NotFoundError - Resource not found
  • ValidationError - Invalid request
  • ServerError - Server error
  • TimeoutError - Request timeout
  • InsufficientCreditsError - Out of credits
  • AccountRestrictedError - Account restricted
  • WebSocketStreamError - WebSocket stream failure (auth, limit, or network)

Requirements

  • Node.js 18+ (for native fetch support)
  • TypeScript 5.0+ (for best type inference)

License

MIT License - see LICENSE for details.

Links