The official Node.js/TypeScript client library for the ScrapeBadger API.
- Full TypeScript Support - Complete type definitions for all API endpoints
- Modern ESM & CommonJS - Works with both module systems
- Async Iterators - Automatic pagination with
for await...ofsyntax - Smart Rate Limiting - Reads API headers and throttles pagination automatically
- Resilient Retries - Exponential backoff with colored console warnings
- Typed Exceptions - Distinct error classes for every failure scenario
npm install scrapebadgeryarn add scrapebadgerpnpm add scrapebadgerimport { ScrapeBadger } from "scrapebadger";
const client = new ScrapeBadger({ apiKey: "your-api-key" });
// Get a tweet
const tweet = await client.twitter.tweets.getById("1234567890");
console.log(`@${tweet.username}: ${tweet.text}`);
// Scrape a website
const result = await client.web.scrape("https://scrapebadger.com", { format: "markdown" });
console.log(result.content);
// Get a user profile
const user = await client.twitter.users.getByUsername("elonmusk");
console.log(`${user.name} has ${user.followers_count.toLocaleString()} followers`);// Pass API key directly
const client = new ScrapeBadger({ apiKey: "sb_live_xxxxxxxxxxxxx" });
// Or use environment variable SCRAPEBADGER_API_KEY
const client = new ScrapeBadger();| API | Description | Documentation |
|---|---|---|
| Web Scraping | Scrape any website with JS rendering, anti-bot bypass, and AI extraction | Web Scraping Guide |
| 37+ endpoints for tweets, users, lists, communities, trends, and real-time streams | Twitter Guide | |
| Vinted | Search items, get details, user profiles, and reference data across all Vinted markets | Vinted Guide |
import {
ScrapeBadger,
AuthenticationError,
RateLimitError,
NotFoundError,
InsufficientCreditsError,
} from "scrapebadger";
const client = new ScrapeBadger({ apiKey: "your-api-key" });
try {
const tweet = await client.twitter.tweets.getById("1234567890");
} catch (error) {
if (error instanceof AuthenticationError) {
console.error("Invalid API key");
} else if (error instanceof RateLimitError) {
console.error(`Rate limited. Retry after: ${error.retryAfter}`);
} else if (error instanceof NotFoundError) {
console.error("Tweet not found");
} else if (error instanceof InsufficientCreditsError) {
console.error("Out of credits");
} else {
throw error;
}
}const client = new ScrapeBadger({
// Required: Your API key (or use SCRAPEBADGER_API_KEY env var)
apiKey: "your-api-key",
// Optional: Custom base URL (default: https://scrapebadger.com)
baseUrl: "https://scrapebadger.com",
// Optional: Request timeout in milliseconds (default: 30000)
timeout: 30000,
// Optional: Maximum retry attempts (default: 10)
maxRetries: 10,
// Optional: Initial retry delay in milliseconds (default: 1000)
retryDelay: 1000,
});The SDK automatically retries requests that fail with server errors (5xx) or rate limits (429) using exponential backoff (1s, 2s, 4s, 8s, ...). Each retry prints a colored warning:
⚠ ScrapeBadger: 503 Service Unavailable — retrying in 4s (attempt 3/10)
When using *All pagination methods (e.g. searchAll, getFollowersAll), the SDK
reads X-RateLimit-Remaining and X-RateLimit-Reset headers from each response.
When remaining requests drop below 20% of your tier's limit, pagination automatically
slows down to spread requests across the remaining window — preventing 429 errors:
⚠ ScrapeBadger: Rate limit: 25/300 remaining (resets in 42s), throttling pagination
This works transparently with all tier levels (Free: 60/min, Basic: 300/min, Pro: 1000/min, Enterprise: 5000/min).
ScrapeBadgerError- Base exception classAuthenticationError- Invalid or missing API keyRateLimitError- Rate limit exceededNotFoundError- Resource not foundValidationError- Invalid requestServerError- Server errorTimeoutError- Request timeoutInsufficientCreditsError- Out of creditsAccountRestrictedError- Account restrictedWebSocketStreamError- WebSocket stream failure (auth, limit, or network)
- Node.js 18+ (for native
fetchsupport) - TypeScript 5.0+ (for best type inference)
MIT License - see LICENSE for details.
