Rate Limiter

The Rate Limiter component protects your application from abuse, denial-of-service (DoS) attacks, and brute-force attempts by limiting the frequency of incoming requests from a single IP address.

Built on top of express-rate-limit, it provides a standardized way to apply rate limits globally or to specific sensitive routes.

Installation Guide

This component requires additional ServerCN components.

👉 Note: You do not need to install any servercn dependencies manually. Installing this component will automatically install all required servercn dependencies. Manual installation is optional if you prefer to manage dependencies yourself.

  • HTTP Status Codes:
npx servercn add http-status-codes

Documentation: HTTP Status Codes

  • API Error Handler:
npx servercn add error-handler

Documentation: API Error Handler

  • Global Error Handler:
npx servercn add global-error-handler

Documentation: Global Error Handler

npx servercn add rate-limiter

Configuration Options

The rate limiter can be customized to fit your application's needs:

  • windowMs: The timeframe for which requests are checked (in milliseconds).
  • max: The maximum number of connections to allow during the windowMs before returning a 429 error.
  • standardHeaders: Enables the RateLimit-* headers in the response.
  • handler: A custom function to execute when the limit is reached. In ServerCN, we use it to forward the error to our global error handler.

Basic Implementation

MVC: src/middlewares/rate-limiter.ts

Modular: src/shared/middlewares/rate-limiter.ts

import { rateLimit } from "express-rate-limit";
 
/**
 * Standard rate limiter middleware
 * Limits each IP to 100 requests per 15-minute window
 */
export const rateLimiter = rateLimit({
  windowMs: 15 * 60 * 1000, // 15 minutes
  max: 100, // Limit each IP to 100 requests per window
  standardHeaders: true, // Return rate limit info in the `RateLimit-*` headers
  legacyHeaders: false, // Disable the `X-RateLimit-*` headers
  message: {
    success: false,
    message:
      "Too many requests from this IP, please try again after 15 minutes",
    status: 429
  },
  handler: (req, res, next, options) => {
    next(new ApiError(STATUS_CODES.TOO_MANY_REQUESTS, options.message.message));
  }
});
 
/**
 * Stricter rate limiter for sensitive routes (e.g., auth, login)
 */
export const authRateLimiter = rateLimit({
  windowMs: 60 * 60 * 1000, // 1 hour
  max: 5, // Limit each IP to 5 failed attempts per hour
  handler: (req, res, next, options) => {
    next(
      ApiError.tooManyRequests(
        "Too many login attempts, please try again after an hour"
      )
    );
  }
});
 
/**
 * Rate limiter for login route
 */
export const signinRateLimiter = rateLimit({
  windowMs: 15 * 60 * 1000,
  max: 5,
  message: {
    success: false,
    message: "Too many login attempts, please try again later.",
    statusCode: 429
  },
  standardHeaders: true,
  legacyHeaders: false
});
 
/**
 * Rate limiter for registration route
 */
export const signupRateLimiter = rateLimit({
  windowMs: 15 * 60 * 1000,
  max: 5,
  message: {
    success: false,
    message: "Too many registration attempts, please try again later.",
    statusCode: 429
  },
  standardHeaders: true,
  legacyHeaders: false
});

Usage Guide

Apply the rate limiter globally in your main app.ts file to protect all routes by default.

src/app.ts
import express from "express";
import { rateLimiter } from "./middlewares/rate-limiter";
 
const app = express();
 
// Apply the rate limiting middleware to all requests
app.use(rateLimiter);
 
// ... rest of your app configuration

For sensitive routes like login, password reset, or expensive search operations, you should apply stricter limits.

src/routes/auth.routes.ts
import { Router } from "express";
import { authRateLimiter } from "../middlewares/rate-limiter";
 
const router = Router();
 
// Only apply to the login route
router.post("/login", authRateLimiter, (req, res) => {
  // Login logic...
});
 
export default router;

Integrating with Redis (Optional)

For distributed applications (multiple server instances), you should use a centralized store like Redis to sync rate limits across all instances.

npm install rate-limit-redis redis
import { rateLimit } from "express-rate-limit";
import RedisStore from "rate-limit-redis";
import { createClient } from "redis";
 
const redisClient = createClient({
  /* redis config */
});
redisClient.connect();
 
export const redisRateLimiter = rateLimit({
  windowMs: 15 * 60 * 1000,
  max: 100,
  store: new RedisStore({
    sendCommand: (...args: string[]) => redisClient.sendCommand(args)
  })
});

Best Practices

  1. Trust Proxy: If your app is behind a reverse proxy (like Nginx, Heroku, Cloudflare), you must enable trust proxy in Express to get the real IP address.
    app.set("trust proxy", 1);
  2. Informative Errors: Always use the handler option to return a structured error message using ApiError.
  3. Exemptions: If you have internal services or webhooks that should never be throttled, use the skip option:
    skip: (req) => req.ip === "127.0.0.1",

File & Folder Structure

Select a file to view its contents

Installation

npx servercn add rate-limiter