Skip to content

Commit

Permalink
Merge #3: v2.0.0
Browse files Browse the repository at this point in the history
Endpoint specific limits, and much more refactors
  • Loading branch information
HeySreelal authored Sep 15, 2024
2 parents d608564 + d656c44 commit 4090fd7
Show file tree
Hide file tree
Showing 17 changed files with 727 additions and 321 deletions.
5 changes: 5 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,8 @@
# 2.0.0

- Added `shelfLimiterByEndpoint` for implementing different rate limits for different endpoints.
- Breaking: Updated `shelfLimiter` definition.

# 1.0.2

- Added more documentations
Expand Down
95 changes: 77 additions & 18 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,24 +12,27 @@

</div>

---
---

`shelf_limiter` is a powerful and highly customizable middleware package for the [shelf](https://pub.dev/packages/shelf) library in Dart that enables efficient rate limiting. Protect your API from abuse and ensure fair usage with ease.
`shelf_limiter` is a powerful and highly customizable middleware package for the [Shelf](https://pub.dev/packages/shelf) library in Dart that enables efficient rate limiting. Protect your API from abuse and ensure fair usage with ease.

## 🌟 Features

- **🔧 Customizable Rate Limits**: Effortlessly set the maximum number of requests and time window to suit your needs.
- **📜 Custom Headers**: Add and manage custom headers in your responses to enhance control and transparency.
- **🚀 Custom Responses**: Craft personalized responses when the rate limit is exceeded, improving user feedback.
- **🔗 Easy Integration**: Integrate seamlessly into your existing Shelf pipeline with minimal setup, so you can focus on building features.
| **Feature** | **Description** |
|---------------------------------|------------------|
| **🔧 Customizable Rate Limits** | Effortlessly set the maximum number of requests and time window to suit your needs. Define global limits or different limits for specific endpoints to control how often clients can access your API. |
| **📜 Custom Headers** | Add and manage custom headers in your responses to enhance control and transparency. |
| **🚀 Custom Responses** | Looking for more control? You’ve got it! Customize and send your own response when the API limit is exceeded. |
| **🔗 Easy Integration** | Integrate seamlessly into your existing Shelf pipeline with minimal setup. Quickly apply rate limiting and focus on building the features that matter most without worrying about complex configurations. |
| **🌐 Endpoint-Specific Limits** | Set different rate limits for different endpoints. Use wildcard patterns (e.g., `/api/v1/*`) to apply rate limits to multiple routes, allowing you to protect high-traffic routes with stricter limits while allowing more leniency on less critical parts of your API. |

## Installation

Add `shelf_limiter` to your `pubspec.yaml` file:

```yaml
dependencies:
shelf_limiter: ^1.0.0
shelf_limiter: <latest>
```
Then run:
Expand All @@ -48,7 +51,7 @@ dart pub add shelf_limiter

### 🔧 Basic Usage

Implement rate limiting in your Shelf application quickly and effectively. Here’s a straightforward example:
Implement rate limiting in your Shelf application quickly and effectively. Here’s a straightforward example using the `shelfLimiter` middleware:

```dart
import 'package:shelf/shelf.dart';
Expand All @@ -57,8 +60,10 @@ import 'package:shelf_limiter/shelf_limiter.dart';
void main() async {
final limiter = shelfLimiter(
maxRequests: 5,
windowSize: const Duration(minutes: 1),
RateLimiterOptions(
maxRequests: 5,
windowSize: const Duration(minutes: 1),
),
);
final handler =
Expand All @@ -71,9 +76,11 @@ void main() async {
Response _echoRequest(Request request) => Response.ok('Request received');
```

In this example, any client is limited to 5 requests per minute. If a client exceeds this limit, they will receive a `429 Too Many Requests` response.

### 🛠️ Enhance Your API with Custom Headers

Add extra details to your responses with custom headers. Here’s how:
Add extra details to your responses with custom headers using `shelfLimiter`:

```dart
import 'dart:convert';
Expand All @@ -89,9 +96,11 @@ void main() async {
);
final limiter = shelfLimiter(
maxRequests: 5,
windowSize: const Duration(minutes: 1),
options: options,
RateLimiterOptions(
maxRequests: 5,
windowSize: const Duration(minutes: 1),
headers: options.headers,
),
);
final handler =
Expand Down Expand Up @@ -124,7 +133,7 @@ void main() async {
429,
body: jsonEncode({
'status': false,
'message': "Uh, hm! Wait a minute, that's a lot of request.",
'message': "Uh, hm! Wait a minute, that's a lot of requests.",
}),
headers: {
'Content-Type': 'application/json',
Expand All @@ -134,9 +143,12 @@ void main() async {
);
final limiter = shelfLimiter(
maxRequests: 5,
windowSize: const Duration(minutes: 1),
options: options,
RateLimiterOptions(
maxRequests: 5,
windowSize: const Duration(minutes: 1),
headers: options.headers,
onRateLimitExceeded: options.onRateLimitExceeded,
),
);
final handler =
Expand All @@ -149,6 +161,53 @@ void main() async {
Response _echoRequest(Request request) => Response.ok('Request received');
```

## 📌 Advanced Usage with Endpoint-Specific Limits

When you want to fine-tune your rate limiting strategy and avoid a one-size-fits-all approach, `shelfLimiterByEndpoint` is your best friend. This middleware allows you to set unique rate limits for different endpoints, giving you the power to tailor restrictions based on the needs of each route. Think of it as customizing speed limits for different roads in your neighborhood—some streets are just busier than others!

### Example - Custom Limits for Different Routes:

Here's how you can make your API as efficient as a well-oiled machine with `shelfLimiterByEndpoint`:

```dart
import 'package:shelf/shelf.dart';
import 'package:shelf/shelf_io.dart' as io;
import 'package:shelf_limiter/shelf_limiter.dart';
void main() async {
final limiter = shelfLimiterByEndpoint(
endpointLimits: {
'/auth': RateLimiterOptions(
maxRequests: 5,
windowSize: const Duration(minutes: 1),
),
'/data': RateLimiterOptions(
maxRequests: 20,
windowSize: const Duration(minutes: 1),
),
'/api/v1/*': RateLimiterOptions( // Wildcard path matching
maxRequests: 15,
windowSize: const Duration(minutes: 2),
),
},
defaultOptions: RateLimiterOptions(
maxRequests: 100,
windowSize: const Duration(minutes: 1),
),
);
final handler =
const Pipeline().addMiddleware(limiter).addHandler(_echoRequest);
var server = await io.serve(handler, 'localhost', 8080);
print('Server listening on port ${server.port}');
}
Response _echoRequest(Request request) => Response.ok('Request received');
```

In this advanced example, the `/auth` endpoint has a rate limit of 5 requests per minute, the `/data` endpoint allows up to 20 requests, and all other endpoints following the `/api/v1/*` pattern are limited to 15 requests per 2 minutes. Any endpoint not explicitly listed follows the default limit of 100 requests per minute.

## ⚙️ Configuration

### Rate Limiter Options
Expand Down
74 changes: 74 additions & 0 deletions example/endpoint_wildcard_example.dart
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
import 'dart:convert';

import 'package:shelf/shelf.dart';
import 'package:shelf/shelf_io.dart' as io;
import 'package:shelf_limiter/shelf_limiter.dart';

void main() async {
// Define rate limiter options for specific endpoints with wildcard support
final endpointLimits = {
'/auth/*': RateLimiterOptions(
maxRequests: 3,
windowSize: const Duration(minutes: 1),
headers: {'X-Custom-Header': 'Auth Rate Limited'},
onRateLimitExceeded: (request) async {
return Response(
429,
body: jsonEncode({
'status': false,
'message': 'Rate limit exceeded for authentication endpoint',
}),
headers: {'Content-Type': 'application/json'},
);
},
),
'/data/*/shit': RateLimiterOptions(
maxRequests: 10,
windowSize: const Duration(minutes: 1),
headers: {'X-Custom-Header': 'Data Rate Limited'},
onRateLimitExceeded: (request) async {
return Response(
429,
body: jsonEncode({
'status': false,
'message': 'Rate limit exceeded for data endpoint',
}),
headers: {'Content-Type': 'application/json'},
);
},
),
'/': RateLimiterOptions(
maxRequests: 20,
windowSize: const Duration(minutes: 1),
headers: {'X-Custom-Header': 'Global Rate Limited'},
onRateLimitExceeded: (request) async {
return Response(
429,
body: jsonEncode({
'status': false,
'message': 'Rate limit exceeded for global endpoint',
}),
headers: {'Content-Type': 'application/json'},
);
},
),
};

// Create the rate limiter middleware with endpoint-specific options
final limiter = shelfLimiterByEndpoint(
endpointLimits: endpointLimits,
);

// Add the rate limiter to the pipeline and define a handler for incoming requests
final handler =
const Pipeline().addMiddleware(limiter).addHandler(_echoRequest);

// Start the server on localhost and listen for incoming requests on port 8080
var server = await io.serve(handler, 'localhost', 8080);
print('Server listening on port ${server.port}');
}

// Basic request handler that responds with 'Request received'
Response _echoRequest(Request request) {
return Response.ok('Request received for ${request.url.path}');
}
42 changes: 42 additions & 0 deletions example/shelf_limiter_by_endpoint_example.dart
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
import 'dart:convert';

import 'package:shelf/shelf.dart';
import 'package:shelf/shelf_io.dart' as io;
import 'package:shelf_limiter/shelf_limiter.dart';

void main() async {
// Define rate limiter options for specific endpoints
final endpointLimits = {
'/limited': RateLimiterOptions(
maxRequests: 5,
windowSize: const Duration(minutes: 1),
headers: {'X-Custom-Header': 'Rate limited'},
onRateLimitExceeded: (request) async {
return Response(
429,
body: jsonEncode({
'status': false,
'message': 'Rate limit exceeded for /api/v1/limited',
}),
headers: {'Content-Type': 'application/json'},
);
},
),
};

// Create the rate limiter middleware with endpoint-specific options
final limiter = shelfLimiterByEndpoint(
endpointLimits: endpointLimits,
);

// Add the rate limiter to the pipeline and define a handler for incoming requests
final handler =
const Pipeline().addMiddleware(limiter).addHandler(_echoRequest);

// Start the server on localhost and listen for incoming requests on port 8080
var server = await io.serve(handler, 'localhost', 8080);
print('Server listening on port ${server.port}');
}

// Basic request handler that responds with 'Request received'
Response _echoRequest(Request request) => Response.ok('Request received');
19 changes: 8 additions & 11 deletions example/shelf_limiter_example.dart
Original file line number Diff line number Diff line change
Expand Up @@ -5,34 +5,31 @@ import 'package:shelf/shelf_io.dart' as io;
import 'package:shelf_limiter/shelf_limiter.dart';

void main() async {
// Additional customization (optional)
// Here we define custom headers and a custom response message for when the rate limit is exceeded
// Define custom rate limiter options
final options = RateLimiterOptions(
maxRequests: 5, // Maximum number of requests allowed
windowSize: const Duration(minutes: 1), // Duration of the rate limit window
headers: {
'X-Custom-Header': 'Rate limited', // Custom header to add to responses
},
onRateLimitExceeded: (request) async {
// Custom message to return when the client exceeds the rate limit
// Customize it as much as you want :)
// Custom response when the rate limit is exceeded
return Response(
429,
body: jsonEncode({
'status': false,
'message': "Uh, hm! Wait a minute, that's a lot of request.",
'message': "Uh, hm! Wait a minute, that's a lot of requests.",
}),
headers: {
'Content-Type': 'application/json',
'X-Custom-Response-Header': 'CustomValue', // Additional custom header
},
);
},
);

// Create the rate limiter middleware with a max of 5 requests per 1 minute window
final limiter = shelfLimiter(
maxRequests: 5,
windowSize: const Duration(minutes: 1),
options: options, // Apply custom options
);
// Create the rate limiter middleware with the custom options
final limiter = shelfLimiter(options);

// Add the rate limiter to the pipeline and define a handler for incoming requests
final handler =
Expand Down
Loading

0 comments on commit 4090fd7

Please sign in to comment.