Skip to content

Commit 322b476

Browse files
committed
Use correct language indicators
1 parent 4022ad5 commit 322b476

File tree

1 file changed

+3
-9
lines changed

1 file changed

+3
-9
lines changed

docs/hypernode-platform/nginx/how-to-resolve-rate-limited-requests-429-too-many-requests.md

Lines changed: 3 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -35,11 +35,10 @@ You can quickly determine which method of Rate Limiting was the cause of the req
3535

3636
To look for rate limiting messages in the error log, you can run the following command:
3737

38-
```bash
38+
```console
3939
$ grep limiting.requests /var/log/nginx/error.log
4040
2020/06/07 13:33:37 [error] 7492#7492: *1590769 limiting requests, excess: 0.072 by zone "bots", client: 203.0.113.104, server: example.hypernode.io, request: "GET /api/ HTTP/2.0", host: "example.hypernode.io"
4141
2020/06/07 13:33:37 [error] 7492#7492: *1590770 limiting connections by zone "zoneperip", client: 198.51.100.69, server: example.hypernode.io, request: "POST /admin/ HTTP/2.0", host: "example.hypernode.io"
42-
4342
```
4443

4544
A log entry where rate limit is applied to user-agents and requests per second (based on the `bots` zone):
@@ -89,15 +88,15 @@ The keywords are separated by `|` characters since it is a regular expression.
8988

9089
To extend the allowlist, first determine what user agent you wish to add. Use the access log files to see what bots get blocked and which user agent identification it uses. To find the user agent, you can use the following command:
9190

92-
```bash
91+
```console
9392
$ pnl --today --fields time,status,remote_addr,request,user_agent --filter status=429
9493
2020-06-07T13:33:37+00:00 429 203.0.113.104 GET /api/ HTTP/2.0 SpecialSnowflakeCrawler 3.1.4
9594
2020-06-07T13:35:37+00:00 429 203.0.113.104 GET /api/ HTTP/2.0 SpecialSnowflakeCrawler 3.1.4
9695
```
9796

9897
In the example above you can see that a bot with the User Agent `SpecialSnowflakeCrawler 3.1.4` triggered the ratelimiter. As it contains the word ‘crawler’, it matches the second regular expression and is labeled as a bot. Since the allowlist line overrules the denylist line, the best way to allow this bot is to add their user agent to the allowlist instead of removing ‘crawler’ from the blacklist:
9998

100-
```
99+
```nginx
101100
map $http_user_agent $limit_bots {
102101
default '';
103102
~*(specialsnowflakecrawler|google|bing|heartbeat|uptimerobot|shoppimon|facebookexternal|monitis.com|Zend_Http_Client|magereport.com|SendCloud/|Adyen|ForusP|contentkingapp|node-fetch|Hipex) '';
@@ -135,7 +134,6 @@ geo $conn_limit_map {
135134
default $remote_addr;
136135
198.51.100.69 '';
137136
}
138-
139137
```
140138

141139
In this example, we have excluded the IP address **198.51.100.69** by setting an empty value in the form of `''`.
@@ -147,7 +145,6 @@ geo $conn_limit_map {
147145
default $remote_addr;
148146
198.51.100.0/24 '';
149147
}
150-
151148
```
152149

153150
### Disable per IP Rate Limiting
@@ -160,7 +157,6 @@ For debugging purposes, however, it could be helpful to disable the per-IP conne
160157
geo $conn_limit_map {
161158
default '';
162159
}
163-
164160
```
165161

166162
**Warning: Only use this setting for debugging purposed! Using this setting on production Hypernodes is highly discouraged, as your shop can be easily taken offline by a single IP using slow and/or flood attacks.**
@@ -178,7 +174,6 @@ if ($request_uri ~ ^\/(.*)\/rest\/V1\/example-call\/(.*) ) {
178174
if ($request_uri ~ ^\/elasticsearch.php$ ) {
179175
set $ratelimit_request_url '';
180176
}
181-
182177
```
183178

184179
In the example above, the URLs `*/rest/V1/example-call/*` and `/elasticsearch.php` are the ones that have to be excluded. You now have to use the `$ratelimit_request` variable as a default value in the file `/data/web/nginx/http.ratelimit` (see below) to exclude these URLs from the rate limiter and make sure that bots and crawlers will still be rate limited based on their User Agent.
@@ -201,7 +196,6 @@ location = /ratelimited.html {
201196
root /data/web/public;
202197
internal;
203198
}
204-
205199
```
206200

207201
This snippet will serve a custom static file called `ratelimited.html` to IP addresses that are using too many PHP workers.

0 commit comments

Comments
 (0)