Skip to content
This repository was archived by the owner on Sep 28, 2022. It is now read-only.

Commit afbf32e

Browse files
committed
Added README with instructions to run the Elasticsearch example
1 parent d04f4a8 commit afbf32e

File tree

3 files changed

+67
-0
lines changed

3 files changed

+67
-0
lines changed

Dockerfile

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,3 @@
11
FROM blazemeter/taurus
22
ADD examples/http /bzt-configs/
3+
# ADD examples/elasticsearch /bzt-configs/

docs/elasticsearch.png

647 KB
Loading

examples/elasticsearch/README.md

Lines changed: 66 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,66 @@
1+
### Load Testing an Amazon Elasticsearch Cluster
2+
3+
This example assumes that your Elasticsearch cluster is behind a VPC and not exposed publicly to the internet. However,
4+
either way is possible to test. Having it publicly accessible makes it even easier.
5+
6+
![Arch](../../docs/elasticsearch.png)
7+
8+
To start, follow all the instructions
9+
in the main [README](https://github.com/aws-samples/distributed-load-testing-using-aws-fargate/blob/master/README.md),
10+
except for **Step 2 and 4** in which you need to do the following changes instead.
11+
12+
#### Instead of Step 2
13+
Open the [Dockerfile](https://github.com/aws-samples/distributed-load-testing-using-aws-fargate/blob/master/Dockerfile)
14+
in the root directory of the project and uncomment the elasticsearch instruction and comment out the http one. It should
15+
look like this:
16+
17+
```Dockerfile
18+
# ADD examples/http /bzt-configs/
19+
ADD examples/elasticsearch /bzt-configs/
20+
```
21+
22+
Then, edit the [runner.py](https://github.com/aws-samples/distributed-load-testing-using-aws-fargate/blob/master/bin/runner.py)
23+
script and specify your Elasticsearch cluster URL in the environment variable.
24+
25+
```python
26+
ENDPOINT_UNDER_TEST = 'https://vpc-123456789.aws.us-west-2.amazon.com'
27+
```
28+
29+
Finally, write your test scenarios. I have written a `consumer.test.js` and a `producer.test.js` scenarios for
30+
demonstration purposes, but you can edit them to fit your needs. In this example, the producer will create random JSON
31+
documents to be indexed in the cluster, while the consumer will issue *search* requests of randomly generated words.
32+
If you look at the `taurus.yml` file, you can see both scenarios specified, which means that Taurus will execute both
33+
scripts in parallel.
34+
35+
#### Instead of Step 4
36+
37+
In the case of your Elasticsearch cluster running behind a VPC, the easiest option would be run the Fargate Docker tasks
38+
in the same VPC. So, follow **Step 4** just as described in the main [README](https://github.com/aws-samples/distributed-load-testing-using-aws-fargate/blob/master/README.md)
39+
but use the CloudFormation template located in `cloudformation/main-with-existing-vpc.yml`. This template will let you
40+
choose the VPC and Subnets to place the Fargate cluster.
41+
42+
For this same reason, run the CloudFormation template in 1 region only and make sure to edit the `bin/runner.py` script
43+
to reflect the one region where you created it. Which should be the same region where Elasticsearch is running.
44+
45+
```python
46+
regions = [
47+
{
48+
'name': 'us-east-1',
49+
'stackName': 'dlt-fargate',
50+
'taskCount': 3
51+
},
52+
# {
53+
# 'name': 'us-east-2',
54+
# 'stackName': 'dlt-fargate',
55+
# 'taskCount': 3
56+
# },
57+
# {
58+
# 'name': 'us-west-2',
59+
# 'stackName': 'dlt-fargate',
60+
# 'taskCount': 3
61+
# }
62+
]
63+
```
64+
65+
Remember, if your Elasticsearch cluster is publicly accessible, meaning that it's HTTP endpoint is reachable from anywhere
66+
in the world, you can ignore the previous step and deploy the Fargate cluster in multiple regions and in it's own VPCs.

0 commit comments

Comments
 (0)