You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
<h1align="center">Flight Controller on AWS Cloud Platform</h1>
2
-
<palign="center">This repository contains the code necessary for Flight Controller on AWS.</p>
1
+
<h1align="center">Flight Controller</h1>
2
+
<palign="center">This repository contains the code necessary for Flight Controller</p>
3
3
4
4
## What is Flight Controller?
5
5
@@ -9,33 +9,36 @@ The intent is to make it trivial to add new measures, allowing teams to be data
9
9
10
10
The approach to scaling a landing zone on AWS is [elaborated here](https://aws.amazon.com/blogs/mt/flight-controller-by-contino-a-solution-built-on-aws-control-tower/).
-`make local` setup local environment and install dependencies.
51
54
-`make docs-run` install, build and run a dev version of the docs.
@@ -55,114 +58,38 @@ Optional
55
58
-`make destroy` destroys the both infra & Grafana with TF CDK.
56
59
57
60
Testing is split into four commands:
61
+
58
62
-`make unittest` runs all the unit tests (i.e. tests that are not [marked as integration](https://docs.pytest.org/en/7.1.x/example/markers.html)).
59
63
-`make integration-test` run all the integration tests.
60
64
-`make test` runs all the tests and reports on coverage.
61
65
-`make e2e` runs the end to end BDD tests using [behave](https://github.com/behave/behave).
66
+
-`make watch` runs all the unit tests on file change. Allowing the test code while making live changes.
62
67
68
+
### Merging changes
63
69
64
-
### Testing from console
65
-
66
-
After you deploy your infrastructure you can interact with the eventbridge on AWS by sending different messages by running the following from the 'infra_aws_cdk' folder:
At the current time there are no branch protections. However, as the build process creates a commit for every build, to keep the git history clean, please `rebase/squash` your commits before pushing. You can do this by running `git fetch origin main && git rebase -i origin/main`, `edit`ing the first commit, and applying `fixup` to all following commits.
75
71
76
72
### Code Structure
77
73
78
74
The code is structured in the [Clean Architecture](https://blog.cleancoder.com/uncle-bob/2012/08/13/the-clean-architecture.html) pattern.
3.`Adapters`, which convert events and data into known types
87
-
4.`Drivers`, which interact with data storage
88
-
5.`Entrypoints`, which handle the event from AWS, retrieve and store data through drivers and call adapters to perform the needed business logic
89
-
90
78
The core rule of Clean Architecture, is that a layer can only depend on the layers that have come before it. E.g. code in the `usecases` layer, may depend on `entities`, but cannot depend on `adapters` or `drivers`.
91
79
92
80
When developing, it is simplest to start at the first layer and work down ending up with the entrypoint. This forces you to focus on the domain objects first before considering external services.
93
81
94
-
### Deploying a change
95
-
96
-
- Assume the correct AWS account/refresh credentials
97
-
- Synth, plan and deploy both infra & Grafana Dashboard.
98
-
-`make build`
99
-
-`make plan`
100
-
-`make deploy`
101
-
102
-
### Managing Grafana API Keys
103
-
104
-
While writing this, Grafana API Keys are valid for maximum 30 days only.
105
-
Hopefully, Amazon will address this limitation in the future - but in the meantime, this simple pattern can be used to automatically rotate an API key every 29 days and store it for use in AWS Secrets Manager.
106
-
107
-

108
-
109
-
110
-
The solution is made up of two components:
111
-
112
-
1. AWS Secret is created with a rotation lifecycle policy that will trigger a Lambda function every 29 days
113
-
2. AWS Lambda Function that will create a new API key in Amazon Managed Grafana and update the AWS Secret with the new key
114
-
115
-
116
-
Python code handling the rotation is stored at `src/entrypoint/grafana_lambda.py`. The code expects two input variables to retrieve secret values from AWS Secrets Manager viz. `grafana_api_key_name` and `grafana_workspace_id`. As the name suggest these values are secret names being used as filters.
117
-
118
-
119
-
### Merging changes
120
-
121
-
At the current time there are no branch protections. However, as the build process creates a commit for every build, to keep the git history clean, please `rebase/squash` your commits before pushing. You can do this by running `git fetch origin main && git rebase -i origin/main`, `edit`ing the first commit, and applying `fixup` to all following commits.
122
-
123
-
# How to create a new metric?
124
-
125
-
1. First of all, you need to define a SLO. In this example we will work with 2 SLO's. These SLO's are related to creation of a "project" at a client.
126
-
127
-
- "time taken between requesting a project and assigning the request to be executed"
128
-
- "time taken between requesting a project and finally creating the project"
129
-
130
-
2. Next you need to understand the "process" for "creating a project". This includes actors, systems, and automation that contributes to creation of a "project" from start to end. From the process diagram we need to identify the events that are associated each component of the "process". "Event" in the process should have minimum the following fields
131
-
132
-
-`aggregate_id`: this is id to all events that is part of an instantioantion of a process. For example, when one requests, assigns, and creates a project all the events will have the same correlation id.
133
-
-`event_type`: this is the type of event, this can be any string representing what happened.
134
-
-`event_version`: this is the version of the event.
135
-
-`time`: time when the event happened
136
-
- (Optional) `payload`: Any other detail one needs to add to the event for additional needs.
137
-
138
-
3. Map the event to entities by creating a python file for entities under code/src/entities. See for example the projects.py for entities. Entities come in two form:
139
-
140
-
- Event Entites: corresponding to the events of the domain
141
-
- Metric Entities: corresponding to the SLO's you have defined in step 1
142
-
143
-

144
-
145
-
4. Calculation of metrics are handled in the usescases. Insert a new use case under code/src/usecases that calculates the metric you are after. You can see the example how the project creation metrics are calculated in the projects.py.
146
-
147
-
5. Controller is where you covert external message to "entity". It delegates the metric handling to "usecases" above. you can find the controller at code/src/adapters folder
148
-
149
-
6. Once you have finised the entites, usecases, and adaptors it is time to build the external layers. First one is the driver layer where events are stored. Drivers are implemented based on the choice of cloud environment. you can see two examples one being the bigquery and the other is DynamoDB+TimesStream. these dirvers interact with the actual DB that stores the entities and metrics to be used. This layer also contains the UI where the metrics are shown.
150
-
151
-
7. One last code is needed to provide an entry point to the system. This is the part where platform message comes in and every part of the system is orchestrated.
152
-
153
-

154
-
155
-
156
-
### Manage Grafana Dashboards
157
-
158
-
TF CDK is used to build and modify Grafana dashboard and the panels that goes inside dashboard. Dashboard configuration is magnaged via `dashboard.json` located within the `infrastructure` folder.
159
-
160
-
161
82
# Roadmap
162
83
163
84
-[ ] Update integration and automation tests
164
85
-[ ] Create Dockerized lambda code for product creation metric
- TF CDK is used to build and modify Grafana dashboard and the panels that goes inside dashboard. Dashboard configuration is managed via `dashboard.json` located within the `infrastructure` folder. Export the new configuration from grafana after making your changes, now update the dashboard.json with the updated configuration.
167
93
168
-
- How can I use the timestream with the hosted grafana?
Making changes to the Event Catalog Documentation is pretty straight forward. You will need to add and update the relevant files within the 3 folders listed above. See others for examples.
17
+
More info in the official Event Catalog [documentation](https://www.eventcatalog.dev/docs/introduction).
18
+
19
+
### Running Locally
20
+
21
+
`make docs_run`. This can be useful to confirm your changes.
22
+
23
+
### Make Commands
24
+
25
+
-`make docs_install` (npm install) Will install all NPM requirements
26
+
-`make docs_build` (npm install & npm run build) Will build the Event Catalog and output to the folder `docs/out`. This is used to confirm you changes are valid.
27
+
-`make docs_run` (npm install & npm run build & npm run dev) Will run a local dev instance of the site, so you can see you changes.
0 commit comments