Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
<!-- markdownlint-disable-next-line first-line-heading -->

## Description

<!-- Describe your changes in detail. -->
Expand Down
17 changes: 12 additions & 5 deletions .github/instructions/tests.instructions.md
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,7 @@ class wraps one backend endpoint group (e.g. `OrderApiResource`, `HIVResultsApiR
### Writing API Tests

```typescript
import { test, expect } from "../../fixtures/IntegrationFixture";
import { expect, test } from "../../fixtures/IntegrationFixture";
import { OrderTestData } from "../../test-data/OrderTestData";
import { headersOrder } from "../../utils/ApiRequestHelper";

Expand Down Expand Up @@ -145,6 +145,7 @@ All page interactions go through Page Object classes in `page-objects/`, each ex

```typescript
import { Locator, Page } from "@playwright/test";

import { ConfigFactory } from "../configuration/EnvironmentConfiguration";
import { BasePage } from "./BasePage";

Expand Down Expand Up @@ -200,9 +201,10 @@ page.locator(".nhsuk-button");
### Writing Frontend Tests

```typescript
import { test } from "../../fixtures/CombinedTestFixture";
import { expect } from "@playwright/test";

import { test } from "../../fixtures/CombinedTestFixture";

test.describe("Order journey — delivery address", { tag: "@ui" }, () => {
test.beforeEach(async ({ homeTestStartPage }) => {
await homeTestStartPage.navigate();
Expand Down Expand Up @@ -237,12 +239,14 @@ directly in the test file and connected via `beforeAll`.
### Writing Integration Tests

```typescript
import { randomUUID } from "crypto";

import { expect } from "@playwright/test";
import { test } from "../../fixtures/CombinedTestFixture";

import { TestOrderDbClient } from "../../db/TestOrderDbClient";
import { test } from "../../fixtures/CombinedTestFixture";
import { OrderBuilder } from "../../test-data/OrderBuilder";
import { headersTestResults } from "../../utils/ApiRequestHelper";
import { randomUUID } from "crypto";

const dbClient = new TestOrderDbClient();

Expand All @@ -264,7 +268,10 @@ test.describe("Results flow — order status update", { tag: "@integration" }, (
hivResultsApi,
}) => {
const correlationId = randomUUID();
const response = await hivResultsApi.submitTestResults(testData, headersTestResults(correlationId));
const response = await hivResultsApi.submitTestResults(
testData,
headersTestResults(correlationId),
);
expect(response.status()).toBe(201);
expect(await dbClient.getLatestOrderStatusByOrderUid(orderId)).toEqual("COMPLETE");
});
Expand Down
13 changes: 7 additions & 6 deletions .github/instructions/ui.instructions.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ error summaries, back links, etc.). Do not create custom implementations of comp
exist in the NHS component library.

```typescript
import { Button, Input, ErrorSummary, BackLink } from "nhsuk-react-components";
import { BackLink, Button, ErrorSummary, Input } from "nhsuk-react-components";
```

For custom layouts or spacing not covered by NHS components, use **Tailwind CSS utility
Expand All @@ -62,12 +62,12 @@ classes**. Never use inline `style={{...}}` props.
The application uses React Context for shared state. Providers are composed in layout
components. The existing providers are:

| Provider | File | Purpose |
|---|---|---|
| Provider | File | Purpose |
| --------------------------- | -------- | ----------------------------------- |
| `JourneyNavigationProvider` | `state/` | Multi-step journey navigation state |
| `CreateOrderProvider` | `state/` | Order creation form state |
| `PostcodeLookupProvider` | `state/` | Postcode lookup state |
| `AuthProvider` | `state/` | NHS Login authentication state |
| `CreateOrderProvider` | `state/` | Order creation form state |
| `PostcodeLookupProvider` | `state/` | Postcode lookup state |
| `AuthProvider` | `state/` | NHS Login authentication state |

New providers should follow the same pattern: a context object, a typed interface, and a
`use<Name>` hook that asserts the context is not null.
Expand Down Expand Up @@ -131,6 +131,7 @@ When wrapping a service call in a React component, use **TanStack React Query**

```typescript
import { useQuery } from "@tanstack/react-query";

import orderDetailsService from "@/lib/services/order-details-service";

const { data, isLoading, error } = useQuery({
Expand Down
21 changes: 10 additions & 11 deletions .markdownlint.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4,33 +4,32 @@
default: true

# https://github.com/DavidAnson/markdownlint/blob/main/doc/md010.md
MD010: # no-hard-tabs
ignore_code_languages:
- make
- console
MD010: # no-hard-tabs
ignore_code_languages:
- make
- console

# MD013 - Line length
MD013:
line_length: 1000
heading_line_length: 80
code_block_line_length: 1200
tables: false
line_length: 1000
heading_line_length: 80
code_block_line_length: 1200
tables: false

# MD024 - Multiple headings with the same content
MD024:
siblings_only: true
siblings_only: true

# MD033 - Inline HTML
# https://github.com/DavidAnson/markdownlint/blob/main/doc/md033.md
MD033: false


# MD041 - First line should be a top-level heading
MD041: false

# MD046 - Code block style
MD046:
style: fenced
style: fenced

# MD059 - Link text should be descriptive
# https://github.com/DavidAnson/markdownlint/blob/main/doc/md059.md
Expand Down
24 changes: 12 additions & 12 deletions docs/developer-guides/Scripting_Docker.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,9 +68,9 @@ Here are some key features built into this repository's Docker module:

### Quick start

The Repository Template assumes that you will want to build more than one docker image as part of your project. As such, we do not use a `Dockerfile` at the root of the project. Instead, each docker image that you create should go in its own folder under `infrastructure/images`. So, if your application has a docker image called `my-shiny-app`, you should create the file `infrastructure/images/my-shiny-app/Dockerfile`. Let's do that.
The Repository Template assumes that you will want to build more than one docker image as part of your project. As such, we do not use a `Dockerfile` at the root of the project. Instead, each docker image that you create should go in its own folder under `infrastructure/images`. So, if your application has a docker image called `my-shiny-app`, you should create the file `infrastructure/images/my-shiny-app/Dockerfile`. Let's do that.

First, we need an application to package. Let's do the simplest possible thing, and create a file called `main.py` in the root of the template with a familiar command in it:
First, we need an application to package. Let's do the simplest possible thing, and create a file called `main.py` in the root of the template with a familiar command in it:

```python
print("hello world")
Expand All @@ -92,9 +92,9 @@ COPY ./main.py .
CMD ["python", "main.py"]
```

Note the paths in the `COPY` command. The `Dockerfile` is stored in a subdirectory, but when `docker` runs it is executed in the root of the repository so that's where all paths are relative to. This is because you can't `COPY` from parent directories. `COPY ../../main.py .` wouldn't work.
Note the paths in the `COPY` command. The `Dockerfile` is stored in a subdirectory, but when `docker` runs it is executed in the root of the repository so that's where all paths are relative to. This is because you can't `COPY` from parent directories. `COPY ../../main.py .` wouldn't work.

The name of the folder is also significant. It should match the name of the docker image that you want to create. With that name, you can run the following `make` task to run `hadolint` over your `Dockerfile` to check for common anti-patterns:
The name of the folder is also significant. It should match the name of the docker image that you want to create. With that name, you can run the following `make` task to run `hadolint` over your `Dockerfile` to check for common anti-patterns:

```shell
$ DOCKER_IMAGE=my-shiny-app make docker-lint
Expand All @@ -105,15 +105,15 @@ make: *** [scripts/docker/docker.mk:20: docker-lint] Error 2

All the provided docker `make` tasks take the `DOCKER_IMAGE` parameter.

`hadolint` found a problem, so let's fix that. It's complaining that we've not specified which version of the `python` docker container we want. Change the first line of the `Dockerfile` to:
`hadolint` found a problem, so let's fix that. It's complaining that we've not specified which version of the `python` docker container we want. Change the first line of the `Dockerfile` to:

```dockerfile
FROM python:3.12-slim-bookworm
```

Run `DOCKER_IMAGE=my-shiny-app make docker-lint` again, and you will see that it is silent.

Now let's actually build the image. Run the following:
Now let's actually build the image. Run the following:

```shell
DOCKER_IMAGE=my-shiny-app make docker-build
Expand All @@ -136,7 +136,7 @@ docker.io/library/python 3.12-slim-bookworm d9f1825e4d49 5 weeks ago 13
localhost/hadolint/hadolint 2.12.0-alpine 19b38dcec411 16 months ago 8.3 MB
```

Your process might want to add specific tag formats so you can identify docker images by date-stamps, or git hashes. The Repository Template supports that with a `VERSION` file. Create a new file called `infrastructure/images/my-shiny-app/VERSION`, and put the following into it:
Your process might want to add specific tag formats so you can identify docker images by date-stamps, or git hashes. The Repository Template supports that with a `VERSION` file. Create a new file called `infrastructure/images/my-shiny-app/VERSION`, and put the following into it:

```text
${yyyy}${mm}${dd}-${hash}
Expand All @@ -148,23 +148,23 @@ Now, run the `docker-build` command again, and towards the end of the output you
Successfully tagged localhost/my-shiny-app:20240314-07ee679
```

Obviously the specific values will be different for you. See the Versioning section below for more on this.
Obviously the specific values will be different for you. See the Versioning section below for more on this.

It is usually the case that there is a specific image that you will most often want to build, run, and deploy. You should edit the root-level `Makefile` to document this and to provide shortcuts. Edit `Makefile`, and change the `build` task to look like this:
It is usually the case that there is a specific image that you will most often want to build, run, and deploy. You should edit the root-level `Makefile` to document this and to provide shortcuts. Edit `Makefile`, and change the `build` task to look like this:

```make
build: # Build the project artefact @Pipeline
DOCKER_IMAGE=my-shiny-app
make docker-build
```

Now when you run `make build`, it will do the right thing. Keeping this convention consistent across projects means that new starters can be on-boarded quickly, without needing to learn a new set of conventions each time.
Now when you run `make build`, it will do the right thing. Keeping this convention consistent across projects means that new starters can be on-boarded quickly, without needing to learn a new set of conventions each time.

### Your image implementation

Always follow [Docker best practices](https://docs.docker.com/develop/develop-images/dockerfile_best-practices/) while developing images.

Here is a step-by-step guide for an image which packages a third-party tool. It is mostly similar to the example above, but demonstrates the `.tool-versions` mechanism.
Here is a step-by-step guide for an image which packages a third-party tool. It is mostly similar to the example above, but demonstrates the `.tool-versions` mechanism.

1. Create `infrastructure/images/cypress/Dockerfile`

Expand Down Expand Up @@ -274,7 +274,7 @@ For cross-platform image support, the `--platform linux/amd64` flag is used to b

### `Dockerignore` file

If you need to exclude files from a `COPY` command, put a [`Dockerfile.dockerignore`](https://docs.docker.com/build/building/context/#filename-and-location) file next to the relevant `Dockerfile`. They do not live in the root directory. Any paths within `Dockerfile.dockerignore` must be relative to the repository root.
If you need to exclude files from a `COPY` command, put a [`Dockerfile.dockerignore`](https://docs.docker.com/build/building/context/#filename-and-location) file next to the relevant `Dockerfile`. They do not live in the root directory. Any paths within `Dockerfile.dockerignore` must be relative to the repository root.

## FAQ

Expand Down
26 changes: 13 additions & 13 deletions docs/developer-guides/Scripting_Terraform.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ Here are some key features built into this repository's Terraform module:

### Quick start

The Repository Template assumes that you will be constructing the bulk of your infrastructure in `infrastructure/modules` as generic deployment configuration, which you will then compose into environment-specific modules, each stored in their own directory under `infrastructure/environments`. Let's create a simple deployable thing, and configure an S3 bucket. We'll make the name of the bucket a variable, so that each environment can have its own.
The Repository Template assumes that you will be constructing the bulk of your infrastructure in `infrastructure/modules` as generic deployment configuration, which you will then compose into environment-specific modules, each stored in their own directory under `infrastructure/environments`. Let's create a simple deployable thing, and configure an S3 bucket. We'll make the name of the bucket a variable, so that each environment can have its own.

Open the file `infrastructure/modules/private_s3_bucket/main.tf`, and put this in it:

Expand All @@ -82,17 +82,17 @@ resource "aws_s3_bucket" "my_bucket" {
}
```

Note that the variable has been given no value. This is intentional, and allows us to pass the bucket name in as a parameter from the environment.
Note that the variable has been given no value. This is intentional, and allows us to pass the bucket name in as a parameter from the environment.

Now, we're going to define two deployment environments: `dev`, and `test`. Run this:
Now, we're going to define two deployment environments: `dev`, and `test`. Run this:

```bash
mkdir -p infrastructure/environments/{dev,test}
```

It is important that the directory names match your environment names.

Now, let's create the environment definition files. Open `infrastructure/environments/dev/main.tf` and copy in:
Now, let's create the environment definition files. Open `infrastructure/environments/dev/main.tf` and copy in:

```terraform
module "dev_environment" {
Expand All @@ -103,11 +103,11 @@ module "dev_environment" {

Some things to note:

- The `source` path is relative to the directory that the `main.tf` file is in. When `terraform` runs, it will `chdir` to that directory first, before doing anything else.
- The `module` name, `"dev_environment"` here, can be anything. Module names are only scoped to the file they're in, so you don't need to follow any particular convention here.
- The `bucket_name` is going to end up as the bucket name in AWS. It wants to be meaningful to you, and you need to pick your own. The framework doesn't constrain your choice, but remember that AWS needs them to be globally unique and if you steal `"nhse-ee-my-fancy-bucket"` then I can't test these docs and then I will be sad.
- The `source` path is relative to the directory that the `main.tf` file is in. When `terraform` runs, it will `chdir` to that directory first, before doing anything else.
- The `module` name, `"dev_environment"` here, can be anything. Module names are only scoped to the file they're in, so you don't need to follow any particular convention here.
- The `bucket_name` is going to end up as the bucket name in AWS. It wants to be meaningful to you, and you need to pick your own. The framework doesn't constrain your choice, but remember that AWS needs them to be globally unique and if you steal `"nhse-ee-my-fancy-bucket"` then I can't test these docs and then I will be sad.

Let's create our `test` environment now. Open `infrastructure/environments/test/main.tf` and copy in:
Let's create our `test` environment now. Open `infrastructure/environments/test/main.tf` and copy in:

```terraform
module "test_environment" {
Expand All @@ -116,20 +116,20 @@ module "test_environment" {
}
```

We have changed the bucket name here. In this example, I am making no assumptions as to how your AWS accounts are set up. If you intend for your development and test infrastructure to be in the same AWS account (perhaps by necessity, for organisational reasons) and you need to separate them by a naming convention, the framework can support that.
We have changed the bucket name here. In this example, I am making no assumptions as to how your AWS accounts are set up. If you intend for your development and test infrastructure to be in the same AWS account (perhaps by necessity, for organisational reasons) and you need to separate them by a naming convention, the framework can support that.

Now we have our modules and our environments configured, we need to initialise each of them. Run these two commands:
Now we have our modules and our environments configured, we need to initialise each of them. Run these two commands:

```bash
TF_ENV=dev make terraform-init
TF_ENV=test make terraform-init
```

Each invocation will download the `terraform` dependencies we need. The `TF_ENV` name we give to each invocation is the name of the environment, and must match the directory name we chose under `infrastructure/environments` so that `make` gives the right parameters to `terraform`.
Each invocation will download the `terraform` dependencies we need. The `TF_ENV` name we give to each invocation is the name of the environment, and must match the directory name we chose under `infrastructure/environments` so that `make` gives the right parameters to `terraform`.

We are now ready to try deploying to AWS, from our local environment.

I am going to assume that you have an `~/.aws/credentials` file set up with a separate profile for each environment that you want to use, called `my-test-environment` and `my-dev-environment`. They might have the same credential values in them, in which case `terraform` will create the resources in the same account; or you might have them set up to deploy to different accounts. Either would work.
I am going to assume that you have an `~/.aws/credentials` file set up with a separate profile for each environment that you want to use, called `my-test-environment` and `my-dev-environment`. They might have the same credential values in them, in which case `terraform` will create the resources in the same account; or you might have them set up to deploy to different accounts. Either would work.

Run the following:

Expand Down Expand Up @@ -221,7 +221,7 @@ Apply complete! Resources: 1 added, 0 changed, 0 destroyed.

```

You will notice here that I needed to confirm the action to `terraform` manually. If you don't want to do that, you can pass the `-auto-approve` option to `terraform` like this:
You will notice here that I needed to confirm the action to `terraform` manually. If you don't want to do that, you can pass the `-auto-approve` option to `terraform` like this:

```shell
TF_ENV=dev AWS_PROFILE=my-dev-environment make terraform-apply opts="-auto-approve"
Expand Down
1 change: 0 additions & 1 deletion docs/user-guides/Scan_dependencies.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,6 @@ cat vulnerabilities-repository-reportc.json | jq
3. _Is it feasible to consolidate this functionality into a custom GitHub Action?_

Although consolidating this functionality into a custom GitHub Action seems like an optimal approach, this functionality also needs to run as a Git hook. Hence, shell scripting is a more suitable method as it makes less assumptions about local environment configuration or rely on third-party runners, providing quicker feedback. Additionally, incorporating this functionality directly into the repository has several advantages, including:

- Improved transparency and visibility of the implementation
- Easier investigation of CVEs found in the repository, eliminating dependence on a third party like GitHub
- Enhanced portability and flexibility, allowing the scans to run in diverse environments
Expand Down
Loading
Loading