Housing Finance API will be used to serve data for the interim housing finance solution
- .NET Core as a web framework.
- xUnit as a test framework.
- Docker
- Docker-Compose (often installed automatically with Docker)
- A recent version of AWS CLI V2
- An AWS CLI profile for an environment the Finance DB is deployed in
- A recent version of the Session Manager Plugin for the AWS CLI
On Windows:
- You will need to use Git Bash or Windows Subsystem for Linux to run the Make commands
- To run with Docker you will need Windows Subsystem for Linux v2 with Docker Engine installed on it (NOT Docker Desktop on base Windows - this does not support the host networking driver)
On MacOS (Monterey - possibly others):
- You will need to turn off AirPlay receiver from your sharing settings due to a port 5000 conflict
-
See the Serverless Configuration under environment for the environment variables to set from parameter store
-
Copy
.env.sample
into the same directory and rename it to.env
, then set the values -
You can use the Makefile from the port forwarding step below to help with generating the
CONNECTION_STRING
env variable -
Pay particular attention to the
GOOGLE_API_KEY
variable, which must be JSON wrapped in single quotes and on a single line
This is currently required in order to connect the API to a functional DB locally
-
Open the finance_database.mk Makefile with the port forwarding commands
-
There is a helper method for generating the connection string to the port forwarded db for a given AWS profile. Run this then copy the output into your
.env
file:
make -f finance_database.mk local_connection_string_to_env
- Ensure you have a corresponding AWS CLI profile (default
housing-{stage}
) that matches thePROFILE
variable with the same AWS Profile that the credentials were sourced from. - Edit the file to use the correct stage (development / staging / production)
- If you're using an SSO profile:
- you can run
make -f finance_database.mk sso_login
to refresh your login and verify your profile
- you can run
- Start the port forwarding with:
make -f finance_database.mk port_forwarding_to_hfs_db
If you want to connect the database through a graphical / other local client:
- Connect to localhost or 192.0.0.1 at port 1433
- Enter the username and password printed to the console after the port forwarding
Docker Compose will read the .env file in the root directory ( same directory as the .env.sample
file), and will connect to the port forwarded database on localhost:1433
On Windows you'll need Docker Desktop running with WSL2 integration enabled
Replace the AWS CLI profile name and run this command to log in to AWS Elastic Container Registry and allow fetching the base Docker container:
aws ecr get-login-password --profile {profile}
Run these Make commands from the root directory to trigger the docker compose build and up steps:
make build && make serve
The application will load an .env file in the root directory ( same directory as the .env.sample
file) and will connect to the port forwarded database on localhost:1433
Run this command from the root directory to start the application:
dotnet run --project HousingFinanceInterimApi/HousingFinanceInterimApi.csproj
You can also configure your IDE of choice to run the HousingFinanceInterimApi
project.
$ make test
To run database tests locally without Docker (e.g. via Visual Studio) the CONNECTION_STRING
environment variable will need to be populated with:
Host=localhost;Database=testdb;Username=postgres;Password=mypassword"
Note: The Host name needs to be the name of the stub database docker-compose service, in order to run tests via Docker.
You will need to have the stub database running in order to run the tests outside of Docker
If changes to the database schema are made then the docker image for the database will have to be removed and recreated. The restart-db make command will do this for you.
- Use xUnit, FluentAssertions and Moq
- Always follow a TDD approach
- Tests should be independent of each other
- Gateway tests should interact with a real test instance of the database
- Test coverage should never go down
- All use cases should be covered by E2E tests
- Optimise when test run speed starts to hinder development
- Unit tests and E2E tests should run in CI
- Test database schemas should match up with production database schema
- Have integration tests which test from the PostgreSQL database to API Gateway
We use a pull request workflow, where changes are made on a branch and approved by one or more other maintainers before the developer can merge into master
branch.
Then we have an automated six step deployment process, which runs in CircleCI.
- Automated tests (xUnit) are run to ensure the release is of good quality.
- The application is deployed to development automatically, where we check our latest changes work well.
- We manually confirm a staging deployment in the CircleCI workflow once we're happy with our changes in development.
- The application is deployed to staging.
- We manually confirm a production deployment in the CircleCI workflow once we're happy with our changes in staging.
- The application is deployed to production.
Our staging and production environments are hosted by AWS. We would deploy to production per each feature/config merged into master
branch.
To help with making changes to code easier to understand when being reviewed, we've added a PR template.
When a new PR is created on a repo that uses this API template, the PR template will automatically fill in the Open a pull request
description textbox.
The PR author can edit and change the PR description using the template as a guide.
Using FxCop Analysers
FxCop runs code analysis when the Solution is built.
Both the API and Test projects have been set up to treat all warnings from the code analysis as errors and therefore, fail the build.
However, we can select which errors to suppress by setting the severity of the responsible rule to none, e.g dotnet_analyzer_diagnostic.<Category-or-RuleId>.severity = none
, within the .editorconfig
file.
Documentation on how to do this can be found here.
Host=localhost;Database=testdb;Username=postgres;Password=mypassword"
Note: The Host name needs to be the name of the stub database docker-compose service, in order to run tests via Docker.
If changes to the database schema are made then the docker image for the database will have to be removed and recreated. The restart-db make command will do this for you.
- Use nUnit, FluentAssertions and Moq
- Always follow a TDD approach
- Tests should be independent of each other
- Gateway tests should interact with a real test instance of the database
- Test coverage should never go down
- All use cases should be covered by E2E tests
- Optimise when test run speed starts to hinder development
- Unit tests and E2E tests should run in CI
- Test database schemas should match up with production database schema
- Have integration tests which test from the PostgreSQL database to API Gateway
- Record failure logs
- Automated
- Reliable
- As close to real time as possible
- Observable monitoring in place
- Should not affect any existing databases