Eventsystem - Enables applications and microservices to pass events between them reliably thereby enabling building decoupled domains and systems.
It provides support for two major consumer implementation types,
- Exactly once ordered processing for worker role consumers
- Atleast once delivery through Webhooks (built in capability) for Web role consumers
Enabling developers to build Event driven systems (Event Notification, Event Carried State Transer, Event Sourced)
This uses AWS tech, so requires to be deployed on AWS. However the consumers and producers are not limited to AWS platform.
The project is under development. Some of the mentioned capabilities are not mature enough and may be a work in progress until we release a 1.0.0 version. It is worth starting the conversation through issues to register interest
PageUp does have few other alternate implementaitons. This repo does not mean, this is the primary Event bus in use within PageUp
- Quick Start
- Configuring the Subscription Engine (Webhook capability)
- A Worker role sample
- A Web role sample
- Architecture
- How to encrypt sensitive deployment configurations
Start by cloning this repo.
- Install Serverless Framework.
npm install serverless -g
- Install dotnetcore
-
Open
serverless.yml
and fill in the appropriate values under sectioncustom:
For example
custom: stream: Glofish s3BucketName: pageup-integration prefix: Integration vars: ${file(./serverless-environment-variables.yml)}
Property Descriptionn stream the stream name. this is used to tag the resources created by this serverless project s3BucketName the S3 bucket where the subscriber file will be created and accessed from. prefix a unique identifier that is used to prefix the service name of serverless project vars the name of the environment variable file -
Deploy
serverless deploy
This is needed only if you are planning to use the Subscription engine capability to write consumers that are Web roles. To learn more about it, go to web role sample
Subscription system supports
- Webhook consumers
- Authenticated Webhook consumers (oAuth client credentials grant flow)
- AWS Lambdas as consumers (uses ARN - requires IAM policies to invoke)
The configuration is webhooks are managed as a json file, the location of which is configured through ENV variable S3_BUCKET_NAME
.
A sample configuration file is included at the root called subscriptions.json
.
Current implementation of the Subscription engine supports only one Auth server, the credentials representing which are configured in serverless-environment-variables.yml
In the current set up, there are two potential places where sensitive information may end up.
Encrypt this file using the following command
openssl aes-256-cbc -e -in serverless-environment-variables.yml -out serverless-environment-variables.yml.enc -k {$ENCRYPTION_KEY}
Replace {$ENCRYPTION_KEY} with a strong password. Add the key value pair to the travis build environment variables.
Ensure any deployment credentials are encrypted.
travis encrypt AWS_ACCESS_KEY_ID_STAGING="secretvalue"
travis encrypt AWS_SECRET_ACCESS_KEY_STAGING="secretvalue"
travis encrypt AWS_ACCESS_KEY_ID_PRODUCTION="secretvalue"
travis encrypt AWS_SECRET_ACCESS_KEY_PRODUCTION="secretvalue"
Run the following command in powershell to build the project.
build.ps1
Use the following command for bash.
./build.sh
To run unit tests
Bring the docker container up. This will bring up local authentication server
docker-compose build
docker-compose up -d
Now run test
dotnet test .\src\BusinessEvents.SubscriptionEngine.Tests\BusinessEvents.SubscriptionEngine.Tests.csproj
serverless deploy --stage v1 --region ap-southeast-2 --data-center staging -v
If you are deploying BusinessEvents-SubscriptionEngine for the first time, you need to comment out the following lambda
function definitions in serverless.yml
.
process-dynamodb-stream:
process-kinesis-stream:
And you may have to run serverless deploy ...
twice to successfully deploy the lambda functions.
The above functions require Kinesis and DynamoDB streams to be created first before it can attach itself to the streams. Once the streams are created, you do not have to peform this step again.