Contact | Synchronous communication | Asynchronous communication |
---|---|---|
Human-to-human (physical) | Speaking to other person directly. | Sending an letter. |
Human-to-human (digital) | Making a phone call. | Sending an email, sms or document. |
Cooperation | Multiple people sitting in a meeting. | Create and review Merge/Pull requests. |
Machine-to-machine | Method call, HTTP request. | Messaging via queues. |
Pros | Cons |
---|---|
Improves scalability. | More difficult to implement correctly. |
Improves reliability. | Requires external infrastructure (in most cases). |
Simple load balancing. | Latency might suffer. |
Might help reducing architectural complexity via decoupling. |
Traditional message broker. It supports:
- Queues (point-to-point, competing consumers)
- Topics (publish/subscribe)
- Subscriptions (competing consumers)
- Time-to-live (TTL) configurable per message.
- Filtering.
- Auto-Forwarding.
- Dead-lettering.
- Sessions.
- Transactions.
- Auto-delete.
Azure Docs: https://learn.microsoft.com/en-us/azure/service-bus-messaging/
Log-based queue:
- Does not provide built-in support for checkpointing (checkpoints are typically stored in Blobs or Tables).
- Time-to-live is configurable on the level of the Event Hub, not individual events.
- No routing features similar to Service Bus.
- Only batch inserts are transactional.
Azure Docs: https://learn.microsoft.com/en-us/azure/event-hubs/
Functions are invoked with a batch of messages from a single Event Hub partition. At most one batch from a single partition is processed at a time.
Target-based scaling: https://learn.microsoft.com/en-us/azure/azure-functions/functions-target-based-scaling?tabs=v5%2Ccsharp
Azure Docs: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-hubs
The client wants to detect problems with sorting robots. They want us to deploy a real-time anomaly detection algorithm on the data sent from the devices.
The client is not happy that the endpoint for daily statistics takes so long. We agreed to implement a caching mechanism to ensure that only the initial query is slow.
Our task:
- Modify the architecture so that the anomaly detection algorithm can process the data alongside our event consumer (the one which stores the data).
- Implement a caching mechanism for the daily statistics query.
- Event Consumer, Azure Function with HTTP trigger
- Stats Reporter, Azure Function with HTTP trigger
- Backend for Frontend, Azure App Service
- Storage, Azure Tables
- Stats Cache, Azure Tables
- Partition Key: DeviceId
- Ensures in-order processing of events per one device
- Ensures that the data from one device will be processed by the same instance of the Anomaly Detection algorithm
- Consumer Groups
- One for the Event Consumer which stores the data
- One for the Anomaly Detection algorithm
Prerequisites: Azure CLI
Clone the repository
git clone https://github.com/datamole-ai/mff-cloud-app-development.git
Navigate to the lesson-3/arm
directory and see the ARM template. It contains
- Function App
- Storage Account with a precreated table
- App Insights
- App Service
First thing is to create a new resource group (or reuse the one from the previous lesson). In the following command, replace the <resource-group>
with your own name (e.g. "mff-iot-{name}-{surname}").
az group create --location 'WestEurope' -n <resource-group>
Then, deploy the infrastructure defined in lesson-3/arm/resources.azrm.json
with the following command. Define the suffix parameter, e.g. your surname. The suffix is used in the names of the resources, so they are unique.
cd lesson-3/arm
az deployment group create `
--name "deploy-mff-task-components" `
--resource-group "<resource-group>" `
--template-file "resources.azrm.json" `
--parameters suffix="<suffix>"
You'll need to store the connection string to the Event Hub, so you can send some events. It should appear in the output as follows:
"outputs": {
"functionsHostKey": {
"type": "String",
"value": "<functionHostKey>"
},
"functionsHostUrl": {
"type": "String",
"value": "<functionHostUrl>"
},
"senderEventHubConnectionString": {
"type": "String",
"value": "<eventhubSenderConnectionString>"
},
"storageAccountConnectionString": {
"type": "String",
"value": "<storageAccountConnectionString>"
}
}
Deploy the new versions of the functions. You'll find the name of the function app in the output of the deployment command (it is defined as mff-iot-fa-<suffix>
).
cd lesson-3/sln/AzureFunctions
func azure functionApp publish "<name-of-the-functionapp>" --show-keys
Go to lesson-3/sln/WebAppBackend
It is possible to deploy the app directly from your IDE:
- Visual Studio, Rider (with Azure plugin) - Right-click on the project -> Publish
Publish the project:
dotnet publish
Create a zip archive from the publish artifacts:
# Powershell
Compress-Archive -Path bin\Release\net8.0\publish\* -DestinationPath deploy.zip
Upload the zip file to Azure via az cli:
az webapp deploy --src-path deploy.zip -n <web-app-name> --resource-group <resource-group-name>
The EventsGenerator
projects generates and sends events for the past few days.
To generate the events run the project:
cd lesson-3/sln/EventsGenerator
dotnet run -- "<event-hub-sender-connection-string>"
Powershell
Invoke-WebRequest -Uri "<webapp-uri>/transports?date=2024-04-05&facilityId=prague&parcelId=123"
cUrl
curl "<webapp-uri>/transports?date=2024-04-05&facilityId=prague&parcelId=123"
Powershell
Invoke-WebRequest -Uri "<webapp-uri>/daily-statistics?date=2024-04-05"
cUrl
curl "<webapp-uri>/daily-statistics?date=2024-04-05"
Open Azure Portal in you browser.
Find the Storage Account resource.
Go to the "Storage browser" in the left panel.
Click on Tables and view the "transports" table.