Skip to content

SuryaTejJosyula/azure-kusto-microhack

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Azure Data Explorer Microhack (Preview)

Kusto Product Group and Microsoft Global Black Belt team are pleased to present this challenge based, collaboration driven, discover-by-doing learning experience to you. Microhacks are divided into three parts to cater enough time for the participants to understand the key concepts of Azure Data Explorer effectively.


Earn a digital badge! In order to receive the ADX microhack digital badge, you will need to complete the challenges marked with 🎓 in each Microhack. Please submit the KQL queries/commands using the answer sheets found at the beginning of each microhack.

Scenario

Contoso is a supply chain logistics company that runs a fleet of ships, trucks, and cargo planes to transport and deliver goods around the world. Some of the world’s largest enterprises rely on Contoso’s logistics capabilities to deliver goods to their end customers. Contoso has invested in connecting its fleet with sensors that measure temperature, pressure, humidity, tilt, shock, and light exposure inside its fleet. These sensors emit telemetry data every 1 minute, property data whenever there is a change in the device property, and command data whenever a new command is executed.

Contoso is looking for suitable data storage and analytical solution that provides out of the box integration with Azure IoT services such as IoT Hub, Event Hubs as well as can read data from storage accounts. Contoso is developing a SaaS application that will allow its customers to track, trace and monitor their shipments. Contoso wants to offer out of the box visualizations with interactive capabilities to enable its customers to drill-in/drill-out of the data. Contoso will offer its customers to view and analyze the last 6 months data. Contoso will retain every customer’s data for up to 1 year. Contoso wants to offer blazing fast loading of visualizations to its customers. This MicroHack walks through the steps in designing, creating, and configuring Azure Data Explorer clusters keeping in mind these requirements. Once the cluster is deployed, this MicroHack enlists the steps to ingest data into ADX databases and tables using various integration methods such as One Click ingestion.

Pre-requisites

  • An Azure subscription
  • Use the Azure Cloud Shell to deploy IoT Central application, create simulated devices and create Data Exports to Event Hubs and Storage Accounts (Steps to create the infrastructure is given below in this guide).
  • Authorization to create an Azure Data Explorer cluster or Synapse Data Explorer Pool

Overview - The microhack architecture

The following architecture has been deployed for you, except the ADX cluster and its integration with other Azure services. IoT Central acts as the source of telemetry generated by Contoso’s sensors installed on its fleet of trucks, vessels, and airplanes. Telemetry data is streamed on a continuous basis to the Event Hub. Device logs, device property changes and commands executed on the devices are stored in a Storage Account as blobs.

Screen capture 1

Deployment Instructions

You can deploy the aforementioned architecture using the steps mentioned below:

On the Azure Cloud Shell, use bash to run the following commands to deploy the solution:

  1. Login to Azure
az login

Note: You must do this step and log in using the prompted URL, even if you're already logged in to Azure. Otherwise, you will see errors when running the script when connecting to IoT Central.

  1. If you have more than one subscription, select the appropriate one:
az account set --subscription "<your-subscription-ID>"
  1. Get the latest version of the repository
git clone https://github.com/MSUSSolutionAccelerators/ADX-IoT-Analytics-Solution-Accelerator.git

Optionally, you can update the cloned iotanalyticsLogistics.parameters.json file to personalize your deployment.

  1. Deploy solution
cd ADX-IoT-Analytics-Solution-Accelerator
. ./deploy.sh
  1. Choose option 2 to deploy from the options provided

  2. This is the expected result:

Tip
📝 Write down the name of the Resource Group that has been created (indicated in green in the image above).

(Optional checks) Confirm deployment success

After the deployment is complete, do the following checks to confirm data is flowing into Event Hub.

  1. Open the Resource Group that is created newly and check that one Event Hub, one IOT Central Application and one Storage Account are created.

  2. Open IOT Central Application, and from overview page, click on the IOT central Application URI.

  3. A new web page with Azure IOT central Hub will open. From the "Devices" menu, check test devices are created and simulated. Iot Central Devices simulation

  4. Similarly, from the "Data export" tab on the left hand menu, check that both Export and Destination are healthy. Iot Central Destination Check Iot Central Export Check

  5. Now from the Azure portal, open Event Hub that is created and check the graph "Messages" in the overview page. Note that the simulated devices messages should be flowing and messages should not be zero.

Note: Data from Event Hub is crucial for succesfully setting up ingestion into ADX. Please take help from proctor if messages are not flowing into Event Hub.

What is Azure Data Explorer and when is it a good fit?

Azure Data Explorer is a fully managed, high-performance, big data analytics platform that makes it easy to analyze high volumes of data in near real time. The Azure Data Explorer toolbox gives you an end-to-end solution for data ingestion, query, visualization, and management.

By analyzing structured, semi-structured, and unstructured data across time series, and by using Machine Learning, Azure Data Explorer makes it simple to extract key insights, spot patterns and trends, and create forecasting models. Azure Data Explorer is scalable, secure, robust, and enterprise-ready, and is useful for log analytics, time series analytics, IoT, and general-purpose exploratory analytics.

Azure Data Explorer capabilities are extended by other services built on its powerful query language, including Azure Monitor logs, Application Insights, Time Series Insights, and Microsoft Defender for Endpoint

How to start with ADX

Generally, when starting with Azure Data Explorer, you will follow the following steps (ADX Microhacks will cover all these steps):

  1. Create an ADX cluster: To use Azure Data Explorer you first create a cluster. An Azure Data Explorer cluster is the most basic unit.
  2. Create database: Each cluster has one or more databases in that cluster. Each Azure Data Explorer cluster can hold up to 10,000 databases and each database up to 10,000 tables.
  3. Ingest data: Load data into database tables so that you can run queries against it. Azure Data Explorer supports several ingestion methods.
  4. Query data: Azure Data Explorer uses the Kusto Query Language, which is an expressive, intuitive, and highly productive query language. It offers a smooth transition from simple one-liners to complex data processing scripts, and supports querying structured, semi-structured, and unstructured (text search) data. Use the web application to run, review, and share queries and results. You can also send queries programmatically (using an SDK) or to a REST API endpoint.
  5. Visualize results: Use different visual displays of your data in the native Azure Data Explorer Dashboards. You can also display your results using connectors to some of the leading visualization services, such as Power BI and Grafana.

Ready to go? It's Microhack time!

About

No description, website, or topics provided.

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published