- Overview
- Architecture
- Prerequisites
- Installation
- Headless Connector
- Consuming IoT data in USD
- Joining a Live Session
- API Key Authentication
- Using Environment Variables
Note: Before you clone the repo, ensure you have Git LFS installed and enabled. Find out more about Git LFS
Developers can build their own IoT solutions for Omniverse by following the guidelines set out in these samples.
IoT Samples guides you on how-to:
- Connect IoT data sources (CSV, message broker etc.) to Omniverse
- Incorporate IoT data in the USD model
- Visualize IoT data, using an OmniUI extension
- Perform transformations of USD geometry using IoT data
- Incorporate Omniverse OmniGraph/ActionGraph with IoT data
Directory Item | Purpose |
---|---|
.vscode | VS Code configuration details and helper tasks |
content/ | Assets used by the samples |
readme-assets/ | Images and additional repository documentation |
source/ | Source code for the sample applications and extensions |
templates/ | Template Applications and Extensions. |
tools/ | Tooling settings and repository specific (local) tools |
.dockerignore | docklerignore file. |
.editorconfig | EditorConfig file. |
.gitattributes | Git configuration. |
.gitignore | Git configuration. |
Dockerfile | Dockerfile |
LICENSE | License for the repo. |
README.md | Project information. |
premake5.lua | Build configuration - such as what apps to build. |
repo.bat | Windows repo tool entry point. |
repo.sh | Linux repo tool entry point. |
repo.toml | Top level configuration of repo tools. |
repo_tools.toml | Setup of local, repository specific tools |
tar_ignore.txt | List of file to ignore when building the Docker image |
When opening the iot-samples
folder in Visual Studio Code, you will be promted to install a number of extensions that will enhance the python experience in Visual Studio Code.
The architecture decouples the IoT data model from the presentation in Omniverse, allowing for a data driven approach and separation of concerns that is similar to a Model/View/Controller (MVC) design pattern. The diagram above illustrates the key components to a solution. These are:
-
Customer Domain - represents the data sources. Industrial IoT deployments require connecting operational technology (OT) systems, such as SCADA, PLC, to information technology (IT) systems to enable various use cases to improve efficiency, productivity, and safety in various industries. These deployments provide a data ingestion endpoint to connect OT data to IT and cloud applications. Some of the widely adopted methods for connecting OT data include MQTT and Kafka. The samples in this repository use CSV and MQTT as data sources, but you can develop your IoT project with any other connectivity method.
-
Connector - is a stand-alone application that implements a bidirectional bridge between customer domain and USD related data. The logic implemented by a connector is use-case dependent and can be simple or complex. The CSV Ingest Application and MQTT Ingest Application transits the data as is from source to destination, whereas the Geometry Transformation Application manipulates USD geometry directly. Depending on the use cases, the connector can run as a headless application locally, on-prem, at the edge, or in the cloud.
- USD Resolver - is a package dependency with the libraries for USD and Omniverse. Find out more about the Omniverse USD Resolver
-
Nucleus - is Omniverse's distributed file system agent that runs locally, in the cloud, or at the enterprise level. Find out more about the Omniverse Nucleus
-
Consumer - is an application that can manipulate and present the IoT data served by a Connector.
- USD Resolver - is a package dependency with the libraries for USD and Omniverse.
- Fabric - is Omniverse's sub-system for scalable, realtime communication and update of the scene graph amongst software components, the CPU and GPU, and machines across the network. Find out more about the Omniverse Fabric
-
Controller - implements application or presentation logic by manipulating the flow of data from the Connector.
- ActionGraph/OmniGraph - is a visual scripting language that provides the ability to implement dynamic logic in response to changes made by the Connector. Find out more about the OmniGraph Action Graph.
- Omniverse Extension - is a building block within Omniverse for extending application functionality. Extensions can implement any logic required to meet an application's functional requirements. Find out more about the Omniverse Extensions.
- USD Stage - is an organized hierarchy of prims (primitives) with properties. It provides a pipeline for composing and rendering the hierarchy. It is analogous to the Presentation Layer in MVC while additionally adapting to the data and runtime configuration.
Note: Connectors implement a producer/consumer pattern that is not mutually exclusive. Connectors are free to act as producer, consumer, or both. There may also be multiple Connectors and Consumers simultaneously collaborating.
Before running any of the installation a number of prerequisites are required.
Follow the Getting Started with Omniverse to install the latest Omniverse Launcher.
If you've already installed Omniverse Launcher, ensure you have updated to the latest
- Python 3.10 or greater
- Nucleus 2023.1 or greater
Once you have the prerequisites installed, please run the following to install the needed Omniverse USD resolver, Omni client, and related dependencies.
Begin by cloning the iot-samples
to your local workspace:
git clone https://github.com/NVIDIA-Omniverse/iot-samples
cd iot-samples
Build The application application with the following command:
Linux:
./repo.sh build
Windows:
.\repo.bat build
If you experience issues related to build, please see the Usage and Troubleshooting section for additional information.
Start the application using:
Linux:
./repo.sh launch
Windows:
.\repo.bat launch
Select iot_samples.usd_explorer.kit
with arrow keys and press enter
NOTE: The initial startup may take 5 to 8 minutes as shaders compile for the first time. After initial shader compilation, startup time will reduce dramatically
Headless connectos are stand-alone applications that implement a bidirectional bridge between customer domain and USD related data. The logic implemented by a connector is use-case dependent and can be simple or complex.
There are two sample connector applications - CSV Ingest Application and MQTT Ingest Application - that transits the data as is from source to destination, whereas the Geometry Transformation Application manipulates USD geometry directly in the connector. Depending on the use cases, a connector can run as a headless application locally, on-prem, at the edge, or in the cloud.
To execute the application run the following:
python source/ingest_app_csv/run_app.py
-u <user name>
-p <password>
-s <nucleus server> (optional default: localhost)
Or if you are using Environment Variables (see Using Environment Variables)
python source/ingest_app_csv/run_app.py
Username and password are of the Nucleus instance (running on local workstation or on cloud) you will be connecting to for your IoT projects.
You should see output resembling:
2023-09-19 20:35:26+00:00
2023-09-19 20:35:28+00:00
2023-09-19 20:35:30+00:00
2023-09-19 20:35:32+00:00
2023-09-19 20:35:34+00:00
2023-09-19 20:35:36+00:00
2023-09-19 20:35:38+00:00
2023-09-19 20:35:40+00:00
2023-09-19 20:35:42+00:00
2023-09-19 20:35:44+00:00
The CSV ingest application can be found in the ./source/ingest_app_csv
folder. It will perform the following:
- Initialize the stage
- Open a connection to Nucleus.
- Copy
./content/ConveyorBelt_A08_PR_NVD_01
toomniverse://<nucleus server>/users/<user name>/iot-samples/ConveyorBelt_A08_PR_NVD_01
if it does not already exist.Note that you can safely delete the destination folder in Nucleus and it will be recreated the next time the connector is run. - Create or join a Live Collaboration Session named
iot_session
. - Create a
prim
in the.live
layer at path/iot/A08_PR_NVD_01
and populate it with attributes that correspond to the unique fieldId
types in the CSV file./content/A08_PR_NVD_01_iot_data.csv
.
- Playback in real-time
- Open and parse
./content/A08_PR_NVD_01_iot_data.csv
, and group the contents byTimeStamp
. - Loop through the data groupings.
- Update the prim attribute corresponding to the field
Id
. - Sleep for the the duration of delta between the previous and current
TimeStamp
.
- Open and parse
In your Omniverse application, open omniverse://<nucleus server>/users/<user name>/iot-samples/ConveyorBelt_A08_PR_NVD_01/ConveyorBelt_A08_PR_NVD_01.usd
and join the iot_session
live collaboration session. See Joining a Live Session for detailed instructions.
Once you have joined the iot_session
, then you should see the following:
Selecting the /iot/A08_PR_NVD_01
prim in the Stage
panel and toggling the Raw USD Properties
in the Property
panel will provide real-time updates from the the data being pushed by the Python application.
To execute the application run the the following:
Windows:
python source/ingest_app_mqtt/run_app.py
-u <user name>
-p <password>
-s <nucleus server> (optional default: localhost)
Or if you are using Environment Variables (see Using Environment Variables)
Windows:
python source/ingest_app_mqtt/run_app.py
Username and password are of the Nucleus instance (running on local workstation or on cloud) you will be connecting to for your IoT projects.
You should see output resembling:
Received `{
"_ts": 176.0,
"System_Current": 0.003981236,
"System_Voltage": 107.4890366,
"Ambient_Temperature": 79.17738342,
"Ambient_Humidity": 45.49172211
"Velocity": 1.0
}` from `iot/A08_PR_NVD_01` topic
2023-09-19 20:38:24+00:00
Received `{
"_ts": 178.0,
"System_Current": 0.003981236,
"System_Voltage": 107.4890366,
"Ambient_Temperature": 79.17738342,
"Ambient_Humidity": 45.49172211
"Velocity": 1.0
}` from `iot/A08_PR_NVD_01` topic
2023-09-19 20:38:26+00:00
The MQTT ingest application can be found in the ./source/ingest_app_mqtt
folder. It will perform the following:
- Initialize the stage
- Open a connection to Nucleus.
- Copy
./content/ConveyorBelt_A08_PR_NVD_01
toomniverse://<nucleus server>/users/<user name>/iot-samples/ConveyorBelt_A08_PR_NVD_01
if it does not already exist. Note that you can safely delete the destination folder in Nucleus and it will be recreated the next time the connector is run. - Create or join a Live Collaboration Session named
iot_session
. - Create a
prim
in the.live
layer at path/iot/A08_PR_NVD_01
and populate it with attributes that correspond to the unique fieldId
types in the CSV file./content/A08_PR_NVD_01_iot_data.csv
.
- Playback in real-time
- Connect to MQTT and subscribe to MQTT topic
iot/{A08_PR_NVD_01}
- Dispatch data to MQTT
- Open and parse
./content/A08_PR_NVD_01_iot_data.csv
, and group the contents byTimeStamp
. - Loop through the data groupings.
- Publish data to the MQTT topic.
- Sleep for the the duration of delta between the previous and current
TimeStamp
.
- Open and parse
- Consume MQTT data
- Update the prim attribute corresponding to the field
Id
.
- Update the prim attribute corresponding to the field
- Connect to MQTT and subscribe to MQTT topic
In your Omniverse application, open omniverse://<nucleus server>/users/<user name>/iot-samples/ConveyorBelt_A08_PR_NVD_01/ConveyorBelt_A08_PR_NVD_01.usd
and join the iot_session
live collaboration session. See Joining a Live Session for detailed instructions.
Once you have joined the iot_session
, then you should see the following:
Selecting the /iot/A08_PR_NVD_01
prim in the Stage
panel and toggling the Raw USD Properties
in the Property
panel will provide real-time updates from the data being pushed by the python application
The following is a simple example of how to deploy a headless connector application into Docker Desktop for Windows. Steps assume the use of
- WSL (comes standard with Docker Desktop installation) and
- Ubuntu Linux as the default OS.
The ollowing has to be done in WSL environment and NOT in Windows environment. Make sure you are in WSL, else you may encounter build and dependency errors.
-
If you have an earlier version of the repo cloned, you may want to delete the old repo in WSL and start with a new cloned repo in WSL. Else you could end up with file mismatches and related errors.
-
Before you clone the repo, ensure you have Git LFS installed and enabled. Find out more about Git LFS
-
Clone a new repo from within WSL
Once you have a new repo cloned, from within WSL run.
Linux:
./repo.sh build
-
Share the Nucleus services using a web browser by navigating to http://localhost:3080/. Click on 'Enable Sharing'. This will enable access to Nucleus services from WSL.
-
Record the WSL IP address of the host machine for use by the container application.
PS C:\> ipconfig Windows IP Configuration ... Ethernet adapter vEthernet (WSL): Connection-specific DNS Suffix . : Link-local IPv6 Address . . . . . : fe80::8026:14db:524d:796f%63 IPv4 Address. . . . . . . . . . . : 172.21.208.1 Subnet Mask . . . . . . . . . . . : 255.255.240.0 Default Gateway . . . . . . . . . : ...
-
Open a Bash prompt in WSL and navigate to the source repo and launch Visual Studio Code (example:
~/github/iot-samples/
). Make sure you're launching the Visual Studio Code from WSL environment and not editing the DockerFile from within Windowscode .
-
Modify the DockerFile
ENTRYPOINT
to add the WSL IP address to connect to the Host's Nucleus Server. Also, include the username and password for your Omniverse Nucleus instance.# For more information, please refer to https://aka.ms/vscode-docker-python FROM python:3.10-slim # Keeps Python from generating .pyc files in the container ENV PYTHONDONTWRITEBYTECODE=1 # Turns off buffering for easier container logging ENV PYTHONUNBUFFERED=1 WORKDIR /app COPY . /app # Creates a non-root user with an explicit UID and adds permission to access the /app folder # For more info, please refer to https://aka.ms/vscode-docker-python-configure-containers RUN adduser -u 5678 --disabled-password --gecos "" appuser && chown -R appuser /app USER appuser # During debugging, this entry point will be overridden. For more information, please refer to https://aka.ms/vscode-docker-python-debug ENTRYPOINT [ "python", "source/ingest_app_csv/run_app.py", "--server", "<host IP address>", "--username", "<username>", "--password", "<password>" ]
-
Create a docker image named
headlessapp
.Linux:
tar -czh -X tar_ignore.txt . | docker build -t headlessapp -
-
Run a container with the lastest version of the
headlessapp
imageWindows:
docker run -d --add-host host.docker.internal:host-gateway -p 3100:3100 -p 8891:8891 -p 8892:8892 headlessapp:latest
-
Watch the application run in Docker Desktop.
Consume the IoT data served by a connector by building your own application logic to visualize, animate and transform with USD stage. The application logic could use one of the following approaches or all of them;
- Extension
- Action Graph
- Direct to USD from headless connector
The sample IoT Extension uses Omniverse Extensions, which are the core building blocks of Omniverse Kit-based applications.
The IoT Extension demonstrates;
- Visualizing IoT data
- Animating a USD stage using IoT data
Visualizing IoT data
The IoT Extension leverages the Omniverse UI Framework to visualize the IoT data as a panel. Find out more about the Omniverse UI Framework
- Launch Omniverse
Start the application using:
Linux:
./repo.sh launch
Windows:
.\repo.bat launch
Select iot_samples.panel_extension.kit
with arrow keys and press enter
NOTE: The initial startup may take 5 to 8 minutes as shaders compile for the first time. After initial shader compilation, startup time will reduce dramatically
- Load the staage
In your Omniverse application,
open omniverse://<nucleus server>/users/<user name>/iot-samples/ConveyorBelt_A08_PR_NVD_01/ConveyorBelt_A08_PR_NVD_01.usd
.
- Join the IoT Live session
- Select the Iot Topic
Click on the play
icon on the bottom right of the applications's viewport and then start the timeline. The extension will animate to the Velocity
value change in the IoT data
and then run one of the following:
Windows:
source\ingest_app_csv\run_app.py
-u <user name>
-p <password>
-s <nucleus server> (optional default: localhost)
or
Windows:
source\ingest_app_mqtt\run_app.py
-u <user name>
-p <password>
-s <nucleus server> (optional default: localhost)
If you are using Environment Variables (see Using Environment Variables) then run one of the following:
Windows:
python source/ingest_app_csv/run_app.py
or
Windows:
python source/ingest_app_mqtt/run_app.py
Username and password are for the target Nucleus instance (running on local workstation or on cloud) that you will be connecting to for your IoT projects.
You will see the following animation with the cube moving:
When the IoT velocity value changes, the extension will animate the rollers (LiveRoller
class) as well as the cube (LiveCube
class).
The ConveyorBelt_A08_PR_NVD_01.usd
contains a simple ActionGraph
that reads, formats, and displays an attribute from the IoT prim in the ViewPort (see Omniverse Extensions Viewport).
To access the graph:
- Select the
Window/Visual Scripting/Action Graph
menu - Select
Edit Action Graph
- Select
/World/ActionGraph
You should see the following:
The Graph performs the following:
- Reads the
_ts
attribute from the/iot/A08_PR_NVD_01
prim. - Converts the numerical value to a string.
- Prepends the string with
TimeStamp:
. - Displays the result on the ViewPort.
Sample demonstrates how to execute USD tranformations from a headless connector using arbtriary values.
To execute the application run the the following:
Windows:
python source/transform_geometry/run_app.py
-u <user name>
-p <password>
-s <nucleus server> (optional default: localhost)
Username and password are of the Nucleus instance (running on local workstation or on cloud) you will be connecting to for your IoT projects.
The sample geometry transformation application can be found in source\transform_geometry
. It will perform the following:
- Initialize the stage
- Open a connection to Nucleus.
- Open or Create the USD stage
omniverse://<nucleus server>/users/<user name>/iot-samples/Dancing_Cubes.usd
. - Create or join a Live Collaboration Session named
iot_session
. - Create a
prim
in the.live
layer at path/World
. - Create a
Cube
at path/World/cube
.- Add a
Rotation
.
- Add a
- Create a
Mesh
at path/World/cube/mesh
.
- Playback in real-time
- Loop for 20 seconds at 30 frames per second.
- Randomly rotate the
Cube
along the X, Y, and Z planes.
If you open omniverse://<nucleus server>/users/<user name>/iot-samples/Dancing_Cubes.usd
in Composer
or Kit
, you should see the following:
Here's how-to join a live collaboration session. Click on Join Session
Select iot-session
from the drop down to join the already created live session.
To authenicate the connector application using an API Key, start Nucleus Explore from the Omniverse Launcher application and right click on the server you wish to connect to and select API Tokens
Provide a token name and click Create
Copy the token token value and store it somewhere safe.
If you are using the run_app.py
application launcher you can do the following:
Windows:
python source/ingest_app_csv/run_app.py
-u $omni-api-token
-p <api token>
-s <nucleus server> (optional default: localhost)
Or if you are using Environment Variables (see Using Environment Variables) you can do the following:
Windows:
python source/ingest_app_csv/run_app.py
The samples supports Nucleus authentication via Environment Variables.
Set User Name and Password environment variables:
Linux:
export OMNI_HOST=<host name>
export OMNI_USER=<user name>
export OMNI_PASS=<password>
Windows:
$Env:OMNI_HOST = "<host name>"
$Env:OMNI_USER = "<user name>"
$Env:OMNI_PASS = "<password>"
Set the API Token environment variable:
Linux:
export OMNI_HOST=<host name>
export OMNI_USER=\$omni-api-token
export OMNI_PASS=<API Token>
Windows:
$Env:OMNI_HOST = "<host name>"
$Env:OMNI_USER = "`$omni-api-token"
$Env:OMNI_PASS = "<API Token>"
Development using the Omniverse Kit SDK is subject to the licensing terms detailed here.
The Omniverse Kit SDK collects anonymous usage data to help improve software performance and aid in diagnostic purposes. Rest assured, no personal information such as user email, name or any other field is collected.
To learn more about what data is collected, how we use it and how you can change the data collection setting see details page.
We provide this source code as-is and are currently not accepting outside contributions.