This is an official implementation of the paper "NARUTO: Neural Active Reconstruction from Uncertain Target Observations".
Ziyue Feng*β 1,2,
Huangying Zhan*β β‘1,
Zheng Chenβ 1,3,
Qingan Yan1,
Xiangyu Xu1,
Changjiang Cai1,
Bing Li2,
Qilun Zhu2,
Yi Xu1
1 OPPO US Research Center, 2 Clemson University 3 Indiana University
* Co-first authors with equal contribution
β Work done as an intern at OPPO US Research Center
β‘ Corresponding author
- π Β Repo Structure
- π οΈ Β Installation
- πΏ Β Dataset Preparation
- πββοΈ Β Running NARUTO
- π Β Evaluation
- π¨ Β Run on Customized Scenes
- π Β License
- π€ Β Acknowledgement
- π Β Citation
# Main directory
βββ NARUTO (ROOT)
β βββ assets # README assets
β βββ configs # experiment configs
β βββ data # dataset
β β βββ MP3D # Matterport3D for Habitat data
β β βββ mp3d_data # Matterport3D raw Dataset
β β βββ Replica # Replica SLAM Dataset
β β βββ replica_v1 # Replica Dataset v1
β βββ envs # environment installation
β βββ results # experiment results
β βββ scripts # scripts
β β βββ data # data related scripts
β β βββ evaluation # evaluation related scripts
β β βββ installation # installation related scripts
β β βββ naruto # running NARUTO scripts
β βββ src # source code
β β βββ data # data code
β β βββ evaluation # evaluation code
β β βββ layers # pytorch layers
β β βββ naruto # NARUTO framework code
β β βββ planning # planning code
β β βββ simulator # simulator code
β β βββ slam # SLAM code
β β βββ utils # utility code
β β βββ visualization # visualization code
β βββ third_parties # third_parties
β β βββ coslam # CoSLAM
β β βββ habitat_sim # habitat-sim
β β βββ neural_slam_eval # evaluation tool
# Data structure
βββ data # dataset dir
β βββ MP3D # Matterport3D data
β β βββ v1
β β βββ tasks
β β βββ mp3d_habitat
β β βββ 1LXtFkjw3qL
β β βββ ...
β βββ replica_v1 # Replica-Dataset
β β βββ apartment_0
β β βββ habitat
β β βββ replicaSDK_stage.stage_config.json
β β βββ ...
β βββ Replica # Replica SLAM Dataset
β β βββ office_0
β β βββ ...
# Configuration structure
βββ configs # configuration dir
β βββ default.py # Default overall configuration
β βββ MP3D # Matterport3D
β β βββ mp3d_coslam.yaml # CoSLAM default configuration for MP3D
β β βββ {SCENE}
β β βββ {EXP}.py # experiment-specific overall configuraiton, inherit from default.py
β β βββ coslam.yaml # scene-specific CoSLAM configuration, inherit from mp3d_coslam.yaml
β β βββ habitat.py # scene-specific HabitatSim configuration
β β βββ ...
β βββ Replica # Replica
β β βββ ...
NOTE: default.py/{EXP}.py has the highest priority that can override configurations in other config files (e.g. mp3d_coslam.yaml/coslam.yaml, habitat.py)
# Result structure
βββ results # result dir
β βββ MP3D # Matterport3D result
β β βββ GdvgFV5R1Z5
β β βββ {EXPERIMENT_SETUP}
β β βββ run_{COUNT}
β β βββ eval_result.txt
β β βββ coslam
β β βββ checkpoint
β β βββ mesh
β β βββ ...
β βββ Replica # Replica result
β β βββ office_0
β β βββ {EXPERIMENT_SETUP}
β β βββ run_{COUNT}
β β βββ eval_result.txt
β β βββ coslam
β β βββ checkpoint
β β βββ mesh
β β βββ ...
# Clone the repo with the required third parties.
git clone --recursive https://github.com/oppo-us-research/NARUTO.git
# We assume ROOT as the project directory.
cd NARUTO
ROOT=${PWD}
In this repo, we provide two types of environement installations: Docker and Anaconda.
Users can optionally install one of them. The installation process includes:
-
installation of Habitat-Sim, where we install our updated Habitat-Sim, where the geometry compilation is updated.
-
installation of Co-SLAM, which is used as our mapping backbone.
-
installation of other packages required to run NARUTO.
Follow the steps to install the Docker environment:
# Build Docker image
bash scripts/installation/docker_env/build.sh
# Run Docker container
bash scripts/installation/docker_env/run.sh
# Activate conda env in Docker Env
source activate naruto
# Install tinycudann as required in Co-SLAM
# We try to include this installation while building Docker but it fails somehow.
pip install git+https://github.com/NVlabs/tiny-cuda-nn/#subdirectory=bindings/torch
Follow the steps to install the conda environment
# Build conda environment
bash scripts/installation/conda_env/build.sh
# Activate conda env
source activate naruto
Follow the steps to download Replica Dataset.
# Download Replica data and save as data/replica_v1.
# This process can take a while.
bash scripts/data/replica_download.sh data/replica_v1
# Once the donwload is completed, create modified Habitat Simulator configs that adjust the coordinate system direction.
# P.S. we adjust the config so the coordinates matches with the mesh coordinates.
bash scripts/data/replica_update.sh data/replica_v1
[Optional] We also use Replica Data (for SLAM) for some tasks, e.g. passive mapping / initialize the starting position.
# Download Replica (SLAM) Data and save as data/Replica
bash scripts/data/replica_slam_download.sh
To download Matterport3D dataset, please refer to the instruction in Matterport3D.
The download script is not included here as there is a Term of Use agreement for using Matterport3D data.
However, our method does not require the full Matterport3D dataset. Users can download the data related to the task habitat only.
# Example use of the Matterport3D download script:
python download_mp.py -o data/MP3D --task_data habitat
# Unzip data
cd data/MP3D/v1/
unzip mp3d_habitat.zip
rm mp3d_habitat.zip
cd ${ROOT}
Here we provide the script to run the full NARUTO system described in the paper. This script also includes the upcoming evaluation process. We also provide a flowchart to assist users to better understand the logic flow of the NARUTO planner.
# Run Replica
bash scripts/naruto/run_replica.sh {SceneName/all} {NUM_TRIAL} {EXP_NAME} {ENABLE_VIS}
# Run MP3D
bash scripts/naruto/run_mp3d.sh {SceneName/all} {NUM_TRIAL} {EXP_NAME} {ENABLE_VIS}
# examples
bash scripts/naruto/run_replica.sh office0 1 NARUTO 1
bash scripts/naruto/run_mp3d.sh gZ6f7yhEvPG 1 NARUTO 0
bash scripts/naruto/run_replica.sh all 5 NARUTO 0
We evaluate the reconstruction using the following metrics with a threshold of 5cm:
- Accuracy (cm)
- Completion (cm)
- Completion ratio (%)
We also compute the mean absolute distance, MAD (cm), between the estimated SDF distance on all vertices from the ground truth mesh.
In line with methodologies employed in previous studies [65, 66], we refine the predicted mesh by removing unobserved regions and noisy points that are within the camera frustum but external to the target scene, utilizing a mesh culling technique. Refer to [65] for a detailed explanation of the mesh culling process
# Evaluate Replica result
bash scripts/evaluation/eval_replica.sh {SceneName/all} {TrialIndex} {IterationToBeEval}
bash scripts/evaluation/eval_mp3d.sh {SceneName/all} {TrialIndex} {IterationToBeEval}
# Examples
bash scripts/evaluation/eval_replica.sh office0 0 2000
bash scripts/evaluation/eval_mp3d.sh gZ6f7yhEvPG 0 5000
# Draw trajectory in the scene mesh
bash scripts/evaluation/visualize_traj.sh {MESH_FILE} {TRAJ_DIR} {CAM_VIEW} {OUT_DIR}
# examples
bash scripts/evaluation/visualize_traj.sh \
results/Replica/office0/NARUTO/run_0/coslam/mesh/mesh_2000_final_cull_occlusion.ply \
results/Replica/office0/NARUTO/run_0/visualization/pose \
src/visualization/default_camera_view.json \
results/Replica/office0/NARUTO/run_0/visualization/traj_vis_view_1
1. Prepare 3D model (e.g. ply/glb files)
2. Use Blender to adjust the coordinate system. (currently we are using RDF in Blender, when it is properly adjusted. We should be looking at the model, and the model is standing)
3. Delete all other unnecessary objects in the `Scene Collection`.
4. Scale and Translate the object to a proper location and size.
5. Add a cube (shift+A) as the bounding box. Scaling cube with negative scales!
6. Export the model as glb
7. Create configuration files as listed in `configs/NARUTO/hokage_room`.
8. RUNNING NARUTO!
NARUTO is licensed under MIT licence. For the third parties, please refer to their license.
- CoSLAM: Apache-2.0 License
- HabitatSim: MIT License
- neural-slam-eval: Apache-2.0 License
We sincerely thank the owners of the following open source projects, which are used by our released codes: CoSLAM, HabitatSim.
@article{feng2024naruto,
title={NARUTO: Neural Active Reconstruction from Uncertain Target Observations},
author={Feng, Ziyue and Zhan, Huangying and Chen, Zheng and Yan, Qingan and Xu, Xiangyu and Cai, Changjiang and Li, Bing and Zhu, Qilun and Xu, Yi},
journal={arXiv preprint arXiv:2402.18771},
year={2024}
}