This repository contains our experiments under the openmask3d setting reported in the paper "Context-Aware Replanning with Pre-Explored Semantic Map for Object Navigation".
Context-Aware Replanning with Pre-Explored Semantic Map for Object Navigation
Po-Chen Ko*, Hung-Ting Su*, Ching-Yuan Chen*, Jia-Fong Yeh, Min Sun, Winston H. Hsu
CoRL 2024
@inproceedings{
su2024contextaware,
title={Context-Aware Replanning with Pre-Explored Semantic Map for Object Navigation},
author={Hung-Ting Su and CY Chen and Po-Chen Ko and Jia-Fong Yeh and Min Sun and Winston H. Hsu},
booktitle={8th Annual Conference on Robot Learning},
year={2024},
url={https://openreview.net/forum?id=Dftu4r5jHe}
}To setup the openmask3d environment, first follow their Setup instructions:
Then, update openmask3d to our version by
conda activate openmask3d
pip install -r requirements.txtWe preprocess the data to put everything in the correct position for OpenMask3d. First, put the MatterPort3D dataset under ./datset/ The dataset can be downloaded here.
The resulting dataset structure should be
dataset/
|--scans/
|---{scan_ID_1}/
| |---region_segmentations/
| |---undistorted_camera_parameters/
| |---undistorted_color_images/
| |---indistorted_depth_images/
|
|---{scan_ID_2}/
...
Next, run preprocess_mp3d.py
python preprocess_mp3d.py
Before running OM3D, you need to download their pretrained models for 3D mask proposal. For your convenience, we provide download_on3d_models.sh which downloads the pretrained model checkpoints pnd puts them in the correct place (under ./resources/).
bash download_om3d_models.sh
To obtain OpenMask3D semantic map, run extract_mp3d_feature.py
python extract_mp3d_feature.py
Finally, the OpenMask3D experiment with the introduced replanning strategies can be executed by
python evaluate_mp3d_top_category.py
python evaluate_mp3d_top_confidence.py
The results will be saved as "topk_confidence_replanning_raw.json" and "topk_categoty_replanning_raw.json", respectively.
This project is developed from the following repositories: