Skip to content

DualPM/AnimodelPoints

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Animodel-Points

Benchmarking protocol

  1. Accepts (20000,3) pointclouds per target instance.
  2. Fits each instance with 200 steps of Iterative Closest Points alignment, using 10k points subsampled from the input.
  3. Root mean square of L2 bidirectional chamfer distances is computed pairwise with the final alignments.
  4. Statistics for the results are reported.

The ground truth is Camera-aligned. Camera-aligned means that the rotation from the Model-view matrix has been applied. Along with the xz component of translation. The benchmark contains fixed draws of 20k-point pointclouds from the Camera-aligned versions of the Animodel dataset.

User must provide camera-aligned pointclouds with exactly 20k points for every instance in the benchmark. Iterative Closest Points (ICP) can be unstable. Better initial alignment will give better results.

Usage!

  1. create a config, see config/animodel_points as a template
  2. call scripts/run_animodel_points.py to evaluate a given animal, interface below..
python scripts/run_animodel_points.py --animal <horse/cow/sheep> --method <method_name> (--no_rotation) --config <path/to/config, eg. config/animodel_points> (<any hydra overrides>)