Nerfies demonstrates deformation-aware neural radiance fields that reconstruct and render dynamic, real-world scenes from casual video. Instead of assuming a static world, the method learns a canonical space plus a deformation field that maps changing poses or expressions back to that space during training. This lets the system generate photorealistic novel views of nonrigid subjects—faces, bodies, cloth—while preserving fine detail and consistent lighting. The training pipeline handles imperfect captures by modeling camera poses, exposure variations, and background segmentation, producing stable geometry and appearance. A set of utilities manages dataset preparation, pose estimation, and checkpoints so researchers can reproduce results on their own footage. The work sits at the intersection of graphics and vision, showing how learned volumetric rendering can handle human motion without dense markers or studio rigs.

Features

  • Canonical-space NeRF with learned nonrigid deformation field
  • High-quality novel-view synthesis of moving, deformable subjects
  • Robust training with pose, exposure, and segmentation handling
  • Utilities for dataset prep, experiments, and checkpoints
  • Support for reenactment and expression/pose interpolation
  • Reproducible baselines for dynamic NeRF research

Project Samples

Project Activity

See All Activity >

License

Apache License V2.0

Follow Nerfies

Nerfies Web Site

Other Useful Business Software
Auth0 for AI Agents now in GA Icon
Auth0 for AI Agents now in GA

Ready to implement AI with confidence (without sacrificing security)?

Connect your AI agents to apps and data more securely, give users control over the actions AI agents can perform and the data they can access, and enable human confirmation for critical agent actions.
Start building today
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of Nerfies!

Additional Project Details

Programming Language

Python

Related Categories

Python Neural Network Libraries

Registered

2025-10-10