Giter Club home page Giter Club logo

pvd_mine's Introduction

One is All: Bridging the Gap Between Neural Radiance Fields Architectures with Progressive Volume Distillation (AAAI 2023 Oral)

🥳 New: 🥳 Code for more powerful PVD-AL is now provided here

Introduction

In this paper, we propose Progressive Volume Distillation (PVD), a systematic distillation method that allows any-to-any conversions between different neural architectures, including MLP(NeRF), sparse(Plenoxels) or low-rank tensors(TensoRF), hash tables(INGP).

Installation

We recommand using Anaconda to setup the environment. Run the following commands:

Step1: Create a conda environment named 'pvd'

conda create --name pvd python=3.7
conda activate pvd
pip install -r ./tools/requirements.txt

Step2: Install extension modules. (Draw from the great project torch-ngp that we mainly rely on.)

bash ./tools/install_extension.sh

Datastes & Pretrained-teacher models

You can download Synthetic-NeRF/LLFF/Tanks&Temples datasets from google, or from baidu.

And download pretrained-teacher-models from google, or from baidu.

You can also train a teacher model according to the follow guidance.

Train a teacher

# train a hash-based(INGP) teacher
python main_just_train_tea.py ./data/nerf_synthetic/chair --model_type hash --data_type synthetic  --workspace ./log/train_teacher/hash_chair

# train a sparse-tensor-based(TensoRF VM-decomposion) teacher
python main_just_train_tea.py ./data/nerf_synthetic/chair --model_type vm --data_type synthetic  --workspace ./log/train_teacher/vm_chair

# train a MLP-based(NeRF) teacher
python main_just_train_tea.py ./data/nerf_synthetic/chair --model_type mlp --data_type synthetic  --workspace ./log/train_teacher/mlp_chair

# train a tensors-based(Plenoxels) teacher
python main_just_train_tea.py ./data/nerf_synthetic/chair --model_type tensors --data_type synthetic  --workspace ./log/train_teacher/tensors_chair

Distill a student

# teacher: hash(INGP),  student: vm(tensoRF)
python3 main_distill_mutual.py  ./data/nerf_synthetic/chair \
                    --data_type synthetic \
                    --teacher_type hash \
                    --ckpt_teacher ./log/train_teacher/hash_chair/checkpoints/XXX.pth \
                    --model_type vm \
                    --workspace ./log/distill_student/hash2vm/chair
                    
# teacher: MLP(NeRF),  student: tensors(Plenoxels)
python3 main_distill_mutual.py  ./data/nerf_synthetic/chair \
                    --data_type synthetic \
                    --teacher_type mlp \
                    --ckpt_teacher ./log/train_teacher/mlp_chair/checkpoints/XXX.pth \
                    --model_type tensors \
                    --workspace ./log/distill_student/mlp2tensors/chair
                   

Evaluation

# evaluate a hash teacher
python main_distill_mutual.py ./data/nerf_synthetic/chair  --teacher_type hash --ckpt_teacher PATH/TO/CKPT.pth --test_teacher --data_type synthetic --workspace ./log/eval_teacher/hash_chair

# evaluate a mlp student
python main_distill_mutual.py ./data/nerf_synthetic/chair --model_type mlp --ckpt PATH/TO/CKPT.pth --test --data_type synthetic --workspace ./log/eval_student/mlp_chair

More detailed parameter description and running commonds

Please refer to more running description for details of training different types of datasets, parameter adjustment, key settings, etc.

Citation

If you find our code or paper useful, please consider citing

@article{fang2022one,
  title={One is All: Bridging the Gap Between Neural Radiance Fields Architectures with Progressive Volume Distillation},
  author={Fang, Shuangkang and Xu, Weixin and Wang, Heng and Yang, Yi and Wang, Yufeng and Zhou, Shuchang},
  journal={arXiv preprint arXiv:2211.15977},
  year={2022}
}

Acknowledgement

We would like to thank ingp, torch-ngp, TensoRF, Plenoxels, nerf-pytorch for their great frameworks!

Also check out Arch-Net for more on general progressive distillation.

pvd_mine's People

Contributors

fangkang515 avatar pingguanhua avatar yuzewang1998 avatar zsc avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.