Giter Club home page Giter Club logo

diegoroyo / mitsuba2-transient-nlos Goto Github PK

View Code? Open in Web Editor NEW

This project forked from mitsuba-renderer/mitsuba2

25.0 5.0 5.0 7.54 MB

Code for "Non-line-of-sight transient rendering" - Transient path tracing and non-line-of-sight capture simulation for Mitsuba 2

License: Other

CMake 1.56% C++ 82.87% C 0.76% Shell 0.07% Makefile 0.01% Cuda 0.83% Batchfile 0.01% Python 13.88% PowerShell 0.01%
mitsuba2 non-line-of-sight transient-path-tracing

mitsuba2-transient-nlos's Introduction

Authors: Diego Royo, Jorge Garcia, Adolfo Muñoz, Adrián Jarabo and the original Mitsuba 2 authors.

Mitsuba 2, extended for transient path tracing and non-line-of-sight data capture.



⚠️⚠️ Please use the new and updated Mitsuba 3 version instead! ⚠️⚠️
This repository is not maintained anymore. Please check out the new version, which uses Mitsuba 3 instead of Mitsuba 2. It has much faster render times and allows to use vectorized CPU and GPU modes very easily through the use of Dr.JIT. Click here to check out the new Mitsuba 3 version.




Documentation and usage

See tal python library, with utilies to execute mitsuba and visualize its results.

Transient path tracing

We provide the transientpath, transientstokes and streakhdrfilm plugins. There is an example scene below.

<scene version="2.2.1">
   <!-- Use transientpath integrator -->
   <integrator type="transientpath">
      <!-- Discard paths with depth >= max_depth -->
      <integer name="max_depth" value="4"/>
   </integrator>

   <!-- Use transientstokes integrator -->
   <!--
   <integrator type="transientstokes">
      <integrator type="transientpath">
            <integer name="max_depth" value="10"/>
      </integrator>
   </integrator>
   -->

   <!-- Geometry, etc. -->

   <sensor ...>
      <!-- Sensor configuration etc. -->

      <!-- Streak (transient) film with dimensions width x height x time -->
      <film type="streakhdrfilm" name="streakfilm">
         <integer name="width" value="1024"/>
         <integer name="height" value="1024"/>
         <integer name="time" value="400"/>
         <float name="exposure_time" value="8"/>
         <float name="time_offset" value="500"/>
         <rfilter name="rfilter" type="gaussian"/>
         <boolean name="high_quality_edges" value="true"/>

         <!-- NOTE: tfilter is not yet implemented -->
         <rfilter name="tfilter" type="gaussian"/>
      </film>

   </sensor>
</scene>

Non-line-of-sight data capture

We provide the nloscapturemeter plugin. See nloscapturemeter for additional details. There is an example scene below.

Note that variables that start with $ should be changed

<scene version="2.2.1">
   <integrator type="transientpath">
      <!-- Recommended 1 for progress bar (see path integrator) -->
      <integer name="block_size" value="1"/>
      <!-- Discard paths with depth >= max_depth -->
      <integer name="max_depth" value="4"/>
      <!-- Only account for paths with depth = filter_depth -->
      <!-- <integer name="filter_depth" value="3"/> -->
      <boolean name="discard_direct_paths" value="true"/>
      <!-- Next event estimation for the laser through the relay wall (recommended true) -->
      <boolean name="nlos_laser_sampling" value="true"/>
      <boolean name="nlos_hidden_geometry_sampling" value="false"/>
      <boolean name="nlos_hidden_geometry_sampling_do_mis" value="false"/>
      <boolean name="nlos_hidden_geometry_sampling_includes_relay_wall" value="false"/>
   </integrator>

   <!-- Relay wall and hidden geometry materials -->
   <bsdf type="diffuse" id="white">
      <rgb name="reflectance" value="0.7, 0.7, 0.7"/>
   </bsdf>

   <!-- Hidden geometry -->
   <shape type="obj">
      <string name="filename" value="$hidden_mesh_obj"/>
      <bsdf type="twosided">
            <ref id="white"/>
      </bsdf>

      <transform name="to_world">
            <scale x="$hidden_scale" y="$hidden_scale" z="$hidden_scale"/>
            <rotate x="1" angle="$hidden_rot_degrees_x"/>
            <rotate y="1" angle="$hidden_rot_degrees_y"/>
            <rotate z="1" angle="$hidden_rot_degrees_z"/>
            <translate x="0" y="0" z="$hidden_distance_to_wall"/>
      </transform>
   </shape>

   <!-- Relay wall -->
   <shape type="rectangle">
      <ref id="white"/>

      <transform name="to_world">
            <rotate z="1" angle="180"/>
            <scale x="$relay_wall_scale" y="$relay_wall_scale" z="1"/>
      </transform>

      <!-- NLOS capture sensor placed on the relay wall -->
      <sensor type="nloscapturemeter">
            <sampler type="independent">
               <integer name="sample_count" value="$sample_count"/>
            </sampler>

            <!-- Laser -->
            <emitter type="projector">
               <spectrum name="irradiance" value="400:0, 500:80, 600:156.0, 700:184.0"/>
               <float name="fov" value="1.5"/>
            </emitter>

            <!-- Acount time of flight for the laser->relay wall and relay wall->sensor paths -->
            <boolean name="account_first_and_last_bounces" value="$account_first_and_last_bounces"/>

            <!-- World-space coordinates -->
            <point name="sensor_origin" x="-0.5" y="0" z="0.25"/>
            <point name="laser_origin" x="-0.5" y="0" z="0.25"/>
            
            <!-- alternative to laser_lookat_pixel -->
            <!-- <point name="laser_lookat_3d" x="0" y="0" z="0"/> -->
            
            <!-- Screen-space coordinates (see streakhdrfilm) -->
            <point name="laser_lookat_pixel" x="$laser_lookat_x" y="$laser_lookat_y" z="0"/>

            <!-- Transient image I(width, height, num_bins) -->
            <film type="streakhdrfilm" name="streakfilm">
               <integer name="width" value="$sensor_width"/>
               <integer name="height" value="$sensor_height"/>

               <!-- Recommended to prevent clamping -->
               <string name="component_format" value="float32"/>

               <integer name="num_bins" value="$num_bins"/>

               <!-- Auto-detect start_opl (and also bin_width_opl if set to a negative value) -->
               <boolean name="auto_detect_bins" value="$auto_detect_bins"/>
               <float name="bin_width_opl" value="$bin_width_opl"/>
               <float name="start_opl" value="$start_opl"/>

               <rfilter name="rfilter" type="box"/>
               <!-- NOTE: tfilters are not implemented yet -->
               <!-- <rfilter name="tfilter" type="box"/>  -->
               <boolean name="high_quality_edges" value="false"/>
            </film>
      </sensor>
   </shape>
</scene>

Refer to the original Mitsuba 2 repository for additional documentation and instructions on how to compile, use, and extend Mitsuba 2.

License and citation

See the original repository. Additionally, if you are using this code in academic research, we would be grateful if you cited our publication:

@article{royo2022non,
    title = {Non-line-of-sight transient rendering},
    journal = {Computers & Graphics},
    year = {2022},
    issn = {0097-8493},
    doi = {https://doi.org/10.1016/j.cag.2022.07.003},
    url = {https://www.sciencedirect.com/science/article/pii/S0097849322001200},
    author = {Diego Royo and Jorge García and Adolfo Muñoz and Adrian Jarabo}

mitsuba2-transient-nlos's People

Contributors

4str0m avatar acrlakshman avatar alhirzel avatar andipotal avatar arpit15 avatar aspurdy avatar bathal1 avatar belcour avatar diegoroyo avatar diiigle avatar dnakath avatar dvicini avatar fritschy avatar jammm avatar jczh98 avatar jgarciapueyo avatar kopetri avatar leroyvn avatar loubetg avatar matthewpurri avatar mehrab2603 avatar merlinnd avatar nathan96g avatar pidgeybe avatar pjessesco avatar schunkes avatar sergeyreznik avatar speierers avatar tizian avatar wjakob avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

mitsuba2-transient-nlos's Issues

Three minor questions about this code

Hi! I've been following the works of your group and recently shifted my focus from BunnyKiller to mitsuba2-transient-nlos, and about this code I have three minor questions mainly and I'd appreciate it if you can answer these questions for me:

Medium support for transient path sampling

In transientpath.cpp there is a function called sample, which I believe it's for obtaining the transient time samples. It overrides the virtual function defined in class TransientSamplingIntegrator. However, the implementation in transientpath.cpp (from line 197) is:

void sample(const Scene *scene, Sampler *sampler, const RayDifferential3f &ray_,
           const Medium * /* medium */,
           std::vector<FloatSample<Float>> & /* aovs_record */,
           std::vector<RadianceSample<Float, Spectrum>> &timed_samples_record,
           Float max_path_opl, Mask active) const override {

and medium seems not get used at all. Also, the logic in this function seems to only have considered surface interactions but not medium interaction. So I wonder: Do I have any misunderstanding here and your code actually supports transient path sampling with medium? Or the code actually does not support medium interaction? If so, will this feature be added in the code or if I was to implemented this functionality myself just by referring to the original volpath.cpp and your code, is it tractable? I think this is a very important feature (BK and mitsubaToF both have this functionality).

Mutex problem

In streakfilm.cpp, line 126-127. global_max_opl and global_min_opl are two global variables (might be modified by multiple threads):

/* Critical section: update min/max OPL */ {
     global_max_opl = std::max(global_max_opl, local_max_opl);
     global_min_opl = std::min(global_min_opl, local_min_opl);
}

Why there is no mutex to lock these variables just like what is done in line 132-134 (right below it)? These two variables are not atomic and min/max are not atomic neither. Is this some special feature in enoki or there is other reason behind this?

mitsuba3

Do you have any plan to opt for mitsuba3 since mitsuba 2 is no longer maintained? I lack good understanding of those algorithm and the experience of enoki and drjit, therefore though I really wanna try moving the transient-imaging and NLOS related code of yours to mitsuba3, there seems to be major refactoring work in here...

compile issuel

  • [🔨 compilation issue]

Summary

I got compile errors when I followed the documentation, but it works well if compile the officially original one.

System configuration

  • Platform: Windows 11/ Ubuntu 22.04 LTS
  • Compiler: VS 2022/ clang 14.0
  • Python version: 3.9/ 3.9
  • Mitsuba 2 version: v2.2.1-nlos1.1
  • Compiled variants:
    • scalar_rgb
    • packet_mono

Description

As above, when compiling, many errors about datatype occur, it is quite unreasonable.
Mainly, may I have the information about your system configuration as a reference?
Thanks a lot!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.