-
Notifications
You must be signed in to change notification settings - Fork 242
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[✨ feature] Emissive Volume Rendering #678
base: master
Are you sure you want to change the base?
[✨ feature] Emissive Volume Rendering #678
Conversation
I've identified the issue with the failing tutorial notebook, it's specifically related to the |
Hi @Microno95 I quickly skimmed through this PR. It's looking good! I'll need to spend more time to carefully review it, and think about some of the apparently necessary interface change 😄 |
No worries! I've made a list of the necessary features for NEE sampling volumes, hopefully it helps with deciding on the interface and I'm very happy to discuss it in more detail as well 😊 Extensions to
I think these should belong to the And then the emitter calls to the appropriate method while forwarding the extra random sample as needed. |
331b87d
to
2ca6695
Compare
Nice work! Sorry I somehow missed this new PR. Just a high level comment for now: This adds quite a lot of code complexity - would it make sense to have special integrators for emissive media? It's somewhat of a special case and the existing volpath integrators are already hard to parse (e.g., for someone coming from using NeRF...) Also, would it make sense to move some of those "medium probabilities" functions to the medium class? |
I have significantly simplified the sampling of emission and reduced the complexity of the integrators in my latest code that I will push soon. Generally, I think it does make sense to move these to the medium class as the different medium event sampling modes are better suited for different media. e.g. if a medium has 0 scattering for, say the green channel, the analogue probabilities actually underestimate the number of scatters when using hero wavelength sampling and lead to significant fireflies. Turns out you can sample radiance at every medium interaction (at least when not using NEE for medium emitters, that's a separate case to be worked on), and this has simplified the inclusion of radiative media effects to a single additional line in both I have held off on pushing any updates as I was waiting on how I should go about updating the endpoint/emitter interface to support a third random sample. |
I see, thanks for clarifying. I don't think adding an extra sample to the endpoint interface is a big deal. How do you deal with sampling the volume of complex shapes? Do you simply sample the bounding box and fail the sample if it falls outside the shape? I think currently it's still tricky to use an arbitrary number of random samples within a virtual function call (e.g., to do rejection sampling) |
Does it make sense to add the 3rd volume sample by changing the I have not yet decided on the best way to sample triangle mesh-based shapes. My two ideas were to either sample within the bounding box and fail the sample if it falls outside as you said, or to build a discrete distrubution using a simple tetrahedral decomposition of the mesh. The upside of using a tetrahedral representation (or really any discretisation of the volume) is that it'll be faster to sample and easier to check if a sample falls inside/outside the shape. The alternative is iterating over every triangle to calculate a winding number or using ray intersections (calculating the number of crossings). The issue with ray intersections is in regards to choosing a direction for testing and I've found that it can be problematic depending on the mesh and winding number calculation scales linearly with the number of triangles. |
I think changing the sample to be Point3f isn't a big deal, better than adding another parameter. I don't think adding a tetrahedral representation makes sense (from an engineering point of view). As far as I know - correct me on that - computing a tetrahedralization is far from trivial and not something that can easily be implemented in. a standalone helper function. I would suggest to check if a point is inside the mesh by simply tracing a ray in a fixed (or randomized) direction and checking if the triangle we hit is facing away from the ray direction. This should work for watertight meshes, and that either way is the assumption for any of the volumetric integrators to work correctly. |
That makes sense! I will work on those and push an update asap. |
Hey @dvicini, I am in the process of implementing the interface for Next Event Estimation of volume emitters, but I'm struggling with estimating the pdf of a direction sampled in the emitter, specifically implementing the The issue I'm having is that while I know the pdf of sampling a given point in the medium, the pdf of a direction is with respect to the solid angle whereas the pdf of sampling a point is with respect to the volume. I'm not sure how to make these two commensurate. I know the following relation should hold i.e. the integral of the solid angle pdf over all solid angles should be equal to the integral of the volume pdf over all volumes (and both should be equal to 1 for a normalised pdf.). Knowing that, I can rewrite the equation as follows. And (this is the part I'm unsure about), this should imply that If all that is correct, then the pdf of sampling a given direction should be straightforward to compute based on the line segments that pass through the medium - easily found via ray intersections. I'm not sure how this is to be treated in the case of medium vs bsdf interactions given that |
The Jacobian to change between solid angle and volume sampling is indeed I am not sure I fully understand you question. If this is primarily about MIS, you will just want to make sure that both PDFs are in the same measure. The conceptually simplest way is to integrate the volume PDF over the ray segment, but that is costly and potentially inaccurate (e.g. can be done using ray marching). It's good to keep in mind that MIS weights don't have to use exact PDFs, they just need to be computed consistently to ensure unbiasedness. Another option could be to compute the MIS in volume measure, I think that would forego the need to integrate over segments (potentially at an increase in noise? but with cheaper per-sample cost). In that case, you would just have to convert the solid angle PDF to volume PDF using the geometry term. |
Right, I think I understand better now. I was thinking I'd need to convert the volume PDF to a solid angle PDF in order to do MIS with the volume emitter, but you're saying I can do it the other way around (which makes perfect sense). In terms of converting the solid angle PDF to a volume PDF, would it be correct to divide |
Yes, this sounds good to me. You will need to make sure that the MIS is consistent with the evaluation of emission after direction sampling. I.e., the simplest method would again be to evaluate the emission along the entire ray segment. |
It seems that simply converting the solid angle PDF to a volume PDF does not work, or perhaps I'm doing something wrong. I've verified that integrating the volume PDF over the ray length works, but it is slower (by a lot, around 20x-40x due to the ray casts required) and raises OptiX warnings in CUDA modes. At first I thought it would be a simple change of variables using the differential element Is it perhaps to do with the fact that |
a9ebdb5
to
a424aed
Compare
The current implementation correctly tracks the ray entry and exit points in a given shape and calculates the exact directional pdf. In # ~~~~ Was wrong about this ~~~~ # I am working on improving and fixing the implementation, but I would welcome a second set of eyes over the code to see if there's anything I'm missing in how I'm accumulating the pdf. Okay, nevermind, it seems that both are incorrect for an as-of-yet unknown reason by the same amount. With OptiX I was getting the correct results in that case due to a bug so I'll investigate down that path. |
a424aed
to
09bf724
Compare
Looking further into, it seems that I've attached two renders, with and without NEE (512x512@4096spp), that show the discrepancy between the NEE and non-NEE results. This is of a The discrepancy definitely has to do with how the PDF is computed, but I'm not sure what's wrong as I've verified that the calculations are correct - by computing the analytic solution to the directional PDF and comparing it to the ray traced solution for the I've also attached the .exr files themselves for better numerical comparison in TEV. |
Hmm, really hard to say what's wrong just from these two images. I would say that for a homogeneous sphere of emissive volume it should really be possible to get a perfect match, as it doesn't have any of the complexity of complex shapes or delta tracking for heterogeneous media. My only suggestion would be to further reduce the complexity: 1) only render direct illumination, no scattering and 2) completely disable absorption. I.e., can you match the total emission from an emissive sphere volume that does not absorb any light? |
With a homogeneous sphere where In the literature, the closest I've seen to the implementation of the solid angle PDF is in Equation 15 of Line Integration for Rendering Heterogeneous Emissive Volumes where they estimate the solid angle PDF by integrating the volume PDF along the ray direction with the appropriate jacobian. Perhaps there is additional consideration in how NEE estimates of illumination are combined with Unidirectional estimates for volume emitters. I should add that a purely emissive medium is not possible with the current homogeneous medium handling in volpath/volpathmis due to the way distance sampling in the medium is computed and how homogeneous media are optimised in the code paths. i.e. to gather emission from homogeneous emissive media with no absorption, there is no way to sample free-flight distances and gather samples at different points along the ray since the ray immediately exits the medium. Thus I achieve the renders by using a heterogeneous medium with |
I see, yes it makes sense that for this setting you need to use some non-zero null density to even sample the emission contributions. I would investigate this case further, this seems like a good minimal test case to debug. |
I've looked further into it, and I think there's actually a bug in how Here's the reference render using the And here's the render from I've also included the renders from I've tested it and the issue exists in the |
Oh, that's a good catch. Not sure what's wrong here, but maybe something related to the ShadowEpsilon or so. I will take a look, would be good to fix |
Had a chance to take a look while wrapping my head around the mitsuba3/src/integrators/volpath.cpp Lines 419 to 425 in ffcde97
remaining_dist = ds.dist * (1.f - math::ShadowEpsilon<Float>) - total_dist;
ray.maxt = remaining_dist;
active &= (remaining_dist > math::ShadowEpsilon<Float>);
if (dr::none_or<false>(active))
break; |
I think the problem is a bit more subtle. By adding the epsilon to the check for the remaining distance, you are essentially doubling the shadow epsilon. The epsilon is already included in the computation of |
… volume sampling. Modified every dependency accordingly and added a flag to signal when the third sample is used. Updated all tests to correctly account for the use of Point3f over Point2f.
… both shapes and volumelights.
…implemented tests.
…olumetric integrators
…ded more tests for volumelight.cpp
…last_scatter_event assignment
…`cuda_*` modes and removed TODO as it's technically implemented
…ed flags to volpath for disentangling emitter and unidirectional sampling
…g with analytic sphere shape, needs mesh shape to be fixed
…ers, needs further testing
…e emitters, needs further testing
…ed up volumetric integrators
… 2d for volume sampling. Modified every dependency accordingly and added a flag to signal when the third sample is used. Updated all tests to correctly account for the use of Point3f over Point2f.
…roperty for sampling emission of medium
…phere and cylinder, and rejection-based sampling to mesh-based shapes
…ters and added appropriate python interfaces/documentation.
…cases in test_ad_integrators.py
TODO: Revert back to mitsuba3/mitsuba-data when merged
… on the git test runner.
… differentiation to prbvolpath.py - probably needs further testing. Optimized volpath.cpp for better ray coherence by looping through the medium traversal for all lanes.
…rd-mode differentiation to prbvolpath.py - probably needs further testing. Optimized volpath.cpp for better ray coherence by looping through the medium traversal for all lanes.
…ium looping as it is slower to render in many cases.
… to failing test.
bb1fc9d
to
23cf67a
Compare
23cf67a
to
4e191ce
Compare
Description
This PR adds emissive event sampling in the null-scattering volpathmis integrator for sampling the VRE and MIS for emission. It supersedes the previous PR #40.
Adds emissive volume support by adding a volume emitter. The interface adds a
volumelight
emitter to a given shape that is then sampled as part of the medium sampling routines in volpath/volpathmis/prbvolpath.The current implementation modifies none of the interfaces of endpoint and thus does not work with MIS as-of-yet. I am unsure of the best way to implement this in terms of modifying the interface. The cleanest solution seems to be that the endpoint sampling methods take an extra float to sample a 3d point, but whether this should be implemented by changing the existing Point2f sample to Point3f or adding an additional argument is unclear to me.
The current implementation is feature complete except for NEE for volume emitters, I have added and tested the python interfaces by bringing the prbvolpath implementation to parity with volpath.
Checklist
- generates warnings about unused parameters inThere are no longer any new warnings.volumelight
, these methods will be implemented and the parameters used.cuda_*
andllvm_*
variants. I have compiled every single variant and run all the tests, and everything passes except for the tutorial notebooks, but these throw an unicode char encoding error so I think it's unrelated. I will check with a fresh master copy if the same error occurs, might be a windows only issue.this is in-progress, have added emissive volume scenes and tests that I will make a PR for in the mitsuba-renderer/mitsuba-data repo.Created the following PR to merge emissive volume tests: Emissive Participating Media reference data mitsuba-data#17Notes
volpath
andprbvolpath
have implementations of the analogue, maximum and mean event probability calculations for testing. The issue of inconsistency only appears with analogue probabilities in non-scalar modes, and only for volpathmis and prbvolpath.