-
-
Notifications
You must be signed in to change notification settings - Fork 35.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LightProbeVolume: Baked lighting for dynamic objects #18371
base: dev
Are you sure you want to change the base?
Conversation
Great job @donmccurdy , this is amazing stuff! |
To start chipping away at the unresolved parts of this PR:
Unless there's any strong argument against it, I'd like to require grid-shaped volumes for now. That should let us drop the triangulation dependency (smaller build!) and means that if we later want to represent the volume as a GPU texture, it won't require a breaking change. I can't think of a way to represent large, non-grid probe volumes efficiently as GPU textures. |
Ok, grids only for now. I've dropped the 'delaunay-triangulate' dependency, and tetrahedra are tessellated directly from the grid structure. Next question — per-object SH. This could be addressed in a separate PR, but the light volume isn't much use if only one dynamic object can receive lighting. Discussion in #16228 covered a few possible naming conventions and APIs, but mostly focused on scene-wide lights... Do any of these make sense?
Or is there another way we'd prefer to have the LightProbeVolume applied to individual objects? |
I was hoping I could work around the issue above without core changes, by updating the single global probe with var volume = new THREE.LightProbeVolume().fromJSON( data );
var probe = new THREE.LightProbe();
scene.add( probe );
meshes.forEach( ( mesh ) => {
mesh.onBeforeRender( () => volume.update( mesh, probe.sh ) );
} );
function render () {
renderer.render( scene, camera );
} ... unfortunately, that doesn't work. The light probe's uniforms are (understandably) not getting updated between each object. Some change or addition to the current use of LightProbe in the renderer will be necessary I think. If this code were mature and production-ready (it isn't yet...), and we wanted it in core of three.js, I would say that probe volumes should be assigned to objects very much like we do for environment maps today, for consistency:
Does that seem like the right direction? If so, maybe we start with supporting either a LightProbe or SH assigned to a material initially, and the probe volume — or user code — can update that? |
Makes sense to me 👌 |
Thanks! Trying to fit the environment map pattern, then. Attaching an Object3D subclass like LightProbe directly to a Material feels awkward, so I think I'd rather do something like this... material.indirectDiffuseSH = new THREE.SphericalHarmonics3().zero(); ... zero by default, and added to any AmbientLight, HemisphereLight, or LightProbe already in the scene for backward-compatibility. Does that make sense to you @WestLangley + @bhouston? I took the name from an older comment (#16228 (comment)) by Ben:
I would also be happy with names derived from "global illumination" or "irradiance". |
Well, it works in any case. 🙂 |
The environment map could have been named
I am not really happy with any of those suggestions, TBH. |
I think this PR is excellent. However, I also think some changes are in in order... for a later PR, that is. In spite of what I said elsewhere, a light probe is not a source of light. It is a probe of light. It measures illuminance. A probe measures light at a given location/direction. A Consequently, I no longer think Furthermore, I think we should begin thinking in these terms, and modify the library accordingly. Exactly how we go about doing that is up for discussion. |
@WestLangley thanks for the comments!
I think these two issues are closely related. I agree that LightProbe does not need to extend Light, and arguably should not. Similarly, LightProbeVolume in this PR is a (compound) probe rather than a light, as it "measures light at [many] given location/direction[s]". In practice, these locations are usually predetermined, e.g. with grid structures as I've done here. Those are the locations at which probes measure light. What is left after that is not measurement but interpolation — an object moves through a volume, and an estimation of the light at the object's location (stored as SH3) is contributed to the object's lighting calculation. While you could store that derived SH3 in a LightProbe again, I find that unsatisfying as the resulting probe is not being used to "measure." For the same reason, assigning a probe to a material seems unsatisfying. That is the argument I see for assigning an SH3 to the material rather than a probe — the SH3 simply represents a value. |
Assigning the property an instance of SH3 is fine. However, remember that illuminance can be encoded not only as SH3, but SH4, for example, or perhaps more commonly as an equirectangular texture. I am not a fan of using the encoding method in the material property name. So personally, I do not like using I am having difficulty finding a name I do like, but I am leaning toward |
Ok, that all makes sense! Without the "SH", "material.indirectDiffuse" does not sound as compelling — it may be confusing when "diffuse" is often used to refer to a material's base color in other software. For the sake of exploring, here's all I can come up with:
I am happy with any of these except the last. I could also imagine something with an |
I think I might prefer to work with irradiance rather than illuminance. The real-time rendering resources I'm able to find, or those relevant to this topic, typically use it. See the Filament docs, for example, discussing light probes, "irradianceSH", and "irradianceEnvMap". This implies physical units are W/m2 or W/cm2. |
I think the trend now is to use photometric terms (lumens, candela, lux, nits) -- not the radiometric terms based on watts, so I would use the term 'illuminance'. We already use the photometric terms in the three.js docs. I do not know why the Filament docs are so inconsistent in terminology -- both luminance and radiance are used. Maybe it is just force-of-habit. |
I think that is true for lights — note the Filament docs use illuminance only there — but that probes and volumes are always discussed in terms of "irradiance." I don't know why that is, but will try to find out. The intro to this talk was also quite helpful: https://www.gdcvault.com/play/1026182/. |
A pairing like Regardless of what we name this, I'm not completely happy with my implementation in dev...donmccurdy:feat-indirectdiffusesh. It seems odd that two sets of SH3 coefficients have to be stored as uniforms (2 x 9 x vec3), where one is a light uniform (and therefore not updated between mesh draws) and the other is a material uniform, both representing global illumination for the object. Do you think those could be combined somehow, in the renderer? |
@donmccurdy The lighting effect of this scene seems to have a lot of noise, |
@FishOrBear if you mean the grain/noise on the hallway itself, that's unrelated to the light probe volume — I just baked the light there with a raycaster, and did so in a hurry, causing the noise. The light probe volume is contributing only the light on dynamic objects (the sphere) in this scene. |
- LightProbeVolume provides baked lighting for dynamic objects. - LightProbeVolumeHelper displays debug info about a volume. - Adds HallLightTest.glb and example.
0948b6a
to
b2c26af
Compare
Is this patch still progressing? It seems like really promising work. |
@donmccurdy Note that Consequently, if you use an environment map, and simultaneously store irradiance in a light probe volume, you will be double-counting the irradiance. three.js does not currently support a PBR workflow where the user provides a both a radiance map and an irradiance map. |
At the risk of derailing this thread even further... I am working on some very related changes for Hubs (still WIP) so I wanted to chime in since its quite relevant to the current discussion and wondering if there is overlap/consolidation that should happen... I have things set up such that you can define AABBs in which a particular environment map will apply. Objects overlapping multiple environment map boxes ("reflection probes") will blend between the 2 most overlapping boxes (or the scene's environment in the case of partially overlapping with 1 box). This is quite similar to what Unity does for reflection probes and the effect is quite nice (this scene has no dynamic lights, just lightmaps + "reflection probes") simplescreenrecorder-2021-11-30_17.11.17.mp4Since environment maps on MeshStandardMaterials are now all PMREM this is also sort of acting as a diffuse light probe which is great. I think we will still want some higher density probes for diffuse changes (ex across hard shadow boundaries) so I was planning to implement a solution like LightProbeVolumes as well in Hubs, but its interesting that apparently Unreal is just using cubemaps? (I have not been able to find a clear indication of this). Assuming they do, I wonder if we should be doing the same? And if so, does that mean reflection probes would also want to just be in a grid? I suspect even if we did use a grid of cubemaps for diffuse lighting doing reflection probes as boxes might still be desirable as you might want a few higher resolution probes for actual reflective objects, and having a position and box defined is also useful for doing things like box projection, though you could argue that could be handled as special cases. Current code for this is here: Hubs-Foundation/three.js@hubs-patches-133...MozillaReality:multiple-envmap .. Note this is on top of our three-133 fork but the only notable change as it pertains to lighting is that we do not apply the irradiance from environment maps to lightmapped objects (we assume you will bake that in already) Hubs-Foundation@eb6297b Workflow wise we are doing very much what is described above, just rendering out equirects in Blender. Current script I am using for this is here https://gist.github.com/netpro2k/fe5a3b1348f3644d9b39e149b7901cf4, though the plan is to integrate this more deeply into our addon. Note that I am using Blender's ReflectionCubemap objects only for their gizmos/UI as the underlying cubemap data is not actually accessible |
Hi, I come 2 years too late, but I'm working on some blender plugin that bakes scene probes (Eevee probes system) into cubemap sheet, It still in progress but advanced enough to be tested in threejs. here is it : https://github.com/gillesboisson/blender-probes-export It exports :
If some people still working on multi light probe integration with threejs, I would be interested to get inputs as it is huge work to implement a solution from scratch. |
That's amazing! I feel like SH would be better for web. That way we don't have to worry about HDR file sizes. |
@gillesboisson Just so you are aware... the three.js PBR materials automatically account for the irradiance (i.e., global illumination) implied by the Consequently, if you include an environment map plus an irradiance probe in your scene, you will be double-counting irradiance. |
Thanks for your feedbacks, I think it will be a bit tricky with threeJS env map as I have multi level roughness map exported in reflection map. Maybe for now I should prioritize on supporting SH export in texture then I'll do experiment with Three JS Is there anyone using this or working on this PR ? As it looks to be outdated. |
I don't think there's anyone working on this PR currently. |
Cheers, I red this thread, looked into changes and looked at current state of threeJS code. Here is few ideas on what I see in the blender plugin and in three JS Light probe definition and standardThe plugin use blender eevee probes objects which support irradiance grid and reflection cubemap. It render each probe element into a requirectangular image using blender cycle rendering engine, in srgb, but it seems from that I should switch to HDR for more accurate result. Then in second rendered result is packed into sheets
All of this is more detailed on the plugin repo readme From engine perspective
It would be interesting to have a new kind of objects which define a probes as now it still defined as light. It will have it own spatial segmentation (I saw some octogrid class somewhere but don't know it is handled in scene) and a common clipping / falloff for handling per object probe influence. It would be very nice and pretty simple to implement and compatible for future kind of probes (I saw some tetra / triangle system for irradiance ideas in this threads and also in some Unity old paper). I'm not an expert three JS and don't know yet what's the roadmap for supporting more probes but I would be interested to help on this part. Maybe first trying to implement a simple solution with custom shader material extending PBR material, objects and scene and then with your help doing a purposal based on three js standard and good practice. |
Agreed – for physically-based shading using the light probe data, we'll need HDR, some comments earlier in this thread have come to the same conclusion. Also note that Blender v4 is moving in the direction of supporting wide-gamut color, and this might affect the color space of light data sampled from probes... Probably the best place to start would be Linear Rec. 709 ("Linear sRGB") data in OpenEXR format. We can package the data into something more efficient and web-friendly later down the road. Or, JSON is always convenient too. I have no strong preference on using spherical harmonics vs. low-res cubemaps. @bhouston had commented earlier low-res cubemaps seem to be what most tools prefer these days, in #18371 (comment). Do others have a preference? We can convert cubemaps to SH but not necessarily the other way around... We should aim for something we can sample efficiently on the GPU. For that reason, I'm no longer really interested in supporting arbitrary tetmesh probe layouts, as I'd tried in earlier versions of this PR. Let's stick to a grid with well-defined falloff, using either SH or low-res cubemaps. Unfortunately I haven't been actively working on this PR for a while, and I don't have bandwidth to pick it back up right now. If anyone else is interested, please do! |
Using SH to estimate irradiance from HDR radiance probes can be problematic due to "ringing". There are various work-arounds, but I would be inclined to use low-res cube maps, instead. |
Good, less work on blender side :) I'll stick to small cubemap then. OpenEXR looks good, I'll check on it later as I don't no yet how to handle it with blender opengl API (as I use blender internal API to compute equi panos to irradiance / reflectance cubemaps). I use 16bit / per channel PNGs for now I still need to check how to keep constistancy between my baked light probes and blended object baking tools render settings (as I'd like to be integrated with fully light baked static object). |
Just to give an update on this subject, I'm working on some integration test in this repo : https://github.com/gillesboisson/threejs-probes-test When i'll have something working, i'll need some help to integrate it better with three js standard. |
I have some issue with my packing process. In my blender plugin :
in cubemaps I have weird dots in x axis on irradiance and y axis reflectance. for reflectance I'm pretty sure it's because of panos sampling for irradiance, I really don't get why. It's out of the three JS scope but if anyone has reference for this kind of issue it would be a great help. |
Hi, Just to give a follow up, I found some time to work on the probes volume system. You can see a demo here : https://three-probes.dotify.eu I'll start working on extending Material. Roadmap map is here : https://github.com/gillesboisson/threejs-probes-test#roadmap |
I finally resolve my blender export issue and add Open EXR export support I still need to finalize global env export, I'll be able to adapt the three JS proto to support both SDR and HDR probes and maybe implement the in engine baking featured on what's has been done on this PR. |
Hi, Here is a demo here of the probes volume structure : https://three-probes.dotify.eu/ It display volumes influences, probes and has a sphere following the camera which display interpolated cubemaps. I could solve most of my compatibility issues, and made a lot of inprovement on blender side. I moved SH to low priority. I'll switch to material integration. |
Just to give a quick update I have an almost working prototype here : https://three-probes.dotify.eu/ If any one has some doc or reference on how shader caching and uniforms works it would be a great hekp Cheers. |
Hi, quick report : I could find some time to fix issues and do some cleaning. The prototype works and support most of the targeted feature. It would be practical for me to have feedback from three experts as I'm still learning threejs low level engine parts. I'll go show the proto to the community in order to see if it makes sense for them to use it and add it to three. The doc explain quite well the state of the proto. It's all there demo : https://three-probes.dotify.eu/ I'll switch to blender side in order to make a more user friendly plugin. Cheers |
@gillesboisson thanks for your patience with our delayed replies here. The demo is outstanding — I'm really thrilled with how this is looking! A few comments, questions, and ideas. Please feel free to respond or ignore them as you prefer. :)
|
Thanks for your feedback, here is few ideas: I noted that the runtime "Sun" directed light is not visible in the reflection probes. Do you feel that's a user-specific decision, and that users could optionally include a sun lamp when baking in Blender? Or is there more to it? I know matching light intensities across Blender and three.js is not always easy... Blender probes has pretty robust solution, for each probe volumes you can set a specific visibility collection which you decide what is baked or not. In my demo I had to do hack with the sun as gltf importer seems to not recognize directionnal light and in some case ignore point lights. Do you have a preference about whether the runtime components of this should become part of the three.js repository, or would you prefer to manage a repository yourself? Personally I am happy with either. If it became part of three.js, I think I would mainly want to document the final structure of the probe data sufficiently that it could (at least theoretically) be produced by other tools in addition to Blender someday. Ideally I would like this to be merged into three js but for now I don't have a robust solution. If you check the material extension part of the code, it gives some approach on how we could implement these into the engine (current solution didn't required to modify threejs renderers, materials or shaders). Do you know what portion of the demo causes the warnings Currently we only support scene.environment (a single IBL for irradiance and reflections shared by an entire scene) in MeshStandardMaterial and MeshPhysicalMaterial. I suspect that if support for probe volumes or other global illumination were added to three.js core someday, it may be limited to only those two material types. In case this makes it easier for you to maintain in the meantime. I actually tried to support material that support envmap, You can check on my implementation it was prettry straight forward for both physical and basic based material, I had only issues with material with "ifdef USE_ENV" in there root shader code (like basic material). Thanks for your feedback, I'm definetely interested about a robust solution for handling the large amounts of textures / uniforms (for now instance rendering is not supported) |
THREE.GLTFLoader does handle both, but (1) the punctual lights option must be enabled in the Blender glTF addon settings to export, and (2) getting visual appearance of lighting intensities to match (not just the units!) is not trivial.
Unfortunately not, only texture 2D arrays. In WebGPU this would be possible.
Ok – we'll need to look closer at the implementation to decide on this I think. I'm not able to do that right now, but either someone else will, or I will get to it in the future! |
@donmccurdy : I updated data schema on blender plugin side and props in plugin which explained how volume are baked and how it is represented in the prototype implementation. I'm also working and supporting light baking in material texture. :
Some quick explanation on probes.json data structure here : |
Quick update, I'm still working on this, I have been focuses on the blender side. I had lightmap baking support on my plugins and work on exported data format. Here is a simple integration with active object using probes for GI and static objects (walls, pillars, etc) using lightmap for GI. |
A little update, I did some code optimisation and features like :
I update the demo scene on the repo, the open exr files are pretty heavy repo : https://github.com/gillesboisson/threejs-probes-test I worked a lot on the blender plugin, I will sell it in blender market under GPL license to finance a part of its development. It will come quite soon and will support lightmap / probes authoring and baking The three js side of the project is made with the idea of having its own baking method. I don't how it can be a purposal on three JS features, but It will be helpfull to get some feedback on how it can go to that direction. |
Are you aware of hdr jpg? Those ~70MB in exrs could become ~7MB in jpgs... |
Thanks for your feedback. I heard about jpghdr but not looked much into it. I have a general subject on data in general, as my way of packing, loading gtlf and texture is far from optimal. I'll work on some better packing solution for texture and gltf (glt-transform looks goods for it). I'll support more texture format export on my plugin SH texture pack for irradiance grid and jpghdr should be one of supported format. I'll need to improve loading part as now I'm not using GLTF loader properly (I saw there is way to customize the parser). |
Any updates? :) |
Summary
A LightProbeVolume samples diffuse indirect lighting in a scene at each of several LightProbe locations, then provides approximate diffuse lighting for dynamic objects at any location within that space. The method complements baked lighting and lightmaps — which only support static objects — by providing high-quality lighting for dynamic objects as they move throughout a larger scene. Like lightmaps, LightProbeVolumes can be 'baked' offline and stored, then loaded and applied at runtime very efficiently.
Fixes #16228.
Thanks to @WestLangley, @bhouston, and @richardmonette for helpful discussions and code that contributed to this PR.
API
Bake:
Apply:
Lighting is sampled only at the center of each mesh, and larger objects like terrain and buildings will not receive realistic lighting from a single sample. Shadows are not cast by light probes.
Unresolved
volume.update()
should search incrementally from the last known cell to improve update times.In this demo, LightProbeVolume is in the scene graph but its probes are not (I don't want them all affecting the mesh). I'm open to other ways of structuring this.It's useful for the volume to have position in the scene.Should we allow arbitrarily-shaped volumes, or require grid-shaped volumes? Blender and Godot appear to be OK with grids. That would let us omit the Delaunay library (which generatesGrids only, for now.5x1.5x more tetrahedra than needed, for a grid?), improving volume creation time. And if (in the future) we want to upload the whole volume's SH coordinates to a GPU texture for sampling in the fragment shader, I think a grid would be necessary. But it is less flexible / optimizable.Demo
https://raw.githack.com/donmccurdy/three.js/feat-lightprobevolume-build/examples/?q=volume#webgl_lightprobe_volume