-
-
Notifications
You must be signed in to change notification settings - Fork 35.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
(WIP) KTX2Loader: Add support for u8, f16, and f32 array and cube textures #26642
base: dev
Are you sure you want to change the base?
(WIP) KTX2Loader: Add support for u8, f16, and f32 array and cube textures #26642
Conversation
📦 Bundle sizeFull ESM build, minified and gzipped.
🌳 Bundle size after tree-shakingMinimal build including a renderer, camera, empty scene, and dependencies.
|
Yes, I can have a look 👍 . How can I test this? Is there example code that you can share? |
@Mugen87 thank you! If you open |
|
||
if ( path.startsWith( 'cube' ) ) { | ||
|
||
texture.mapping = THREE.CubeUVReflectionMapping; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should be CubeReflectionMapping
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I also think that you should assign instances of DataCubeTexture
to envMap
instead of map
. Otherwise the renderer tries to use a 2D sampler (instead of samplerCube
) with your cube textures which results in the respective runtime error.
But even then the texture definition fails at this check:
three.js/src/renderers/webgl/WebGLTextures.js
Line 1139 in 0e486b1
if ( texture.image.length !== 6 ) return; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe you can setup the instance of DataCubeTexture
inside KTX2Loader
similar to: https://threejs.org/examples/webgl_materials_cubemap_mipmaps
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I also think that you should assign instances of DataCubeTexture to envMap instead of map...
Hm, I see! For testing purposes, I'm resolving this with a new TextureHelper
class to eliminate any guesswork on which sampler is used.
Maybe you can setup the instance of DataCubeTexture inside KTX2Loader similar to: https://threejs.org/examples/webgl_materials_cubemap_mipmaps
So the idea here seems to be that each item in the mipmaps
array is a DataCubeTexture itself? I've pushed code giving this a try, but DataCubeTexture appears black, or if a valid CompressedCubeTexture was previously used, the DataCubeTexture will display that instead. :/
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is DataCubeTexture
really necessary? I think this type of textures are already supported. Check out the following code block in uploadCubeTexture()
:
three.js/src/renderers/webgl/WebGLTextures.js
Lines 1275 to 1304 in 22d20b6
if ( isDataTexture ) { | |
if ( useTexStorage ) { | |
state.texSubImage2D( _gl.TEXTURE_CUBE_MAP_POSITIVE_X + i, 0, 0, 0, cubeImage[ i ].width, cubeImage[ i ].height, glFormat, glType, cubeImage[ i ].data ); | |
} else { | |
state.texImage2D( _gl.TEXTURE_CUBE_MAP_POSITIVE_X + i, 0, glInternalFormat, cubeImage[ i ].width, cubeImage[ i ].height, 0, glFormat, glType, cubeImage[ i ].data ); | |
} | |
for ( let j = 0; j < mipmaps.length; j ++ ) { | |
const mipmap = mipmaps[ j ]; | |
const mipmapImage = mipmap.image[ i ].image; | |
if ( useTexStorage ) { | |
state.texSubImage2D( _gl.TEXTURE_CUBE_MAP_POSITIVE_X + i, j + 1, 0, 0, mipmapImage.width, mipmapImage.height, glFormat, glType, mipmapImage.data ); | |
} else { | |
state.texImage2D( _gl.TEXTURE_CUBE_MAP_POSITIVE_X + i, j + 1, glInternalFormat, mipmapImage.width, mipmapImage.height, 0, glFormat, glType, mipmapImage.data ); | |
} | |
} | |
} else { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
BTW: What was the reason for adding CompressedCubeTexture
? You could already setup a normal CompressedTexture
with six images. I know there is a use case for CompressedCubeTexture
but I don't remember anymore.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mostly, the error about THREE.WebGLState: TypeError: Failed to execute 'texSubImage2D'...
in the OP, in which I can't get WebGLRenderer to interpret the given data as a cube texture. I'm finding it difficult to construct valid textures in these layouts, and I think having distinct classes with documented parameters might help.
I imagine we don't technically need as many texture classes as we have already, and everything else could just be parameterization. And in theory I'm fine with fewer classes, but we'd need documentation of those parameters. I haven't been able to figure it out from reading the renderer code so far, other than that it's theoretically supported. 😰
The difference between DataTexture and CompressedTexture is pretty fuzzy to me, especially since CompressedTexture can use internal formats that aren't technically compressed, but neither name seems like it encompasses the purpose of the other.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Um, if I had to choose I would distinct between a normal texture class and a cube texture class. We already do that with Texture
and CubeTexture
and I think it is consistent to transfer this policy to DataTexture
. It's just important to me that we deprecate the "old" approach at some point and force users towards DataCubeTexture
. Same for CompressedCubeTexture
.
Mostly, the error about THREE.WebGLState: TypeError: Failed to execute 'texSubImage2D'... in the OP, in which I can't get WebGLRenderer to interpret the given data as a cube texture.
Whether a texture is interpreted as a normal texture or cube texture depends on the shader. WebGLUniforms
decides based on the uniform type what setTexture*()
function is called from WebGLTextures
. The given texture must be compatible to the respective sampler.
The difference between DataTexture and CompressedTexture is pretty fuzzy to me, especially since CompressedTexture can use internal formats that aren't technically compressed, but neither name seems like it encompasses the purpose of the other.
CompressedTexture
was initially introduced to easier implement a code path in WebGLTextures
for calling the compressed*
WebGL texture commands. There would be of course other ways to make that distinction (like checking the texture format) but I think using a separate class for that is fine.
It's true that CompressedTexture
can be configured with plain RGBAFormat
. However, there is at least one use case where this is in some sense "required". DDSLoader
supports loading S3TC textures that hold uncompressed RGBA data. Since the loader is derived from CompressedTextureLoader
, it has to return an instance of CompressedTexture
. Hence, CompressedTexture
does support RGBAFormat
(the only exception next compressed formats). If we want to stop that, DDSLoader
either needs to return a data texture or we stop support for uncompressed S3TC data completely.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's just important to me that we deprecate the "old" approach at some point and force users towards DataCubeTexture. Same for CompressedCubeTexture.
I see! Yes that is fine with me. I don't fully understand the old approach, but that's ok.
Whether a texture is interpreted as a normal texture or cube texture depends on the shader.
I think one issue here, maybe, was scene.background
. It could be a Texture or CubeTexture and we need to decide based on the texture, not just the shader. Perhaps NodeMaterial has similar cases?
It sounds like we're OK with adding these classes then, but just need to figure out how to ensure they work correctly?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
f26d50d
to
b4b6522
Compare
Status:
I've also added a |
|
I'll pull TextureHelper into a separate PR for now, since this one is currently not ready. #27151 |
…(WIP) Add DataCubeTexure
4031585
to
e550512
Compare
Work in progress to support:
My plan here is to use DataTexture classes when the data is uint8, float16, or float32, and to use CompressedTexture classes for the GPU-specific formats. This implies that a new DataCubeTexture class is necessary. But I'd also be OK with using CompressedTexture-based classes for all of this.
@Mugen87 this isn't urgent, but would you be willing to have a look at why these textures aren't rendering? The data going into the texture looks reasonable to me, but there are errors.
For cube textures:
It seems to think the DataCubeTexture is a 2D texture, but I can't understand why.
For array textures: