WebGLRenderer: Improve `RGBA_INTEGER` support.
tomhsiao1260 opened this issue · comments
Description
I face the same issue that @gkjohnson mentioned 3 years ago. Seems the error comes from three.js doesn't support clearing or rendering to integer render targets (as Garrett Johnson said). Any update for this issue?
https://arc.net/l/quote/gnimfrqg
Reproduction steps
Please check the fiddle, you can compare the different between RGBAFormat
and RGBAIntegerFormat
in console panel.
Code
See fiddle for complete example.
const data = new Uint8Array(2 * 3 * 4)
const texture = new THREE.DataTexture(data, 2, 3)
// texture.format = THREE.RGBAFormat // this work
texture.format = THREE.RGBAIntegerFormat // this not work
texture.type = THREE.UnsignedByteType
texture.minFilter = THREE.NearestFilter
texture.magFilter = THREE.NearestFilter
texture.needsUpdate = true
const renderTarget = new THREE.WebGLRenderTarget(2, 3)
renderTarget.texture = texture
renderer.setRenderTarget(renderTarget)
Live example
https://jsfiddle.net/yaohsiao/np97zdve/21/
Screenshots
Version
r166
Device
Desktop
Browser
Chrome
OS
MacOS
There are a couple of issue in your code. Here is a fixed version: https://jsfiddle.net/5hyf3jzo/1/
- You have to define the internal format of the texture, in your case:
texture.internalFormat = 'RGBA8UI'
. - Your fragment shader has to render the correct data type. In your case it is
uvec4
.
Next time, please ask for help at the forum first.
It's possible that glInternalFormat could be a bit smarter here. It looks like integer types are handled more gracefully for Red
and RG
formats but not for RGBA
.
That sounds like a good addition. Would you make a PR with your suggestion?
That sounds like a good addition. Would you make a PR with your suggestion?
Perhaps at some point but I don't think I can commit time to implementing and testing it right now. It may be worth keeping the issue open to keep track of it for the moment.