Add mipmapping to renderer for downscaling in hyprexpo
KZDKM opened this issue · comments
Description
For both hyprexpo and my plugin Hyprspace, the textures of window and layer surfaces in overviews are squished down into a smaller size and rendered as is with GL_LINEAR
filter. This results in windows and layers in overview being very pixelated, sharp and wobbly.
Is it possible implement an additional field in m_RenderData
struct (just like useNearestNeighbor
) to tell renderTexture
calls to create a mipmap or use other methods to downscale the surface textures?
I've tried (there is scale) but it doesn't scale the shaders so it looks wrong, plus many calculations tend to break.
TL;DR: tried, dunno how. Feel free to make a MR
Changing GL_LINEAR
to GL_NEAREST_MIPMAP_LINEAR
and adding glGenerateMipmap(tex.m_iTarget)
to renderTextureInternalWithDamange
results in complete garbled texture.
Interestingly, it only corrupts window textures and works completely fine with layer surfaces (looks fine and properly downscaled in overview). I guess this has something to do with calculateUVForSurface
? I'm not familiar with OpenGL.
Most windows should have unset (aka default) uv after calculations, though. It's only when a window is clipped/stretched
Might be related: https://gitlab.freedesktop.org/wlroots/wlroots/-/issues/3814 and https://gitlab.freedesktop.org/wayland/weston/-/merge_requests/1461. Seems like generating mipmap using opengl would require upstream support, pretty sure the texture corruption not caused by UV or shader. Using a shader for bilinear / trilinear filtering might still be possible though.