elm-community / webgl

Moved to elm-explorations/webgl

Home Page:https://package.elm-lang.org/packages/elm-explorations/webgl/latest

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Arrays in shaders

SjorsVanGelderen opened this issue · comments

The type of an array in a shader is incorrectly inferred as just the primitive type.
Example:

uniform int  palette[64];

yields

But the definition (shown above) is a:

    Shader
        {}
        { palette : Int }
        {}

Where palette should probably have been inferred as Array Int.

Combined with the inability to generate textures at runtime, this is a real obstruction when for example creating a graphics editing application.

I think the createUniformSetter function should check the size of the incoming uniform, and use the appropriate type accordingly. The biggest hurdle probably is that the GLSL parser that is being used doesn't infer the types correctly.

@SjorsVanGelderen Hi, this is a known limitation #36

What kind of graphics editing application do you have in mind? It would be better first to know more about the use case.

The current plan is to rewrite the glsl parser, integrate it with the Elm compiler, and only after we can think about supporting more types in shaders.

Closing because this is tracked in #36