You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While using WebGL in Elm, I discovered the GLSL parser doesn't seem to infer types of arrays correctly.
For instance: uniform int array[32]; is inferred as having just the type Int.
I'm not sure about Haskell, but in Elm I expect this type to be inferred as Array Int.
Is this simply not supported, or is it a bug?
The text was updated successfully, but these errors were encountered:
My guess is it's a bug. The parser has a couple bugs in it. I've found that sometimes when parsing function parameters it will incorrectly parse function (int value) as function(in tvalue) or something like that.
While using
WebGL
inElm
, I discovered theGLSL
parser doesn't seem to infer types of arrays correctly.For instance:
uniform int array[32];
is inferred as having just the typeInt
.I'm not sure about
Haskell
, but inElm
I expect this type to be inferred asArray Int
.Is this simply not supported, or is it a bug?
The text was updated successfully, but these errors were encountered: