gpujs / gpu.js

GPU Accelerated JavaScript

Home Page:https://gpu.rocks

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

"Unhandled declaration" in production

mikhailsdv opened this issue · comments

A GIF or MEME to give some spice of the internet

What is wrong?

I get the following error when trying to use gpu.js on production (optimized code):

Uncaught Error: Unhandled declaration
    at WebGL2FunctionNode.astVariableDeclaration (gpu-browser.js:12600:17)
    at WebGL2FunctionNode.astGeneric (gpu-browser.js:8023:23)
    at WebGL2FunctionNode.astBlockStatement (gpu-browser.js:12555:14)
    at WebGL2FunctionNode.astGeneric (gpu-browser.js:8007:23)
    at WebGL2FunctionNode.astForStatement (gpu-browser.js:12455:12)
    at WebGL2FunctionNode.astGeneric (gpu-browser.js:8017:23)
    at WebGL2FunctionNode.astFunction (gpu-browser.js:11962:12)
    at WebGL2FunctionNode.astFunctionExpression (gpu-browser.js:8095:17)
    at WebGL2FunctionNode.astGeneric (gpu-browser.js:7991:23)
    at WebGL2FunctionNode.toString (gpu-browser.js:7437:32)

Where does it happen?

On my static website built with Next.js that loads gpu.js from CDN. However, it perfectly works on local server with unoptimized code in a dev mode.

How do we replicate the issue?

  1. Create Next.js project.
  2. Add gpu.js script tag to the _document.js: <Script src="https://unpkg.com/gpu.js@2.15.2/dist/gpu-browser.js" strategy="beforeInteractive" />
  3. Build the project with next build (npm run build).
  4. Generate static website with next export.
  5. Host the static website on a server and try to use gpu.js there.

I suppose babel does something with my kernel code, idk.

How important is this (1-5)?

4

Other Comments

I use gpu.js v2.15.2. Here is my kernel code. Seems like nothing special:

const createKernel = outputLength => new window.GPU({
    mode: "gpu",
}).createKernel(
    function (videoPixels, emojiPixels, emojiPixelsLength, brightness) {
        let lowestDistanceIn3d = Infinity
        let lowestDistanceEmojiPixelIndex = 0

        for (let j = 0, i = 0; j < emojiPixelsLength; j += 3) {
            const emojiPixel = [emojiPixels[j], emojiPixels[j + 1], emojiPixels[j + 2]]

            const videoPixelOffset = this.thread.x * 4
            const videoPixel = [
                videoPixels[videoPixelOffset],
                videoPixels[videoPixelOffset + 1],
                videoPixels[videoPixelOffset + 2],
            ]

            const distanceIn3d = Math.sqrt(
                Math.pow(emojiPixel[0] - videoPixel[0] * brightness, 2) +
                    Math.pow(emojiPixel[1] - videoPixel[1] * brightness, 2) +
                    Math.pow(emojiPixel[2] - videoPixel[2] * brightness, 2)
            )

            if (lowestDistanceIn3d > distanceIn3d) {
                lowestDistanceIn3d = distanceIn3d
                lowestDistanceEmojiPixelIndex = i
            }

            i++
        }
        return lowestDistanceEmojiPixelIndex
    },
    {
        output: [outputLength],
        loopMaxIterations: 9999999,
    }
)

I'm having the same problem. Did you find a solution for this?

I assumed that babel breaks my kernel code, so it becomes unparsable for gpu.js parser. First i tried to extract the kernel code to the separate file and ignore it in a babel config. It didn’t help.

Then i tried to put the kernel code inside dangerouslySetInnerHTML to be a 100% sure that babel doesn’t touch it. And it worked!

Yep, just tried loading the kernel code as a static file from a script tag and it worked, thank you. It also worked if I then minified that file interestingly.
So I agree it must be some sort of optimisation that is breaking the kernel code.

Not the most ideal workaround, hopefully someone comes up with a more secure solution.