How to determine the maximum grid size?
MrOlegus opened this issue · comments
MrOlegus commented
What is wrong?
const gpu = new GPU();
const multiplyMatrix = gpu.createKernel(function(a, b) {
let sum = 0;
for (let i = 0; i < 512; i++) {
sum += a[this.thread.y][i] * b[i][this.thread.x];
}
return sum;
}).setOutput([512, 512]);
const c = multiplyMatrix(a, b);
I am using your example for matrix multiplication. For 512x512 everything is ok, but at 1024x1024 I get it.
Where does it happen?
All the time.
How do we replicate the issue?
const gpu = new GPU();
const multiplyMatrix = gpu.createKernel(function(a, b) {
let sum = 0;
for (let i = 0; i < 1000000; i++) {
sum += a[this.thread.y][i] * b[i][this.thread.x];
}
return sum;
}).setOutput([1000000, 1000000]);
const c = multiplyMatrix(a, b);
How important is this (1-5)?
3
Expected behavior (i.e. solution)
Throwing an out-of-memory error instead of trying to execute.
Other Comments
Jacob Bogers commented
you can ask limits from webgl for example
var maxTextures = gl.getParameter(gl.MAX_TEXTURE_IMAGE_UNITS);
ofc a superduper card is going to have a higher limit then (lets say) your feature phone