Min reduction on unsigned integer produces incorrect result.
MoFtZ opened this issue · comments
MoFtZ commented
When using the Reduce
algorithm with the MinUInt32
or MinUInt64
reduction, the output is not the expected value.
The following code should output Reduced[0] = 0
, but instead outputs Reduced[0] = 4294967295
.
using ILGPU;
using ILGPU.Algorithms;
using ILGPU.Algorithms.ScanReduceOperations;
using ILGPU.Algorithms.Sequencers;
using ILGPU.Runtime.OpenCL;
using System;
using System.Linq;
namespace AlgorithmsReduce
{
class Program
{
static void Main()
{
using var context = new Context();
context.EnableAlgorithms();
using var accelerator = new CLAccelerator(context, CLAccelerator.CLAccelerators.First());
using var buffer = accelerator.Allocate<uint>(64);
using var target = accelerator.Allocate<uint>(1);
accelerator.Sequence(accelerator.DefaultStream, buffer.View, new UInt32Sequencer());
accelerator.Reduce<uint, MinUInt32>(
accelerator.DefaultStream,
buffer.View,
target.View);
var data = target.GetAsArray(accelerator.DefaultStream);
for (int i = 0, e = data.Length; i < e; ++i)
Console.WriteLine($"Reduced[{i}] = {data[i]}");
}
}
}
NOTE: Only applies to the OpenCL accelerator, affecting ILGPU.Algorithms v0.9.2 and v0.10.0-beta1.