pytorch / xla

Enabling PyTorch on XLA Devices (e.g. Google TPU)

Home Page:https://pytorch.org/xla

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Define Custom Operator based on CPP program

rwbfd opened this issue Β· comments

commented

πŸš€ Feature

Define Custom Operator based on a CPP program

Motivation

It is possible to define a custom-based CPP program or CUDA program as an operator. However, I am not entirely sure whether this can be done in PyTorch XLA.

If so, is it possible to create the code example with it? Or is it the same as defining a custom operator in PyTorch?

Can you be a bit more specified on what are you trying to do? There is a way to define a c++ op like

at::Tensor XLANativeFunctions::clamp(const at::Tensor& self,
const c10::optional<at::Scalar>& min,
const c10::optional<at::Scalar>& max) {
TORCH_LAZY_FN_COUNTER("xla::");
return bridge::AtenFromXlaTensor(
tensor_methods::clamp(bridge::GetXlaTensor(self), min, max));
}

but you will either need a python binding to expose this op or you want to call this op directly from C++ somewhere.