soumith / fakecuda

A convenient package for the lazy torch programmer to leave all your :cuda() calls as-is when running on CPU

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

#fakecuda

A convenient package for the lazy torch programmer to leave all your :cuda() calls as-is when running on CPU.

###Usage:

require('fakecuda').init(true) -- true is for using fakecuda, false or no-argument wont initialize fakecuda.

###Installation:

luarocks install https://raw.githubusercontent.com/soumith/fakecuda/master/fakecuda-scm-1.rockspec

###Example:

Let's say your code is sprinkled all over like this:

require 'torch'
require 'nn'

use_cpu = true
require('fakecuda').init(use_cpu)

a=torch.randn(10)

ac = a:cuda()

b = nn.SpatialConvolution(3,16,5,5)
b:cuda()

cutorch.synchronize()

These are all nicely ignored with fakecuda.

The only thing that has to be tread carefully is if you have calls like:

a=torch.CudaTensor(3,4)

explicitly initializing a torch.CudaTensor is in murky territory where side-effects are not easy to ignore.

About

A convenient package for the lazy torch programmer to leave all your :cuda() calls as-is when running on CPU


Languages

Language:Lua 100.0%