Cuda :norm()
jonathantompson opened this issue · comments
me: dude
20 x = torch.rand(1,1,d,c):cuda()
21 print(x:norm(2,3))
22 print(x:float():norm(2,3))
wtf
Sent at 4:26 PM on Wednesday
soumith: dude, if you paste random lines, i dont know wtf you're talking about
me: d, and c are arbitrary. Shouldn't the two results that are printed be identical?
Sent at 4:28 PM on Wednesday
soumith: ah
so you found a bug in cuda's norm
open a bug report love
Note: This was Ross sending a bug report from my account... He will open the bug report again from his account.
:-) Wait is it still valid?
Yeah, it's still a problem.
I felt that the issue should be filled under his account instead of mine, since he's the one that found the bug :-) It should be done in the next 10-15 minutes.
The long story is that I logged into my account on his computer and I forgot to log out again. Then I suppose Ross didn't realize that his browser was logged into the wrong account.
Ok :-) I like the chat, it's very nice!
On Wed, Oct 9, 2013 at 4:49 PM, Jonathan Tompson
notifications@github.comwrote:
Yeah, it's still a problem.
I felt that the issue should be filled under his account instead of mine,
since he's the one that found the bug :-) It should be done in the next
10-15 minutes.The long story is that I logged into my account on his computer and I
forgot to log out again. Then I suppose Ross didn't realize that his
browser was logged into the wrong account.—
Reply to this email directly or view it on GitHubhttps://github.com//issues/168#issuecomment-26007429
.