andersbll / deeppy

Deep learning in Python

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Problem with Examples: Activation Units

filmo opened this issue · comments

Hi, just installed this a few minutes ago and ended up using the example code on the preliminary site (rather than the included code). Looks like the old activation methods are still being used which causes an error with the current version of deeppy.

For example:

dp.Activation('relu')

http://andersbll.github.io/deeppy-website/examples/index.html

May want to update code on site in case anybody else runs across this.

(The example code on GitHub does work)

Hey, thank you for the info! I have a larger update of the codebase pending and will make sure this is fixed at that point.

For the ones that try the examples, here are some updates that need (and i've seen so far, version deeppy==0.1.dev0):

  • dataset.arrays replaces dataset.data
  • dp.SupervisedFeed replaces dp.SupervisedInput
  • db.Feed replaces db.Input
  • dp.ReLU() replaces dp.Activation('relu')
  • dp.GradientDescent have diferents params
  • looks like dp.GradientDescent.train_epochs replaces dp.GradientDescent.train_epochs, as max_epochs isn't a param of dp.GradientDescent anymore.