RubixML / ML

A high-level machine learning and deep learning library for the PHP language.

Home Page:https://rubixml.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Getting "undefined method Tensor\Matrix::maximum()" error using AdaMax optimiser

MarkLuds opened this issue · comments

Hey, this isn't a pressing issue for me as i'm happy to use the other optimisers which are working fine. With some settings i occasionally get some errors from what i guess is the Tensor extension. Below is the classifier i am using and the error. Not sure if this is something I'm doing wrong, but if i use the same setup/data with the adam, cycle, adagrad optimisers it works fine.

Let me know if this issue needs any further info from me. thanks, Mark

The logger output of the model:
[2022-02-09 16:41:16] INFO: Multilayer Perceptron (hidden_layers: [0: Dense (neurons: 20, alpha: 1.0E-6, bias: true, weight_initializer: He, bias_initializer: Constant (value: 0)), 1: Activation (activation_fn: ELU (alpha: 2.5)), 2: Batch Norm (decay: 0.3, beta_initializer: Constant (value: 0), gamma_initializer: Normal (std_dev: 1)), 3: Dense (neurons: 20, alpha: 1.0E-6, bias: true, weight_initializer: He, bias_initializer: Constant (value: 0)), 4: Activation (activation_fn: ELU (alpha: 2.5)), 5: Batch Norm (decay: 0.3, beta_initializer: Constant (value: 0), gamma_initializer: Normal (std_dev: 1)), 6: Dense (neurons: 20, alpha: 1.0E-6, bias: true, weight_initializer: He, bias_initializer: Constant (value: 0)), 7: PReLU (alpha_initializer: Normal (std_dev: 0.25))], batch_size: 64, optimizer: AdaMax (rate: 0.0001, momentum_decay: 0.02, norm_decay: 0.999), alpha: 1.0E-5, epochs: 20, min_change: 1.0E-8, window: 20, hold_out: 0.1, cost_fn: Cross Entropy, metric: MCC) initialized

The error:

Fatal error: Uncaught Error: Call to undefined method Tensor\Matrix::maximum() in /home/trendalix1/vendor/rubix/ml/src/NeuralNet/Optimizers/AdaMax.php:45
Stack trace:
#0 /home/trendalix1/vendor/rubix/ml/src/NeuralNet/Layers/Dense.php(255): Rubix\ML\NeuralNet\Optimizers\AdaMax->step(Object(Rubix\ML\NeuralNet\Parameter), Object(Tensor\Matrix))
#1 /home/trendalix1/vendor/rubix/ml/src/NeuralNet/FeedForward.php(212): Rubix\ML\NeuralNet\Layers\Dense->back(Object(Rubix\ML\Deferred), Object(Rubix\ML\NeuralNet\Optimizers\AdaMax))
#2 /home/trendalix1/vendor/rubix/ml/src/NeuralNet/FeedForward.php(181): Rubix\ML\NeuralNet\FeedForward->backpropagate(Array)
#3 /home/trendalix1/vendor/rubix/ml/src/Classifiers/MultilayerPerceptron.php(412): Rubix\ML\NeuralNet\FeedForward->roundtrip(Object(Rubix\ML\Datasets\Labeled))
#4 /home/trendalix1/vendor/rubix/ml/src/Classifiers/MultilayerPerceptron.php(366): Rubix\ML\Classifiers\MultilayerPerceptron->partial(Object(Rubix\ML\Datasets\Labeled))
#5 /home/trendalix1/vendor/rubix/ml/src/PersistentModel. in /home/trendalix1/vendor/rubix/ml/src/NeuralNet/Optimizers/AdaMax.php on line 45

What version of Rubix ML is this? What version of the Tensor extension are you using?

Hi Andrew.
Rubix Ml version : * 0.4.1.
tensor version: 3.0.0 .

EDIT: Okay this seems to be resolved installing the latest version.

I noticed there is a much newer rubix/ml version, which is strange as i installed only a couple of months ago and used 'composer require rubix/ml', which installed 0.4.

Thanks, Mark

Hmmmm interesting. I'll try to figure out why Composer installed an older version but yes Rubix ML < 1.0 required Tensor 2.0 I believe. Rubix ML >= 1.0 will throw an exception if the wrong version is detected. If you still want to use ML 0.4.0 then you can downgrade the Tensor extension to 2.0.