business-science / modeltime.gluonts

GluonTS Deep Learning with Modeltime

Home Page:https://business-science.github.io/modeltime.gluonts/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Example Code - No Result - what happened? Approach & Questions...

Steviey opened this issue · comments

Win 7/64bit , R 4.0.5, latest packages

I get no result/plot. What can it be?

`library(modeltime.gluonts)
library(tidymodels)
library(tidyverse)

model_fit_deepar <- deep_ar(
id = "id",
freq = "M",
prediction_length = 24,
lookback_length = 36,
epochs = 10,
num_batches_per_epoch = 50,
learn_rate = 0.001,
num_layers = 2,
dropout = 0.10
) %>%
set_engine("gluonts_deepar") %>%
fit(value ~ ., training(m750_splits))

Forecast with 95% Confidence Interval

modeltime_table( model_fit_deepar ) %>%
modeltime_calibrate(new_data = testing(m750_splits)) %>%
modeltime_forecast(
new_data = testing(m750_splits),
actual_data = m750,
conf_interval = 0.95
) %>%
plot_modeltime_forecast(.interactive = FALSE)`


WARNING:gluonts.dataset.loader:Multiprocessing is not supported on Windows, num_workers will be set to None.
100%|##########| 50/50 [00:04<00:00, 10.73it/s, epoch=1/10, avg_epoch_loss=8.82]
100%|##########| 50/50 [00:03<00:00, 14.48it/s, epoch=2/10, avg_epoch_loss=7.65]
100%|##########| 50/50 [00:03<00:00, 14.33it/s, epoch=3/10, avg_epoch_loss=7.51]
100%|##########| 50/50 [00:03<00:00, 14.71it/s, epoch=4/10, avg_epoch_loss=7.33]
100%|##########| 50/50 [00:03<00:00, 14.32it/s, epoch=5/10, avg_epoch_loss=7.21]
100%|##########| 50/50 [00:03<00:00, 14.12it/s, epoch=6/10, avg_epoch_loss=7.07]
100%|##########| 50/50 [00:03<00:00, 14.73it/s, epoch=7/10, avg_epoch_loss=7.01]
100%|##########| 50/50 [00:03<00:00, 14.12it/s, epoch=8/10, avg_epoch_loss=6.94]
100%|##########| 50/50 [00:03<00:00, 14.53it/s, epoch=9/10, avg_epoch_loss=6.91]
100%|##########| 50/50 [00:03<00:00, 14.44it/s, epoch=10/10, avg_epoch_loss=6.88]learning rate from ``lr_scheduler`` has been overwritten by ``learning_rate`` in optimizer.
learning rate from ``lr_scheduler`` has been overwritten by ``learning_rate`` in optimizer.

UPDATE:

Splitting the pipe up in pieces, gives me the desired result/plot with warnings...

library(modeltime.gluonts)
library(tidymodels)
library(tidyverse)

#View(m750_splits)

model_fit_deepar <- deep_ar(
  id                    = "id",
  freq                  = "M",
  prediction_length     = 24,
  lookback_length       = 36,
  epochs                = 10, 
  num_batches_per_epoch = 50,
  learn_rate            = 0.001,
  num_layers            = 2,
  dropout               = 0.10
) %>%
  set_engine("gluonts_deepar") %>%
  parsnip::fit(value ~ ., training(m750_splits))

  model_table <- modeltime_table(model_fit_deepar) 
  
  calibration_table <- model_table %>%
    modeltime_calibrate(testing(m750_splits))
  
info<-calibration_table %>%
  modeltime_accuracy() %>%
 table_modeltime_accuracy(.interactive = FALSE)
  
  fc<-calibration_table %>% modeltime_forecast(
    new_data      = testing(m750_splits),
    actual_data   = m750,
    conf_interval = 0.95
  )
  
  myPlot<- fc %>% plot_modeltime_forecast(.interactive = TRUE)
  
  print(info)
  print(myPlot)

  old<-0
if(old>0){
# Forecast with 95% Confidence Interval
modeltime_table(model_fit_deepar) %>%
  modeltime_calibrate(new_data = testing(m750_splits)) %>%
  modeltime_forecast(
    new_data      = testing(m750_splits),
    actual_data   = m750,
    conf_interval = 0.95
  ) %>%
  plot_modeltime_forecast(.interactive = FALSE)
}

----------------------------------
WARNING:gluonts.dataset.loader:Multiprocessing is not supported on Windows, num_workers will be set to None.
100%|##########| 50/50 [00:04<00:00, 10.73it/s, epoch=1/10, avg_epoch_loss=8.63]
100%|##########| 50/50 [00:03<00:00, 14.16it/s, epoch=2/10, avg_epoch_loss=7.64]
100%|##########| 50/50 [00:03<00:00, 14.47it/s, epoch=3/10, avg_epoch_loss=7.45]
100%|##########| 50/50 [00:03<00:00, 14.17it/s, epoch=4/10, avg_epoch_loss=7.39]
100%|##########| 50/50 [00:03<00:00, 13.96it/s, epoch=5/10, avg_epoch_loss=7.15]
100%|##########| 50/50 [00:03<00:00, 14.02it/s, epoch=6/10, avg_epoch_loss=7.01]
100%|##########| 50/50 [00:03<00:00, 14.37it/s, epoch=7/10, avg_epoch_loss=6.97]
100%|##########| 50/50 [00:03<00:00, 14.10it/s, epoch=8/10, avg_epoch_loss=6.9]
100%|##########| 50/50 [00:03<00:00, 14.56it/s, epoch=9/10, avg_epoch_loss=6.85]
100%|##########| 50/50 [00:03<00:00, 14.18it/s, epoch=10/10, avg_epoch_loss=6.8]learning rate from ``lr_scheduler`` has been overwritten by ``learning_rate`` in optimizer.

image

image

Question I : Are there any hints for the warnings?
Question II : Is this construct already fully compatible to tune_bayes of tidymodels?

Thank you Matt.