google / lifetime_value

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Invalid input or output shapes

IvanUgrin opened this issue · comments

Hello,

I am using the zero_inflated_lognormal_loss function as described in the regression notebook. I haven't changed anything in the book. I just changed the input data to be my own but the format is the same. This is the dnn summary I am getting:

Screenshot 2024-06-06 at 15 46 32

However when I try to fit the model I receive the following exception from the loss function:

Epoch 1/400
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[15], line 1
----> 1 history = model.fit(
      2     x=x_train,
      3     y=y_train,
      4     batch_size=1024,
      5     epochs=EPOCHS,
      6     verbose=2,
      7     callbacks=callbacks,
      8     validation_data=(x_eval, y_eval)).history

File [/opt/conda/lib/python3.11/site-packages/keras/src/utils/traceback_utils.py:122](http://localhost:10000/opt/conda/lib/python3.11/site-packages/keras/src/utils/traceback_utils.py#line=121), in filter_traceback.<locals>.error_handler(*args, **kwargs)
    119     filtered_tb = _process_traceback_frames(e.__traceback__)
    120     # To get the full stack trace, call:
    121     # `keras.config.disable_traceback_filtering()`
--> 122     raise e.with_traceback(filtered_tb) from None
    123 finally:
    124     del filtered_tb

File [/opt/conda/lib/python3.11/site-packages/lifetime_value/zero_inflated_lognormal.py:63](http://localhost:10000/opt/conda/lib/python3.11/site-packages/lifetime_value/zero_inflated_lognormal.py#line=62), in zero_inflated_lognormal_loss(labels, logits)
     60 positive = tf.cast(labels > 0, tf.float32)
     62 logits = tf.convert_to_tensor(logits, dtype=tf.float32)
---> 63 logits.shape.assert_is_compatible_with(
     64     tf.TensorShape(labels.shape[:-1].as_list() + [3]))
     66 positive_logits = logits[..., :1]
     67 classification_loss = tf.keras.losses.binary_crossentropy(
     68     y_true=positive, y_pred=positive_logits, from_logits=True)

ValueError: Shapes (None, 3) and (3,) are incompatible

Any clue what am I doing wrong?

EDIT: The same error happens if you directly run the https://github.com/google/lifetime_value/tree/master/notebooks/kdd_cup_98/regression.ipynb script with the provided kdd dataset.

Hello, I have the same problem, have you solved it?

Hello, unfortunately, no. If you have any idea, please let me know.