google-deepmind / deepmind-research

This repository contains implementations and illustrative code to accompany DeepMind publications

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Enformer: can I increase the input sequence size in training data?

exnx opened this issue · comments

Hi, I had some questions for the Enformer model.

I was wondering what steps are involved in increasing the context length of both the input sequence (and training data) and the model input.

For example, if I'd like to decrease the bin size and increase the length of the sequence I'd like to process.

Has anyone done that before? Thanks!