James Ingram's repositories
Language:PythonMIT000
Language:PythonApache-2.0000
AlphaFold-x
Exploring AlphaFold with Transformers
Language:Jupyter Notebook000
DALLE-pytorch
fork of lucidrain's implementation
Transformers
Transformers playground
Language:Jupyter Notebook000
DeBERTa
Exploring NLP models
Language:Python000
market-charts
crypto and US indices models, charts etc.
000
GPTx-output-dataset
GPTx output dataset
Language:PythonMIT000
GPTx
Exploring GPT2, along with NLP Transformers with attention on a specialized domain
Language:PythonNOASSERTION000
BERT
Exploring Bidirectional Encoder Representations from Transformers (BERT)
Language:PythonApache-2.0000
Model-Evaluation-Test-Harness
Model Evaluation Test Harness - for developing deep learning models for univariate time series forecasting. Source article: Jason Brownlee, Ph.D. Making Developers Awesome at Machine Learning, accessed on Oct 31, 2018