chkla / codecarbon-huggingface

Testing CodeCarbon πŸ’¨ with Huggingface πŸ€—

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Testing CodeCarbon πŸ’¨ with Huggingface πŸ€—

In this notebook, I played around with the new CodeCarbon πŸ’¨ package integrated into Comet β˜„οΈ using Huggingface πŸ€— to show a fine-tuned language model's carbon footprint.

In 2019 the paper "Energy and Policy Considerations for Deep Learning in NLP" popped up, discussing machine learning models' carbon footprint. Giving this as a portion of food for thought, the community starts thinking about the long-term effects and consequences.

The CodeCarbon πŸ’¨ project is a software package to track the carbon footprint. This package is already integrated into Comet β˜„οΈ , a tool to analyze and track your models (similar to wandb).

To exemplify the use of CodeCarbon πŸ’¨, I used a part of code from this HuggingFace' notebook to define a simple task for fine-tuning a language model (if you want, you can try out any other task).

Note: The current integration in HuggingFace seems to be a bit buggy in logging the experiments in the right format to get a carbon score.

About

Testing CodeCarbon πŸ’¨ with Huggingface πŸ€—


Languages

Language:Jupyter Notebook 100.0%