cvdfoundation / google-landmark

Dataset with 5 million images depicting human-made and natural landmarks spanning 200 thousand classes.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Memory Requirements

vijayannepu opened this issue · comments

Can someone tell me, what was the minimum requirement of the disk space for the storage of this entire data?

As described in the README:

  • Each train TAR is about 1GB, there are 500 of them --> 500GB
  • Each index TAR is about 0.85GB, there are 100 of them --> 85GB
  • Each test TAR is about 0.5GB, there are 20 of them --> 10GB

To download the entire the dataset, it would thus require ~595GB.