mholt / timeliner

All your digital life on a single timeline, stored locally -- DEPRECATED, SEE TIMELINIZE (link below)

Home Page:https://timelinize.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Google Photos (Location)

garrettdowd opened this issue · comments

Hi all,
Just came across this project. In the wiki it mentions that location data is stripped when using the Google Photos API?
I am currently using rclone to pull and backup my Google Photos account to my personal server through Google Drive. All of my media pulled using rclone retains the GPS data (lat/lon/alt).

Two thoughts.

  1. I would rather use my own script (or some other program) to sync my Google Photos account. Would it be easy for timeliner to import from a local folder?
  2. I am not familiar with the Google Photos API, but location data is pretty important (especially if you don't have Location History). It seems alternative methods to import photos with the location data would be of high importance.

I think that would be better suited for a Google Drive data source (but it would have to work somewhat generically, not just photos).

Importing from a local file (or folder) is also a possibility, as other data sources support this (and some require it).

However, with both of these options, the resulting items will lack richness and be much less structured. You may lose complete album information and item/people relationships, and other metadata available through the API or official export mechanisms.

Google Photos will stop syncing to Google Drive: https://gsuiteupdates.googleblog.com/2019/06/google-photos-drive-sync-stopping.html

Starting on July 10, 2019, Google Photos will no longer sync to Google Drive. From that date forward, if you add or delete files in Photos, they won’t be automatically added or deleted in Drive.

Possible workaround:

I have a test script locally (not committed, since it's just spike code) which attempts to scrape location data from a headless browser. However, I haven't been able to get it to work quite yet. Would anyone be interested in this approach as a super hacky, slow, alternative, but an alternative nonetheless?

Another option might be Google Takeout, to download (all?) your Google Photos in one lump sum -- I assume they'd keep the location metadata but I haven't tested this.

I can still see the location data for the files in Google Takeout.

@nelsonjchen Good, then there is some hope at least!

If we went this route, we'd need a way to augment our database with locations for any existing media items. Hmm, this might have to be treated as a special case; I don't know how it would generalize to other data sources and metadata fields.

Also this still is not awesome because it involves basically downloading the whole archive again. 😕

commented

FYI there's an open issue about this on Google's issue tracker.
https://issuetracker.google.com/issues/80379228

Just in case anyone else feels like yelling into the abyss of hopelessness. 😉

It's so weird. I'm getting very inconsistent results from Google Photo's UI. Some pictures I download with that UI do not have the location stripped out. I'm not sure what's the deciding factor if location data is included or not.

@nelsonjchen

It's so weird. I'm getting very inconsistent results from Google Photo's UI. Some pictures I download with that UI do not have the location stripped out. I'm not sure what's the deciding factor if location data is included or not.

I've not experienced inconsistencies here. For photos which have location data, manually downloading them from the web app by clicking the menu button then "Download" always have the location data. Only when downloaded via the API is the location data stripped out, I'm pretty sure.

One thing to note with the Google Photos API: it looks like you can only pull the compressed version of your photos/videos. I store all my content in original quality, and a complete fetch of content using Timeliner is half the size (in total bytes) of an export using Takeout. With the Photos to Drive sync, direct download from the Photos web interface, and with Takeout, a sha256 sum of the media matches the sum of the source content pulled directly off my phone. With the change in Google policy, it looks like Takeout may be the only way to bulk export source content in original quality.

@SpaethCo Good to note, thanks. Timeliner does support importing data from an export file, so if someone wants to contribute a Google Photos importer (just needs to honor the Options.Filename field) that works with Google Takeout, I'd happily review a PR. Apparently, that would allow you to preserve full/original quality AND location metadata.

Maybe ping this guy (nicely): https://twitter.com/dflieb

Product Lead, Google Photos.

There seem to be a couple projects attempting to upload to Google photos using the unofficial web API. Maybe they can be extended to support download / backup as well?

https://github.com/3846masa/upload-gphotos
https://github.com/simonedegiacomi/gphotosuploader
https://github.com/canhlinh/gphoto

I have added early, experimental support for Google Photos via Takeout archives: c2d1332

I have confirmed, insofar as I can tell, that it does include the location metadata as well as original file uploads with full EXIF data.

Use like so:

$ timeliner import takeout-whatever-001.zip google_photos/...

HOWEVER please be aware of caveats:

  • This opens a firehose. Be prepared for your CPU to be busy for a while as it works through the entire archive file.
  • Google can change the Takeout archive format at any time, breaking this implementation. Please help maintain this feature if you use it!
  • The importer does not currently support all variants of a file, for example those with names suffixed with -edited or (1) before the extension. It basically works by finding metadata JSON files, then looking for the version of the filename without the .json extension.
  • Do not import from both Takeout and the API; you can potentially duplicate your entire library. Google Takeout does not provide item IDs, so we have to make up our own, and thus it is practically impossible to prevent duplicates from the API. Update! (See comment below.) You can now combine the use of the API and Takeout, but you have to enable soft merging to avoid duplicating your entire library. The Google Photos data source wiki page is updated with further information as well.

Further contributions and enhancements are welcomed, if anyone wants to contribute.

I've improved the situation quite a bit!

You can now use both the Google Photos API and Google Takeout to fill your timeline with your Google Photos library.

This is possible with a feature called "soft merging". The readme has been updated to describe how this works. Basically, when you plan to use both (or have used one method and are now using the other), always specify at least -merge=soft. Then also specify any other fields you want to prefer from the current data source, for example, when using the API, I specify -merge=soft,id,file,meta which keeps the ID, data file, and metadata from the Google Photos API, and when I use Takeout, I specify -merge=soft which lets it fill in the missing location data. (Location data is different from "metadata" in the general Timeliner sense of it; Timeliner promotes location as first-class item data. There is a separate "metadata" field that a data source may optionally fill in, which Google Photos API does, but Takeout does not.)

See the readme for more info, as well as the wiki page for the Google Photos data source, which has been updated with instructions and details.