It comes with an adapter for Google Photos, so you can use it to download / backup photos and videos in your library.
Note: there are some limitations to this approach – namely that rclone won't be able to download your original image files, even if you uploaded with 'original' quality. These limitations are documented on the rclone website: https://rclone.org/googlephotos/
I store these on a local OpenSuse box I use as a NAS. It has a local drive, with a 2nd drive that is mounted and synced to weekly. Then the important stuff is rsynced offsite to a cloud server, rsync for pictures and other various "family" stuff. Borg for home drives with more private information.
That internal drive I backup to, runs btrfs. After backup is complete. I take a snapshot and keep a monthly one going back at least a year.
I use Syncthing to copy from my phone to my ZFS NAS. The phone also backs up to Google Photos (Pixel phone). From there I import them into a Lightroom catalog (looking at DigiKam though) and move the photos to a ZFS dataset that has auto snapshots enabled (accidental deletions, malware protection, etc) and is on a RAIDZ2 pool (survives two drive failures).
From there RClone encrypts and copies them to AWS Glacier Deep Archive (11 9s, etc). I'll also start coping them to Google Workspace (was GSuite).
This removes a lot of product change risks such as Google APIs not returning the original files, etc. AWS Glacier is secure and cheap unless I have to restore. If I'm doing a full restore it's likely that's the cheapest option so then I don't care.
Glacier Deep Archive works out to about $1/TB/month and because it's paid per GB it removes the risks of a company changing plans or some "unlimited but not really" problem.
One piece of advice, think about your local disaster risks. I'm in an earthquake risk area. I ruled out most off-site backup options that involved moving drives around due to those all being in my same disaster area and costs of bank boxes etc.
After spending too much time on it now, the Google Photos API Limitations (only compressed files via the API, EXIF Data not complete) are what really rubs me the wrong way. I've turned on Takeout Backup Creation every 2 months now and expect to just flatout pump these files into Glacier, while retaining a local copy of the latest version on a NAS here, but this is really a sad state of affairs.
Is iCloud better for this?
As Google Photos is my main usecase for being stuck in the Google Ecosystem, is this situation better with the iCloud?
Both timeliner and rclone (which are both great projects) are at the mercy of the Google photos api, which removes GPS metadata and elides the majority of other tags. Even something as critical as the captured-at time can get modified from the original.
(Source: my personal experience with Google Pixel "original quality" backups, and providing customer support to several hundred PhotoStructure beta testers).