this post was submitted on 21 Jul 2024
208 points (94.4% liked)

Selfhosted

40340 readers
858 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
208
submitted 4 months ago* (last edited 4 months ago) by gedaliyah to c/selfhosted
 

Sorry but I can't think of another word for it right now. This is mostly just venting but also if anyone has a better way to do it I wouldn't hate to hear it.

I'm trying to set up a home server for all of our family photos. We're on our way to de-googling, and part of the impetus for the change is that our Google Drive is almost full.We have a few hundred gigs of photos between us. The problem with trying to download your data from Google is that it will only allow you to do so in a reasonable way through Google takeout. First you have to order it. Then you have to wait anywhere from a few hours to a day or two for Google to "prepare" the download. Then you have one week before the takeout "expires." That's one week to the minute from the time of the initial request.

I don't have some kind of fancy California internet, I just have normal home internet and there is just no way to download a 50gig (or 2 gig) file in one go - there are always intrruptions that require restarting the download. But if you try to download the files too many times, Google will give you another error and you have to start over and request a new takeout. Google doesn't let you download the entire archive either, you have to select each file part individually.

I can't tell you how many weeks it's been that I've tried to download all of the files before they expire, or google gives me another error.

you are viewing a single comment's thread
view the rest of the comments
[–] butitsnotme 37 points 4 months ago (3 children)

I know it’s not ideal, but if you can afford it, you could rent a VPS in a cloud provider for a week or two, and do the download from Google Takeout on that, and then use sync or similar to copy the files to your own server.

[–] gedaliyah 10 points 4 months ago (2 children)

I don't know how to do any of that but I know it will help to know anyway. I'll look into it. Thanks

[–] [email protected] 11 points 4 months ago

Be completely dumb and install a desktop OS like Ubuntu Desktop. Then remote into it, and use the browser just as normal to download the stuff on it. We'll help you with moving the data off it to your local afterwards. Critically the machine has to have as much storage as needed to store all of your download.

[–] Blue_Morpho 3 points 4 months ago

Instead of having to do an Operating system setup with a cloud provider, maybe another cloud backup service would work. Something like Backblaze can receive your Google files. Then you can download from Backblaze at your leisure.

https://help.goodsync.com/hc/en-us/articles/115003419711-Backblaze-B2

Or use the filters by date to limit the amount of takeout data that's created? Then repeat with different filters for the next chunk.

[–] [email protected] 3 points 4 months ago

I was gonna suggest the same.

[–] [email protected] 2 points 4 months ago

Use this. It's finnicky but works for me. You have to start the download on one device, then pause it, copy the command to your file server, then run it. It's slow and you can only do one at the time, but it's enough to leave it idling