Hi,
Few days ago, I have created a project on gitea.com to store big files (~50GB) from a workstation by using git LFS. The upload rate was really good and the upload has taken around 20 minutes. This workstation has been erased. Today, I’ve attempted to clone this repository on another workstation and I am surprised to see that the download rate is around 20KB/s. With this speed, I can hope to download my project in 7 days. Maybe…
I suspect that gitea.com have speed limitations… Only in download by using git lfs and http download. if I’d known there were bandwidth limits on uploads, I’d have chosen another solution.
Is there a solution for extracting the project with a correct bandwidth? Even if it means closing the repository?
It very well could be gitea.com being slow. Maybe look at git client settings though. Maybe something like this can help speed it up-
git -c lfs.concurrenttransfers=100 lfs clone http://Username:Password@URL/Project PullTest9
Plenty of searches that might troubleshoot it from the receiving end before moving on to something you can’t control (gitea.com) Maybe one of the server owners can chime in for that side.
Thank you! The suggested setting does help to retrieve the elements. But at a speed of around 90kb/s. That’s still not enough
I don’t really know how to tweak the command line to improve this. I’m interested in other ideas.
Hi, yes. Speed limitations are expected because of the increasing number of repositories. And uploading is free from AWS but downloading is not.