If you’ve got a slow internet connection and are trying to pull a large collection of images, you may encounter retries causing the entire pull to fail, not to mention multiple concurrent downloads sucking up all your bandwidth.
I found that the following Docker daemon configuration changes helped:
- Set
max-concurrent-downloadsto1 - Bump
max-download-attemptsto something relatively high, like20
The resulting JSON would look a little like this:
{
"builder": {
"gc": {
"defaultKeepStorage": "20GB",
"enabled": true
}
},
"experimental": false,
"max-concurrent-downloads": 1,
"max-download-attempts": 20
}It won’t speed up your pulls, but at least you can let them run in the background without the retries failing on you.