If you recall, one of the automated systems I have on my home network is a PVR named sickgear. I was covered on the configuration of its files but I missed its database. Too late for that now, but not entirely too late to solve it.
To remediate, I used a script that called s3cmd (from s3cmd package) on the RPI. The script is a bit simple to implement and the logic is to "touch" an empty file everyime I run the backup job. If the database file is newer than the empty file, then execute the backup.
When I tested the script, I was in for a surprise. The s3cmd implementation on RPI was not working and the error is a mix between:
WARNING: Upload failed: /sickbeard.db ([Errno 104] Connection reset by peer)
WARNING: Retrying on lower speed (throttle=0.0x)
WARNING: Waiting..
--- OR ---
WARNING: Upload failed: /sickbeard.db ([Errno 32] Broken pipe)
WARNING: Retrying on lower speed (throttle=0.0x)
WARNING: Waiting..
The error is quite long and eventually fails. It seems to work for small files, but fails as the uploaded files increase in size. The workaround suggested on several forums is to ditch s3cmd and replace it with AWSCLI. Several folks have confirmed that it worked. But I'm not quite inclined to develop several scripts for the workaround. So I continued my experimentation still using s3cmd.
The workaround I discovered was a very simple one. I just replaced the "put" with "sync". For more information on the differences between the two, it is is explained in the s3tools webpage. That is just what I needed.
As you can see from the above screenshot, the same file uploaded to AWS S3 bucket takes a few seconds to completely upload. No more errors.
RELATED: Install Adblock on Raspberry Pi via Pi-Hole
I hope this workaround helps you as well.