This blog going to explain the steps to move files or folders from the SSH server ( where AWS credentials are not configured ) to the AWS S3 bucket easily without missing any files or folders.
I was working on moving away from a legacy server to S3 bucket hosting for static file hosting. Being legacy there was many files and folder was there and the legacy server was not having AWS access enabled. Due to the limitation of storage cannot run the zip command to compress and download files. Later those files and folders have to move to the S3 bucket.
While googling found good solutions to tackle this problem.
Let's follow below steps:
Considering you have access to the SSH server and S3 bucket AWS credentials.
- Need to use
rsynccommand. There is
SCPthe command is also available but if the internet interrupts you have to start again. And FileZilla took more time than the terminal command.
sshdirectorypathwill be SSH server directory from where you have to get files and folders and it will do all recursively. Same
/localmachine/directory/your local machine directory.
> rsync -avz -e ssh username@hostname:/sshdirectorypath/ /localmachine/directory/ receiving file list ... done
If the SSH server is already enabled AWS credentials like secrete key and access key under
~/.aws/credentialthen can skip the first step.
Running command to copy the local backup to the S3 bucket.
s3 cp ./ s3://bucketname/ --recursive or if you have to select specific folder inside bucket then s3 cp ./ s3://bucketname/folder/folder --recursive
Benefits of using
is no data leakage will happen
even if the internet interruption will resume where it last stop
faster and it lists the first contents and copies.
I hope this blog helps you to learn. Feel free to reach out to me on my Twitter handle @AvinashDalvi_ or comment on the blog.
Keep learning and keep sharing!
Did you find this article valuable?
Support Avinash Dalvi by becoming a sponsor. Any amount is appreciated!