April 25, 2020 / Nirav Shah
In this blog we will explain to you how to upload large amounts of data on simple storage service (S3) by AWS.
There are so many tools and services that AWS offers for transferring your on-premise to AWS.
<ol class=”listing”>
<li><a class=”linkcolor” href=”https://aws.amazon.com/storagegateway/” target=”_blank” rel=”noopener”>AWS Storage Gateway.</a></li>
<li><a class=”linkcolor” href=”https://aws.amazon.com/datasync/” target=”_blank” rel=”noopener”>AWS DataSync.</a></li>
<li><a class=”linkcolor” href=”https://aws.amazon.com/directconnect/” target=”_blank” rel=”noopener”>AWS Direct Connect.</a></li>
<li><a class=”linkcolor” href=”https://aws.amazon.com/snowball/” target=”_blank” rel=”noopener”>AWS Snowball Family.</a></li>
<li><a class=”linkcolor” href=”#” target=”_blank” rel=”noopener”>Amazon S3 Transfer Acceleration.</a></li>
<li><a class=”linkcolor” href=”#” target=”_blank” rel=”noopener”>Using the AWS CLI.</a></li>
</ol>
In this blog we will understand the best way to transfer your data using S3 Transfer Acceleration and AWS CLI.
But first let’s understand what multipart upload is?
S3 supports multipart uploads for large files. For example: using this feature, you can break a 5 GB upload into as many as 1024 separate parts and upload each one independently, as long as each part has a size of 5 megabytes (MB) or more. If an upload of a part fails it can be restarted without affecting any of the other parts. Once you have uploaded all the parts you ask S3 to assemble the full object with another call to S3.
Consider the following options for improving the performance of uploads and optimizing multipart uploads:
<ul class=”listing”>
<li>Enable Amazon S3 Transfer Acceleration.</li>
<li>Using the AWS CLI.</li>
</ul>
1) Enable Amazon S3 Transfer Acceleration
<a class=”linkcolor” href=”https://docs.aws.amazon.com/AmazonS3/latest/dev/transfer-acceleration.html” target=”_blank” rel=”noopener”>Amazon S3 Transfer Acceleration</a> can provide fast and secure transfers over long distances between your client and Amazon S3. Transfer Acceleration uses Amazon CloudFront’s globally distributed edge locations.
Transfer Acceleration has additional charges, so be sure to <a class=”linkcolor” href=”https://aws.amazon.com/s3/pricing/” target=”_blank” rel=”noopener”>review pricing.</a>
If you want to see the transfer speeds for your use case, review the <a class=”linkcolor” href=”http://s3-accelerate-speedtest.s3-accelerate.amazonaws.com/en/accelerate-speed-comparsion.html” target=”_blank” rel=”noopener”><strong>Amazon S3 Transfer Acceleration Speed Comparison</strong></a> tool.
How to use
There are so many ways to Enable Transfer Acceleration.so that I will put the link below so you can use it as per your requirement.
<a class=”linkcolor” href=”https://docs.aws.amazon.com/AmazonS3/latest/dev/transfer-acceleration-examples.html#transfer-acceleration-examples-console” target=”_blank” rel=”noopener”>https://docs.aws.amazon.com/AmazonS3/latest/dev/transfer-acceleration-examples.html#transfer-acceleration-examples-console</a>
Note:- Transfer Acceleration does not support cross-Region copies using
<a class=”linkcolor” href=”https://docs.aws.amazon.com/AmazonS3/latest/API/API_CopyObject.html” target=”_blank” rel=”noopener”>CopyObject</a>
2) Using the AWS CLI
The AWS Command Line Interface (CLI) is a unified tool to manage your AWS services. With just one tool to download and configure, you can control multiple AWS services from the command line and automate them through scripts.
You can install AWS CLI for any major operating system: <a class=”linkcolor” href=”https://docs.aws.amazon.com/cli/latest/userguide/install-macos.html” target=”_blank” rel=”noopener”>macOS, </a><a class=”linkcolor” href=”https://docs.aws.amazon.com/cli/latest/userguide/install-linux.html” target=”_blank” rel=”noopener”>Linux</a>, or <a class=”linkcolor” href=”https://docs.aws.amazon.com/cli/latest/userguide/install-windows.html” target=”_blank” rel=”noopener”>Windows.</a>
You can customize the following <a class=”linkcolor” href=”https://docs.aws.amazon.com/cli/latest/topic/s3-config.html” target=”_blank” rel=”noopener”>AWS CLI configurations for Amazon S3.</a>
How to use
We have considered the Linux operating system.
<pre>$ pip install awscli</pre>
Get your access keys
<p class=”number”>1) Get your <a class=”linkcolor” href=”https://console.aws.amazon.com/iam/home?#home” target=”_blank” rel=”noopener”>access keys</a></p>
<p class=”number”>2) Go to Users.</p>
<img class=”img-responsive” src=”https://www.eternalsoftsolutions.com/blog/wp-content/uploads/2022/11/s3image6.webp” />
<p class=”number”>3) Click on your <strong>user name</strong></p>
<img class=”img-responsive” src=”https://www.eternalsoftsolutions.com/blog/wp-content/uploads/2022/11/s3image2.webp” />
<p class=”number”>4) Go to the Security credentials tab.</p>
<img class=”img-responsive” src=”https://www.eternalsoftsolutions.com/blog/wp-content/uploads/2022/11/s3image3.webp” />
<p class=”number”>5) Click Create access key</p>
<img class=”img-responsive” src=”https://www.eternalsoftsolutions.com/blog/wp-content/uploads/2022/11/s3image4.webp” />
<p class=”number”>6) You’ll see your Access key ID. Click “Show” to see your Secret access key and download it and keep safe</p>
<img class=”img-responsive” src=”https://www.eternalsoftsolutions.com/blog/wp-content/uploads/2022/11/s3image5.webp” />
Once you successfully install the AWS CLI, open the command prompt and execute the below commands.
<ol class=”listing”>
<li>First, execute aws configure to configure your account (This is a one-time process) and press Enter (this is a one-time process).</li>
<li>Now, it will ask for an AWS access key ID, key, region name, and output format. Enter all the inputs and press Enter.</li>
</ol>
<img class=”img-responsive” src=”https://www.eternalsoftsolutions.com/blog/wp-content/uploads/2022/11/s3image6.webp” />
<strong>Uploading large files</strong>
Here, assume we are uploading a large 150GB data file to s3://systut-data-test/store_dir/ (that is, directory store-dir under bucket systut-data-test) and the bucket and directory are already created on S3
The command is:
<pre>$ aws s3 cp ./150GB.data s3://systut-data-test/store_dir/</pre>
After it starts to upload the file, it will print the progress message like
Completed 1 part(s) with … file(s) remaining
at the beginning, and the progress message as follows when it is reaching the end.
Completed 9896 of 9896 part(s) with 1 file(s) remaining
After it successfully uploads the file, it will print a message like
upload: ./150GB.data to s3://systut-data-test/store_dir/150GB.data
But AWS CLI can do much more. Check out the comprehensive documentation at <a class=”linkcolor” href=”https://docs.aws.amazon.com/cli/latest/index.html” target=”_blank” rel=”noopener”>AWS CLI Command Reference.</a>

Nirav Shah is the Director of Eternal Web Pvt Ltd, an AWS Advanced Consulting Partner and certified Odoo Partner based in the UK. With over a decade of experience in cloud computing, digital transformation, and ERP implementation, Nirav helps enterprises adopt the right technology to solve complex business challenges. He specialises in AWS infrastructure, Odoo ERP, and web development solutions for businesses across the UK and beyond.
Have queries about your project idea or concept? Please drop in your project details to discuss with our AWS Global Cloud Infrastructure service specialists and consultants.