Advertisement
  1. Code
  2. Cloud & Hosting
  3. AWS

Supercharge Website Performance With AWS S3 and CloudFront

Scroll to top
Read Time: 13 min

We live in a world where people are increasingly expecting more and faster speeds. In fractions of a second, your website can lose valuable visitors and in turn, money. Although most people think CDNs are for the "big dogs", they're actually super cheap and incredibly easy to use these days.

In this tutorial I'll show you how to setup and use Amazon's Web Services S3 and CloudFront to decrease website load time as well as show the performance differences.

What is a CDN?

A CDN is a Content Delivery (or Distribution) Network. It is a network of computers with each system placed at different points with the same data on each. When someone accesses the network, they can access the file on the system nearest them or the one with less current load. This results in better lower latency and file download duration. To learn more about CDNs see "Content delivery network" at Wikipedia.

CDN-graphic-example-jremickCDN-graphic-example-jremickCDN-graphic-example-jremick

In the example image above, visitors access the server nearest them that will provide the best possible performance. The network of servers would be the CDN. A regular web host would have one central server in which all those visitors would have to access. That one server could be located only in the US or maybe Europe and would result in longer latency and load times for visitors farther away.

Using more than one server, even on just one continent, will make a difference in performance.

Why & The Proof

I've had quite a few people ask me why a CDN is important, even for smaller websites, and why they should bother paying for yet another web service. The simple answer is, the faster–the better. And why not offer your customers (visitors) the best you can?

The smaller the website, the less of an impact a CDN will make. Although, if your visitors translate into money for you then every little bit helps.

  • In 2006 Google's tests showed that increasing load time by 0.5 seconds resulted in a 20% drop in traffic.
  • In 2007 Amazon's tests showed that for every 100ms increase in load time, sales would decrease 1%.
  • This year (2009) Akamai (a CDN leader) revealed in a study that 2 seconds is the new threshold for eCommerce web page response times.

It's cheap. It's easy. And it can translate into more money in terms of customers and saving on your regular web host expenses.

Amazon Web Services (AWS)

Amazon provides a whole slew of fantastic web services. We'll be using Amazon's Simple Storage Service (S3) and CloudFront. S3 is a data storage solution in the cloud which can be tied to CloudFront, Amazon's CDN.

If you're looking for a slightly simpler, all-in-one solution, Rackspace Cloud Files is another great option. They've partnered with Limelight Network's CDN which has slightly better performance than Amazon's CDN currently. However, their service has a few drawbacks you won't find with Amazon. I won't get into all these but one of the bigger ones for me was the lack of custom CNAME support which is supposedly coming at some point in the future. With CNAME support you can setup a custom sub-domain to access your files such as "cdn.yourdomain.com".

To see recent performance comparisons, visit http://www.cloudclimate.com/cdns/

Pricing

Here's Amazon's S3 pricing for the US. For other areas, click the image to see full pricing.

s3-pricings3-pricings3-pricing

Here's Amazon's CloudFront pricing for the US. For other areas, click the image to see full pricing.

cloudfront-pricingcloudfront-pricingcloudfront-pricing

Use Amazon's monthly calculator to get a better idea of your end bill. Last month, my total bill was less than $5, with the majority of that incurred from 20GB+ of data storage. As you can see, it's very, very cheap, especially when you take into consideration the performance and flexibility benefits.

Setup S3 & CloudFront

To get started, we need to sign up for Amazon's S3 and CloudFront services. If you already have an account with Amazon you'll just need to login and finish the signup. If not, you'll need to create an account then proceed to signup for S3 and CloudFront. The signup is simply adding the service to your account. There's nothing complicated involved.

Click each image to go to the service's information and signup page.

s3-signups3-signups3-signup
cloudfront-signupcloudfront-signupcloudfront-signup

Once you've signed up, you'll get an Access Key ID and Secret Access Key which can be found under "Your Account" > "Security Credentials". This is basically your username and password for accessing S3.

aws-keysaws-keysaws-keys

Setup S3 Bucket For Files

First we need to create a bucket to store all our files in. For more information on "buckets" read "Amazon S3 Buckets Described in Plain English".

To do this, we'll first log into our S3 account using the Access Key ID and Secret Access Key with an application like Transmit (OS X), which is what I'll be using. To see more apps or browser add-ons for accessing S3 see "Amazon S3 Simple Storage Service – Everything You Wanted to Know".

transmit-s3-logintransmit-s3-logintransmit-s3-login

Once signed in, we'll create a bucket to put our files in. I've named mine "files.jremick.com". Buckets must have unique names, need to be between 3 and 63 characters and can contain letters, numbers and dashes (but can't end with a dash).

By unique, they mean unique on the AWS network. So it's a good idea to use something like a URL or something similar.

transmit-create-buckettransmit-create-buckettransmit-create-bucket

The files we put in this bucket can now be accessed at "files.jremick.com.s3.amazonaws.com". However, this URL is pretty long and we can quickly setup a shorter one. We'll setup a new CNAME entry at our web host to do this.

Setup Custom S3 Subdomain

To shorten the default URL we'll create a CNAME entry as I've done below (this is at your web host). I've chosen "files" as my subdomain but you could use whatever you like.

s3-cnames3-cnames3-cname

Now we can access these bucket files at "files.jremick.com". Much better! Then simply upload the files you want within the "files.jremick.com" bucket.

Once your files are uploaded, you'll want to set the ACL (Access Control List) to allow everyone to read the files (if you want them public). In Transmit you simply right click, select get info, under permissions set "Read" to "World" and click "Apply to enclosed items...". This will give all the files within this bucket read access to the world.

bucket-permissionsbucket-permissionsbucket-permissions

By default, files uploaded to your S3 account will only allow read and write access to the owner. So if you upload new files later on, you'll need to go through these steps again or apply different permissions for just those files.

Create CloudFiles Distribution

Now that we have setup S3, created a shorter URL and uploaded our files, we'll want to make those files accessable through CloudFront to get super low latency to reduce our load times. To do this we need to create a CloudFront distribution.

Log into your AWS account and navigate to your Amazon CloudFront management console (under "Your Account" drop down menu). Then click the "Create Distribution" button.

create-distributioncreate-distributioncreate-distribution

We'll select the origin bucket (the bucket we created earlier), turn on logging if you would like, specify a CNAME and comments and finally either enable or disable the distribution. You don't have to enter a CNAME or comments but we'll want to setup a shorter URL later like we did for S3. I would like to use "cdn.jremick.com" so that's what I'm setting here.

create-distribution-settingscreate-distribution-settingscreate-distribution-settings

As you can see, the default URL is pretty ugly. That's not something you're going to want to try to remember. So now let's setup a CNAME for the pretty, short URL.

distribution-settingsdistribution-settingsdistribution-settings

Setup Custom CloudFiles Subdomain

To setup the custom CloudFiles subdomain, we'll go through the same process as we did for S3.

cloudfront-cnamecloudfront-cnamecloudfront-cname

Now we can access files through CloudFront using "cdn.jremick.com".

How It All Works

When someone accesses a file through your S3 bucket, it acts just like a regular file host. When someone accesses a file through CloudFiles though, it requests the file from your S3 bucket (the origin) and caches it at the CDN server closest to the orignial request for all subsequent requests. It's a little more complicated than that, but that's the general idea.

Think of a CDN as a smart network that is able to determine the fastest possible route for request delivery. Another example would be if the server closest is bogged down with traffic, it may be faster to get the file from a server a little farther away but with less traffic. So CloudFront will deliver the requested file from that location instead.

Caching Problems

Once a file is cached in the CloudFront network servers, it does not get replaced until it expires and is automatically removed (after 24 hours of inactivity by default). This can be a major pain if you're trying to push updates out immediately. To get around this you'll need to version your files. For example, "my-stylesheet.css" could be "my-stylesheet-v1.0.css". Then when you make an update that needs to go out immediately you would change the name to "my-stylesheet-v1.1.css" or something similar.

Performance Testing

Our content is uploaded to our S3 bucket, our CloudFront distribution is deployed and our custom subdomains are setup for easy access. It's time to put it to the test to see what kind of performance benefits we can expect.

I've setup 44 example images ranging in size from approximately 2KB up to 45KB. You might be thinking that this is more images than most websites will load on a single page. That may be true but there are many websites such as portfolios, ecommerce sites, blogs, etc. that load just as many and possibly more images.

image-testingimage-testingimage-testing

Although I'm only using images for this example, what's important is the file size and the quantity for the comparison. Today's websites are loading several javascript, CSS, HTML and image files on every page. 44 file requests is probably fewer than most websites actually make so a CDN could have an even greater impact on your website than we'll see in this comparison.

I'm using Safari's Web Inspector to view performance results, I've disabled caches and shift + refresh 10-15 times (about every 2-3 seconds) for each test to get a decent average of total load time, latency and duration.

  • 45 Total files (including HTML document)
  • 561.13KB Total combined file size

Regular Web Host

Here are the performance results when hosted via my regular web host. Sorted by latency.

image-test-jremickimage-test-jremickimage-test-jremick
  • 1.82-1.95 Seconds total load time
  • 90ms Fastest latency (last test)
  • 161ms Slowest latency (last test)
  • ~65% of the images had a latency of less than 110ms

Sorted by duration.

image-test-jremick-durationimage-test-jremick-durationimage-test-jremick-duration
  • 92ms Fastest duration (last test)
  • 396ms Slowest duration (last test)

Amazon S3

The exact same files were used for testing S3. Sorted by latency.

image-test-s3image-test-s3image-test-s3
  • 1.3-1.6 Seconds total load time
  • 55ms Fastest latency (last test)
  • 135ms Slowest latency (last test)
  • ~90% of the images had a latency of less than 100ms

Sorted by duration.

image-test-s3-durationimage-test-s3-durationimage-test-s3-duration
  • 56ms Fastest duration (last test)
  • 279ms Slowest duration (last test)

S3 is faster than my regular web host but only marginally. If you didn't feel like messing around with a CDN, S3 is still a great option to give your website a decent speed boost. I still recommend using a CDN though and we'll see why in this next test.

Amazon CloudFiles

The exact same files were used for testing CloudFront.

image-test-cloudfrontimage-test-cloudfrontimage-test-cloudfront
  • 750-850ms Total load time
  • 25ms Fastest latency (last test)
  • 112ms Slowest latency (last test)
  • ~85% of the images had a latency of less than 55ms.
  • Only one file had a latency of more than 100ms.

Sorted by duration.

image-test-cloudfront-durationimage-test-cloudfront-durationimage-test-cloudfront-duration
  • 38ms Fastest duration (last test)
  • 183ms Slowest duration (last test)

Comparison

Here's a quick breakdown of the performance comparison between my regular web host and the same files on Amazon's CloudFront service.

  • 1.82-1.95 seconds vs 0.75-0.85 seconds total load time (~1.1 seconds faster)
  • 90ms vs 25ms fastest latency (65ms faster)
  • 161ms vs 112ms slowest latency (49ms faster)
  • CloudFront: Only one file with latency greater than 100ms and 85% of the files with latency less than 55ms
  • Regular Web Host: Only 65% of the files had a latency of less than 110ms

Duration comparison

  • 92ms vs 38ms Fastest duration (54ms faster)
  • 396ms vs 183ms Slowest duration (213ms faster)

50ms or even 100ms doesn't sound like a very long time to wait (0.1 seconds) but when you repeat that for 30, 40, 50 or more files then you can see how it quickly adds up to seconds.

Visual Comparison

Here's a quick video to show just how noticeable the increase in load time is. I've disabled caches and do a forced refresh (shift + refresh) to make sure images aren't being cached.

Other Ways To Increase Performance

There are several other ways to increase website performance when using a CDN.

  • Create different subdomains for different types of files to maximize parallel downloads. For example, load images from "images.jremick.com" and other files like scripts and CSS from "cdn.jremick.com". This will allow more files to load in parallel reducing the total load time.
  • Gzip files like JavaScript and CSS
  • Configure ETags

See Best Practices for Speeding Up Your Web Site for more.

Serving Gzipped Files From CloudFiles

One of the options above for increasing performance even more was providing gzipped files. Unfortunately CloudFront isn't able to automatically determine if a visitor can accept gzipped files or not and serve up the correct one. Fortunately, all modern browsers support gzipped files these days.

Create & Upload Your Gzipped Files

To serve gzipped files from CloudFront, we can give our website some logic to serve up the right files or we can set the Content-Encoding and Content-Type on a few specific files to keep things a little simpler. Gzip the files you want and rename them so it doesn't end it .gz. For example, "filename.css.gz" would become "filename.css" or to remind yourself that it is a gzipped file, name it "filename.gz.css". Now upload the gzipped file to the location in your S3 bucket you want (don't forget to set the ACL/Permissions).

If you're not sure how to gzip a file, see http://www.gzip.org (OS X can do this in terminal)

Set Content-Encoding and Content-Type

We need to set the Content-Encoding and Content-Type (if it isn't already set) on our files so that when requested via browser it knows the content is gzipped and will be able to decompress it properly. Otherwise it will look like this.

no-content-encodingno-content-encodingno-content-encoding

We can do this easily with Bucket Explorer. Once you've downloaded it, enter your AWS Access Key and Secret key to log into your S3 account. Find the gzipped file you uploaded earlier, right click and select "Update MetaData".

gzip-update-metadatagzip-update-metadatagzip-update-metadata

As you can see, it already has the Content-Type set to text/css so we don't need to set that (javascript would be text/javascript). We just need to add the right Content-Encoding. Click "Add" and in the popup dialoge enter "Content-Encoding" in the Key field and "gzip" in the Value field. Click OK, then Save and you're done! Now the browser will view the file correctly.

gzip-save-metadatagzip-save-metadatagzip-save-metadata

Gzipping a file can greatly reduce the file size. For example, this test stylesheet was around 22KB and was reduced to approximately 5KB. For my blog I've combined all my jQuery plugins with jQuery UI Tabs. After minification it was reduced to 26.49KB, after being gzipped it was reduced to 8.17KB.

Conclusion

There are a lot of ways to increase the performance of your website and in my opinion they're worth trying. If visitors are only 0.5 seconds or even 1 second away from leaving your website, a CDN could keep that from happening. Plus, most of us are speed freaks anyway so why not crank up the performance of your website if you can? Especially if it could save you money in the process.

If you have any questions, please let me know in the comments and I'll try to respond to them. Thanks!


Advertisement
Did you find this post useful?
Want a weekly email summary?
Subscribe below and we’ll send you a weekly email summary of all new Code tutorials. Never miss out on learning about the next big thing.
Advertisement
Looking for something to help kick start your next project?
Envato Market has a range of items for sale to help get you started.