Advertisement
General

Supercharge Website Performance With AWS S3 and CloudFront

by

We live in a world where people are increasingly expecting more and faster speeds. In fractions of a second, your website can lose valuable visitors and in turn, money. Although most people think CDNs are for the "big dogs", they're actually super cheap and incredibly easy to use these days.

In this tutorial I'll show you how to setup and use Amazon's Web Services S3 and CloudFront to decrease website load time as well as show the performance differences.

What is a CDN?

A CDN is a Content Delivery (or Distribution) Network. It is a network of computers with each system placed at different points with the same data on each. When someone accesses the network, they can access the file on the system nearest them or the one with less current load. This results in better lower latency and file download duration. To learn more about CDNs see "Content delivery network" at Wikipedia.

CDN-graphic-example-jremick

In the example image above, visitors access the server nearest them that will provide the best possible performance. The network of servers would be the CDN. A regular web host would have one central server in which all those visitors would have to access. That one server could be located only in the US or maybe Europe and would result in longer latency and load times for visitors farther away.

Using more than one server, even on just one continent, will make a difference in performance.

Why & The Proof

I've had quite a few people ask me why a CDN is important, even for smaller websites, and why they should bother paying for yet another web service. The simple answer is, the faster–the better. And why not offer your customers (visitors) the best you can?

The smaller the website, the less of an impact a CDN will make. Although, if your visitors translate into money for you then every little bit helps.

  • In 2006 Google's tests showed that increasing load time by 0.5 seconds resulted in a 20% drop in traffic.
  • In 2007 Amazon's tests showed that for every 100ms increase in load time, sales would decrease 1%.
  • This year (2009) Akamai (a CDN leader) revealed in a study that 2 seconds is the new threshold for eCommerce web page response times.

It's cheap. It's easy. And it can translate into more money in terms of customers and saving on your regular web host expenses.

Amazon Web Services (AWS)

Amazon provides a whole slew of fantastic web services. We'll be using Amazon's Simple Storage Service (S3) and CloudFront. S3 is a data storage solution in the cloud which can be tied to CloudFront, Amazon's CDN.

If you're looking for a slightly simpler, all-in-one solution, Rackspace Cloud Files is another great option. They've partnered with Limelight Network's CDN which has slightly better performance than Amazon's CDN currently. However, their service has a few drawbacks you won't find with Amazon. I won't get into all these but one of the bigger ones for me was the lack of custom CNAME support which is supposedly coming at some point in the future. With CNAME support you can setup a custom sub-domain to access your files such as "cdn.yourdomain.com".

To see recent performance comparisons, visit http://www.cloudclimate.com/cdns/

Pricing

Here's Amazon's S3 pricing for the US. For other areas, click the image to see full pricing.

s3-pricing

Here's Amazon's CloudFront pricing for the US. For other areas, click the image to see full pricing.

cloudfront-pricing

Use Amazon's monthly calculator to get a better idea of your end bill. Last month, my total bill was less than $5, with the majority of that incurred from 20GB+ of data storage. As you can see, it's very, very cheap, especially when you take into consideration the performance and flexibility benefits.

Setup S3 & CloudFront

To get started, we need to sign up for Amazon's S3 and CloudFront services. If you already have an account with Amazon you'll just need to login and finish the signup. If not, you'll need to create an account then proceed to signup for S3 and CloudFront. The signup is simply adding the service to your account. There's nothing complicated involved.

Click each image to go to the service's information and signup page.

s3-signup
cloudfront-signup

Once you've signed up, you'll get an Access Key ID and Secret Access Key which can be found under "Your Account" > "Security Credentials". This is basically your username and password for accessing S3.

aws-keys

Setup S3 Bucket For Files

First we need to create a bucket to store all our files in. For more information on "buckets" read "Amazon S3 Buckets Described in Plain English".

To do this, we'll first log into our S3 account using the Access Key ID and Secret Access Key with an application like Transmit (OS X), which is what I'll be using. To see more apps or browser add-ons for accessing S3 see "Amazon S3 Simple Storage Service – Everything You Wanted to Know".

transmit-s3-login

Once signed in, we'll create a bucket to put our files in. I've named mine "files.jremick.com". Buckets must have unique names, need to be between 3 and 63 characters and can contain letters, numbers and dashes (but can't end with a dash).

By unique, they mean unique on the AWS network. So it's a good idea to use something like a URL or something similar.

transmit-create-bucket

The files we put in this bucket can now be accessed at "files.jremick.com.s3.amazonaws.com". However, this URL is pretty long and we can quickly setup a shorter one. We'll setup a new CNAME entry at our web host to do this.

Setup Custom S3 Subdomain

To shorten the default URL we'll create a CNAME entry as I've done below (this is at your web host). I've chosen "files" as my subdomain but you could use whatever you like.

s3-cname

Now we can access these bucket files at "files.jremick.com". Much better! Then simply upload the files you want within the "files.jremick.com" bucket.

Once your files are uploaded, you'll want to set the ACL (Access Control List) to allow everyone to read the files (if you want them public). In Transmit you simply right click, select get info, under permissions set "Read" to "World" and click "Apply to enclosed items...". This will give all the files within this bucket read access to the world.

bucket-permissions

By default, files uploaded to your S3 account will only allow read and write access to the owner. So if you upload new files later on, you'll need to go through these steps again or apply different permissions for just those files.

Create CloudFiles Distribution

Now that we have setup S3, created a shorter URL and uploaded our files, we'll want to make those files accessable through CloudFront to get super low latency to reduce our load times. To do this we need to create a CloudFront distribution.

Log into your AWS account and navigate to your Amazon CloudFront management console (under "Your Account" drop down menu). Then click the "Create Distribution" button.

create-distribution

We'll select the origin bucket (the bucket we created earlier), turn on logging if you would like, specify a CNAME and comments and finally either enable or disable the distribution. You don't have to enter a CNAME or comments but we'll want to setup a shorter URL later like we did for S3. I would like to use "cdn.jremick.com" so that's what I'm setting here.

create-distribution-settings

As you can see, the default URL is pretty ugly. That's not something you're going to want to try to remember. So now let's setup a CNAME for the pretty, short URL.

distribution-settings

Setup Custom CloudFiles Subdomain

To setup the custom CloudFiles subdomain, we'll go through the same process as we did for S3.

cloudfront-cname

Now we can access files through CloudFront using "cdn.jremick.com".

How It All Works

When someone accesses a file through your S3 bucket, it acts just like a regular file host. When someone accesses a file through CloudFiles though, it requests the file from your S3 bucket (the origin) and caches it at the CDN server closest to the orignial request for all subsequent requests. It's a little more complicated than that, but that's the general idea.

Think of a CDN as a smart network that is able to determine the fastest possible route for request delivery. Another example would be if the server closest is bogged down with traffic, it may be faster to get the file from a server a little farther away but with less traffic. So CloudFront will deliver the requested file from that location instead.

Caching Problems

Once a file is cached in the CloudFront network servers, it does not get replaced until it expires and is automatically removed (after 24 hours of inactivity by default). This can be a major pain if you're trying to push updates out immediately. To get around this you'll need to version your files. For example, "my-stylesheet.css" could be "my-stylesheet-v1.0.css". Then when you make an update that needs to go out immediately you would change the name to "my-stylesheet-v1.1.css" or something similar.

Performance Testing

Our content is uploaded to our S3 bucket, our CloudFront distribution is deployed and our custom subdomains are setup for easy access. It's time to put it to the test to see what kind of performance benefits we can expect.

I've setup 44 example images ranging in size from approximately 2KB up to 45KB. You might be thinking that this is more images than most websites will load on a single page. That may be true but there are many websites such as portfolios, ecommerce sites, blogs, etc. that load just as many and possibly more images.

image-testing

Although I'm only using images for this example, what's important is the file size and the quantity for the comparison. Today's websites are loading several javascript, CSS, HTML and image files on every page. 44 file requests is probably fewer than most websites actually make so a CDN could have an even greater impact on your website than we'll see in this comparison.

I'm using Safari's Web Inspector to view performance results, I've disabled caches and shift + refresh 10-15 times (about every 2-3 seconds) for each test to get a decent average of total load time, latency and duration.

  • 45 Total files (including HTML document)
  • 561.13KB Total combined file size

Regular Web Host

Here are the performance results when hosted via my regular web host. Sorted by latency.

image-test-jremick
  • 1.82-1.95 Seconds total load time
  • 90ms Fastest latency (last test)
  • 161ms Slowest latency (last test)
  • ~65% of the images had a latency of less than 110ms

Sorted by duration.

image-test-jremick-duration
  • 92ms Fastest duration (last test)
  • 396ms Slowest duration (last test)

Amazon S3

The exact same files were used for testing S3. Sorted by latency.

image-test-s3
  • 1.3-1.6 Seconds total load time
  • 55ms Fastest latency (last test)
  • 135ms Slowest latency (last test)
  • ~90% of the images had a latency of less than 100ms

Sorted by duration.

image-test-s3-duration
  • 56ms Fastest duration (last test)
  • 279ms Slowest duration (last test)

S3 is faster than my regular web host but only marginally. If you didn't feel like messing around with a CDN, S3 is still a great option to give your website a decent speed boost. I still recommend using a CDN though and we'll see why in this next test.

Amazon CloudFiles

The exact same files were used for testing CloudFront.

image-test-cloudfront
  • 750-850ms Total load time
  • 25ms Fastest latency (last test)
  • 112ms Slowest latency (last test)
  • ~85% of the images had a latency of less than 55ms.
  • Only one file had a latency of more than 100ms.

Sorted by duration.

image-test-cloudfront-duration
  • 38ms Fastest duration (last test)
  • 183ms Slowest duration (last test)

Comparison

Here's a quick breakdown of the performance comparison between my regular web host and the same files on Amazon's CloudFront service.

  • 1.82-1.95 seconds vs 0.75-0.85 seconds total load time (~1.1 seconds faster)
  • 90ms vs 25ms fastest latency (65ms faster)
  • 161ms vs 112ms slowest latency (49ms faster)
  • CloudFront: Only one file with latency greater than 100ms and 85% of the files with latency less than 55ms
  • Regular Web Host: Only 65% of the files had a latency of less than 110ms

Duration comparison

  • 92ms vs 38ms Fastest duration (54ms faster)
  • 396ms vs 183ms Slowest duration (213ms faster)

50ms or even 100ms doesn't sound like a very long time to wait (0.1 seconds) but when you repeat that for 30, 40, 50 or more files then you can see how it quickly adds up to seconds.

Visual Comparison

Here's a quick video to show just how noticeable the increase in load time is. I've disabled caches and do a forced refresh (shift + refresh) to make sure images aren't being cached.

Other Ways To Increase Performance

There are several other ways to increase website performance when using a CDN.

  • Create different subdomains for different types of files to maximize parallel downloads. For example, load images from "images.jremick.com" and other files like scripts and CSS from "cdn.jremick.com". This will allow more files to load in parallel reducing the total load time.
  • Gzip files like JavaScript and CSS
  • Configure ETags

See Best Practices for Speeding Up Your Web Site for more.

Serving Gzipped Files From CloudFiles

One of the options above for increasing performance even more was providing gzipped files. Unfortunately CloudFront isn't able to automatically determine if a visitor can accept gzipped files or not and serve up the correct one. Fortunately, all modern browsers support gzipped files these days.

Create & Upload Your Gzipped Files

To serve gzipped files from CloudFront, we can give our website some logic to serve up the right files or we can set the Content-Encoding and Content-Type on a few specific files to keep things a little simpler. Gzip the files you want and rename them so it doesn't end it .gz. For example, "filename.css.gz" would become "filename.css" or to remind yourself that it is a gzipped file, name it "filename.gz.css". Now upload the gzipped file to the location in your S3 bucket you want (don't forget to set the ACL/Permissions).

If you're not sure how to gzip a file, see http://www.gzip.org (OS X can do this in terminal)

Set Content-Encoding and Content-Type

We need to set the Content-Encoding and Content-Type (if it isn't already set) on our files so that when requested via browser it knows the content is gzipped and will be able to decompress it properly. Otherwise it will look like this.

no-content-encoding

We can do this easily with Bucket Explorer. Once you've downloaded it, enter your AWS Access Key and Secret key to log into your S3 account. Find the gzipped file you uploaded earlier, right click and select "Update MetaData".

gzip-update-metadata

As you can see, it already has the Content-Type set to text/css so we don't need to set that (javascript would be text/javascript). We just need to add the right Content-Encoding. Click "Add" and in the popup dialoge enter "Content-Encoding" in the Key field and "gzip" in the Value field. Click OK, then Save and you're done! Now the browser will view the file correctly.

gzip-save-metadata

Gzipping a file can greatly reduce the file size. For example, this test stylesheet was around 22KB and was reduced to approximately 5KB. For my blog I've combined all my jQuery plugins with jQuery UI Tabs. After minification it was reduced to 26.49KB, after being gzipped it was reduced to 8.17KB.

Conclusion

There are a lot of ways to increase the performance of your website and in my opinion they're worth trying. If visitors are only 0.5 seconds or even 1 second away from leaving your website, a CDN could keep that from happening. Plus, most of us are speed freaks anyway so why not crank up the performance of your website if you can? Especially if it could save you money in the process.

If you have any questions, please let me know in the comments and I'll try to respond to them. Thanks!


Related Posts
  • Code
    Plugins
    Choosing the Right Plugin for Your Next WordPress ProjectPlugin icon 400
    Ever needed a plugin for your WordPress-based website? Of course! After all, that's how WordPress is extended, isn't it? If you're a beginner to WordPress, or even a power user, then you're likely familiar with the WordPress plugin repository, premium plugins, and so on. The thing is, there are so many plugins available that do many of the same things, it can be difficult to determine which is best suited for your particular needs. This raises the question: What do you do to find the perfect plugin for a particular need? In this post, we're going to see how to find the best plugin for exactly that.Read More…
  • Business
    Communication
    Set Up Whitelabeled Email in the Cloud for Your Clients With AtmailAtmail preview
    If you've ever wanted to offer a high quality email service to your clients, tailored to fit their needs, Atmail fits the bill perfectly.Read More…
  • Business
    Sales
    The Ultimate eCommerce Guide to Boosting Your Black Friday SalesPreview black friday ecommerce
    The festive season is big business for all retailers, and Black Friday kicks it all off with a bang. This is especially true for online eCommerce traders who have learned to make the most of the Black Friday frenzy. In 2012 online spending on Black Friday rose 26% compared to 2011, hitting over $1 billion. With the rapid rise of mobile commerce, this is a trend that's set only to continue in 2013.Read More…
  • Web Design
    Typography
    Working With WebINK: Pro-Quality Fonts for the WebWebink preview
    WebINK by Extensis is a web font service which gives you access to over a thousand font families, produced by the best foundries in the industry. As a designer, you can use their fonts in Photoshop and in development, for free, before handing over the whole project to your client's WebINK account.Read More…
  • Code
    Tools & Tips
    Essential Command-Line Tools for Web DevelopersTerminal for web devs
    Tools can make our workflows feel seamless, allowing us to focus on what we are building, and not worry about the process. Most web developers, on all portions of the stack, work from the command-line. There are countless utilities, which can make you more productive. These aren't full blown command-line applications, such as Git, but rather simple and composable tools, which can improve your workflow as a web developer.Read More…
  • Code
    iOS SDK
    Updating iOS Applications with GroundControlPreview
    Submitting an update to the App Store is unavoidable if you want to add major enhancements to an application. However, it is cumbersome and time-consuming to go through the App Store review process for simple modifications. In this article, I will show you how to update an application remotely using the GroundControl project.Read More…