We have an application where I work that is used extensively for business reporting. The audience for these reports include all of the top management of the company, including the CEO. The problem we face is that it’s performance is slow for some of the most important users and they are very vocal about that problem. A few of us have been working on how to speed up the user experience but where do we look first? In this article I will tackle one of the simple solutions we found and how we proved to ourselves that it really did make things faster.
This particular application is all web with no client side installation necessary. The fact that it is a reporting tool is not nearly as important as the fact that it is a full web application with all the benefits and the downsides to go with it. Another important fact is that the users are most often over 2000 miles away in another office and their experience is much worse than the users where the app is hosted. Those remote users however are very important customers and include those who report to the CEO of the company. Suffice it to say that when they complain it is heard loud and clear.
So where does one begin in the effort to increase the speed of a web app? In our situation the network stood out as a primary candidate since the speed was better in the main office than in the remote office. Our network folks did their own analysis and decided to recommend and upgrade from 45 megabits to 1000 megabits (1 gigabit) and we all thought this would be the answer to the problem. This represented a speed increase of over 22 times the speed of our old connection and surely would fix the problem, but as it turned out it did not. The measurements we had been taking every day showed that the speed remained slower for the remote users by a factor of 4-5 so if a user in the main office saw response times of 4 seconds the remote users would see 16-20 seconds for the same page. So we had to start looking elsewhere.
During this same time frame some other colleagues turned me on to the recommendations written up by Yahoo that they called “
Best Practices for Speeding Up Your Web Site”. While digging into that information it was revealed that some surprisingly large number of response time issues in web apps were due to client side issues not deep in the tiers of the application server and database. They go on to list out the changes one can make to improve speed and even provide some ways of measuring how bad the problems might be. Armed with this new (to me) information off I went to analyze and make recommendations on how to improve performance.
The first tool I tried was the
Yslow addon that works with
Firebug and
Firefox. This tool gives grades for pages based on the Yahoo recommendations, just like those grades we got in school, A is the best and F is failing. In our case we got a lot of D’s and F’s because of lack of knowledge when setting up the servers.. Some of the recommendations were not possible for us to use because it’s not a public application and we purchased it from a well known vendor so any changes to the internals of the app were beyond our capability. However based on some the failing grades we could make some simple setting changes on the web server and possibly get a speed increase. The problem is how do we know when things have gotten faster and how much faster?
The first rule of experimentation is to limit yourself to only 1 change at time in order to observe the affect of that change. It is tempting to go and make a whole bunch of changes at once but doing this make it next to impossible to tell which of the changes contributed positively to the speed and which ones negatively. This lead to a simple approach of changing just one setting on the web server and then measuring the difference. We chose to start with one of our grades of F on the area of adding expires headers to the content. Yahoo in this case recommends what they call “far future” expires for static content. In other words the content that does not change very often like pictures, icons, css, javascript, etc, should get an expiration date that is a long way out like 10 years. Just to see how this might affect things we made the change on the development environment and run some tests. The procedure seemed simple enough:
- Set the setting the web server to add expires header to the static content
- Clear the browser cache completely to simulate a new user on the site
- Using the to measure the speed and give the grades, load the page the first time and observe the speed
- Navigate to a different page
- Navigate back to the page in question and observe the speed again.
- The difference in speed should be the improved performance.
The problems arose quickly when I found out that most of the users on this particular web app were using Internet Explorer most of the time and most did not know Firefox even existed. I then had to switch my experimentation to another set of tools that work with IE. Since I am focused on free and/or cheap tools I ruled out some that I could not justify the cost. One that stood out was a tool called dynaTrace AJAX Edition.
HttpWatch also gets an honorable mention but the full version costs money and the free version is a little to limited for most uses. dynaTrace AJAX Edition also provided grades and recommendations but adds much more detail beyond Yslow and HttpWatch.
I decided to modify my experiment because we have two environments to test on, QA and DEV, so we could use expires header on one environment and leave them off on the other. This would give us side by side comparisons without having make changes over and over again. The second change I made was to define a set of test steps to follow so that we can repeat the same click-stream on each environment, again keeping everything as constant as possible. Another change to my procedure was to make sure to perform the steps at least once on each environment to make sure the application server was primed and ready since the first time using the app after a restart is very slow as the objects get loaded into memory on the server, then all the subsequent requests are faster (a subject of another article). The new procedure looks more like this:
- Set the DEV environment to use Expires headers with an expiration of 90 days.
- Set the QA environment to the same as PROD using no Expires headers
- Restart the servers if necessary
- Run through the click-stream script once to wake up the servers manually or with your automated script in Selenium or other automation tool (a subject of another article)
- Clear the cache on the browser.
- Run through the click-stream script and let dynaTrace AJAX Edition takes it’s measurements.
- Repeat the click-stream script again and take a second set of measurements. Optionally close the browser and re-open just to make sure it’s using a permanent cache, not just in memory.
This modified procedure should produce a long response time right after the cache is cleared because the browser needs to download all of the components. The second run through of the click-stream script should produce noticeably different speeds than the first because the cache is primed with non-expiring content and the browser only needs to load those components that do not have Expires headers. The difference between the two speeds will show how much of an improvement Expires headers made. Here is a great article on how the browser cache works “Best Practices on Browser Cacheing”
The results of one of my experiments showed a marked improvement in page times. The tool breaks out a few measurements First Impression time, Onload Time, and Fully loaded time and in each of those I saw in improvement:
| Expires Disabled | Expires Enabled | Delta |
Rank | F(0) | A(100) | upgraded from F to A |
First Impression time (ms) | 2826 | 839 | -1.987 seconds or 337% faster |
Onload Time (ms) | 3535 | 1715 | -1.820 seconds or 206% faster |
Fully Loaded time | 3746 | 1930 | -1.816 seconds or 194% faster |
Remarks | 69 requests, 63 uncached | 70 Requests | 63 requests cached/ |
Comparing the results from the two configurations we can see that the user experience is greatly improved by having a primed cache. In practice a user would have slow response times with an empty or expired cache then fast after their cache was loaded with the static content. Using expiration time frames as long as 10 years may not work well with every app which is why we decided to use 90 days in our experiment.
dynaTrace results without Expires headers (sreenshot)
dynaTrace result with Expires headers (screenshot)
Conclusion:
Setting content to use Expires headers can measurably increase the performance of web pages. This increase is limited to return visitors to a page or web site and does not help new visitors as much. In this particular case of a web site that is internal to a company and has nothing but return visitors utilizing cache control headers show a marked increase. There are many other steps that can be taken to improve web site performance but this one represents an easy setting change because it does not require any deep dives into the application code and logic. To know for sure if this setting helps or hinders performance take measurement that reveal the actual response times and compare them with and without the setting enabled.