Website Optimization Metrics Chapter 10
Without quantifiable metrics, website optimization (WSO) is a guessing game. But with hundreds of billions of e-commerce dollars at stake, most companies cannot afford to guess.
With web metrics, you can progressively improve your search engine marketing (SEM) campaigns, conversion rates, and website performance. The results of using these controlled experiments are more profits, happier customers, and higher return on investment (ROI). The folks at Amazon.com have a saying that nicely sums up the gist of this chapter: “data trumps intuition.”
Nevertheless, website owners are awash in a sea of data. With such a surfeit of statistics the variety of metrics available to analyze can be overwhelming. You can use web analytics software such as WebTrends to analyze server log data and provide standardized reports. But how do you choose the best metrics to measure website success? How do you best run controlled experiments such as A/B split tests, multivariate tests, and parallel flights? What is the best Overall Evaluation Criterion (OEC) for your particular goal?
This chapter will boil down this statistical tsunami to highlight the most effective metrics and techniques that you can use to optimize the effectiveness of your website. We show the most effective metrics for each subject area (SEM and performance), selected metrics in action, and show the best tools that you can use to measure and tweak websites. Let the optimization begin!
What follows is an outline of this chapter:
- Website Optimization Metrics
- Website Success Metrics
- Figure 10-1. Google Analytics dashboard showing site usage trends
- Popular Web Metrics
- Measuring SEM success
- Types of Web Analytics Software
- Web Server Log Analysis
- JavaScript Page Tagging
- Code: JavaScript page tagging
- Multivariate testing with Google Website Optimizer
- Hybrid Analytics Systems
- User Experience Testing Software
- Search Engine Marketing Metrics
- Search Marketing Strategy
- Optimal paths
- Classes of Metrics
- Volume
- Content
- Objectives
- Means
- Volume Metrics
- Page views
- Visits or sessions
- Unique visitors
- New visitors
- Repeat visitors
- Instances
- Content Metrics – Measuring Each Component
- Entries
- Single-access visits
- Bounce rate (and simple engagement)
- Revenue per visit(or)
- Page attrition
- PathWeight and ProxyScoring
- Primary content consumption
- PathLoss
- Exit rate (or page exit ratio)
- Objectives
- Understanding objectives
- Ad clicks
- Goal pages
- Comments
- Orders
- Signups
- Cart additions
- Conversion
- Measuring the Means
- CPC – Cost per click
- CTR – Clickthrough rate
- ROAS – Return on ad spend
- ROI – Return on investment
- Success Metrics = Reaching Goals
- Search Marketing Strategy
- Web Performance Metrics
- Keeping Score
- Excel: Scorecard.xls Create your own performance scorecard
- Speed checklist
- Request statistics
- Load times
- Scorecard tips
- Designing a Sample Test
- Find your audience
- Clear cache and cookies
- Flush DNS
- Simulate connection speeds
- It’s Measuring Time
- Under the hood – waterfall reports
- Firebug: A simple alternative
- Figure 10-18. Firebug output display of Digg.com, first view
- Figure 10-19. Firebug JavaScript profiler
- What about keeping score?
- AOL Pagetest
- Figure 10-20. AOL Pagetest waterfall report, Digg.com cached view
- Inside an HTTP request
- Special Responses
- Summarizing Load Time and Request Statistics
- Speed Up Your Site
- Figure 10-23. AOL Pagetest Optimization Report
- Enhance Firebug with YSlow
- Reporting the Numbers
- Figure 10-26. Digg.com performance scorecard
- A movie is worth a thousand scorecards
- Start render
- Useful content display
- Graphics loaded
- Ads loaded
- Commercial Monitoring Tools
- Overlooked Web Performance Issues
- Keeping Score
- Summary
- Website Success Metrics
Footnotes
- Eisenberg, B. November 26, 2007. “Future Now’s 2007 Retail Customer Experience Study.”
- Future Now, http://www.grokdotcom.com/2007/11/26/cyber-monday-future-nows-2007-retail-customer-experience-study/ (accessed February 21, 2008). Forrester Research projects that U.S. online retail sales will grow to $316 billion by 2010.
- Kohavi, R. et al. 2007. “Practical Guide to Controlled Experiments on the Web: Listen to Your Customers, not to the HiPPO.”
- In KDD 2007 (San Jose, CA: August 12-15, 2007), 959-967. Don’t listen to the Highest Paid Person’s Opinion (HiPPO), but rather pay attention to experimental data. The researchers stress the importance of statistical power and sample size.
- Kohavi, R., and M. Round. 2004. “Front Line Internet Analytics at Amazon.com.”
- In eMetrics Summit 2004 (Santa Barbara, CA: June 2-4, 2004), http://ai.stanford.edu/~ronnyk/emetricsAmazon.pdf (accessed February 21, 2008).
- As of May 2008, there were 2,199 open jobs for “web analytics”
- at http://www.simplyhired.com/a/jobs/list/q-%22web+analytics%22.
- Eisenberg, B. et al. 2006. Call to Action: Secret Formulas to Improve Online Results
- (Nashville, TN: Thomas Nelson), 218.
- Kohavi, R. 2005. “Focus the Mining Beacon: Lessons and Challenges from the World of E-Commerce.”
- In PKDD 2005 (Porto, Portugal: October 3-7, 2005). Invited keynote.
- Kohavi, R., and R. Longbotham. 2007. “Online Experiments: Lessons Learned.”
- Computer 40 (9): 103-105. This is an Amazon statistic taken from a presentation by Greg Linden at Stanford: http://home.blarg.net/~glinden/StanfordDataMining.2006-11-29.ppt, November 29, 2006.
- Riley, E., I. Mitskaviets, and D. Card. 2007. “Optimization: Maximizing ROI Through Cross-Tactic Optimization.”
- JupiterResearch, http://www.jupiterresearch.com (accessed February 12, 2008).
- SEMPO. December 2006. “Search Engine Marketing Professional Organization survey of SEM agencies and advertisers, December 2006. Global Results.”
- SEMPO, http://www.sempo.org (accessed February 12, 2008).
- Atterer, R. et al. 2006. “Knowing the user’s every move: User activity tracking for website usability evaluation and implicit interaction.”
- In WWW 2006 (Edinburgh, Scotland: May 23-26, 2006), 203-212.
- Burby, J., and A. Brown. August 16, 2007. “Web Analytics Definitions.”
- Web Analytics Association, http://www.webanalyticsassociation.org (accessed February 5, 2008).
- Shields, D. December 15, 2007. “Definitions of Web Analytics Metrics by Classification of Function.”
- Wicked Business Sciences, http://wickedsciences.com/research/white-papers/5163,0512,1215,2007.pdf (accessed February 4, 2008).
- PathLoss
- is a metric developed by Paul Holstein of CableOrganizer.com.
- Roast, C. 1998. “Designing for Delay in Interactive Information Retrieval.”
- Interacting with Computers 10 (1): 87-104.
- Balashov, K., and A. King. 2003. “Compressing the Web.”
- In Speed Up Your Site: Web Site Optimization. Indianapolis: New Riders, 412. A test of 25 popular sites found that HTTP gzip compression saved 75% on average off text file sizes and 37% overall.
- Bent, L. et al. 2004. “Characterization of a large web site population with implications for content delivery.”
- In WWW2004 (New York: May 17-20, 2004), 522-533. In a 2004 trace, 47% of requests used cookies; 34% of all requests were for cookied images; and 73% of all cookied requests were for images, showing that judicious use of cookies would cut their use by half and enable caching.
- Theurer, T. January 4, 2007. “Performance Research, Part 2: Browser Cache Usage–Exposed!“
- Yahoo! User Interface Blog, http://yuiblog.com/blog/2007/01/04/performance-research-part-2/ (accessed February 22, 2008).
- Bent, L., and G. Voelker. 2002. “Whole Page Performance.”
- In WCW 2002 (Boulder, CO: August 14-16, 2002), 11. The average DNS lookup in the United States takes about 7.1 milliseconds.
- Cardwell, N. et al. 2000. “Modeling TCP Latency.”
- In INFOCOM 2000 (Tel Aviv, Israel: March 26-30, 2000): 1742-1751. Found that 70 ms is a reasonable round-trip time (RTT) for web objects.
- Habib, M. A., and M. Abrams. 2000. “Analysis of Sources of Latency in Downloading Web Pages.”
- In Web- Net 2000 (San Antonio, TX: October 30-November 4, 2000), 227-232. Round-trip times range from 20 to 90 ms across the United States. Overseas RTT ranged from 140 to 750 ms for a satellite link to Bangladesh. About 40% to 60% of total web page latency is from the initial request to receiving the first byte, due mainly to overhead, not server delay.
- Touch, J. et al. December 1998. “Analysis of HTTP Performance.”
- USC/ISI Research Report 98-463.
- Nielsen, J. 2007. “Response Times: The Three Important Limits.”
- Useit.com, http://www.useit.com/papers/ responsetime.html (accessed January 18, 2008).
- A service level agreement (SLA)
- is a formally negotiated agreement between two parties that records a common understanding of the level of service.
« Advanced Web Performance Optimization | Website Optimization Metrics | Table of Contents