Why Measuring the Real Mobile Web User Experience is so Valuable to your bottom line

by

Quality of Experience (QoE) is the ultimate barometer of whether or not an IT service is “successful” from a customer/consumer perspective. New application environments such as mobile require new types of metrics to assess not just response time and availability, but effective navigation and interaction with the application and application-supported processes at hand. (source: EMA Research’s Advisory Note: An Adopter’s Guide to User Experience Management – How to Pick the Right QoE Solution for You)

The PC Industry is clearly in transition. We’re now entering the “Post-Desktop” era where Mobile devices will now dominate. Technologies that were designed to measure Web site performance are going to become increasingly irrelevant in this mobile era. Tools that are considered adequate for the desktop user experience are not “precise” enough and lack context when it comes to accurately assessing response time and availability of mobile services.

If the desktop era was defined by Quality of Service then the Mobile era will be defined by the Quality of Experience. A simple slide illustrates the stark differences:

clip_image002

We’ve identified 6 critical “real time” elements to consider when measuring the Quality of the consumers Experience (QoE):

  1. The Carrier Network
  2. The user’s current location
  3. The OS and Browser
  4. The Device’s capabilities
  5. The need for a custom timing framework to further refine the experience
  6. The ability to personalize the Web page for the consumer and then measure its effectiveness

Here’s a sample test that illustrates the first 4 items above.

Methodology:

The test URL was http://m.cnn.com We then used Blaze.io to perform a performance test using an iPhone based in Ottawa, Canada. Next we followed with Firefox on a iMac configured to send an iPhone User Agent (tells CNN that it’s a mobile browser making the request), and we used the Network Link Conditioner utility (part of X-Code on the Mac) to “simulate” a good 3G network and one with a 5% packet loss on the download. Finally we ran two tests on an actual Android device running on Sprint’s network in Castle Rock, Co, and then connected via Wi-Fi to the Internet.

It’s clear from the table below that, in this instance, “simulating” the network, browser and OS all lead to under reporting the actual user experience. (other examples may lead to overstating performance).

Test URL: http://m.cnn.com

# of Requests

Size/KB

Time/Sec

% faster

Blaze.io (link) – Ottawa, Canada (Eastern Time Zone)

25

144

10

Firefox Desktop (User Agent = iPhone – 3G Avg. 5% Download Packet loss)

23

67

10

Firefox Desktop (User Agent = iPhone – 3G Good. No packet loss)

22

66

8.0

20%

Android HTC on Sprint EVDO_A (3P Mobile: Detail mode, Castle Rock, Co)

26

148

4.0

60%

Android HTC on Sprint Wi-Fi (3P Mobile: Detail mode, Castle Rock, Co)

23

121

2.72

73%

The Bottom line…

It means that new metrics are required to measure the quality of the Mobile user experience. The first three tests represent the old way of doing things. The lack of precision is clear. Moving to a new set of metrics to assess not just response time and availability, but effective navigation and interaction with the application and application-supported processes at hand is now imperative if an IT service is to be considered “successful” from a customer/consumer perspective.

Here’s who benefits from a B2B & B2C perspective, and more importantly how they benefit.

clip_image004

Posted in: #mobile, #webperf, #wpo, Performance, User Experience


Email Subscription


Categories