7. Organic Ad
Campaigns
search network $
1 1 1
Advertiser site
Visitor 2 O er 3 $
8 Upselling 4
Abandonment
Reach
5 Purchase step $
Mailing,
alerts, Purchase step $
9 promotions
$
Conversion $
Disengagement 7
Enrolment 6
Impact on site
$ Positive $ Negative
8.
9.
10. Bad
$
4 content
Social Search
Invitation
network link results
4 Good
content
1 $
1 1
Collaboration site
2
Visitor Content creation Moderation
$
3 Spam & trolls
$
Engagement 5
Viral
6 Social graph
spread
7
Disengagement $
Impact on site
$ Positive $ Negative
11.
12. Enterprise subscriber $
1
End user (employee) $
Refund $
2
Renewal, upsell, SLA
reference SaaS site violation
Performance
Good Bad 3
Helpdesk Support
5 $
Usability escalation costs
7
4
Good Bad
Productivity
Good Bad
6
Churn $
Impact on site
$ Positive $ Negative
13.
14. $
Media site
Enrolment Targeted
2 embedded ad 5
$
6 1
Ad
Visitor
network
4
3 5
Advertiser $
Departure $ site
Impact on site
$ Positive $ Negative
19. 1970: 1980:
The right The right
hardware application
Client-server
architectures
20. 1970: 1980:
The right The right
hardware application
Client-server Vendor
architectures dominance
21. 1970: 1980: 1990:
The right The right The right
hardware application integration
Client-server Vendor
architectures dominance
22. 1970: 1980: 1990:
The right The right The right
hardware application integration
Client-server Vendor
Web, SaaS, XML
architectures dominance
23. 1970: 1980: 1990: 2000:
The right The right The right The right
hardware application integration adoption
Client-server Vendor
Web, SaaS, XML
architectures dominance
24. 1970: 1980: 1990: 2000:
The right The right The right The right
hardware application integration adoption
Client-server Vendor
Web, SaaS, XML
architectures dominance
Enterprise
application adoption is
the new frontier
26. “Hard” data
Analytics Usability Performability
(what did they (how did they (could they do
do on the interact with what they
site?) it?) wanted to?)
Complete Web Monitoring
Community VoC Competition
(what were (what were (what are they
they saying?) their up to?)
motivations?)
“Soft” data
27. “Hard” data
Analytics Usability Performability
(what did they (how did they (could they do
do on the interact with what they
site?) it?) wanted to?)
Complete Web Monitoring
Community VoC Competition
(what were (what were (what are they
they saying?) their up to?)
motivations?)
“Soft” data
33. ATTENTION
NEW
VISITORS
SEARCHES GROWTH
TWEETS NUMBER
OF VISITS
MENTIONS
ADS SEEN LOSS
BOUNCE
RATE
34. ATTENTION ENGAGEMENT
NEW
VISITORS
SEARCHES GROWTH
PAGES
TWEETS NUMBER
PER
OF VISITS
MENTIONS VISIT
ADS SEEN LOSS
BOUNCE
RATE
35. ATTENTION ENGAGEMENT
NEW
VISITORS
SEARCHES GROWTH
PAGES TIME
TWEETS NUMBER
PER ON
OF VISITS
MENTIONS VISIT SITE
ADS SEEN LOSS
BOUNCE
RATE
36. ATTENTION ENGAGEMENT CONVERSION
NEW
VISITORS
SEARCHES GROWTH CONVERSION
PAGES TIME RATE
TWEETS NUMBER
OF VISITS
PER ON x
MENTIONS VISIT SITE
GOAL
ADS SEEN LOSS VALUE
BOUNCE
RATE
46. “Hard” data
Analytics Usability Performability
(what did they (how did they (could they do
do on the interact with what they
site?) it?) wanted to?)
Complete Web Monitoring
Community VoC Competition
(what were (what were (what are they
they saying?) their up to?)
motivations?)
“Soft” data
48. Yes
Perceptual information
(did I see it?)
No
No Affordance Yes
(was I supposed to interact with it?) Adapted from Gaver (1991)
49. Yes
False
Perceptual information
affordance
(did I see it?)
No
No Affordance Yes
(was I supposed to interact with it?) Adapted from Gaver (1991)
50. Yes
Seen
False
(perceptible)
Perceptual information
affordance
affordance
(did I see it?)
No
No Affordance Yes
(was I supposed to interact with it?) Adapted from Gaver (1991)
51. Yes
Seen
False
(perceptible)
Perceptual information
affordance
affordance
(did I see it?)
Correct
rejection
No
No Affordance Yes
(was I supposed to interact with it?) Adapted from Gaver (1991)
52. Yes
Seen
False
(perceptible)
Perceptual information
affordance
affordance
(did I see it?)
Unseen
Correct
(hidden)
rejection
affordance
No
No Affordance Yes
(was I supposed to interact with it?) Adapted from Gaver (1991)
53.
54.
55. “Hard” data
Analytics Usability Performability
(what did they (how did they (could they do
do on the interact with what they
site?) it?) wanted to?)
Complete Web Monitoring
Community VoC Competition
(what were (what were (what are they
they saying?) their up to?)
motivations?)
“Soft” data
56.
57.
58.
59.
60.
61.
62. “Hard” data
Analytics Usability Performability
(what did they (how did they (could they do
do on the interact with what they
site?) it?) wanted to?)
Complete Web Monitoring
Community VoC Competition
(what were (what were (what are they
they saying?) their up to?)
motivations?)
“Soft” data
68. Search Join
Anonymous, but Permission-
little insight into based access to
what’s going on activity (friends,
behind closed forums)
doors
69. Search Join Moderate
Anonymous, but Permission- Some
little insight into based access to administrative
what’s going on activity (friends, control, but you
behind closed forums) have to earn it
doors
70. Search Join Moderate Run
Anonymous, but Permission- Some Complete control
little insight into based access to administrative and visibility but
what’s going on activity (friends, control, but you no guarantee
behind closed forums) have to earn it anyone will show
doors up
71.
72. “Hard” data
Analytics Usability Performability
(what did they (how did they (could they do
do on the interact with what they
site?) it?) wanted to?)
Complete Web Monitoring
Community VoC Competition
(what were (what were (what are
they saying?) their they up to?)
motivations?)
“Soft” data
73. “Hard” data
Analytics Usability Performability
(what did they (how did they (could they
do on the interact with do what they
site?) it?) wanted to?)
Complete Web Monitoring
Community VoC Competition
(what were (what were (what are they
they saying?) their up to?)
motivations?)
“Soft” data
95. Figure 3 Interactive user productivity versus computer response time for human-intensive
interactions for system A
E 600
-
3
T -" INTERACTIVE USER PRODUCTIVITY (IUP)
w
-HUMAN-INTENSIVE COMPONENT OF IUP
7 MEASURED DATA (HUMAN-INTENSIVE
E 500 -
A
z " COMPONENT)
U
E
-
w
E 0
>
-
>
- -
400
3
n
F
2
0
0
300 -
200 -
100 -
0
0- I 1 I I I
0 1 2 3 4 5
COMPUTER RESPONSE TIME (SI
(1981) A. J. Thadhani, IBM Systems Journal, Volume 20, number 4
128. Now we can ask
Does poor performance cause bad KPIs?
129.
130. Impact of page load time on average daily
searches per user
0%
-0.15%
-0.30%
-0.45%
-0.60%
50ms pre-header 100ms pre-header 200ms post-header 200ms post-ad 400ms post-header
131.
132. Impact of additional delay on business metrics
0%
-1.25%
-2.50%
-3.75%
-5.00%
50 200 500 1000 2000
Queries/visitor Query refinement Revenue/visitor
Any clicks Satisfaction
133. Shopzilla had another angle
• Big, high-traffic site • 16 month re-engineering
• 100M impressions a day • Page load from 6 seconds
to 1.2
• 8,000 searches a second
• Uptime from 99.65% to
• 20-29M unique visitors a 99.97%
month
• 100M products • 10% of previous
hardware needs
http://en.oreilly.com/velocity2009/public/schedule/detail/7709
141. VISITOR ACCELERATOR WEB
SERVER
Decide whether
to optimize
142. VISITOR ACCELERATOR WEB
SERVER
Decide whether
to optimize
Normal
content
143. VISITOR ACCELERATOR WEB
SERVER
Decide whether
to optimize
Normal
content
Insert
segment
marker
144. VISITOR ACCELERATOR WEB
SERVER
Decide whether
to optimize
Normal
content
Insert
Optimize? segment
marker
145. VISITOR ACCELERATOR WEB
SERVER
Decide whether
to optimize
Normal
Accelerated
content
Insert
Optimize? segment
marker
146. VISITOR ACCELERATOR WEB
SERVER
Decide whether
to optimize
Normal
Accelerated
content
Insert
Optimize? segment
marker
Unaccelerated
147. VISITOR ACCELERATOR WEB
SERVER
Decide whether
to optimize
Normal
Receive Accelerated
content
page
Insert
Process
Optimize? segment
scripts
marker
Send
analytics Unaccelerated
148. VISITOR ACCELERATOR WEB
SERVER
Decide whether
to optimize
Normal
Receive Accelerated
content
page
Insert
Process
Optimize? segment
scripts
marker
Send
analytics Unaccelerated
GOOGLE
ANALYTICS
149. Traffic levels
9,000
Total number of visits
6,750
4,500
8,505
2,250 4,740
0
Optimized Unoptimized
Visitor experience
151. % visits marked “new”
% of visits that had no returning cookie
14
11
7 13.61%
10.85%
4
0
Optimized Unoptimized
Visitor experience
152. That means...
9000
Value Number of visits
6750
4500 7,582
4,095
2250
923 645
0
Optimized Unoptimized
153. Average time on site
31
Time on site (minutes)
23
16 30.17
23.83
8
0
Optimized Unoptimized
Visitor experience
154. Pages per visit
16
Average pages seen
12
8 15.64
11.04
4
0
Optimized Unoptimized
Visitor experience
155. Conversion rate
and order value
20
Difference due to optimization
15
10
16.07
5
5.51
0
Conversion rate Order value
156. This is just one case
LOTS
# OF
VISITS
OPTIMIZED
0
0 VISITOR LATENCY 10,000
Different visitors experienced
different performance levels.
157. With one outcome
LOTS
# OF
VISITS
21.58%
0 BETTER
0 VISITOR LATENCY 10,000
Right now we have a single experiment,
and a single resulting business impact.
158. With one outcome
LOTS
Best 5% Worst 5%
# OF
VISITS
21.58%
0 BETTER
0 VISITOR LATENCY 10,000
Visitors who were optimized fall into
a range – the 5th to 95th percentile.
159. Lots of different results
LOTS
$ PER
DAY
0
0 VISITOR LATENCY 10,000
If we have several experiments, we can
understand the relationship better.
160. Lots of different results
LOTS 24%
$ PER
DAY
0
0 VISITOR LATENCY 10,000
If we have several experiments, we can
understand the relationship better.
161. Lots of different results
LOTS
18%
$ PER
DAY
0
0 VISITOR LATENCY 10,000
If we have several experiments, we can
understand the relationship better.
162. Lots of different results
LOTS
14%
$ PER
DAY
0
0 VISITOR LATENCY 10,000
If we have several experiments, we can
understand the relationship better.
163. Lots of different results
LOTS
$ PER 12%
DAY
0
0 VISITOR LATENCY 10,000
If we have several experiments, we can
understand the relationship better.
164. Lots of different results
LOTS
$ PER
DAY
9.5%
0
0 VISITOR LATENCY 10,000
If we have several experiments, we can
understand the relationship better.
165. Lots of different results
LOTS
$ PER
DAY
0
0 VISITOR LATENCY 10,000
If we have several experiments, we can
understand the relationship better.
166. You have your own curve
LOTS
$ PER
DAY
0
0 VISITOR LATENCY 10,000
Every web business has a curve like
this hidden inside it.
167. 100,000 100%
Count (logarithmic)
Availability (% uptime)
10,000
99%
1,000
98%
100
97%
10
0 96%
Mon Tue Wed Thu Fri Sat Sun
Visits Twitter mentions Blog comments
Conversions Facebook members Uptime
168. “Hard” data
Analytics Usability Performability
(what did they (how did they (could they do
do on the interact with what they
site?) it?) wanted to?)
Complete Web Monitoring
Community VoC Competition
(what were (what were (what are they
they saying?) their up to?)
motivations?)
“Soft” data
172. Some takeaways
Myopic silos of monitoring won’t last
Outcomes aren’t just e-commerce:
Revenue, adoption, productivity, contribution
173. Some takeaways
Myopic silos of monitoring won’t last
Outcomes aren’t just e-commerce:
Revenue, adoption, productivity, contribution
Everything needs to tie back to end users and
outcomes
174. Some takeaways
Myopic silos of monitoring won’t last
Outcomes aren’t just e-commerce:
Revenue, adoption, productivity, contribution
Everything needs to tie back to end users and
outcomes
Your fastest path to revenue may be performance
improvement
175. Some takeaways
Myopic silos of monitoring won’t last
Outcomes aren’t just e-commerce:
Revenue, adoption, productivity, contribution
Everything needs to tie back to end users and
outcomes
Your fastest path to revenue may be performance
improvement
Starting with the end user is a good way to detect and
triage problems in complex environments
176. Some takeaways
Myopic silos of monitoring won’t last
Outcomes aren’t just e-commerce:
Revenue, adoption, productivity, contribution
Everything needs to tie back to end users and
outcomes
Your fastest path to revenue may be performance
improvement
Starting with the end user is a good way to detect and
triage problems in complex environments
You don’t know what’s broken where until you look
In july, we released a book with O’Reilly called complete monitoring.
It’s different than other ORLY books in that it’s not concentrated on a programming language
but business outcomes instead.
Every business has a goal hidden inside it.
Amazon: what do they want you to do?
Maximize your shopping cart size
They’re a transactional site. They make money when people complete a process, usually involving a purchase or subscription.
But Amazon also wants you to leave reviews
And add something to a wishlist
These are forms of collaboration, where communities create content.
What about another kind of site. What does gmail want you to do?
GMail is first and foremost a SaaS site. It wants you to be productive, so you can get work done and keep using the system. A paid SaaS site is the same thing.
Of course, GMail is also another kind of site -- a media site. That’s an ad up there.
Media sites want you to click on targeted advertising.
Analytics is about measuring.
Here’s the simplest possible analytics model.
Here’s the simplest possible analytics model.
Here’s the simplest possible analytics model.
Here’s the simplest possible analytics model.
Here’s the simplest possible analytics model.
Here’s the simplest possible analytics model.
Here’s the simplest possible analytics model.
Here’s the simplest possible analytics model.
Here’s the simplest possible analytics model.
Here’s the simplest possible analytics model.
Here’s the simplest possible analytics model.
This is a “funnel” -- the usual way to visualize the conversion of web visitors to folks who do what you want them to.
For example, the number of people who come to a site, but then leave right away, is called the Bounce Rate.
For example, the number of people who come to a site, but then leave right away, is called the Bounce Rate.
For example, the number of people who come to a site, but then leave right away, is called the Bounce Rate.
For example, the number of people who come to a site, but then leave right away, is called the Bounce Rate.
For example, the number of people who come to a site, but then leave right away, is called the Bounce Rate.
For example, the number of people who come to a site, but then leave right away, is called the Bounce Rate.
For example, the number of people who come to a site, but then leave right away, is called the Bounce Rate.
For example, the number of people who come to a site, but then leave right away, is called the Bounce Rate.
For example, the number of people who come to a site, but then leave right away, is called the Bounce Rate.
For example, the number of people who come to a site, but then leave right away, is called the Bounce Rate.
For example, the number of people who come to a site, but then leave right away, is called the Bounce Rate.
In addition to bounce rate,
There are KPIs for shopping cart abandonment
Or traffic volumes
Or content creation rate
It’s one thing to know what people did on your site. But often you want to know how they did it -- did they click on the red button or the blue text? Did they scroll all the way down?
Designers call things like buttons and doorknobs “affordances.” They worry about things like whether the user perceived the affordance, and whether it was in fact intended as one.
Designers call things like buttons and doorknobs “affordances.” They worry about things like whether the user perceived the affordance, and whether it was in fact intended as one.
Designers call things like buttons and doorknobs “affordances.” They worry about things like whether the user perceived the affordance, and whether it was in fact intended as one.
Designers call things like buttons and doorknobs “affordances.” They worry about things like whether the user perceived the affordance, and whether it was in fact intended as one.
for example, the “Xiti Pro” and “Xiti Free” links aren’t actually URLs. They’re text that people mistake for hyperlinks.
You can drill down to the individual form components
Companies like Expedia, Travelocity, and Priceline had problems with abandonment.
Visitors would search for a hotel, find one they liked, check rates and availability—and then leave.
The sites tried offering discounts, changing layouts, modifying the text, and more. Nothing.
“Why did you come to the site?”
Visitors weren’t planning on booking a room, only checking availability.
The reason they thought visitors were coming to their site was wrong.
The site’s operators had a different set of goals in mind than visitors did, and the symptom of this disconnect was the late abandonment.
With this new-found understanding of visitor motivations, travel sites took two important steps.
First, they changed the pages of their sites, offering to watch a particular search for thecustomer and tell them when a deal came along, as shown in Figure 7-1.
Second, they moved the purchasing or bidding to the front of the process, forcing the buyer to commit to payment or to name their price before they found out which hotel they’d booked. This prevented window-shopping for a brand while allowing them to charge discounted rates.
The results were tremendous, and changed how online hotel bookings happen. Today, most travel sites let users watch specific bookings, and many offer deeper discounts than the hotel chains themselves if customers are willing to commit to a purchase before they find out the brand of the hotel.
PMOG and Webwars.
In these games, players install browser plug-ins that let them view websites in different ways than those intended by the site operator.
In PMOG, a user can plant traps on your website that other players might trigger, or leave caches of game inventory for teammates to collect.
Other “overlays” to the web let people comment on a site using plug-ins like firef.ly—shown in Figure 7-3—or use site content for address books and phone directories as Skype does.
Or volume of comments
It’s easy to craft a message. Getting genuine attention is the hard part.
**GLOSS OVER**
Online marketing made advertising accountable thanks to web analytics.
Viral marketing approaches makes it easy to spread messages that have high returns.
Community marketing now makes it possible for others to genuinely be interested in a product by not feeling like they’re getting messages from a company with ulterior motives to sell.
There are really 8 major types of communities today with four levels of engagements to each.
And there are four levels of engagement you can have with them. More engagement means more visibility, at the expense of anonymity.
And there are four levels of engagement you can have with them. More engagement means more visibility, at the expense of anonymity.
And there are four levels of engagement you can have with them. More engagement means more visibility, at the expense of anonymity.
And there are four levels of engagement you can have with them. More engagement means more visibility, at the expense of anonymity.
So -- what you monitor and what you get out of it depends on what approach you take and what platform you’re engaging with. Here are some examples of the approaches and platforms.
Of course, many of these techniques -- particularly community ones -- can be used to “stalk” your competitors too.
Sites still fail in lots of ways. It’s scary how much things break. This is just a sample of pages for Canadians...
Sites still fail in lots of ways. It’s scary how much things break. This is just a sample of pages for Canadians...
Sites still fail in lots of ways. It’s scary how much things break. This is just a sample of pages for Canadians...
Sites still fail in lots of ways. It’s scary how much things break. This is just a sample of pages for Canadians...
Sites still fail in lots of ways. It’s scary how much things break. This is just a sample of pages for Canadians...
Sites still fail in lots of ways. It’s scary how much things break. This is just a sample of pages for Canadians...
Sites still fail in lots of ways. It’s scary how much things break. This is just a sample of pages for Canadians...
Sites still fail in lots of ways. It’s scary how much things break. This is just a sample of pages for Canadians...
Sites still fail in lots of ways. It’s scary how much things break. This is just a sample of pages for Canadians...
Sites still fail in lots of ways. It’s scary how much things break. This is just a sample of pages for Canadians...
All of this analytics is good. But it’s only half of the job of web monitoring. Because try as you might, websites have a problem.
for example . . . imagine that you decided to launch a kick ass survey. you’ve bought the latest shiny tool
you’ve carefully crafted the questions
you hired outside help to make sure they’re worded properly
you had them sent to a professional copy editor to get the final tone just right
it went through legal
you segmented your campaign according to the demographic whose voice you need to understand the most
and as you sit precariously over the big red send button you can’t help but feel that you’ve covered all your bases.
Satisfied, you press the button and out it goes into the world.
that was the case with paypal, recently.
We don’t have insight into their numbers, so we can’t tell for sure what the particular conversion rate for this survey was, but we suspect that the pickup wasn’t as good as anticipated.
Their web analytics and VOC don’t have the necessary functions built in to determine that their SSL cert was mismatched, cause safari and other browsers to come up with a nasty message saying “we can’t verify the identity of paypal-surveys.com”.
After all, think about it; if it’s coming from paypal and the identity can’t be verified, would you go on the site and fill anything out?
we know of a case of a marketing officer who’se job was put in question because of a string of failed campaigns.The company jumped the gun on this one. Thanks to a friend in the web operations department, he was able to show that the network was at fault. Even though the company load tested diligently, they only did from their internal network. It turns out the problems were related to the last mile - something that was hidden until the company implemented synthetic monitoring.
Even though overall sentiment was a little more negative than usual during the campaigns, the conversion rates skyrocketed once better transit was installed.
This is a scary one and a true. If you haven’t heard, sitemeter took down every single website that were a client of theirs. If you were on IE and wanted to access sites like TechCrunch, Gizmodo and so on, you were out of luck in August, because the code crashed the browser.
Think about it - your site isn’t just vulnerable to whatever goofy code your development team throws at the Internet, it’s also vulnerable to your very own web analytics tracking codes!
This would take hours of troubleshooting to reveal without synthetic monitoring - or one simply alert would be triggered with the proper tools in place.
I don’t mean to pick on SiteMeter btw, I’m sure they have a great service - but these types of errors can kill substantial amounts of revenue until you catch it.
Once upon a time, performance was a dark art. We struggled to deliver “good enough” without really knowing why.
We managed by anecdote. We were sure faster was better, but we couldn’t tie it to specific business outcomes.
The notion that speed is good for users isn’t new. The concept of “Flow” – a state of heightened engagement that we experience when we’re truly focused on something – was first proposed by mihaly csikszentmihalyi
There’s a study from IBM in 1981 that shows strong evidence of the relationship between performance and productivity. As systems get faster, users get EXPONENTIALLY more productive.
It turns out that attention and engagement drop off predictably. At ten milliseconds, we actually believe something is physically accessible – think clicking a button and seeing it change color. At 100 milliseconds, we can have a conversation with someone without noticing the delay (remember old transatlantic calls?) At a second, we’re still engaged, but aware of the delay. At ten seconds, we get bored and tune out, because other things come into our minds.
It turns out that attention and engagement drop off predictably. At ten milliseconds, we actually believe something is physically accessible – think clicking a button and seeing it change color. At 100 milliseconds, we can have a conversation with someone without noticing the delay (remember old transatlantic calls?) At a second, we’re still engaged, but aware of the delay. At ten seconds, we get bored and tune out, because other things come into our minds.
It turns out that attention and engagement drop off predictably. At ten milliseconds, we actually believe something is physically accessible – think clicking a button and seeing it change color. At 100 milliseconds, we can have a conversation with someone without noticing the delay (remember old transatlantic calls?) At a second, we’re still engaged, but aware of the delay. At ten seconds, we get bored and tune out, because other things come into our minds.
It turns out that attention and engagement drop off predictably. At ten milliseconds, we actually believe something is physically accessible – think clicking a button and seeing it change color. At 100 milliseconds, we can have a conversation with someone without noticing the delay (remember old transatlantic calls?) At a second, we’re still engaged, but aware of the delay. At ten seconds, we get bored and tune out, because other things come into our minds.
It turns out that attention and engagement drop off predictably. At ten milliseconds, we actually believe something is physically accessible – think clicking a button and seeing it change color. At 100 milliseconds, we can have a conversation with someone without noticing the delay (remember old transatlantic calls?) At a second, we’re still engaged, but aware of the delay. At ten seconds, we get bored and tune out, because other things come into our minds.
How much was fast enough? It was anybody’s guess.
And guess they did.
This is Zona’s formula for patience, the basis for the “eight second rule.” Unfortunately, things like tenacity, importance, and natural patience aren’t concrete enough for the no-nonsense folks that run web applications.
And guess they did.
This is Zona’s formula for patience, the basis for the “eight second rule.” Unfortunately, things like tenacity, importance, and natural patience aren’t concrete enough for the no-nonsense folks that run web applications.
IT operators and marketers are completely different people. What convinces an IT person to fix performance doesn’t convince a marketer. They want to know how it will impact the business fundamentals.
By now, we know that everything matters. Usability, page latency, visitor mindset, and even sentiment on social media platforms all contribute to the business results you get from a site.
Imagine for a minute that you’re the mayor of a sleepy little beach town. You track all kinds of things about the city (because you’re an analyst.) You track tourism. And drowning rates. And hotel room vacancies. And ice cream consumption. And grains of sand. And all kinds of things.
You track tourism. And drowning rates. And hotel room vacancies. And ice cream consumption. And grains of sand. And all kinds of things.
You track tourism. And drowning rates. And hotel room vacancies. And ice cream consumption. And grains of sand. And all kinds of things.
You have a problem with drowning, and you’ve ruled out the usual causes.
One day, someone is crunching ice cream numbers
They notice there’s a correlation between ice cream and drowning.
They notice there’s a correlation between ice cream and drowning.
They notice there’s a correlation between ice cream and drowning.
This is useful: Knowing icecream consumption trends, you can predict demand for funeral homes
Or tell local merchants how much ice cream to stock based on drowning rates. You have CORRELATION, which can be used to make predictions.
But what’s really going on? It turns out that both icecream and drowning are correlated to something else -- something causal: summertime.
One day, someone points out that there’s a correlation between ice cream and drowning.
One day, someone points out that there’s a correlation between ice cream and drowning.
One day, someone points out that there’s a correlation between ice cream and drowning.
One day, someone points out that there’s a correlation between ice cream and drowning.
Knowing this, you can minimize deaths (through CPR)
or lifeguards
And maximize ice cream sales (perhaps by locating them near lifeguard stands just to be sure.)
One day, someone is crunching ice cream numbers
One example of this is performance experimentation that Google’s done. Google’s a perfect lab. Not only do they have a lot of traffic, they also have computing resources to do back-end analysis of large data sets. Plus, they’re not afraid of experimentation – in fact, they insist on it. So they tried different levels of performance and watched what happened to visitors.
The results, which they presented at Velocity in May, were fascinating. There was a direct impact between delay and the number of searches a user did each day – and to make matters worse, the numbers often didn’t improve even when the delay was removed. You may think 0.7% drop isn’t significant, but for Google this represents a tremendous amount of revenue.
Microsoft’s Bing site is a good lab, too. They looked at key metrics, or KPIs, of their search site.
They showed that as performance got worse, all key metrics did, too. Not just the number of searches, but also the revenue (earned when someone clicks) and refinement of searches.
Shopzilla overhauled their entire site, dramatically reducing page load time, hardware requirements, and downtime.
They saw a significant increase in revenues
The site improvement increased the number of Google clicks that turned into actual visits
It also affected search engine scores. By improving load time, search engines (in this case Google UK) “learned” that this was a good destination. We, and many others, had been claiming this for a while; but Google refused to acknowledge it officially.
Google finally owned up that this was in fact the case -- on April 9 of this year.
By tying performance and availability to Key Performance Indicators – KPIs – business and operations can finally have a conversation.
But KPIs are different for different sites.
Strangeloop agreed to set up an experiment using their technology which would help measure this.
First, traffic. Despite splitting visitors to be optimized and unoptimized evenly, we had many more optimized sessions captured by the analytics. This may be a result of slower-loading pages failing to execute the analytics script, or abandoning the visit before the page had time to load.
Unoptimized visitors are roughly 1% more likely to leave the site immediately, without proceeding to other pages.
The unoptimized visits consisted of more new visitors than the optimized ones did. While this might seem counter-intuitive, remember that these are visits:
This likely means that optimized visitors came back more often.
Optimized visitors spent more time on the site
And looked at more pages during their visit – if you’re a media property, this means more impressions for your advertisers.
On a second e-commerce site running roughly the same experiment, conversions were 16 percent higher and orders were 5.5% higher.
Ultimately, you want a single, comprehensive view of your web presence across all of these platforms in order to make good decisions and communicate what you’re doing to the rest of the organization.