Judah Phillips makes an excellent argument for standardization of web analytics data that is more clear and concise than my own post on the topic. I still don't see people talking about industry-specific libraries of high-value events, though. Without these, practical interoperability between analytics products and peripheral products will be limited. A "purchase" event really isn't the same as a "flight reservation booked" event or a "credit card application submitted" event, and they shouldn't be treated or described the same. Without high-value business events, you're still just talking about the core measures and dimensions that are the same across all sites. Enterprise analytics solutions go well beyond these core dimensions and measures -- and so must any data definition standard.
I'm happy to see that there appears to be interest in this topic among the big thinkers. It won't be easy to accomplish, for sure. But I believe open standards will be beneficial to vendors in the long run.
Wednesday, October 03, 2007
More on WA Standards
Thursday, September 27, 2007
Thoughts on WAA's Standards Committee's Web Analytics Definitions Document
I finally had the time today to start poring over this document, released by the Web Analytics Association (WAA) on August 16th. I haven't finished reading all of it in detail yet, but I have these thoughts (well, ok, they're rants) so far.
Let me start by saying that I'm excited about the publication of this document, as it represents a step toward the development of the set of standards for web analytics data collection and data definitions that will be required for true interoperability of web analytics products, and products peripheral to to web analytics that, together, create true business problem solutions. It's a small step, but a step nonetheless. In my dreamland, I see a standards-based library of vertical-specific business events that can be used as the basis for collecting, reporting, analyzing, and integrating analytics data. Web analytics has to move beyond simple counts and ratios of dimensions based on browser/client actions, toward standardized handling and reporting of high-value business events.
So, with that in mind, here are my unfiltered thoughts (rants).
Firstly, I'm left wondering what the purpose of the document is. Is it to put a stake in the ground as to what the standards should be, or is it to act as a sort of "meta user manual" that takes all the variant vendor behaviors into account, attempting to make any standard definition wide enough to include any vendor's product? In cases, it seems to be the latter. Consider this definition of Visit Duration:
Should not the purpose of a standards document be to declare the standard? Explaining typical behavior belongs in a book about web analytics, not in a standards document. That's not to say that you're going to get all the vendors to agree on the "right" way to calculate visit duration. But if there isn't a standard (or if you aren't declaring one) it doesn't belong in the document (IMHO).
The length of time in a session. Calculation is typically the timestamp of the last
activity in the session minus the timestamp of the first activity of the session.
Secondly, I'm a little disappointed that several of the definitions are self-referential (if you will) -- they use the word being defined as a core part of the definition. Sometimes the definitions are almost ethereal. Of course being a long-time practitioner, I understand what is meant, but I think this will make it difficult for new practitioners and managers to grasp the definition. Here's an example from the definition of Dimension:
A general source of data that can be used to define various
types of segments or counts and represents a fundamental dimension of
visitor behavior or site dynamics.
What? IMHO, I should be able to use a standards document to develop an entirely new web analytics product, based on the standards. This doesn't tell me what a dimension really is, so I can't develop the product based on a standard.
To be sure, there's lots that's good about this document, and I don't mean to minimize that. But I would love to see this pushed more toward true standards and have some of the core definitions crystallized even further.
That's all for now. More later (maybe).
X Change Wrap Up
Eric Peterson wrote a very nice, big picture summary post regarding the X Change conference (thanks for the kudos on my huddle, Eric!). I have to agree with him, it was a very positive experience primarily because it was so interactive. I tend to get bored at conferences (both as presenter and listener) because what I crave is impassioned discussion, not a one way dissertation. No matter how much you know, there is always more to learn, even if you're the so-called expert in the room. X Change created the opportunity to have these impassioned discussions, and I, for one, took advantage of that. I learned a ton. And I think the people in my huddles learned a ton, too. Not from me, but from each other.
Here's Eric's closing statement:
I’ll leave you with this parting shot about X Change, a comparison I’m shocked that nobody smarter than I has already made:
- Emetrics is the Web 1.0 conference for web analytics where you will learn a ton and be very happy
- X Change is the Web 2.0 conference for web analytics where you will contribute a ton and be very satisfied
Well said.
Friday, September 21, 2007
X Change Day 1
Day one at X Change was pretty enlightening. I liked the huddle format, where leaders and participants were encouraged to engage in an open discussion on a topic, rather than fall into a presenter-listener relationship. Some huddle leaders were better at facilitating a discussion than others (some really just presented), but overall each of the huddles I participated in involved a good healthy dose of discussion and debate.
Of particular interest to me was the session on "deploying measurement systems across the global enterprise" with Judah Phillips. I came away from this session having synthesized this key idea:
In any global enterprise execution of a web analytics solution there are two frameworks (for lack of a better term) at work. Each framework contains multiple potential models. There's a framework for solution design models, and a framework for solution deployment models. The models in each framework can be mixed-and-matched -- there isn't a correlation necessarily between model 1 for design and model 1 for deploy. The combination of models that works for you will depend largely on the political and structural ecosystem you work in.
Here are the models:
Models for Design
- Decentralized business units or entities with a unified measurement model
- Decentralized business units of entities with a unique measurement model per bu
- Centralized business with a single measurement model
The benefit of model two is that everybody gets what they want.
Model 3 probably only applies if your businesses around the world are essentially identical, and managed from one HQ location. Can't think of where else this would apply.
Models for Deployment
- Crawl, Walk, Run (i.e. roll out basic analytics, then, as Judah put it, roll out dimensions that have meaningfulness to the business, then integrate external data)
- Deploy "meaningful" solution slowly across globe
Model 2, in my experience, is necessary when you have a decentralized organization that can not handle a quickly paced series of small changes, but instead offers you only one window per year (or less) to deploy a solution, or where the decentralized nature means that you have different windows at different times across the different parts of the enterprise around the world, preventing you from orchestrating a carefully controlled series of phases.
Of course, I think there are hybrids, too. I've worked with companies employing multiple combinations of the models for design and deploy...this is what I've seen. What am I missing? What other models are there?
Wednesday, September 19, 2007
More OLAP Fun
I've taken on a new project in the last few days. I'll be working to help an enterprise-class company integrate existing customer data into their Visitor Intelligence solution, allowing them to segment existing reports or build new reports, on the fly, with any combination of customer attributes from outside data stores and web analytics data from the page tag.
The power of the reporting and ad-hoc segmentation is as I wrote about here, but this is even more interesting because rather than segmenting and constructing reports only from data collected through page tagging, we'll be leveraging the power of web services and OLAP style reporting to integrate data about a single visitor from multiple databases and 3rd party systems.
I'll post more as the project moves forward.
Adding to Eric T. Peterson's Commentary on Dainow
So I've been keeping quiet on the Dainow post re: Google displacing all other web analytics "products". Partly because this has been fun to watch, but mostly because I work for one of the other supposedly "dead" competitors. I wanted to see where this landed before I got in the mix.
Eric Peterson's thoughts on this are spot on. Here's Eric:
Dainow demonstrates a near complete lack of understanding of web analytics and the web analytics marketplace. Google Analytics already dominates the market in terms of total domains coded, but dominance isn’t defined by the breadth of your coding, it’s defined by the success your customers have using your application!I'll go one further. What Dainow fails to see is the difference between a product and a solution, where a solution is a product and a set of services combined to solve a business problem. While the market well served by Google doesn't really require a "solution" as much as they simply need a tool to do a job, the customers served by the big players (my employer included) tend to need (and have the money for) services to ensure successful solution design, implementation, deployment, and adoption of the tool set and the business processes required to make good use of the tool set.
The farther you go up the market, out of mid-market and into true enterprise class solutions, the more this is true. In fact, I would argue that in true enterprise-class solution deployments (the area of consulting I specialize in) the services are more important than the tool set. The greatest tool in the world isn't worth anything if it can't be successfully deployed across a global enterprise with a standards-based approach. Google, neat tool that it is, is nowhere near displacing the few vendors who can play at this level.
See you at X Change?
I'll be at Semphonic's X Change conference in Napa, California on Thursday and Friday. I'm leading two "huddle" sessions on using web behavioral data to optimize customer experience and drive business result improvements.
I'm looking forward to some lively discussion, and I'm hoping to learn as much as I share. I'll post a summary of the discussion points, ah-ha moments, and key take-aways from each session.
Hope to see you there!
Tuesday, July 31, 2007
The Power of OLAP Reporting
With the announcement, today, of WebTrends new OLAP reporting solution (Visitor Intelligence) which adds reporting capability to the evolving suite of tools built on top of WebTrends warehouse architecture, I can finally talk about the power that is coming to the world of web analytics.
If you're not familiar with OLAP, or multi-dimensional reporting tools, the first page and a half of this article are worth a read. It's a good introduction to what's coming.
Essentially, OLAP tools, and WebTrends Visitor Intelligence is no exception, allow you to do deep, ad-hoc drilling and re-arranging of data, on the fly while maintaining the proper relationships and correlations between the dimensional data. While there are pre-configured reports, called Starting Points, in Visitor Intelligence that will meet the needs of "just give me the data" end-users, curious analysts will find themselves in a playground of possibilities.
Don't like like the order of the dimensions in the report you're looking at? Rearrange them, and the relationships stay in tact (just like in a pivot table). Don't like the measures that are in this report? Grab any available measure and drop it in the report. No longer are you confined to defining your report view, and being stuck with that. Nor are you working in a cumbersome environment where the relationships between data are not clearly represented and easily manipulated. The real power is that with OLAP reporting, all you need to know is which dimensions you want to report on, and which measures you want to report. From there, you can construct whatever report you want, on-the-fly, or you can set up starting points that are essentially pre-built reports. Also, the ability to create custom measures on-the-fly is absolutely awesome. No processing time, no analyzing. It's just there.
Here's a real-life example from a customer I've been working with to develop a robust reporting solution using Visitor Intelligence. This customer has a globally distributed and decentralized online business, which is organized roughly by regions of the world (each country is a division), brands operated by each division, and customer groups serviced on each site. Of particular importance to the customer is understanding how much of each division's business comes from a country other than where that division operates, and what services those "out-of-country" customers are consuming. This insight will help the business better understand who their customers are, and how to market to them.
In this case, the customer has tagged each of their web sites with a single, universal meta-data model that describes each and every web page in the world, and how it fits into the global organization. The model passes values for the region, country, and division, in addition to descriptive data about product lines and the divisional business units offering the product lines. The result is that we collect a rich set of data easily turned into dimensions in an OLAP environment. The icing on the cake is visit and visitor geo-location data built in to WebTrends that allows us to determine who is "out-of-country" in a particular visit, and who is not.
Upon launch the business will have both default report views tailored to their specific needs, and the flexibility build exactly the right views of the available data. User A, a global business manager who wants to see Unique Visitors by Country, Division, Brand, and Product can easily create that view. User B, a product manager, can build a view that shows Visits and Unique Visitors by Product and Brand. And User C, an analyst, can build a view showing Visitors, Visits, and Visits per Visitor for "out-of-country" visits only broken down by Division, then Country of Visitor, then Product usage broken down to Business Units offering that product.
I've never before worked with a web analytics tool that is this powerful, and opens up so many possibilities -- and I've worked at three different analytics vendors. It still comes down to business results, though, and what a full-featured OLAP solution brings to the table is the ability to easily explore and manipulate the data to discover the insight needed to make business improvements with measurable impact.
Tuesday, April 24, 2007
Monday, February 05, 2007
Join me for 5 Insights from Online Marketing Experts
Join me on Thursday, February 8th at 11:00 AM PST when I'll be delivering the BtoB webcast 5 Insights from Online Marketing Experts.
In this webcast, you'll learn how to:
- Profitably acquire new customers through search marketing automation
- Develop a consistent marketing measurement framework
- Leverage KPI dashboards to keep your finger on the pulse of your performance
- Segment your customers and leverage cross channel data to target them effectively
- Build profitable, long-lasting relationships with your customers
Update: 2/14/2007
The webinar went very well. Most exciting was the fact that there were lots of good questions from the audience. Many of the questions centered around KPI's. I came away from that with a very clear picture of the lack of clarity that still exists about what KPI's are, let alone which ones we should be looking at as marketers and business owners.
Here's what you need to know about KPI's:
- They are KEY performance indicators, not just any-old-performance-indicators. Internalizing that distinction will help you weed out data that's good to have, but doesn't belong on a KPI scorecard or dashboard.
- They should reflect performance of the drivers of your business that you can influence. As an e-commerce marketer or business owner, I can influence the number of people arriving at my site, the quality of people arriving at my site, the % of those arriving who buy or transact, the average value of a transaction, the % of buyers who become repeat buyers, how many visits it takes before a shopper becomes a buyer, etc. And all of these things roll up to impact one number - revenue - the ultimate KPI. If you're not tracking revenue associated with your financial services products, you should be.
- If it measures something you can't influence, it isn't a KPI. For example, your business may be influenced by the weather. But you can' t influence the weather, so average daily temperature wouldn't be a KPI. It may be good supporting data that helps you make sense of of your business, but it isn't a KPI.
- KPIs will vary based on the model of your business, and what drivers influence your business. There are no universal KPI's, though there are many similarities across consumer products retailers and banking/financial retailers.
- KPIs shouldn't change unless your business model changes. Remember, they measure the drivers of your business. There aren't too many drivers when you boil it all down...we're talking 5 to 10 metrics.
- Everyone across the company should be looking at the same KPIs. Everyone has the ability to influence these 5 or 10 drivers of the company's business. That doesn't mean that a site producer won't have a scorecard tuned especially for his needs and his area of influence...he will. But his scorecard is subordinate to the corporate scorecard with the highest-order KPIs.