Web analytics
This article needs additional citations for verification. Please help improve this article by adding reliable references. Unsourced material may be challenged and removed. (October 2008) |
Web analytics is the measurement, collection, analysis and reporting of internet data for purposes of understanding and optimizing web usage.[1]
Web analytics is not just a tool for measuring website traffic but can be used as a tool for business research and market research. Web analytics applications can also help companies measure the results of traditional print advertising campaigns. It helps one to estimate how the traffic to the website changed after the launch of a new advertising campaign. Web analytics provides data on the number of visitors, page views etc to gauge the popularity of the sites which will help to do the market research.[citation needed]
There are two categories of web analytics; off-site and on-site web analytics.
Off-site web analytics refers to web measurement and analysis regardless of whether you own or maintain a website. It includes the measurement of a website's potential audience (opportunity), share of voice (visibility), and buzz (comments) that is happening on the Internet as a whole.
On-site web analytics measure a visitor's journey once on your website. This includes its drivers and conversions; for example, which landing pages encourage people to make a purchase. On-site web analytics measures the performance of your website in a commercial context. This data is typically compared against key performance indicators for performance, and used to improve a web site or marketing campaign's audience response.
Historically, web analytics has referred to on-site visitor measurement. However in recent years this has blurred, mainly because vendors are producing tools that span both categories.
The remainder of this article concerns on-site web analytics.
On-site web analytics technologies
Many different vendors provide on-site web analytics software and services. There are two main technological approaches to collecting the data. The first method, logfile analysis, reads the logfiles in which the web server records all its transactions. The second method, page tagging, uses JavaScript on each page to notify a third-party server when a page is rendered by a web browser. Both collect data that can be processed to produce web traffic reports.
In addition other data sources may also be added to augment the data. For example; e-mail response rates, direct mail campaign data, sales and lead information, user performance data such as click heat mapping, or other custom metrics as needed.
Web server logfile analysis
Web servers record some of their transactions in a logfile. It was soon realized that these logfiles could be read by a program to provide data on the popularity of the website. Thus arose web log analysis software.
In the early 1990s, web site statistics consisted primarily of counting the number of client requests (or hits) made to the web server. This was a reasonable method initially, since each web site often consisted of a single HTML file. However, with the introduction of images in HTML, and web sites that spanned multiple HTML files, this count became less useful. The first true commercial Log Analyzer was released by IPRO in 1994 [2].
Two units of measure were introduced in the mid 1990s to gauge more accurately the amount of human activity on web servers. These were page views and visits (or sessions). A page view was defined as a request made to the web server for a page, as opposed to a graphic, while a visit was defined as a sequence of requests from a uniquely identified client that expired after a certain amount of inactivity, usually 30 minutes. The page views and visits are still commonly displayed metrics, but are now considered rather unsophisticated measurements.
The emergence of search engine spiders and robots in the late 1990s, along with web proxies and dynamically assigned IP addresses for large companies and ISPs, made it more difficult to identify unique human visitors to a website. Log analyzers responded by tracking visits by cookies, and by ignoring requests from known spiders.
The extensive use of web caches also presented a problem for logfile analysis. If a person revisits a page, the second request will often be retrieved from the browser's cache, and so no request will be received by the web server. This means that the person's path through the site is lost. Caching can be defeated by configuring the web server, but this can result in degraded performance for the visitor to the website.
Page tagging
Concerns about the accuracy of logfile analysis in the presence of caching, and the desire to be able to perform web analytics as an outsourced service, led to the second data collection method, page tagging or 'Web bugs'.
In the mid 1990s, Web counters were commonly seen — these were images included in a web page that showed the number of times the image had been requested, which was an estimate of the number of visits to that page. In the late 1990s this concept evolved to include a small invisible image instead of a visible one, and, by using JavaScript, to pass along with the image request certain information about the page and the visitor. This information can then be processed remotely by a web analytics company, and extensive statistics generated.
The web analytics service also manages the process of assigning a cookie to the user, which can uniquely identify them during their visit and in subsequent visits. Cookie acceptance rates vary significantly between web sites and may affect the quality of data collected and reported.
Collecting web site data using a third-party data collection server (or even an in-house data collection server) requires an additional DNS look-up by the user's computer to determine the IP address of the collection server. On occasion, delays in completing a successful or failed DNS look-ups may result in data not being collected.
With the increasing popularity of Ajax-based solutions, an alternative to the use of an invisible image, is to implement a call back to the server from the rendered page. In this case, when the page is rendered on the web browser, a piece of Ajax code would call back to the server and pass information about the client that can then be aggregated by a web analytics company. This is in some ways flawed by browser restrictions on the servers which can be contacted with XmlHttpRequest objects.
Logfile analysis vs page tagging
Both logfile analysis programs and page tagging solutions are readily available to companies that wish to perform web analytics. In some cases, the same web analytics company will offer both approaches. The question then arises of which method a company should choose. There are advantages and disadvantages to each approach[3].
Advantages of logfile analysis
The main advantages of logfile analysis over page tagging are as follows:
- The web server normally already produces logfiles, so the raw data is already available. To collect data via page tagging requires changes to the website.
- The data is on the company's own servers, and is in a standard, rather than a proprietary, format. This makes it easy for a company to switch programs later, use several different programs, and analyze historical data with a new program. Page tagging solutions involve vendor lock-in.
- Logfiles contain information on visits from search engine spiders. Although these should not be reported as part of the human activity, it is useful information for search engine optimization.
- Logfiles require no additional DNS Lookups. Thus there are no external server calls which can slow page load speeds, or result in uncounted page views.
- The web server reliably records every transaction it makes. Page tagging may not be able to record all transactions. Reasons include:
- Page tagging relies on the visitors' browsers co-operating, which a certain proportion may not do (for example, if JavaScript is disabled, or a hosts file prohibits requests to certain servers).
- Tags may be omitted from pages either by oversight or between bouts of additional page tagging.
- It may not be possible to include tags in all pages. Examples include static content such as PDFs or application-generated dynamic pages where re-engineering the application to include tags is not an option.
Advantages of page tagging
The main advantages of page tagging over logfile analysis are as follows.
- Counting is activated by opening the page, not requesting it from the server. If a page is cached, it will not be counted by the server. Cached pages can account for up to one-third of all pageviews. Not counting cached pages seriously skews many site metrics. It is for this reason server-based log analysis is not considered suitable for analysis of human activity on websites.
- Data is gathered via a component ("tag") in the page, usually written in JavaScript, though Java can be used, and increasingly Flash is used.
- It is easier to add additional information to the tag, which can then be collected by the remote server. For example, information about the visitors' screen sizes, or the price of the goods they purchased, can be added in this way. With logfile analysis, information not normally collected by the web server can only be recorded by modifying the URL.
- Page tagging can report on events which do not involve a request to the web server, such as interactions within Flash movies, partial form completion, mouse events such as onClick, onMouseOver, onFocus, onBlur etc.
- The page tagging service manages the process of assigning cookies to visitors; with logfile analysis, the server has to be configured to do this.
- Page tagging is available to companies who do not have access to their own web servers.
Economic factors
Logfile analysis is almost always performed in-house. Page tagging can be performed in-house, but it is more often provided as a third-party service. The economic difference between these two models can also be a consideration for a company deciding which to purchase.
- Logfile analysis typically involves a one-off software purchase; however, some vendors are introducing maximum annual page views with additional costs to process additional information. In addition to commercial offerings, several open-source logfile analysis tools are available free of charge.
- For Logfile analysis you have to store and archive your own data, which often grows very large quickly. Although the cost of hardware to do this is minimal, the overhead for an IT department can be considerable. For example, if you run out of disk space your database may start over-writing old entries which can often be irreparable.
- For Logfile analysis you need to maintain the software, including updates and security patches.
- Complex page tagging vendors charge a monthly fee based on volume i.e. number of pageviews per month collected.
Which solution is cheaper to implement depends on the amount of technical expertise within the company, the vendor chosen, the amount of activity seen on the web sites, the depth and type of information sought, and the number of distinct web sites needing statistics.
Regardless of the vendor solution or data collection method employed, the cost of web visitor analysis and interpretation should also be included. That is, the cost of turning raw data into actionable information. This can be from the use of third party consultants, the hiring of an experienced web analyst, or the training of a suitable in-house person. A cost-benefit analysis can then be performed. For example, what revenue increase or cost savings can be gained by analysing the web visitor data?
Hybrid methods
Some companies are now producing programs which collect data through both logfiles and page tagging. By using a hybrid method, they aim to produce more accurate statistics than either method on its own. The first Hybrid solution was produced in 1998 by Rufus Evison who then spun the product out to create a company based upon the increased accuracy of hybrid methods [2][4].
Visitors Geolocation
With IP geolocation, it is possible to track visitors location. Using IP geolocation database or API, visitors can be geolocated to city, region or country level[5].
Click analytics
Click analytics is a special type of web analytics that gives special attention to clicks (Point-and-click). According to Google Analytics, "a 'Click' refers to a single instance of a user following a hyperlink from one page in a site to another"[6]. This definition is mostly accurate when talking about click analytics.
Commonly, click analytics focuses on on-site analytics. An editor of a web site uses click analytics to determine the performance of his or her particular site, with regards to where the users of the site are clicking.
Also, click analytics may happen real-time or "unreal"-time, depending on the type of information sought. Typically, front-page editors on high-traffic news media sites will want to monitor their pages in real-time, to optimize the content. Editors, designers or other types of stakeholders may analyze clicks on a wider time frame to aid them assess performance of writers, design elements or advertisements etc.
Data about clicks may be gathered in at least two ways. Ideally, a click is "logged" when it occurs, and this method requires some functionality that picks up relevant information when the event occurs. Alternatively, one may institute the assumption that a page view is a result of a click, and therefore log a simulated click that lead to that page view.
Other methods
Other methods of data collection are sometimes used. Packet sniffing collects data by sniffing the network traffic passing between the web server and the outside world. Packet sniffing involves no changes to the web pages or web servers. Integrating web analytics into the web server software itself is also possible.[7] Both these methods claim to provide better real-time data than other methods.
Key definitions
There are no globally agreed definitions within web analytics as the industry bodies have been trying to agree definitions that are useful and definitive for some time. The main bodies who have had input in this area have been Jicwebs(Industry Committee for Web Standards)/ABCe (Auditing Bureau of Circulations electronic, UK and Europe), The WAA (Web Analytics Association, US) and to a lesser extent the IAB (Interactive Advertising Bureau). This does not prevent the following list from being a useful guide, suffering only slightly from ambiguity. Both the WAA and the ABCe provide more definitive lists for those who are declaring their statistics using the metrics defined by either.
- Hit - A request for a file from the web server. Available only in log analysis. The number of hits received by a website is frequently cited to assert its popularity, but this number is extremely misleading and dramatically over-estimates popularity. A single web-page typically consists of multiple (often dozens) of discrete files, each of which is counted as a hit as the page is downloaded, so the number of hits is really an arbitrary number more reflective of the complexity of individual pages on the website than the website's actual popularity. The total number of visitors or page views provides a more realistic and accurate assessment of popularity.
- Page view - A request for a file whose type is defined as a page in log analysis. An occurrence of the script being run in page tagging. In log analysis, a single page view may generate multiple hits as all the resources required to view the page (images, .js and .css files) are also requested from the web server.
- Visit / Session - A visit is defined as a series of page requests from the same uniquely identified client with a time of no more than 30 minutes between each page request. A session is defined as a series of page requests from the same uniquely identified client with a time of no more than 30 minutes and no requests for pages from other domains intervening between page requests. In other words, a session ends when someone goes to another site, or 30 minutes elapse between pageviews, whichever comes first. A visit ends only after a 30 minute time delay. If someone leaves a site, then returns within 30 minutes, this will count as one visit but two sessions. In practise, most systems ignore sessions and many analysts use both terms for visits. Because time between pageviews is critical to the definition of visits and sessions, a single one pageview event does not constitute a visit or a session (it is a "bounce").
- First Visit / First Session - A visit from a visitor who has not made any previous visits.
- Visitor / Unique Visitor / Unique User - The uniquely identified client generating requests on the web server (log analysis) or viewing pages (page tagging) within a defined time period (i.e. day, week or month). A Unique Visitor counts once within the timescale. A visitor can make multiple visits. Identification is made to the visitor's computer, not the person, usually via cookie and/or IP+User Agent. Thus the same person visiting from two different computers will count as two Unique Visitors. Increasingly visitors are uniquely identified by Flash LSO's (Long Storage Objects), which are less susceptible to privacy enforcement.
- Repeat Visitor - A visitor that has made at least one previous visit. The period between the last and current visit is called visitor recency and is measured in days.
- New Visitor - A visitor that has not made any previous visits. This definition creates a certain amount of confusion (see common confusions below), and is sometimes substituted with analysis of first visits.
- Impression - An impression is each time an advertisement loads on a user's screen. Anytime you see a banner, that is an impression.
- Singletons - The number of visits where only a single page is viewed. While not a useful metric in and of itself the number of singletons is indicative of various forms of Click fraud as well as being used to calculate bounce rate and in some cases to identify automatons bots).
- Bounce Rate - The percentage of visits where the visitor enters and exits at the same page without visiting any other pages on the site in between.
- % Exit - The percentage of users who exit from a page.
- Visibility time - The time a single page (or a blog, Ad Banner...) is viewed.
- Session Duration - Average amount of time that visitors spend on the site each time they visit. This metric can be complicated by the fact that analytics programs can not measure the length of the final page view[8].
- Page View Duration / Time on Page - Average amount of time that visitors spend on each page of the site. As with Session Duration, this metric is complicated by the fact that analytics programs can not measure the length of the final page view unless they record a page close event, such as onUnload().
- Active Time / Engagement Time - Average amount of time that visitors spend actually interacting with content on a web page, based on mouse moves, clicks, hovers and scrolls. Unlike Session Duration and Page View Duration / Time on Page, this metric can accurately measure the length of engagement in the final page view.
- Page Depth / Page Views per Session - Page Depth is the average number of page views a visitor consumes before ending their session. It is calculated by dividing total number of page views by total number of sessions and is also called Page Views per Session or PV/Session.
- Frequency / Session per Unique - Frequency measures how often visitors come to a website. It is calculated by dividing the total number of sessions (or visits) by the total number of unique visitors. Sometimes it is used to measure the loyalty of your audience.
- Click path - the sequence of hyperlinks one or more website visitors follows on a given site.
- Click - "refers to a single instance of a user following a hyperlink from one page in a site to another"[9]. A growing community of web site editors use click analytics to analyze their web sites.
- Site Overlay is a techniques in which graphical statistics are shown besides each link on the web page. These statistics represent the percentage of clicks on each link.
Common sources of confusion in web analytics
The hotel problem
The hotel problem is generally the first problem encountered by a user of web analytics. The term was first coined by Rufus Evison explaining the problem at one of the Emetrics Summits and has now gained popularity as a simple expression of the problem and its resolution.
The problem is that the unique visitors for each day in a month do not add up to the same total as the unique visitors for that month. This appears to an inexperienced user to be a problem in whatever analytics software they are using. In fact it is a simple property of the metric definitions.
The way to picture the situation is by imagining a hotel. The hotel has two rooms (Room A and Room B).
Day 1 | Day 2 | Day 3 | Total | |
Room A | John | John | Jane | 2 Unique Users |
Room B | Mark | Jane | Mark | 2 Unique Users |
Total | 2 | 2 | 2 | ? |
As the table shows, the hotel has two unique users each day over three days. The sum of the totals with respect to the days is therefore six.
During the period each room has had two unique users. The sum of the totals with respect to the rooms is therefore four.
Actually only three visitors have been in the hotel over this period. The problem is that a person who stays in a room for two nights will get counted twice if you count them once on each day, but is only counted once if you are looking at the total for the period. Any software for web analytics will sum these correctly for whatever time period, thus leading to the problem when a user tries to compare the totals.
New visitors + Repeat visitors unequal to total visitors
Another common misconception in web analytics is that the sum of the new visitors and the repeat visitors ought to be the total number of visitors. Again this becomes clear if the visitors are viewed as individuals on a small scale, but still causes a large number of complaints that analytics software cannot be working because of a failure to understand the metrics.
Here the culprit is the metric of a new visitor. There is really no such thing as a new visitor when you are considering a web site from an ongoing perspective. If a visitor makes their first visit on a given day and then returns to the web site on the same day they are both a new visitor and a repeat visitor for that day. So if we look at them as an individual which are they? The answer has to be both, so the definition of the metric is at fault.
A new visitor is not an individual; it is a fact of the web measurement. For this reason it is easiest to conceptualise the same facet as a first visit (or first session). This resolves the conflict and so removes the confusion. Nobody expects the number of first visits to add to the number of repeat visitors to give the total number of visitors. The metric will have the same number as the new visitors, but it is clearer that it will not add in this fashion.
On the day in question there was a first visit made by our chosen individual. There was also a repeat visit made by the same individual. The number of first visits and the number of repeat visits will add up to the total number of visits for that day.
Web analytics methods
Problems with cookies
Historically, vendors of page-tagging analytics solutions have used third-party cookies sent from the vendor's domain instead of the domain of the website being browsed. Third-party cookies can handle visitors who cross multiple unrelated domains within the company's site, since the cookie is always handled by the vendor's servers.
However, third-party cookies in principle allow tracking an individual user across the sites of different companies, allowing the analytics vendor to collate the user's activity on sites where he provided personal information with his activity on other sites where he thought he was anonymous. Although web analytics companies deny doing this, other companies such as companies supplying banner ads have done so. Privacy concerns about cookies have therefore led a noticeable minority of users to block or delete third-party cookies. In 2005, some reports showed that about 28% of Internet users blocked third-party cookies and 22% deleted them at least once a month,[10]
Most vendors of page tagging solutions have now moved to provide at least the option of using first-party cookies (cookies assigned from the client subdomain).
Another problem is cookie deletion. When web analytics depend on cookies to identify unique visitors, the statistics are dependent on a persistent cookie to hold a unique visitor ID. When users delete cookies, they usually delete both first- and third-party cookies. If this is done between interactions with the site, the user will appear as a first-time visitor at their next interaction point. Without a persistent and unique visitor id, conversions, click-stream analysis, and other metrics dependent on the activities of a unique visitor over time, cannot be accurate.
Cookies are used because IP addresses are not always unique to users and may be shared by large groups or proxies. Other methods of uniquely identifying a user are technically challenging and would limit the trackable audience or would be considered suspicious. Cookies are the selected option[who?] because they reach the lowest common denominator without using technologies regarded as spyware[citation needed].
Unique landing pages vs referrals for campaign tracking
Tracking the amount of activity generated through advertising relationships with external web sites through the referrals reports available in most web analytics packages is significantly less accurate than using unique landing pages.
Secure analytics (metering) methods
All the methods described above (and some other methods not mentioned here, like sampling) have the central problem of being vulnerable to manipulation (both inflation and deflation). This means these methods are imprecise and insecure (in any reasonable model of security). This issue has been addressed in a number of papers [11] [12] [13] [14], but to-date the solutions suggested in these papers remain theoretic, possibly due to lack of interest from the engineering community, or because of financial gain the current situation provides to the owners of big websites. For more details, consult the aforementioned papers.
See also
- Mobile Web Analytics
- Eurocrypt
- List of web analytics software
- Web bug
- Web log analysis software
- Online video analytics
References
- ↑ The Official WAA Definition of Web Analytics
- ↑ 2.0 2.1 Web Traffic Data Sources and Vendor Comparison by Brian Clifton and Omega Digital Media Ltd
- ↑ Increasing Accuracy for Online Business Growth - a web analytics accuracy whitepaper
- ↑ Hosted v software v hybrid web analytics tools - blog article
- ↑ IPInfoDB (2009-07-10). "IP geolocation database". IPInfoDB. http://ipinfodb.com/ip_database.php. Retrieved 2009-07-19.
- ↑ http://www.google.com/support/googleanalytics/bin/answer.py?hl=en&answer=32981
- ↑ Web analytics integrated into web software itself
- ↑ http://blog.clicktale.com/2009/10/14/what-google-analytics-cant-tell-you-part-1/
- ↑ http://www.google.com/support/googleanalytics/bin/answer.py?hl=en&answer=32981
- ↑ clickz report
- ↑ [|Naor, Moni]; [|Pinkas, Benny] (1998). "Secure and Efficient Metering". Advances in Cryptology - Eurocrypt 1998: International Conference on the Theory and application of Cryptographic Techniques. Espoo, Finland. http://www.wisdom.weizmann.ac.il/%7Enaor/PAPERS/meter_abs.html. Retrieved 2007-12-27.
- ↑ [|Naor, Moni]; [|Pinkas, Benny] (April 14-18, 1998). "Secure Accounting and Auditing on the Web". Seventh International World-Wide Web (WWW) Conference, 1998. Brisbane, Australia. http://www.pinkas.net/PAPERS/www7paper/p336.htm. Retrieved 2007-12-27.
- ↑ Franklin, Matthew; Malkhi, Dahlia (1998). "Auditable Metering with Lightweight Security" (PostScript). Journal of Computer Security, 1998 6 (4): 237–256. http://research.microsoft.com/~dalia/pubs/meter-ftp.ps. Retrieved 2007-12-27.
- ↑ Johnson, R.; J. Staddon (2007). "Deflation-secure Web metering" (PDF). International Journal of Information and Computer Security 1. http://www2.parc.com/csl/members/jstaddon/publications/03_Staddon.pdf. Retrieved 2008-12-01.
Bibliography
- Arikan, Akin (2008) Multichannel Marketing. Metrics and Methods for On and Offline Success. Sybex.
- Bradley N (2007) Marketing Research. Tools and Techniques. Oxford University Press, Oxford.
- Burby, Jason and Atchison, Shane (2007) Actionable Web Analytics: Using Data to Make Smart Business Decisions.
- Clifton, Brian (2008) Advanced Web Metrics with Google Analytics. (Paperback.)
- Davis, J. (2006) ‘Marketing Metrics: How to create Accountable Marketing plans that really work’ John Wiley & Sons (Asia).
- Farris, P., Bendle, N.T., Pfeifer, P.E. Reibstein, D.J. (2009) Key Marketing Metrics The 50+ Metrics Every Manager needs to know, Prentice Hall, London.
- Lenskold, J. (2003) ‘Marketing ROI: how to plan, Measure and Optimise strategies for Profit’ London: McGraw Hill Contemporary
- Kaushik, Avinash (2007) Web Analytics: An Hour a Day, Sybex, Wiley.
- Mortensen, Dennis R. (2009) Yahoo! Web Analytics. Sybex.
- Peterson Eric T (2004) Web Analytics Demystified: A Marketer’s Guide to Understanding How Your Web Site Affects Your Business. Celilo Group Media
- Peterson Eric T (2005) Web Site Measurement Hacks. O'Reilly ebook.
- Plaza, B (2009) Monitoring web traffic source effectiveness with Google Analytics: An experiment with time series. Aslib Proceedings, 61(5): 474–482.
- Srinivasan , J .(2001) E commerce Metrics, Models and Examples, London: Prentice Hall.
- Sterne, J. (2002) Web metrics, Proven Methods for Measuring Web Site Success, London: John Wiley & Sons.
- Tullis, Tom & Albert, Bill (2008) Measuring the User Experience. Collecting, Analyzing and Presenting Usability Metrics. Morgan Kaufmann, Elsevier, Burlington MA.
External links
cs:Webová analytika de:Web Analytics es:Análisis web fr:Mesure d'audience sur Internet pt:Web analytics ru:Веб-аналитика sk:Analýza webu fi:Kävijäseuranta zh:網站分析
If you like SEOmastering Site, you can support it by - BTC: bc1qppjcl3c2cyjazy6lepmrv3fh6ke9mxs7zpfky0 , TRC20 and more...
- Pages with broken file links
- Articles needing additional references from October 2008
- Articles with invalid date parameter in template
- All articles needing additional references
- All articles with unsourced statements
- Articles with unsourced statements from January 2010
- All articles with specifically-marked weasel-worded phrases
- Articles with specifically-marked weasel-worded phrases from August 2009
- Articles with unsourced statements from August 2009
- Internet marketing
- Web analytics