Many of you may have noticed a bit of a chessboard in the corner of the Google Analytics. It's quite subtle icon has been showing more and more over the past few months, and it is the nature of any attempt to use Google survives Data Analytics reports. At first it seemed only screens with advanced segments on, but recently began to appear even on the reports and data views of low sample sizes. An increasing problem is the fact that sites with higher traffic are forced to sampling mode for each report, and the default precision is balanced in the middle between “faster processing” and “greater accuracy.” A website with lots of traffic, with several months of data segment of the custom, even the “greater precision” can bring the transmission of data based on less than 25% of the actual movement. When Big Data means Shortcuts We have a problem of Big Data. Google is incredible amount of data collected, including Analytics properties worldwide, and it is understood that the load on their data centers is enormous. In order to save energy and reduce charging time processing, data sampling can be used in reports. Theoretically, the use of sampling should not reduce accuracy, because the data must be measured and statistically significant probability of accuracy. If you have a problem with the reports attempt, you can pay six figures for enterprise-class analytics, but most companies simply can not afford the premium Analytics. Beware of the chessboard Although in theory reports an attempt should be statistically accurate as unsampled reports in Google Analytics, you should question the assumption before trusting her. In the real world of reporting, there is actually a wide range of variation from one side to the other refresh during normal viewing of the report. This level of variation is unacceptable from the point of view of responsibility as pay per click advertising, marketing and e-commerce managers need accurate data to make critical decisions for marketing campaigns and strategies. Let's look at a real world example below. In this report, two date ranges are compared and custom segment remove advertising campaigns for sale is used. Movement for each date range is more than 460,000 visits. Default Sample Rate In the first report view, the sampling rate of the data used by default, which is based on 21.59% with a visit report. Please note that this is the sum of the visits to the two date ranges. The report calculates that the exchange rate is fell 4.01% month over month, from 3.36% to 3.23%. It should also be noted that the “Message list” goal of increased 3.10% month-over-month. Adjusted sampling Now look at this report, in which we chose the farthest setting to “High precision “for sampling. Still sample data, but it gives us a report on 41.54% of the total number of visits, which is almost twice the default size of the sample. It is noted that the actual conversion rate Rose 9.05% and 4.01% does not drop. “Message list” goal now shows 22.43% increase over the month, which is a little different than the 3.10% reported earlier.
This is just one of the many comparisons, but it shows the radical difference that the sampling data can have on your Analytics reports. The more visitors your site has, gets frequent sampling. In the above example, even choosing the maximum accuracy not give a result of all visits included. Movement on smaller websites, sampling may have little impact. Applications All users of Google Analytics need to be careful about setting sampling when viewing the report. Sampling will be happening more and more frequently for higher traffic sites, and when asked for more data in the report. It may be noted that the sampling will be 100% based on the report for the month of data, and it can be sampled in 60% of the sight of six months. Agencies must educate their customers about the differences and potential incompatibilities, home marketing, you need to educate other employees. Sampling may be significant emphasis on high volume sites to upgrade to Premium Analytics, but as long as the sampling understands and potential pitfalls are considered, free Analytics is likely to remain the standard for websites. Paolo Vidali is a digital media strategist, specializing in PPC, e-commerce and conversion optimization.
Recent posts Paolo Vidali ( see all)
Related Posting Search Result:
As artificial intelligence is the key to large data (0)
Big Data and how best to use it has become a perennial Post and endless debate about rarely come to a satisfa...
For big break in her look at the phone, security, big data (0)
IT salaries remain mostly stagnant in 2015, with the exception of employees of a desired skill sets, according to a re...
EMC names new CTO oversee strategy cloud, big data (0)
EMC called were Huawei Technologies Executive John Roese as Chief Technology Officer. Roese, Nortel Networks is al...
Hadoop, big data and business: match made in heaven (0)
Apache Hadoop has been positioned as a technology to help companies leverage big data, to learn about their custome...
It offers 5 Google Analytics for Business Owners (0)
It is wiser for small and medium businesses to have a website and use digital marketing as a means of gaining v...
9 Alternatives to Google Analytics (0)
There were a lot of problems with Google Analytics, which have been discussed in a variety of positions in the web, su...