, , , , ,

The number of sites on the web has grown exponentially in the past few years, so much so that the web has been inundated with information (Evoy 2017). Consequently, this has pushed webmasters to pay attention to what lies beneath the surface of their content and has thus brought to light the need to take action not to be lost to the masses.

This study is centered around Alexa Internet (referred to simply as Alexa here) which is an analytical tool that mostly collects its data through a toolbar installed by consenting users around the world (Alexa Internet 2017).

In order to evaluate the quality of the results obtained with Alexa, they will be compared to the results captured from other tools, most notably Google Analytics (GA) (Google 2017).

So then, where to begin in the complex world of web metrics? Here starts the journey of finding data that will allow this study to evaluate the real quality of Alexa, whether it be its transparency, its reliability, its trustworthiness or even how it stands up to its competition.  

Data : What to look for

First off, the manner in which the data is collected needs to be identified for each platform. In this study, three different manners are used: JavaScript, the toolbar and web log (Evoy 2017, Gawron 2017).

JavaScript is a method that can be associated with Google Analytics. Simply put, it’s a line of code that’s integrated to your website which allows GA to track the activity of numerous metrics and statistics attached the site. More recently Alexa has started using a similar method to collect data through a payment system that certifies the website.

The toolbar is an instrument installed by the user out of their own volition, which then proceeds to track and collect the information of each website that is visited to send it back to the establishment of said tool. This is the primary method used by Alexa.

Web logs are records that are collected through the server of the website, they are used by multiple tools. This method will hold less importance to the study than the others will.

After identifying the tools to use, data can be collected for each one using a specific sample of chosen websites.

This also means a set of metrics must be chosen, more specifically ones that will highlight the pros and cons of each tool.

However, not all of them have the same metrics available, hence the importance of creating a benchmark that presents all the comparable indicators.

Furthermore, some metrics will be useful when it comes to analyzing and then interpreting the data, such as “page views” (the number of pages that have been viewed by visitors), whereas something like “page value” (how much money is your page worth) will have little to no data attached to it (Kaushik 2007).

Data : Our experience

Thus far, a number of difficulties have arisen even though data capture has not yet begun. These issues have predominantly been related to establishing a sample of websites and the way the tools function. For the main two platforms (Alexa and GA) that were chosen, multiple setbacks were detected:

Alexa will only open up certain aspects of the metrics with the free to use version, which means the metrics that will be used have to be clearly thought out.

The manner in which Alexa works also has to be taken into account: it does not separate websites that are a part of the same domain, which means it creates a melting pot of all data based on a domain. This complicates things when it comes to the sample of websites because GA does differentiate the websites.

Another problem that arises with Alexa is that if the website is not significant enough there will not be sufficient data and thus it will render the comparison to GA superfluous.

Google Analytics requires the website to have signed up for its service for it to be available. It also requires that the website in question have its analytics made public (or for them to have been shared) in order to access the platform.  

Throughout the research project most of the data will be “on screen”, so it will obviously need to be captured and stored. As the data from each tool needs to be compared it will require simultaneously capture in order to insure cohesion. This could become problematic with the quantity of data that needs to be processed. To assure that no data is lost during the data gathering multiple methods will be explored such as data exportation or even website crawling.

Apples and oranges: There is no such thing as a perfect tool

No matter which approach is used, there will never be a perfect tool. Just like apples and oranges, the essence of the analytical platform will be the same as its counterpart, but numerous things will affect the outcome of the data and the results (Evoy 2017).

Depending on the goals you set for yourself, the tool should go hand in hand with them, hence sometimes producing the need to use multiple analytical means.

When weighing out the options some important elements to consider emerge:

  • Transparency: identifying the origin of the data
  • Reliability : using multiple tools to validate the results
  • Trustworthiness : interpreting the meaning of the information  
  • Flexibility : choosing a tool depending on the metrics available

Ultimately, to guarantee that this study is well and truly reliable, it will be important to be mindful of all the factors surrounding each tool that ends up being used.

by Megan Fuss and Sophie Johner


ALEXA INTERNET, 2017. Alexa an Amazon.com company  [online]. [Consulted 6 March 2017]. Available at: https://www.alexa.com/

EVOY, Ken, 2017. The Definitive Guide to Alexa. Business2Community [online]. 5 May 2017. [Consulted 18 May 2017]. Available at: http://www.business2community.com/online-marketing/definitive-guide-alexa-01834263#bXqQlRxGBYbyl9xD.97

GAWRON, Karolina, 2017. Infographic: JavaScript Tracking vs. Web Log Analytics. PIWIK PRO [online]. [Consulted 18 May 2017]. Available at: https://piwik.pro/blog/javascript-tracking-web-log-analytics/

GOOGLE, 2017. Google Analytics Solutions [online]. [Consulté le 19 mars 2017]. Disponible à l’adresse: https://www.google.com/analytics

KAUSHIK, Avinash, 2007. Web Metrics Demystified. Occam’s Razor by Avinash Kaushik [online]. 11 December 2007. [Consulted 8 April 2017]. Available at: https://www.kaushik.net/avinash/web-metrics-demystified/