This article won’t recount the history of Safari’s Intelligent Tracking Prevention (ITP) initiative, nor will it offer explicit guidance on implementing one of the many workarounds which are now proliferating. Instead, my goal here is to provide you with a digestible summary of what the forthcoming update to ITP means for you, as a site owner, and what you need to be doing to prepare for it.
Many analysts will first encounter the concept of scope in Google Analytics while exploring custom dimensions and metrics. But scope is one of the fundamental characteristics of GA’s data model, and a thorough understanding of it will enable you to use the platform far more effectively!
Instead of wondering why there isn’t a Pageviews metric on the channels report, you’ll understand this as something which follows logically from how GA collects and stores data. What previously seemed like frustrating limitations will make complete sense. Understanding the interplay between dimensions, metrics, and their scopes is a hugely powerful skill for an analyst, so let’s get started.
Over the last few years, the Google Tag Manager product team have done an incredible job responding to requests and developing the platform. From enterprise functionality like Zones to time-saving features like RegEx table variables and element visibility triggers, the past 18 months have seen GTM go from strength to strength.
That said, in the spirit of the festive season – specifically the issuing of outrageous demands – I’ve decided to put together a list of everything we don’t have yet. This is by no means a comprehensive list, nor is it in any particular order; instead, it’s a mixture of little time-saving improvements and major new features which would benefit power users.
Today I’m going to show you how to harness data on the real-world usage of your website to shed light on what’s breaking and where.
Last week, Google announced that in July 2018 it would make another major stride towards the complete normalisation of HTTPS encryption. Version 68 of the Chrome browser will be the first to explicitly mark all HTTP pages (i.e. every URL served over the legacy protocol) as “not secure”. Operating a secure checkout on a predominantly insecure site is no longer a viable option.
While moving to HTTPS is easier and cheaper than ever before, it is nevertheless vital that any protocol migrations be carried out carefully and with SEO oversight. The onus is on you to ensure a smooth transition, and one of the most common roadblocks is mixed content.
When it comes to direct traffic in Analytics, there are two deeply entrenched misconceptions.
The first is that it’s caused almost exclusively by users typing an address into their browser (or clicking on a bookmark). The second is that it’s a Bad Thing, not because it has any overt negative impact on your site’s performance, but rather because it’s somehow immune to further analysis. The prevailing attitude amongst digital marketers is that direct traffic is an unavoidable inconvenience; as a result, discussion of direct is typically limited to ways of attributing it to other channels, or side-stepping the issues associated with it.
Last week, the Google Tag Manager team launched the Element Visibility trigger. If you’re not excited by this, you should be. In this short post I outline how to use and configure this trigger and its associated new built-in variable types, and offer a few tips for how to derive actionable insight from element visibility tracking.
In the three and a half years since I launched this website, a lot has changed. Since I last wrote, I’ve refactored my CSS, switched hosting provider, changed my development toolset, migrated to HTTPS, moved to GitHub, and lots more. In fact, until recently, the only thing that hadn’t changed was my approach to actually adding content: while on other projects I’ve experimented with a variety of CMS and frameworks, this website has remained firmly hand-coded.