Welcome Mr. Viewthrough…

The time has come for Viewthrough.org to be folded into Tip of the Spear blog.

After many years after seeing the glaring problem with ad measurement we now have bigger fish to fry with rampant ad fraud. While clicks are loaded with fraud as a means to keep up appearances viewthroughs are very difficult to fake unless forms are being filled out and logins faked…not impossible but that veers into lead gen fraud.

That said, viewthrough is still an important measure for discerning digital marketers.

Common Sense Digital Ad Measurement

Been too long since the last post but hopefully more to come!

Digital analytics and ad fraud expert Augustine Fou provides an overview of why and how to start upping your digital advertising. Especially handy is the approach to getting started with simple on/off tests in different markets.

Getting Started is Better than Not

Not perfect but excellent training wheels for harried digital marketers in need of a starting point for an experimental design approach.

See: Context, Correlation, Coincidence, Causation; and plan to spend time on the considerable wealth of articles on the Fou Analytics Practitioners blog.

Response to Openly Selling Counterfeit “bAds”

Thanks to Dr. Augustine Fou for articulating in his latest post Openly Selling Counterfeit bAds, that clearly programmatic media has an ongoing IVT/NVT problem. In my experience, a toxic mix of weak technology, sloppy process and biased analytics are often to blame.

The bAds framework is clever, yet as presented it is Manichean: options are either a) $35 CPM “real pubs” or b) $3 CPM Cucci/Bluberry. The problem is that reality is more nuanced and this line of thinking risks “throwing the baby out with the bath water.”
To keep it simple there are three (at least) categories: 1) established media brands
2) legit mid-tier/indie “long-tail” sites and 3) clickbaity content farm type sites (“scammers”).

The subtle implication that advertisers “get what they pay for” might suggest to some that they are misguided, cheap and even penny-wise/pound-foolish. As uncomfortable truth as far too many certainly are! However, advertisers must manage the considerable digital ad execution risk, which in practice they do by preferring lower CPM programmatic media – if they can show it works. For some it does and for some others they are just going through the motions.

As an advertiser, that $35 CPM may or may not be overpriced. Value is in the eye of the beholder when it comes to programmatic advertising. Just because a given ad is being served up by a big brand “real pub” is not justification enough for paying 10x more especially since it can still be a bAd. Given the recent glitches that we’ve heard about (USAToday, Facebook, etc…) these “real pubs” have plenty of work to do on the basics let alone Fou Analytics-level housecleaning. Scarier is what we have not heard about!

While not an exact analog, in a prior life for a major retailer, we had occasion to compare performance of high CPM “real pubs” and a far lower CPM ad network (still around today). At the time we were piloting Adometry (pre-Google) to understand multi-touch attribution. Additionally, we leveraged an advanced integration between Adobe Analytics and the Doublclick ad server instance of our media agency (Digitas Chicago). It turned out that the “real pub” digital ad inventory (foisted upon us as part of TV buy) performed awfully compared to this ad network. As in, an order of magnitude difference in both traffic and ecommerce revenue (clickers plus viewthrough – even after calibrating for incrementality). As a second measure, we used algorithmic attribution (Adometry used Shapley value) where the story was even starker. We were able to see the actual contribution of various media relative to each other and the big brand media only made sense if the CPM rates could be renegotiated to a fraction of what they were getting.

With all that said, it seems dicey to blanket advocate that advertisers should spend more on $35 big media brands. Essentially, a “flight to quality” risk mitigation strategy. Still, it all comes down to nuance (again) while doing the hard work of defrauding programmatic media and defining upfront measurable objectives. Given the same ceteris paribus target audience and a choice of $35 CPM “real pubs” vs $3 CPM “TBD,” the choice is obvious.

Going beyond bAd/bAudience quality, most of these “real pubs” are arguably compromised by bContent across both news and entertainment categories. You might expect that bottom-of-the-barrel from “scammers” but the sad truth is that many big media are no better than mid-tier and in some ways are worse when it comes to “quality.” Big media news brands were built on years of journalistic integrity by being objective and presenting both sides of a story. That is a quaint relic of yesterday as most have sacrificed their brands at the partisan alter.

In any case, let’s hope that more real marketers do stand-up for better measurement.

And in the mean time, the question is: Do “real pubs” deserve a premium?

PS: Discerning digital marketers can up their game by immersing themselves in
Fou Analytic’s wealth of articles; make sure you have plenty of coffee or tea!

Anti-Big G Knuckleheads and Unintended Consequences

I recently learned of the Knuckleheads (yes that is their name) initiative. On the surface they advocate for more competition in search engines. However, after digging deeper what they are really about is enabling said competition through the creation a new, “US Bureau of the Shared Internet Cache.” And how? By essentially nationalize Internet search, specifically by confiscate Google’s successful Internet crawler search cache.

Now, past-readers of this blog are aware that I’m not a fan of Big G and 99% of the time I use Bing. However, it is said that two wrongs don’t make a right.

The beef is supposedly about the de facto preference of millions and millions of Web site owners – some big some tiny in terms of Web site traffic volume. Some of them explicitly give preference to Big G via their robots.txt configuration file disallowing (blocking) all the others! They claim that these independent site owners are biased and present detailed documented evidence.

The first problem is that what Knuckleheads call “bias” is the site owner’s freedom of choice. Predictably, the search wannabe’s don’t like that and presume to know better.

Lipstick on the Pig

All told, what these search engine wannabes are pushing to do is a really bad idea but all too familiar rent-seeking. Instead of developing something better they seek special privilege by appropriating the successful incumbent.

Here are a few reasons to consider that suggest darker motives:

  1. Robots.txt can be ignored by crawlers (think red herring)
  2. Calls for either outright property theft and/or a massive taxpayer gift to Big G
  3. Terrible precedent for more and new digital grifters
  4. Unintended consequence of likely disinvestment

First, robots.txt it truly is an optional guideline for crawlers and bots. According to Moz.com: “Some user agents (robots) may choose to ignore your robots.txt file.” New ones in stealth-mode, student CS projects as well as black hats ignore it. That means the supposed rationale doesn’t even jive.

Most Web sites with freely available content only explicitly prevent access unless through user authentication controlled by the Web server. Everything else is essentially “come and get it.” Would-be competitors to Big G can and do crawl Web sites which translates into serious misdirection on what Knuckeheads is really about. Kinda’ shady.

Second, the purported essential facilities doctrine that is referenced is a euphemism that economically resembles eminent domain in that government force is used for the taking. The difference is that eminent domain is laid out in the 5th amendment and requires just compensation. It makes sense that would-be competitors and their investors would be on board.

Perhaps less immoral is that Big G might get compensated for the taking. With real estate that is easy enough to do with recent comparables to establish price to be paid out for seizing the asset. How would the price be established for Big G’s Internet crawler cache? Politicians, bureaucrats and eager competitors of course. How would that be paid for? Most likely: either or both baking cost into some budget that is covered by the taxpayers or worse still, pumping up the money supply just a little bit more. It is also not hard to imagine the process becoming punitive as well, e.g. a one-time fine, covering legal costs or maybe ongoing administrative fees. As to which Faustian bargain, it just depends on the legal argument.

Next, would be the setting of a precedent, i.e. the path being cleared for even more Internet regulations built on the tortured logic of essential facilities. By ginning-up government enabled franchises – what could go wrong? For professional lobbyists this is a feature not a bug.

Last, as usual there are unintended consequences of such privitization of gain (the windfall to the Big G wannabes) at public cost (all searchers). In recent years economics has circled around to the idea of externalities which can be good as well as bad. Consider the example of demand shifts in the price of alcohol vs education. More expensive booze is supposedly better for society (definitely better for government revenues) while a more educated populace has “spillover effects.”

In their defense, the main positive externality of Big G is that everyone benefits from everyone else’s search behavior via their Page Rank algorithm. With regard to the Knukleheads, they want to put at risk the future quality of search to clearly benefit themselves and MAYBE benefit some abstract Internet searchers. Once the cache becomes public one problem gets solved but several others pop-up. For Big G, the incremental benefit of spending on crawler cadence, better algorithms or speedy servers just went through the floor. That means the future state is seriously put at risk with worse quality (file under: Tragedy of the Commons).

Be Careful What You Wish For…

End Result: Flexible Ethics

In the Knucklehead dream world, US taxpayers/Internet users trade the technical availability more search choice (that they were always free to use beforehand) from new players in exchange for a more invasive/powerful government bureaucracy, combined with either abject theft of private property or a big gift of cash to Big G (that said US taxpayers pay for.) Plus, the added benefit of likely disinvestment by Big yielding worse search quality in the future.

These flexible ethical standards are illustrated by the diagram below. Nearly everybody understands that #1 is immoral behavior: they will not put a gun to John’s head to force him to give up his possessions. Further they also get that it us wrong to pay a3rd party intermediary to do it as shown in #2. They still get that it is immoral, criminal and wrong for society. However, when we engage a 4th entity the ethics change and far too many are now OK with force being used.

Knuckleheads purport that their position is ethical – it is not. If they somehow stole the technology we’d all see it for what it is; even if they had an offshore team or hackers do the dirty work it would still be plain to see. Only by whitewashing through a 4th party, one that is supposed to be disinterested (the government) is it possible to pull this off. Knuckleheads is #3 plain and simple.

Visual Showing Illogic of Flexible Ethics

Alternatives

Disappointing then that there isn’t much in the way of free market solutions being proposed. They could incent site owners or Big G to modify their behavior. Behavioral economics and game theory could likely solve this problem without the heavy hand and nasty unintended consequences. Some thought-starters:

  • Open source consortium offering alternative cache
  • Tax incentives to make it more appealing for Big G (and any others) sharing cache or cloud server capacity
  • Crowdfunding campaign to support the effort
  • Education of the market
  • Etc…
Interesting Framework on Rent-seeking

I ran across the above framework which is telling and relevant. Seeking a monopoly from the government suggests is the strategy used by weak competitors and weak co-operators. Not a good look.

Not that I say it lightly, it would be better to break it up into many smaller businesses and force more competition/fairer dealings within the advertising, analytics, hardware, consumer products and data businesses. That is a tall order since nobody is forced to use Big G. Sadly, it is easier for Knuckleheads to find some contemptible career politician looking for a cause and a donation.

Bad Math Apparatchiks Find New Tax Bunny: Digital Ads

Magic Tax Bunny

Still reeling from regulations pushed by privacy zealots and the usual agitprop, here comes a disturbing development: a new digital advertising tax that continues the tradition of corrupt and poorly managed state governments to never be accountable nor live within their means. Here is the story from David McCabe @ The New York Times, Maryland Approves Country’s First Tax on Big Tech’s Ad Revenue.

A terrible precedent, with loads of unintended consequences it is too bad the authors could only find one tiny quote from the IAB on the considerable counter-argument to an advertising tax. Here are a few thought-starters:

  • More expensive ads: Regulation has a way of ensuring that the what is being regulated will be of poorer quality/worse level of service and of course cost more. Economist, Dan Mitchell nicely shows the trend. He could have added: K-12 education, drugs, banking and insurance.
  • More central planning, less free markets: It is interesting that digital ad revenue is being taxed in such a punitive way but that is proudly mentioned as a feature not bug, i.e. picking winners and losers. Do legacy media like broadcast, cable nets, radio, newspapers and magazines get a pass?
  • Collateral damage: What about the impact on the rest of the ad tech stack and media agencies that derive their revenue from a share or split of the ad revenue (as opposed to licensing fees)? More expensive advertising from advertisers will mean less advertising.
  • Tip of the iceberg: Looking past, Maryland: what about when the other 49 states that couldn’t balance their books before Covid that now decide to “get theirs”? Surely the Feds will want in on this.
  • Reverse uno card: Can’t wait for the first of rent-seekers say they want the tax…this typical counter-move is usually for brownie points while raising the costs for everybody else; that of course, stifles competition: think Amazon or unions being in favor of higher minimum wages.
  • Subsidies. Will “big tech media” then become too “big to fail” and require bizarre subsidies?
  • Hidden-costs. How will the new bureaucracy be administered? If the new department is paid for by the taxed then you can be sure there is an incentive to pack it with cake jobs filled by people looking to keep busy finding more to tax. Bottom-line: More civil servant jobs no doubt.
  • Anti-growth. Advertising facilitates business growth for the advertisers; that is why they do it. Unfortunately, when you tax something you get less of it that is “settled science”. That means taxing advertising will retard growth and therefore restrict job creation. That makes no sense but is sure as gravity. More from the Mises Institute’s Defending the Advertiser; file under screwing the pooch.

Sad but All Too Often True…

Here is the Association of National Advertiser’s take on the situation.

DAA Chicago Chapter Event | Intelligent Tracking Prevention

Safari was First but Firefox and even Chrome…?

Today, many businesses are tracking and programmatically targeting website interactions through cookies and other pixels across the web.  However, with the new GDPR regulations in the EU , people have become even more sensitive to how their data and web activity is being used by 3rd parties.  Enter Intelligent Tracking Preventions (ITP), created by Apple’s WebKit, that effectively limits 3rd party browser cookies from specific websites without users explicitly enabling it.

As practitioners in the field of web analytics, one of our greatest aims is to avoid losing the integrity of the data we are collecting that drives measurement and optimization.  With the release of ITP 2.2 earlier this year, 3rd party cookies expire in 24 hours.  Mozilla and even Google are considering similar moves to limit 3rd party tracking cookies. Join the DAA Chicago Chapter as our panel of experts cover:

  • How does ITP work?
  • What does the new version ITP mean to you?
  • Are there any actions you should immediately take to update your web analytics?

Drinks and Pot Belly sandwiches will be provided!

Speakers:
Prunil Patel, Measurement Lead at Google Prunil has over 7 years of experience in data-driven marketing and led strategic involvement & application of insights for practice areas including B2B, Social Media, E-Commerce, and CRMs. Currently he is the measurement lead at Google.

Domenico Tassone, Founder of Viewthrough Measurement Consortium Domenico is an analytics consultant with 25 years of experience across agency, client, consulting and ad tech sides of digital. In 1994 he co-founded Streams, one of the first Web agencies and developers of the Lilypad analytics platform. 

Dave Fimek, Lead Analytics Consultant at InfoTrust Dave works closely with InfoTrust clients to provide technical and analytic marketing services such as implementation of Google Analytics 360, development of key performance reporting, and development of unique solutions to complicated marketing problems!

Agenda: 4:30-5:00pm: Check-in, drinks & dinner 5:00-6:30pm: ITP Panel and Q&A 6:30-7:00pm: Networking
Registration: DAA Members: $5 Non-Members: $10

Start Date: 9/24/2019 4:30 PM CDT
End Date: 9/24/2019 7:00 PM CDT

Venue Name: 1871

Location:
222 W Merchandise Mart Plaza #1212
Chicago, IL  United States 

Organization Name: Digital Analytics Association
Contact: DAA Membership Team
Email: info@digitalanalyticsassociation.org
Phone: +1 781-876-8887

REGISTER HERE
https://www.digitalanalyticsassociation.org/ev_calendar_day.asp?date=9/24/2019&eventid=468

Part IV – Behavioral Data Trojan Horse

4th Post in a 6-Part Series About the Behavioral Data Land Grab

Big G astutely recognized the latent value of the data being collected by a TMS several years ago when they released their free GTM platform. Such precious behavioral data could be used to power ad targeting business directly and in a more clandestine manner using algorithmic modeling. In so doing they brilliantly disrupted the tag management system niche…not unlike what happened with Web site analytics.

For those already using Big G’s free analytics for measurement, GTM was another way to get locked-in to the platform thus reinforcing the original decision. Even more valuable to Big G was that a free tag manager might entice other that were using Adobe Analytics, Webtrends and IBM Coremetrics enabling their data ingestion footprint a second chance at net new behavioral data. Not surprisingly, these days Adobe, IBM, Webtrends and even Matomo all offer companion tag management systems of their own.

Data Leakage by Design: Tracking Tags Disguise as Analytics

Free is Not Without Cost

The fact is that Big G’s new global site tag is not needed on Web sites for clickthrough measurement of digital campaigns for two reasons:

#1: Clients already have this tracked in their own enterprise Web site analytics systems and much better at that…

  • Clients can already thoroughly measure paid search and other click-based digital channels with their own enterprise Web site analytics platform that were purpose-built for measuring activity on the site and linking it to sources of traffic. By using query string parameters appended to clickthrough URLs , site analytics campaign tracking relies on either a single (CID, SID) parameter or multi-parameter (utm_) values added on to landing page URLs. This enable Web site analytics systems to associate site behavior like engagement or conversions with specific sources of incoming traffic, e.g. visit rate by channel, bounce rate by publisher, registrations by keywords, etc. The fact is that neither Big G Ads nor Search nor AdWords tracking can provide this level of detail. Such tracking is usually 1st party and fairly reliable.
  • For this reason, site analytics platforms are superior for measuring on-site engagement in granular interaction detail (downloads, exists, video watching, form abandonment) as well as having a comprehensive view of all the digital channels that are causing site visits (social, email, direct, organic search, referrals). Such a view allows for basic attribution of conversion events back to sources of traffic. For these reasons and not withstanding Big G Analytics, it is impossible for Big G to have as detailed information as Adobe Analytics, Matomo, Webtrends or IBM Digital Analytics.
  • What’s more, if Big G Analytics is not use then the advertiser’s Web site behavioral data is private and only for the client to use. That means, it cannot be repurposed or data mined to enhance somebody else’s ad campaigns elsewhere…including the advertiser’s competition. That is like unwittingly being in a de facto marketing co-op.
  • Advertiser’s media agencies should be using the enterprise Web site analytics platform of record’s data for performance optimization. Often times, this does not happen and for a variety of related reason’s advertiser’s end up with multiple site analytics systems. These seeds once planted tend to yield digital measurement chaos and costs.


#2: Big G is already keeping their own count albeit just not as accurate – effectively due to 3rd party cookie limitations. Big G’s own confidential and proprietary technical pdf document entitled, “Introducing the New DoubleClick Global Site Tag” states that prior (and in lieu of it), that there has always been a portion of conversions being reported using statistical modeling to estimate since Big G cannot measure traffic subject to 3rd party cookie blockers/deleters.

Why do Advertiser’s Need to Spend Time and Money to Help Big G Conversion Counting to Become More Accurate When Advertisers Already Have a Universe Count of Their Own?

Big G Search/Ads/AdWords cannot measure the universe of all of a Web site’s conversions using their 3rd party Floodlight tag that Web site analytics easily track using 1st party tracking cookie method.

Learn more:

Bottom-line: The reality is that the global site tag only improves Big G’s measurement and does not inherently benefit client-owned site analytics reporting at all. In practice and by Big G’s design, getting GTM in place will often create dysfunction by encouraging stakeholder interest in using Big G Analytics as well as they operational cost of maintaining two TMS and maybe two site analytics systems.

In Other News…

With Big G’s Chrome browser privacy virtue signaling making their walled garden walls even higher, it will be interesting to see if an how the FTC plays the next move. Also of note, is the IAB pushing their DigitTrust solution for other ad tech stack players.

Next Post in Series: Part V – Implications for Digital Marketers

Part III – Big G & Media Minion Maneuvers

3rd Post in a 6-Part Series About the Behavioral Data Land Grab

Big G and many of their media minions are quick to point out that by using the new global site tag, they can then get around ITP’s 3rd party tracking limitations. The reason is that the GTM tag architecture tricks the user’s Web browser to treating it as 1st party by changing the context. The legacy DFA Floodlight tag cannot do this as it is a plain and simple 3rd party tracker in a 3rd party context. That DoubleClick impression cookie served up on ad delivery (on media publisher site) and then later checked for by the DFA Floodlight (on the advertisers site) is notorious enough at this point to be easily black-listed by blockers and anti-virus platforms.

Voted Most Likely to be Blocked, Deleted or Purged

Manufactured Crisis?

The global site tag (gtag.js) request often comes from the media agency team as a panicked rush to install the new code snippet – toute suite. The implication is that if these new global site tags are *not* used, then campaign measurement and therefore campaign performance will dramatically suffer or to become questionable. The implied benefit of the new global site tag is that at minimum, current paid search measurement accuracy will be better. What this really means is that Big G AdWords conversions (clickthough and (post-clickthrough) can be more accurately counted.

Most advertisers and their agencies will miss the nuance. They may not realize that simply showing more conversions in AdWords reports does not mean that Big G paid search actually caused more of them to happen. Questionable incrementality is a broader problem with paid search attribution and Big G’s walled garden of search performance data. That aside, showing the universe of conversions that an advertiser is already receiving more accurately only means that Big G’s AdWords reporting approaches the conversion tracking accuracy of site analytics like Adobe’s. Stated differently, Big G fixed their conversion tracking problem (caused by 3rd party blocking by ITP, plug-ins and anti-virus deleters) which before the global site tag relied on a predicted count. That is what has been reported out for years in AdWords. It is all about Big G more confidently taking greater credit for more of the conversions in their analytics system (not advertisers’).

Dropping the Ball: Who Do They Work for Anyway?

Instead of pushing back to Big G on behalf of their clients or suggesting alternative solutions, too many media agencies are not doing their diligence. They are pressuring clients to just go along with the request and merely parroting that Big G recommends this without much question. The implication is that the global site tag is needed for the media agency to measure better and there fore to do their very job. At the same time, most digital advertisers today do not want to provide their media agency with another reason for bad analytics and poor measurement. Meanwhile Apple ITP is conveniently blamed for the problem.

Judas Goat Leads the Sheep Up to Slaughter at Chicago Stockyards

Expected to be stewards of their client’s digital media business, this is an unabashed agency fail. All told, the new global tag combined with an expanded tag footprint on the Web site is a shifty way for Big G to also ingest more highly valuable behavioral data at the expense of digital advertisers. Even more unseemly is that this is a clever end-run to thwart advertisers that sought to limit Big G’s behavioral data access by not using their Analytics/Tag Manager products in the first place. Worse, the end result is dysfunction with analytics teams and hidden operational costs of a maintaining a redundant de facto tag management system.

Such conduct by those representing themselves as agents of marketers is disappointing. Unfortunately, it is consistent with the unflattering issues of undisclosed incentives and rebates from tech companies, media vendors and others that was revealed in the ANA Media Transparency Report of 2015. Digital advertiser clients themselves are not blameless: the buck needs to stop with them.

Next: Part IV – A Trojan Horse for Digital Marketers

Part II – Apple ITP: Convenient Scapegoat for Data Grab?

Apple Safari’s Intelligent Tracking Protocol

In September of 2018, Apple released Safari 12 and updated its Intelligent Tracking Prevention 2.0 as a follow-up privacy enhancement for their customers. The newer ITP 2.0 update was even stricter than ITP 1.0 (released in September of 2017) going from expiring 3rd party tracking cookies in 24 hours to immediate expiration, i.e. effectively blocking all of them. The ad tech and ad network industry is rightfully concerned about this existential threat to their way of doing business. However, it should not have been a surprise to them.

For digital marketers and particularly advertisers, canny moves by Big G are taking advantage of the “crisis” situation. With long-time business tentacles deep and wide, the intrinsic value and security of advertisers’ campaign and Web site behavioral data should be of particular concern. Consider how data once captured can be leveraged by and between paid media business (search, display ad network), ad stack (ad server, DSP, DCO) and the site-side stack (search console, analytics, attribution, optimization and tag management).

Super Basic Overview of 3rd Party Cookie Tracking

To recap, 3rd party tracking cookies are the kind of cookies that are used by ad tech/networks all or in part for analytics and optimization. These cookies are different from the 1st party tracking cookies used by Web sites that are actually visited by users to authenticate and personalize the user experience (custom content), e.g.  ecommerce sites and social networks. By contrast, 3rd party cookies are delivered to the browser by another and different Web server from the Web site domain that the user is accessing, e.g. Ford ads on the New York Times Web site. This is often called cookie “context.” Under the auspices of convenience, AdWords leverages an internal “link” to Big G analytics and the Big G ad server (DoubleClick/DART for Agencies/DFA/DCM/Big G Ads).

To put the tracking impact of ITP 2.0 on analytics and targeting in perspective, as of the time of this writing, Safari 12 (ITP 2.0) represents about 15-20% of all Web site visits for a typical brand with the prior Safari 11 (ITP 1.0) representing another 15-20%. That puts a combined total of 35-40% of Web site visits. In terms of unique visitors to a Web site, this is probably more like 15-20% that are at risk of being miscounted, misreported and therefore highly in-actionable. Maciej Zawadinski one of the founders of Piwik/Matomo, wrote an informative and objective piece on ITP and tracking.

For advertisers, it is also likely that there is a very distinct sociodemographic skew to this hard-to-reach and highly perishable user segment. The group tends to be higher education, more male and higher income. Think along the lines of an oldie but a goodie: ComScore’s Natural Born Clickers from 2007 updated in 2009.

Big G’s Timing and Blaming the Victim

Although 3rd party cookie tracking has been getting worse over the years, there has been a paucity of published research on this far-reaching problem. Despite having the biggest advertiser and publisher ad server market shares, Big G has nothing to say about how reliable their 3rd party tracking cookie really is in practice. Publishers and agencies are left to figure it out themselves though most do not have the time or resources. If the industry has such research it is sequestered at an account-level or within a QA group as it is unflattering at best and devastating at worst.

In one published study, programmatic tech company Digilant observed in 2012 that 3rd party ad cookies degraded quickly: by 43% in the first seven days and by 75% within twenty-eight days. That should not be a surprise with ad networks, ad servers, other analytics/targeting platforms’ cookies increasingly being blocked, blacklisted and actively deleted by users.

Instead, on Monday October 17, 2017 Big G proclaimed through the now retired DoubleClick brand’s blog:

The two options offered (click the above link and scroll down) are:

  1. Use the fake gtag.js (see:GTM disguised as a Floodlight)
  2. Avoid any such pretense and use the naked GTM with the easy linking capability

The instructions to expand their presence and put it up on *all* pages almost seem helpful.

GTM Presents As A Helpful Measurement Solution…But Helping Whom?

The inconvenient truth for many digital advertisers, their media agencies and 3rd party cookie-reliant ad servers/analytics as a result of the miscounting is that important digital metrics are severely compromised. Ad tech company Flashtalking regularly quantifies and publishes research on the inaccuracy of ad metrics can be that depend on 3rd party tracking cookies. Most recently, they bravely put user cookie rejection rate at 47%! For individual users they estimate 25% are rejecters. That means that when it comes to analytics and optimization, reach and viewthrough conversions are effectively overstated while frequency is understated. For more information, see Viewthrough.org’s recent post, 3rd Party Tracking Weakening, Viewthough Impacted.

What to Do About It?

Digital marketers do have options to improve both targeting and analytics without risking more Web site behavioral data leakage to hidden players in the ad tech stack. Fortunately for advertisers these days there is a new breed of ad server that is more transparent and uses more reliable tracking methods: Flashtalking, TrueEffect and possibly Sizmek. Are there others out there? Let TOTSB know!

Next up: Part III – Big G Media Minion Maneuvers

3rd Party Tracking Weakening, Viewthough Impacted

A recent Webinar by Flashtalking, an ad serving/attribution technology company brought up the digital media analytics elephant in the room: 3rd party cookie tracking by display ad servers is unreliable and providing poor measurement to advertisers and their agencies.

The slow and steady decay of cookies is nothing new (Digilant 2012 Cookie Deletion Study), but seeing actual numbers that impact specific metrics is troubling. For digital advertising analytics the quantifiable impact on digital advertising campaigns is severely compromised for several key metrics.

Ad server metrics like reach, frequency and viewthrough conversions are way off!

Given the focus of Viewthrough.org, it is worth specifically focusing on the relevant findings of Flashtalking’s recent Cookie Rejection Index research that was shared during the Webinar.

By testing legacy and their proprietary methods together in the same campaign, Flashtalking is able to observe a strictly 3rd party cookie-based count and a machine fingerprinting (plus cookie) count, compare them and then determine and report on actual 3rd party cookie measurement efficacy. Discerning digital marketers need to check for themselves and benchmark their current tech stack’s accuracy.

The typical Viewthrough Conversion Process

For the regular research study, Flashtalking analyzes 25 of their client’s brands, Flashtalking analyzes tracking simultaneously using both cookies and their proprietary machine fingerprinting methodology called the ftrack ID, which probablistically establishes digital identity on a given device. Not perfect but a major step forward and better than legacy ad serving methods.

Machine Fingerprinting 101

Before diving into the affected metrics, a quick primer on machine fingerprinting will be helpful. This alternative method of digital identification arose because of the problems with cookies which require a Web server to place them via browser on a user’s machine and then to retrieve them. Better analytics solutions and ad networks have been using this method for years. Unlike cookies, machine fingerprinting tracking cannot be rejected, blocked or deleted. However, they are not as precise hence the term probablistic (likely).

For those unfamiliar with this passive method of tracking, visit and test a Web browser on Panopticlick from the Electronic Frontiers Foundation. Machine fingerprinting relies on the unique combination of browser characteristics, e.g. plug-ins and extensions that are shared with the Web server being accessed. Nothing new there.

Recent Web browser test results showing a sample of the fairly unique combination of traits that enable probalistic identification

Unreliable Ad Server and Attribution Metrics

The fact is that legacy ad servers, certain analytics and multi-touch attribution platforms that have not dealt with this problem are providing junk metrics. None of them are providing any transparency on how bad their count really is or trend data. As a result most digital marketers are blissfully unaware.

Unreliable and unstable 3rd Party Tracking Cookie impacts ad servers, analytics, attribution platforms, ad networks and more.

On the flip-side, many ad networks have solved for this problem as it immediately impacted their bottom-line. Worse campaign optimization on their side leads to worse advertiser results. That said, rarely is it mentioned or internal research about it shared with digital advertiser clients or their agencies. Most are usually only concerned with results and don’t ask questions.

For their part, Flashtalking’s research is revealing just how bad it is and found the following ad metrics seriously compromised:

  • Conversions are understated by an average of 40% (this includes viewthrough conversions and by extension viewthrough visits)
  • Reach is overstated by 97% meaning that ad server reporting are exaggerates
  • Frequency is understated by 41% meaning that ad server reporting misses more than half the repeat impressions
If digital Reach/Frequency matter…better look elsewhere for accurate metrics.

The above figures include desktop, mobile and tablet device types. Flashtalking also provided additional reporting that compares desktop to mobile showing the true story. Ad server performance for mobile platform metrics is significantly worse:

  • Conversions understatement from 11% to 79%, about 7x worse for mobile
  • Reach overstatement from 26% to 128%, about 5x worse for mobile
  • Frequency understatement from 20% to 50%, about 2.5x worse for mobile

The impact of the above should be a cause for concern for digital marketers seeking to understand display’s passive branding effects, multi-touch attribution and even ad viewability. Considering that many digital advertisers are not incrementality testing or calibrating viewthrough measurement the metrics from ad servers are almost worthless.

Learn more about Big G’s solution to this problem at Tip of the Spear Blog.