Every once in awhile, we get a question about how Tracking First is different from Adobe’s inbuilt Classification Rule Builder (CRB). It’s a good question, and a pretty easy one to answer. The simplest and most important thing to say is this: use of Tracking First and Adobe CRB is not an either/or kind of a thing. In fact, a best practice would be to use both. The main dIfference is whether your tracking codes will be prepared and described before the campaign goes live (using Tracking First), or only after the campaign goes live (using just Classification Rule Builder).If you’re going to use Adobe CRB, it’s important to keep in mind that you need to take care that your ‘regular expression’ (aka your RegEx) does not overmatch. In other words, you must not use too broad a brush in your matching logic. If you do, you run the risk of inadvertently destroying or overwriting your previous rules -- creating values for classifications that never should have been created. In designing your rules, you have to think not only of what data matches your logic, but of what might accidentally match, and of what doesn't match. You can ruin good data accidentally by using a logic expression that isn’t constrained enough.
Tracking First interviewed Dominic Tassone of the Indegene Encima Group for this two-part series focused on the unique campaign tracking challenges faced by marketers in the pharmaceutical industry. Part one covered the challenge, and part two explores how pharma marketers are tackling it.Tracking First: Thanks for your helpful explanation of the unique tracking challenges facing pharma marketers. Can we talk a bit about the regulatory environment that impacts drug advertising, and how they work with it?DT: Pharma and medical device companies need to be very careful, particularly about how they market to consumers and to a lesser extent to physicians. All drug marketing has to pass medical, legal and FDA reviews. The regulatory hurdle creates another trickle-down of complexity, like requiring different collateral for consumers and practitioners. The drug companies have to be transparent and consistent with messaging to practitioners, while simultaneously working to create demand or stimulate interest on the consumer side.
It was a staple of the cartoons from my childhood: Seated on a river bank, an eager fishing enthusiast casts a line into the water and begins reeling in the line, imagining trout for dinner. Cue the laugh track -- what breaks the surface of the water is a sodden old boot.And so it is with marketing teams, enjoying the newfound freedom being pitched to them by various ad platforms. These platforms emphasize their ease of use in launching new campaigns. “You don’t have to wait for internally-generated Tracking Codes to deploy your marketing,” they say. “You can get the data you need with no hassle.” And marketers respond to it, because it’s mostly true. The vast majority of campaign tracking codes are no longer generated by human analysts, but by the Facebooks and Doubleclicks of the world. Within their ecosystems, these platforms accurately track and monitor, dutifully feeding data into the tag manager.But this presents a challenge to marketing analytics, one that can sneak up even when the tag management system is humming perfectly. When it comes time to analyze performance holistically, it works against your integrated marketing picture to have outside ad platforms creating cloned variations of codes that were carefully designed by the analytics team. Marketing teams don't realize that in reaching for "freedom," they’re also pulling in a lot of noise.
Have a look at this image. Sound familiar? Web analytics has held out the elusive promise of being a set-it-and-forget-it kind of thing. “Set up your reports, and the data will fill itself in.” That promise has largely held true -- for every part of web analytics except Marketing. That’s because with marketing, the web page you have today isn’t the one you had yesterday. There’s constant change: new information, new deals, new parameters. What everyone wants is a system that runs itself. Otherwise, as the figure shows, you spend all your time making sure the reporting is right. Spending time on data correction takes time away from the analysis that will really help the company. It’s a necessary evil. Wouldn’t it be great if we could get marketing data to the same set-and-forget kind of place as the rest of our web analytics?
In the 2017 world of IT and systems engineering, Test-driven development (TDD) is quickly becoming the new mantra. No one writes a line of code these days without the intent to have that code check/test itself. If there is bug in that code, it gets caught and fixed before it goes live, reducing any risk of breakage.This kind of system has never been deployed on the analytics side. By convention, analytics work has relied on hacks; quick and dirty patches that frequently go awry, and are just as likely to backfire and cut down the analyst, as to cut down her obstacles. If the analyst is winging it, to fill in a little gap in the proverbial data wall, he can unwittingly create a huge chasm with a single stroke. Bringing a TDD approach to analytics would go some way in changing that. It would require that whenever you make any change to your analytics, you make sure the change is fully tested before it’s deployed. This method takes more time -- and may frustrate management -- but will result in better quality control.
Looking at the world of campaign analytics in 2017, it can be challenging for anyone who didn’t grow up in the industry to make sense of its complexity. Seasoned analysts and marketers have a history with the technology, but they've often witnessed so much change so quickly that it can feel at times like someone snuck up and piled a bunch of new challenges on top of old ones, before solutions to the old problems were fully worked out.When it comes to URL tracking and campaign analytics, the tools at our fingertips are impressively precise. Not that long ago, the only data you could meaningfully derive from a referring URL was how much traffic you’re getting from various websites. At a high level it allowed companies to see which partnerships and publishing platforms were bringing eyes to their sites, but that’s about it.
I’m pleased to be one of the featured speakers at the upcoming Observepoint Analytics Summit. It’s a free, virtual event, and I hope you’ll sign up for my session. To get you excited about it, here’s a sneak peek at what I’ll be talking about: Closing the Loop on Data Validation.Everybody knows the secret to delivering quality data. You check it. You check it right before release. You check it every time a change is made to the campaign or the website, either through a dev release or a Tag Management release. You check it once it’s pushed to production. Then you put it on the list of things to check again periodically.
So many people have written about the pros and cons of Adobe versus Google Analytics (GA). A quick search on the comparison will bring in a huge numberof opinions. As a tracking code solutions provider with a considerable interest in the debate, what fresh perspective can we offer? Let’s start with a bit of background.Origin of GA--launched in November 2005
Task 3: Embed the Tracking Code within Landing Page LinksTracking codes are typically appended to landing pages in the query-string parameter section of the URL. Any time you click on a sponsored ad anywhere on the internet, you’ll see not just your destination’s domain in the URL, but usually a question mark, and a long, unintelligible character string. Somewhere in that morass you’ll find the tracking code, but its placement is something that companies configure independently of one another.