#104: Getting the Data Collection Right with Adam Greco

Have you ever had stakeholders complain that they’re not getting the glorious insights they expect from your analytics program? Have you ever had to deliver the news that the specific data they’re looking for isn’t actually available with the current platforms you have implemented? Have you ever wondered if things might just be a whole lot easier if you threw your current platform out the window and started over with a new one? If you answered “yes” to any of these questions, then this might be just the episode for you. Adam “Omniman” Greco — a co-worker at Analytics Demystified of the Kiss sister who is *not* a co-host of this podcast — joined the gang to chat about the perils of unmaintained analytics tools, the unpleasant taste of stale business requirements, and the human-based factors that can contribute to keeping a tool that should be jettisoned or jettisoning a tool that, objectively, should really be kept!

Links to Items Referenced in the Episode

Episode Transcript


00:04 Announcer: Welcome to the Digital Analytics Power Hour. Tim, Michael, Moe and the occasional guest discussing digital analytics issues of the day. Find them on Facebook at facebook.com/analyticshour and their website analyticshour.io. And now, the Digital Analytics Power Hour.

00:28 Michael Helbling: Hi, everyone. Welcome to the Digital Analytics Power Hour. This is episode 104. You know, it’s a poor craftsmen who blames their tools. This tired cliche gets rolled out from time to time in our industry and well, it’s definitely within the purview of this show to go into the differences between WebTrends 8 in Google Analytics. No, we’re definitely not going to. But it is the case a lot, and I mean a lot where companies abandon tools because they “don’t work” or “don’t meet our needs.” But are all of the analytics tools fundamentally flawed? According to Tim Wilson, yes they are. Hey, Tim.

01:10 Tim Wilson: Hey, Michael.


01:12 MH: I just… I didn’t actually ask you if you believe that, but it seemed like a good default position for you.

01:17 TW: I’ll take it.

01:18 MH: And Moe, do you even use any enterprise analytics tools? Or are you just using all those cool state-of-the-art SQL query database engine, deep learning algorithm AI, things.

01:33 Moe Kiss: Yeah. Howdy. That’s totally me.

01:35 TW: I knew it.

01:35 MK: Yep. I still use some tools and stuff.

01:38 MH: Excellent and I’m Michael and I don’t do analytics anymore, because I read this blog post one time that said web analytics is dead. Okay, but what is the actual story? In true consulting fashion, it truly depends but our guest thinks you might be abandoning your tool too quickly. Who, you might ask, would be so bold? Well, it’s just Adam Greco. You might know him as Omni Man…

02:07 TW: Omni Man.

02:08 MH: That top gun guy…

02:09 TW: Top gun guy.

02:10 MH: Day-to-day he is a senior partner at Analytics Demystified. He’s also on the Board of the Digital Analytics Association, and before that he was the guy in charge of all of the Omniture implementation at Salesforce, and before that he even worked at Omniture, itself. But wait, there’s more. He was also one of the co-hosts of the Beyond Web Analytics podcast, the spiritual predecessor to this show. You’ve read his blog, you’ve definitely have heard him speak at an analytics industry event. And finally, he is our guest. Welcome Adam.

02:46 Adam Greco: Thank you all for having me.


02:48 MH: Yeah. So do people get rid of their analytics tool too quickly? Is that what we all agree and why?

02:56 TW: Well for consultants you know, keep swapping you know. It’s great work.

03:00 MH: It’s great work, right? Like, “Hey, we wanna switch tools.” “Okay.”

03:03 AG: Well, I think… I think a lot of people do switch their tools pretty quickly. I think that it’s always easy to blame the tool. I’ve seen that ever since my early days at Omniture. The tool is the scapegoat of the analytics industry as I’m sure it is in many other industries as well.

03:20 TW: Didn’t you basically have a job of when somebody was unfairly blaming the tool, it was kind of your job to go and actually maybe check and see if really it was a tool shortcoming, or if it was a implementation and usage shortcoming?

03:34 AG: Yeah, yeah. Back in the old days, pre Adobe, when I worked at Omniture, I had the job, which was known officially as Client Success. But the real name of the job is I was known as the Wolf of Omniture. Very different than The Wolf of Wall Street and I will say, but more of the Harvey Keitel from Pulp Fiction wolf. And my job was to any time someone had called up Omniture and said, “We wanna get rid of Omniture SiteCatalyst and the tool stinks,” that meant I had to board a plane and go try to save the account from them leaving and going to some other tool, like Google. So in that role, I ended up visiting lots of countries, lots of cities, lots of companies. And I’d say over that time, I learned a number of reasons why people failed in analytics and not all of it was tool-based but most people their going in position was, that it was always the tool that was the problem. And since I worked for the vendor, I was the person who got yelled at for an hour in those fun meetings.


04:39 MK: Sorry, when you say the client said the tool was the problem. And I obviously have very limited experience versus yours and from a very specific company. What I don’t understand is that the marketer is saying that because it doesn’t do something? ‘Cause in my team, everyone’s always like, “Okay, how do we fix what we haven’t implemented well?” So is it more like the marketing and the product or whoever it is that’s making the decisions that are saying the tool isn’t working or is it actually the analyst type person in the business from your experience?

05:14 AG: Well, I could tell you back then, it was normally the person who was paying the bill for Omniture SiteCatalyst, who was saying, “Our renewal is up. I don’t feel like I’m getting a lot of value.” Then they go to the team and the team who’s supporting it, of course, they did nothing wrong so they never want to admit they did anything wrong. So that would quickly turn into a conversation of, “Well, Mr or Mrs. Executive, it must be the tool. So, let’s yell at the vendor for a while, and that’ll make us feel better.”

05:46 TW: Well did you run into people who… I mean, there are things that even a web analytics tool isn’t the best platform to necessarily answer, historically wasn’t. Especially, go back four, five, seven, eight years, I guess and there would be sort of, I feel like there were questions that would come up that yes you’d have to do something very specific in the implementation to capture it. Scroll Depth, or something. Somebody would come up, wake up one morning and say, “I wanna know Scroll Depth,” and they know we’re paying a bunch of money for this tool and they’d say, “Well. So I want a Scroll Depth report,” and didn’t like hearing the response that, “No, that’s not automatically… ” ’cause you can go two ways, you can say, “Let’s talk about what you think you’re gonna get from Scroll Depth. Hey, that’s a good point, let’s implement it. We’ll have that data in a week.” But it’s easier to say, “Well no, that’s not in our implementation, it’s actually maybe kind of tricky to get in a meaningful way, ’cause you’re not fully thinking through your requirement.” And maybe they read an article that had, it was ClickTale that was showing Scroll Depth. And so they said, “What do you mean? That Adobe or Google is not giving us this out-of-the-box?” I mean that… It seems like one of those avenues.

07:00 AG: So the way I look at it is, there are definitely cases where someone wants to do something that a tool can’t do. There’s no tool that’s perfect that does everything. But in my experience over, I can’t believe it’s been like 15 years, the reasons why I think people generally fail, in no particular order, is that, one, people leave the company and when they leave, they never tell anybody anything about what was going on in the implementation and how to use it. So there’s really no transfer. Most people I work with today. If I ask them, “Were you the one who actually did this initial implementation?” They’re like, “No way, that person was like five people ago.” Two…

07:39 TW: And it might have been a fine implementation at the time, but it wasn’t documented and it wasn’t necessarily maintained and that may be part of the reason they left.


07:47 AG: Yeah exactly. But also I think the other reasons that I see is there weren’t really good business requirements around the implementation. Like what were the questions that you actually wanted to answer or it might be that there were, to your point, Tim at one point, but businesses move pretty quickly nowadays and the questions that people want today are very different. When we first started doing analytics, there weren’t a huge portion of people using mobile devices. So now many requirements are more focused on mobile. I think there are cases where people just do a bad implementation, they hard code tags on page, they never heard of tag management systems, they don’t know about data layers and things get screwed up. And the one that I try to solve a lot, is the fact that sometimes people just don’t know even how to use the tool that they bought, whether it’s Adobe or Google and it’s kinda like you’re driving a car, but you don’t know how to drive, you don’t know all the gauges work.

08:44 AG: And that’s what I cover in a lot of my training classes is I always tell people that, learning how to use the tool that your company may have spent hundreds of thousands or millions of dollars on, that’s actually the easiest thing that you can solve. There’s a lot of other bigger things in our industry that are harder to solve. But I think those are the main reasons that I see people failing and once they fail, they tend to go right to the tool. My favorite story is, when I was at Omniture one time and we had a new customer who came to us and we had a kick-off call. This was, I mean, this had to have been late ’90s, and they basically said… Or no actually, they’re probably early 2000s, they said, “You know, we just… We had WebTrends and then that didn’t work, so then we went to WebSideStory, and that didn’t work. So now we’re going to Omniture and here’s how we wanna do it, because this is how we always do it.”


09:39 AG: And I thought they were kidding or that they were punking me at first and then they kept talking and they’re like, “Well this is how we’d like to do it.” I said, “Well why are we doing it the same way…

09:47 TW: Failed twice. [chuckle]

09:48 AG: If you’re on the third tool?” They’re like, “Well this is how we always do it.” I said, “And how has that worked out for you in the past?” So I just, I found it comical.

09:57 TW: Which is the challenge of a switching over. Like people are going from Google to Adobe, I’ve seen that happen. They’re so used to the page in Google is basically the URL with or without parameters stripped. And so they say, “Well what do you mean you want this page name to be a clear and concisely named thing, why don’t we just make the page name the URL because that’s what we had in Google.” And you kinda have to go back and say, “Wait a minute, you’re switching from one tool to another without wanting to think through the opportunities to actually think about your business or even switching from Insight, Adobe Insight to Adobe Analytics and having requirements saying, “Well, let’s replicate what we had with this other tool in the new tool.” I may start using that example saying, “Well, if you just wanna… ” You probably are giving something up. If you do switch, I don’t think it’s human nature to recognize that there will be something you liked about the old world that is gonna be not available or very difficult to get in the new world. It’s not like you get everything you had in the last and you get more and it’s better and cleaner and cheaper and faster. I don’t feel like as an industry we are very good about actually recognizing trade-offs.

11:19 MK: So that’s something that the Snowflake guys spent a lot of time chatting to us about with our Snowplow implementation, was that we basically took exactly what we had in Google Analytics, and we’re like, “Let’s just replicate that in Snowplow.” And the guys were like, “Is that what we should be… Is that the right thing to do? Like should we just be replicating it?” And we’re like, “But that’s the easiest thing to do,” and they’re like, “But that doesn’t mean it’s the best decision.” And that for me was… Yeah, a really big learning that it’s only just clicked in my head as you were saying that, then sorry.

11:50 AG: Yeah, I think when you do think about moving tools, if you’ve made the decision, “I’m gonna change tools,” to your point, Moe, it is a great opportunity for you to think what’s working, what’s not working and I always tell people to take what you love about your current implementation and you wanna bring that forward. But more often than not, I don’t think people make the decision on the right criteria. I think a new person becomes the CMO, they’ve always used Google Analytics, they think Adobe Analytics is ugly, so they’re moving to Google Analytics no matter what you say. But I think that’s… It’s tough. I mean, I had a company last… Two weeks ago, who said to me, “Alright, we’ve been on Adobe for years, and we wanna move to Google,” and I said, “Why?” And they said, “Well we’re heavy users of Salesforce and Google has a new integration with Salesforce.”


12:41 AG: And I said, “Okay, well, what part of the integration is it that’s most appealing to you? And they’re like, “Well, actually we haven’t really seen the integration, but we heard that they have an integration with Salesforce.” I said, “Well, does the integration meet your business needs, like what is that you’re gonna get?” Because you could actually integrate any tool with Salesforce. And it just jumped out at me that they hadn’t really thought through why they were thinking of making the change and it’s just more of a wholesale change and they had already done a whole feature by feature comparison. And I actually think doing a feature by feature comparison between analytics tools is the wrong way to go about it.

13:19 TW: It’s really… It’s almost impossible to do. I mean in a…

13:23 AG: Yeah. I mean I just think the better way to do it is to say, “Here’s the 50 business questions we wanna answer, which tool is better at answering these business questions?”

13:32 MH: Because I… So couple things, I love this conversation. I think there’s also companies set themselves up for failure because they don’t think about what it means to own and administer a tool. A lot of times, you know to your point earlier, Adam, this tool gets set up by someone, and then just left alone. It’s not how the web works, is not how digital works. New stuff is happening on websites, new functionalities coming out all the time. So your analytics implementation has to be a living, breathing implementation, as well as your website or your mobile app or whatever you’re measuring. And so people tend to think in this mentality of, I go, I set it up and then I’m done setting it up and now I’m just gonna use it. It’s gonna work perfectly from now on.

14:17 MK: I think that’s… Part of that is the stakeholders in the business, expect that. So basically, they’re like you spent a year implementing this so now you’re gonna spend all of your time using it. You’re not actually gonna have to spend your time making sure that everything’s running properly, queuing data, like tweaking, making changes, improvements. They’re like, “But you got all this time to implement and that’s done now, so now you’re gonna do the work with it.”

14:43 AG: But see that approach I think is dangerous because I think instead of thinking of implementation as being done, it’s like an ever-evolving thing.

14:52 TW: Yes.

14:53 AG: There’s kind of like… There’s some things that are done being implemented, but at the same time, you’re actually challenging stuff you implemented two years ago…

15:00 TW: Yes. Exactly.

15:01 AG: Do we still need this? And so I think there’s this ever-flow… Now, the actually ironic thing and again I’m kind of in the Adobe world more so, back when there were only 50 success event or 50 eVars. You know, when I worked at Salesforce, we were always trashing eVars. We were like, “Okay, we’re out of eVars again, so which ones are the ones that are on the chopping block, and getting the guillotine. And then it’s great that like you know these tools are getting better, where now you can store more data points. But I think it also makes you a little lazier.

15:32 S?: Oh. [chuckle]

15:34 AG: We had to be more nimble back then and really think about, is this eVar 42 worth it because we got a hot new requirement that needs an eVar and I used to prune our Salesforce implementation of Omniture, probably every three months I would chop stuff away. And there’s some stuff that you could analyze and you could find out the answer and it’s not gonna change that often, so you could check it a year later. You don’t need to have that data all the time.

16:02 MH: The creativity that comes from constraints.

16:05 TW: I remember having that. Actually I remember when they made that announcement, Adam, and you and I had a… I was actually… That was my reaction was, I was terrified that that was gonna just let people say, “Oh let’s make four versions of this eVar so we can get every possible allocation and expiration setting, which I’m thinking is gonna confuse the business user even more. Luckily, Adobe’s moved away where in many cases, you can kinda control that on the back end.

16:32 MK: But I just wanna return to it. So how do you get the business to understand that implementation is not set and forget? Like you guys are like, “Yeah, it’s constantly evolving,” but for the person that’s sitting in the business trying to be like, “I need to get time to review and improve.”

16:50 MH: Well, ideally, they would hire a big name, well-regarded industry expert consultant, like Adam Greco.


17:00 MH: They’ll believe him when he says it.

17:03 MK: Well, that’s true.

17:05 AG: No, but seriously. But seriously, I think Moe, you know the way I think about it is, and I’m not sure if everyone does it this way, but I tend to run implementations strictly by the business requirements. I think that is the glue that ties everything together. And so if you were to go to your stakeholders, ’cause using your example and say, “It’s been six months, here are the 50 top requirements we have right now. Let’s quickly run through these. Are there any of these that are not super important anymore? And then are there new ones that need to be added to this list?” And that’s what kills me when I go to companies is, even companies who I’ve worked with, and I force them to have a list of requirements. I’ll run into them two years later, and I’ll say, “How’s everything going? How’s your requirements list? And they’ll bow their head and be a little bit ashamed and they’ll say, “You know, we actually are using the requirements list you gave us but we haven’t added any new ones to it.” And that’s not the… And I think the point is that you wanna kind of have a list of things that are evolving with your business. That’s just again, how I view the world.

18:14 TW: But the business isn’t… I mean they’re not… I mean the idea of a business requirement is an abstraction. Like, I do see it as a… I’m kind of intrigued because I feel like Adobe implementations, the various different processes and approaches I’ve seen are all kind of have this, “Let’s come up with the requirements. Now let’s come up with our variables and let’s do the mapping to the requirements.” And that doesn’t seem to be the same sort of structure of how Google implementations are run, and I don’t know how Snowplow ones are but there’s still a layer of abstraction with the business requirements that when it comes to a business user, they’re saying, “I don’t know what question I’m asking tomorrow. You’re asking me to envision what data I’m going to want.” That initial… That list of business requirements, in my experience, have done well.

19:02 TW: Yeah, they can ask all sorts of questions they hadn’t even envisioned and they’re covered by the business requirements, but I think it is a legitimately, tough leap to say, “We have a new business requirement, this abstracted middle layer that if we solve it technically is going to enable a whole bunch of additional analysis or reporting.” I mean, I think I struggle both seeing it very effective, but then also seeing how it would be hard to say, “What new requirements do we have, outside of we’ve added new functionality or added something new to the site in just kind of the nature of the evolution of the business.” Does that make sense?

19:49 MH: I do think the business cycles change and what we care about as a business shifts over the years, from acquisition to retention and those kinds of things which then gives a different focus to how you wanna set up measurement and tracking. So I do think there are other forces at play beyond just sort of like “Hey, we added a new function to our website.”

20:09 AG: Yeah, and I don’t wanna… I know there’s a huge debate about whether you do requirements upfront, whether you don’t. There’s kind of the heap of the world that says, “Hey we’re just gonna track everything and we kind of figure it out later and so… “

20:21 TW: Well, that’s just where I’m headed.


20:23 AG: I think that’s a different… I think that’s a different discussion. But to your point, Tim, I think… I don’t think it has to be very formal. If Moe, for example, is in a meeting and someone says, “Here’s a question I have,” and maybe Moe’s never gotten that question before. Well, at that point, you can still add that to the list and say, “Here is a question that we got, that’s an interesting question. Do we have the data to answer this question today? Yes or No.” If we can’t hack something together to answer that, then I think if it’s important enough, it would go into a cycle and the cycle is then, “Okay, we need to do some tagging, we need to get the data all that and so on.” And at that point it iterates. But there’s always gonna be millions of questions you get that are just kind of one-off questions and some tools are better than others at that, but I still think having a list of the questions that your team can answer is good for your stakeholders. And at the end of the year if you wanna look back and say how much value the analytics team has provided, why not have a list of all the questions we’ve answered, who asked those questions and so on, I think that’s not a bad thing.

21:31 MK: So what about the cool… I say the cool, sexy new trend, because all of our analysts and data scientists are like, “Let’s move away from anything bought and let’s go to open source because then we’re gonna have the opportunity to put on our resume that we’ve contributed to an open source product.” Which seems to be like the new reason for switching tools.


21:54 MK: So I posed that situation. But then also in the context of, there’s a guy here in Sydney who’s a member of the analytics community, and he actually has the opposite approach where he refuses to use anything free because he thinks that unless there is budget for it, he won’t get resources and they won’t understand that they need to dedicate time to maintaining it. So he always pays for something, even if there’s a free option. What is… Like what’s your experience then on…

22:23 TW: We do have to be clear that nothing is free. So like, hey Snowplow… Snowplow’s free. I mean, I think he’s got one strategy for dealing with the gross misperception that open source software is free. You know, read the 27-part series by Simon on Event Taxonomy or whatever it was, on open source Snowplow. It’s not free, you gotta pay for some smart people to figure out how to implement it well but…

22:54 AG: Yeah, I mean I’ll… But, what I respond to that is, there’s… Let’s start with… You had a two-part question. The first… Let me answer the second one ’cause I’ve seen this a lot. Why do people wanna go with paid or in general, why would they stay with their existing vendor? Okay, ’cause it’s kind of the opposite of what we talked about earlier. If I see a company who is miserable with their analytics vendor no matter who it is, but they won’t leave, I’ve asked them why. Why are you staying with this turd, you know? And basically, the answers I tend to get are, I think somewhat lame, but are the, “We have year over year data, we can’t walk away from.” Which again, if you have your data in a data lake and you export it, you have that, but that is a huge one. I mean it comes up a lot.

23:36 AG: Two we’re too lazy to get another vendor through procurement and procurement takes a long time at our company. Three, the employees at our company know how to use one tool, and there are some people who, honestly, are so tool bigoted, that if they change the tool they will leave and go work at another company because they wanna work with Google or they wanna work with Adobe, they don’t wanna work with another one. And I think that the… Some companies love to have, I call it that one throat to choke, they want that vendor that they can call and they can complain and if they go with an open source they’re worried, like, “Okay, well, who am I gonna complain to?” But I’d say the biggest reason why people don’t go to your second point is, why not just go open source is I think a lot of people have that old IBM mentality of no one ever got fired for bringing in Google or Adobe. And if you go open source and you build your own, some people get a little nervous that they’re really putting themselves out there and if it doesn’t work, are they exposing themselves to a lot of risk and are they gonna be the one who gets fired when that happens? But I think… So, that’s why I see people kind of staying.

24:44 AG: But I think to your point, a lot of people who are entrepreneurial in the industry, wanna play around with all of the new tools out there, and I think there’s ways that if you work for a company and they use Adobe or Google, there’s no reason you can’t also play around with some open source stuff and eventually if you start doing more and more stuff there, then you can make a case to your company that says, “Hey, we’re able to answer 50% or 60% of the questions,” and then it comes into… Then it comes down to a cost issue, of… Do you… Would you rather pay for the tool, or you’d rather pay for some highly expensive data engineers.

25:18 TW: But the risk there though is right that ’cause I like that, that, “Hey, throw the open source one on as well, play around with it and, “Hey, here’s the stuff we can do with this that we can’t do with the other tool.” And then there’s… You have to be really careful that that doesn’t bubble up and all of a sudden it’s like, “Oh wait, you’re saying we can get a tool for free, that does everything our other tool does?” It’s like, “No, no, no. I said that there are some things that this open source tool does and does better than what we’re paying for, but let’s be clear of everything that the paid tool is doing.” And then if it’s like, “Great, keep both.” Like it’s always easier to have… To keep adding stuff as opposed to shedding a tool is just brutal. And paid tools, the people you’re paying, are gonna fight like crazy to not let you shed them which just becomes an increasingly complicated tech stack.

26:11 MH: Well, I have gone on record, and I’ll say it again, if I have to use Lotus Notes again, I will quit. So that is one technology that I’ve sort of like I draw the line.

26:21 TW: I’ve… I had a coworker from the ’90s…

26:22 AG: Michael, you’re killing me. You’re killing me. I was a Lotus Notes developer for five years.


26:29 TW: A co-worker from the ’90s, who posted on Facebook this last week that he was like, “For the first time since the ’90s, I have a machine that does not have Lotus Notes installed on it.” He’s from the same company and it was a very… It was a Facebook worthy note, apparently.

26:46 MH: Well, I think there’s another challenge where companies just don’t house…

26:50 TW: No, no, no. Let’s talk LotusScript ’cause I think you guys can drop off…

26:51 MH: Oh, okay.

26:54 MK: Jesus.


26:55 AG: Tim? Tim.

26:58 MH: Separate… Yeah, that’s right.

27:00 AG: Separate podcast just about Lotus Notes.

27:00 MH: Beyond email provider… Softwares.

27:02 AG: At commands, at functions.

27:04 TW: Oh, okay.

27:04 MH: So there is that challenge where a company just does a terrible job and they’ve had terrible support. I remember being on a call in the early days of tag management and the client says, this is their exact quote, “I don’t like this product. It literally slows everything down and breaks everything.” And I was like, “Oh, that’s a pretty big bucket. We should hone in on some of those things that it’s slowing down and/or breaking that’s everything. Let’s start making a list.” But it’s also that companies go in there, and the people that purport to be experts, go in and do a terrible job sometimes, across the vendor landscape, across the consulting landscape. So that’s one of the things we lack. We lack some standardization of what is a good way to use this tool. ‘Cause we can stretch these tools in a lot of different shapes and a lot of very industrious people have taken tools and made them into weird balloon animals.

28:03 MK: But here’s the crux of the problem: What happens when the company has already been, I don’t wanna say poisoned, to think something about a particular tool? And we had this experience with a BI tool that we had where everyone in the business was just like, “It’s slow, it’s rubbish, it takes 20 minutes to load my reporting.” And it didn’t matter how many times you were like, “It’s got nothing to do with the tool. It’s to do with the huge data set that it’s pulling from, the fact that someone’s built a really inefficient query, that if we spend time to rebuild it, it will be very evident.” But the business is so just past it. They’re already like, “No, we need something else.” How do you manage that?

28:46 AG: Yeah. That’s what I call, Moe, I call that the point of no return. And that’s what I saw a lot in my wolf days where I would talk to a company and I would tell them 100 ways that they were using, back then Omniture SiteCatalyst, the wrong way and that everything they wanted to do it could do, but even after I argued with them, they still said, “I just wanna get rid of the tool.” And personally, I was kind of annoyed. I’m like, “Well, why did you fly me here and was the whole point?” But anyway, I think sometimes you need to just get away from a tool, it’s like a relationship, you just need to get away. And I’ve actually seen a number of companies, who’ve gone away from one tool, gone to another tool and then ironically, outgrown that tool’s functionality and gone back to their original, the tool they had first. But now they have better process, they have all that stuff, but… And I think the thing that frustrates me the most about the whole tool debate is that if you switch to a new tool, for the first three months, it’s like you’re in the honeymoon stage. Everything is great again.

29:52 MK: Really?

29:53 TW: Well you’re very forgiving. You’re very, very forgiving, right? You’re just like, “Oh, I’m sure that’s coming later. I’m sure we’ll have it in four months.”

30:00 MK: Okay. Okay.

30:00 AG: Yeah. But my point is, is that what a new tool does that people don’t realize is it masks the underlying issues your company has. If you screwed up the first tool you probably don’t have a good process for implementing. You may not have a good process for identifying your requirements. You may not have a good process for QA and now you have a new tool. It’s working for three months, but then everything is gonna go to shit six months later and then you’re like, “Okay.” Well, now at that point you can officially say, “It’s not you, it’s me.”

30:31 TW: I think there is that very… When you go to implement a new tool, the other thing that you’re getting… And I don’t know if you know that, Adam, this was on your list is that whether it’s conscious or subconscious, when we say, “Hey, we’re gonna swap the tool out.” Well, you’ve actually just bought yourself three to six to nine months, while you’re implementing because everybody knows that you’re fixing the problem, you’re swapping out the tool, which means there’s limited scrutiny. And then the rollout gets delayed and people are like, “Yeah, yeah. Every IT project gets delayed.” And so you’ve kind of just bought another six to nine months and then you’re in that phase where people are saying, “Well, it must be me. I must not know how to use the tool.”

31:12 MK: That’s what I actually think is the problem with so many implementations. If you could get that six to nine months or whatever it is, three to nine months of buffer to go back and fix the wrong thing, that would actually solve the problem in a lot of cases. But it’s like, “I’ll go to the new tool because then I’m gonna have that time to do it properly.” When if you dedicated that time to do it properly… But I just don’t think there’s a way that you can do that with a business.

31:38 AG: Yeah, it’s hard to get that. But I’ll tell you a funny story. When I left Omniture and went to work at Salesforce, their implementation was not good, and I looked at it, and everyone was exactly like you were saying, Moe, people had just been done with it. They’re like, “I hate this thing, I’m not using it.” And I told them, I said, “It’s not the tool.” So what I actually did, is I shut it down and we were… And basically, people were freaking out. They’re like, “What do you mean?” And I said, “Well, I’m actually taking everyone’s ID away, and we’re gonna shut it down for four weeks and we’re gonna re-implement it with just a couple metrics, a couple dimensions.” And everyone’s like, “Wait a minute, I need my data.” And I said, “Wait a minute, you just sat and told me for an hour how crappy the data was. You don’t trust any of it. So you’re telling me I can’t take the stuff you don’t like away from you for four weeks?” And I totally caught them, and they were like, “Oh, I guess you just… Yeah, I guess, it’s okay.”

32:29 AG: So I shut it down. And it’s funny ’cause my boss at the time, his name is Chandra, he said, “Wait a minute. Let me understand this. You just came to the company to head up analytics, and the first thing you wanna do is shut down analytics at the company?” And I said, “Yeah, I know it sounds weird, but it’s like no one trusts it, and I feel like we need a clean slate.” And so, we reimplemented a very mini-implementation with just a couple of metrics, couple eVars. And I actually took… Since I’d taken everyone’s ID away, the only way they got back into it, is if they came to a training class with me, and I made them pass a test. And so, we only had five or 10 people who were in the tool, and then the word started getting out that, “Hey, there’s actually a new implementation, and the data is actually right this time.” So then they told their co-workers, then they wanted an ID, and then people were beating down my door to get training. And this is something that like six months ago had 400 users that never… I couldn’t get to log in to save my life. But now they felt it was new and cool, and hip.

33:29 AG: And I’ve also found that the reputation of a team and the implementation itself is a huge thing. The fact that they just thought that our team was… It was a new day, like now everyone wanted to jump on the bandwagon, so I know not everyone is gonna have that capability. When they asked me, they’re like, “Are you sure you know what you’re doing?” I said, “Well, I just worked at the company who made our product for five years, so I think you could trust me, I know what I’m doing here.” But not everyone has the courage to do that, or has the… Be able to just say I have the background to do that. But it was fun, and I think it was revolutionary, and it’s the reason why the analytics program kind of took off when I was there.

34:08 MK: I think that’s the coolest shit I’ve ever heard. Legitimately. I feel like that’s the next level to just stopping sending reports.

34:17 AG: Yeah, but in all honesty, I left Salesforce after two years, and I heard that afterwards the implementation went to shit. So, you know, it’s just… It’s tough, it’s where you have to be vigilant.

34:29 MK: I do think what that also says, though, is that the credibility of the tool is tied up with the credibility of the person leading the project at such a fundamental level. Like, if you have a voice in the organization where people trust you and trust your experience, then I think you can essentially get them on board with whatever you think is best, but if you don’t have that trust, then I think you would struggle to get users to use anything.

34:54 TW: Yeah. That actually… It was making me think of the… I’ve watched analysts slam their tool. And a lot of times, it is a… If somebody new comes in who is not as familiar with the tool, so that’s the easiest way for them to… And this is not the Salesforce example ’cause you were coming in and saying, “Fix what you already have.” I’ve watched it happen where somebody comes to a company and they know Adobe, but the company is running on GA, so they pretty much after like a month, every meeting, you can just count on them saying, “Well, here’s how you do this in Adobe. I don’t know anything about GA.” And then it gives them something very tangible they can do, even if it’s not necessarily adding a lot of value, but it gives them the… They can say, “We’re gonna improve analytics, we’re gonna swap out this tool.” And I’ve seen it happen both ways. I’ve seen people come in who are familiar with GA. Maybe they came in from a background where they were using the AdWords integration a lot, and they get really frustrated, so they’re like, “Well, yeah, we’re using Adobe, we don’t need Adobe.”

35:55 TW: And I’ve watched clients go both ways because it is giving somebody new in a leadership position an opportunity to show that they’re doing something very tangible. And oh, by the way, they’re buying themselves that three to nine months of runway to swap stuff out, and it’s preventing them from needing to actually learn a new platform. And I get incredibly frustrated with people on… And it’s usually, it’s Google or Adobe who… You give them about five minutes into account, any conversation, and it happens again, and again, and again, and they’re trashing the tool that was not the first one they learned. It infuriates me.

36:37 MK: What do you say to them? How do you… Everyone knows people that have moved to a company and within five minutes they’re like, “Oh, hold up. We’re gonna swap tools,” because they are more comfortable with one or the other. How do you manage that? How do you deal with that person?


36:54 MH: It’s a process that has to begin at the org level and work its way down because probably you shouldn’t be hiring people who are that stuck in a tool.

37:05 TW: In some cases, I’ve learned a lot about Adobe from one of those people at an agency, and so I’m like, “Well, I’ll take what I can learn, and this will help me try to do the ma… ‘Cause I was more familiar with Google than Adobe. So actually I spent some time questioning myself. I was like, “Oh, maybe this really is a big shortcoming of Google.” I was a little naive at the time, whereas…

37:32 MH: Yeah. Well, I remember when I first went to go work at Lands’ End, and this was after spending the first four years of my career being all Webtrends all the time, and our Omniture sales rep comes to visit us, and we have lunch. And after lunch, we’re walking back inside the building, and he’s like, “You’re pretty cool. I was really worried that the first thing out of your mouth was, ‘Let’s switch to Webtrends or something.’” And I was like, “Why would you expect me to say that?” But that’s what people, I guess, he expected, was that people would just be like, “Oh, that’s the tool I know, so we should obviously use that.”

38:06 TW: It feels like you were just trying to force a story where somebody said that you were pretty cool.

38:11 MH: I was pretty cool. I mean, that guy was cool.


38:17 MH: Well, that’s all the time we have, Tim Wilson, so… No, I’m just kidding. [laughter]

38:21 TW: Well, where does supplemental tools come in? ‘Cause there’s kinda that… We’re talking this core… You’re switching from Adobe to Google or Google to Adobe, or whatever, but it does seem like there’s a lot more cases to supplement with a heat mapping tool or an event tracking tool, or something else. How does that… ‘Cause that’s adding complexity, adding another tool set and adding something else that needs to be implemented and maintained. And you’re probably now dealing with the Venn diagram where there are cases where the functionality actually overlaps.

38:57 AG: Yeah. I think that’s probably the biggest frustration that I see when I work with people. For example, everyone knows the infamous click map or activity map in the Adobe Analytics tool. I tend to find that there are certain things that tools don’t do super well. I haven’t had great experience with click map or activity map. And I think there’s other tools like Chartbeat or HotJar, or Decibel Insight, or ClickTale that do a better job. And I think that if you’re a big enough company and you could do a best of breed, then you should integrate those tools.

39:32 AG: And I often times will find a situation where I’ll say, “You know what, we’re gonna build a segment in Google or Adobe, and find the visitors we care about that met this particular criteria, but then, we’re going to… Since we haven’t integrated with, say, Decibel Insight, we’re gonna go look at Decibel Insight and watch the people who are using the session… That were in the sessions that had the problem we’re trying to solve.” So I think it’s very easy to integrate these tools, whether it’s OpinionLab, ForeSee Results… And then, you basically use your primary tool, but then you kind of supplement it with all of these additional tools out there. If you don’t have the big bucks, and you can’t afford to have lots of different tools, then you might just use the activity map and get the best data you can.

40:18 TW: Oh, yeah, heaven. I guess nobody actually switches from Google to Adobe, or Adobe to Google, because of their click mapping capabilities.

40:27 MH: Yeah.

40:28 MK: Yeah. I wanted to ask, do you think… And again, I feel like I keep bringing it back to client side, but let’s be honest, they’re the ones that make all the mistakes about implementation. So if you decide, as the person who’s leading this, “Okay, I’ve got one of those big tools, implementation’s okay, but I wanna go play around with some of this other stuff on the side.” For me personally, the way that I would probably look to tackle that, which says something about possibly how I plan my work days, would be like, “Okay, maybe I can find a little bit of time on the weekend so I can have a play around with that and see whether it works or not.” For people listening, do you think that’s the only way to actually get… ‘Cause I… Immediately when you’re like, “Oh, just play around with this on the side,” I’m like, “When the hell would you have time to do that in your job?” Is the only way to do it that you just take it on as a bit of a personal project, or do you think you just fight to make the time? Because in the end, it will add business value.

41:27 AG: Well, then I could tell you, having watched Tim go through his cathartic exercise with R over the last couple of years, I think Tim started in that vein, but I’ll let him speak for… But I’m sure that he has found cases where that has been very useful for clients, what he’s learned where now it becomes something that’s highly relevant to what a client might need. So you might need to start as kind of playing around in an area. Or if you could find a particular thing that a tool, that your big tool, can’t do, then I would say, make that a part of your job is to solve that with another tool that can, and then go from there.

42:09 TW: And once you call it a proof of concept with a real use case… For one thing, especially if you’re client side, you tell them, “Hey, I wanna just put this on our blog. I’m only gonna run it for a month. You can nuke the data afterwards, but I can’t make the case.” But that then inputs a time pressure, it seems like, on you to do that. So I guess it probably sort of depends on are you pursuing it from a professional perspective of, “I think I should kind of learn and grow,” or pursuing it because, “We’ve got a gap with our current implementation or our current capabilities, and this might solve it.” You know, it becomes a prioritization issue. I’m definitely not the one to provide any counsel about how to not force yourself to fuck around in the evenings or weekends. So I’m uniquely terrible.

43:00 MK: I think you actually don’t have any personal time. Occasionally I see you go kayaking, and I’m like, “I don’t actually know when you fit that in.@ I think you don’t sleep.

43:07 TW: I’m just using the Google’s Vision API to find pictures of people kayaking…

43:13 MH: So you just post those?

43:13 TW: And then just post those to, as though it’s me.

43:18 AG: But you know, Moe, in the past our firm had this thing called the Analysis Exchange which is still floating out there, which was a way for people to go play around with the data set for helping a non-profit. At the same time, it was very focused on GA. I almost feel like hearing you talk, it would be cool if there was a way to do that with Snowplow, be able to say, “Hey, here’s a set of data that anyone can use.” And I think you guys have talked about this on past podcasts where there’s open sources of data that you could use, but it’d be cool if there was a way to update the Analysis Exchange and do that more for open source technology and help churches, hospitals, non-profits and kind of kill two birds, so you’re helping people, but you’re also learning a new skill at the same…

44:07 MK: Yeah, nice.

44:08 TW: I’m a good year behind getting Snowplow implemented on Analyticshour.io. It is an active to-do. It just keeps getting punted by…

44:16 MK: Didn’t someone offer to do it as well?

44:17 TW: Yeah, one of the Snowplow… This is terrible. I’m gonna say Anthony. He’s the guy in New York, Snowplow person.

44:25 MH: Oh, Mandelli?

44:26 TW: Mandelli, yeah, offered to help. So now I think we’re on record.

44:30 MH: Yeah. Sheesh.

44:30 TW: It’s on the list. Let’s make it happen.

44:32 MH: Sheesh, Tim.

44:33 MK: Okay. I need to… I’ve got to squeeze in another question. So, Adam in your experience do you think that there is a particular structure that leads to an implementation not going over a cliff, and just being a disaster? And by structure I mean like having one person in the business who is the implementation guru that everything goes with them. Or do you think it’s about everyone in the team knowing a little bit or what do you think is the best structure to make sure that you don’t get to that implementation that no one’s touched in two years and suddenly becomes incredibly outdated?

45:07 AG: Yeah, that’s a little bit of tough one. I’d say it depends on the company, but I’ll tell you a couple of stories. So when I was at salesforce.com, we had a very small team and we took in what we called the ambassador model, which was our team would do the hard core analysis that the CMO wanted, but then we had an ambassador in Japan, an ambassador in Dublin, an ambassador in a couple of our key business units, who would know a little bit more than the average bear when it came to our analytics and we would do brown-bag lunches with them and we would educate them so that they would be in meetings that we weren’t in and they would hear stuff and then if they couldn’t do it on their own, they would come to me and say, “Adam there’s a new question that came up that your team might wanna solve.”

45:55 AG: Then we went to Agile, and so everything was Agile marketing sprints. So what we did is we actually had about 10 different groups, and we took our analytics team, and every sprint team had to have one analytics individual on it. So as the sprint was kicking off, we would hear is there any analytics needs for this sprint. And that would either be we need to implement something new, or we need to just do an analysis when it’s done. And there would be some sprints where we weren’t needed, so our team would just disappear for two weeks and go back to other stuff. But I think being embedded in the business is the biggest problem and I think that what I’ve seen when companies have an analytics team, that is off in its own world, then they forget and they don’t hear what’s relevant to the business anymore.

46:43 AG: Just a real scary story. I happened to be accidentally when I was in a meeting at Salesforce with an events team, and I don’t even know how I ended up in that meeting, it was for some weird reason. But while we’re in the meeting, I heard them talking for 20 minutes about trying to figure out where they should have training events for salesforce.com. And I said to them while they were talking, I pulled up my laptop and said, “Well, by IP address, I can tell you that the people who are looking and searching for keywords related to training and clicking on our training pages are in these 10 cities.” And they all stopped, and their jaws like hit the table. They’re like, “Wait a minute, we’ve had all this information all along? We’re just here making crap up. You actually know where people want training? Why don’t we do training in those 10 cities?” And that group started asking, “How do I get a log in to Omniture SiteCatalyst? How do I use this?” And they loved it, but I was accidentally in that meeting. And think about how many meetings are happening in companies where you don’t hear that people have questions that you could answer, if you could subdivide Moe and have mini-Moes everywhere, listening to what’s going on. And so, I think that figuring out at your company what’s the right structure to make sure that your analytics team is hearing what’s going on at the company is the way you stay relevant in the future.

48:03 MH: Yeah. Alright. Well, we’ve got to start wrapping up. This has been great though. So one thing we love to do on the show is just go around the horn and share a last call, something we think will be of interest to our listeners. Adam, you’re our guest. Do you wanna go first?

48:18 AG: Sure, I’m gonna do a shameless plug here. For those of you who may have been to one in the past, our company has done these fun little conferences that we call ACCELERATE, and they are fun, one day events where you just… We do top 10 tips in 20 minutes. So if you… It’s like a fire hose of information. If you hate one, then 20 minutes later, you’re on to the next one. So we are doing one in sunny California at the end of January, ACCELERATE 2019. We’d love it if your listeners would check it out and maybe come out. For people like me who live in Chicago, it’s nice to get away from winter even for a weekend. So that’s gonna be in the end of January. And for those of you going to Super Week, it’s a great stop off in San Francisco before you take your long flight over to Europe.

49:09 MH: Awesome. Yeah. Well, Tim and I have definitely been to an ACCELERATE before, maybe even more than one. A good time.

49:17 AG: Yeah and we’re gonna be having, some of your past guests on the podcast, will be speaking there. We’re gonna have Christa there. Ben Gaines is gonna be there. So we’re gonna have some fun people and do some fun events.

49:28 MH: Okay, Moe what’s your last call?

49:29 MK: Well, it’s like a half of a last call because most people in our industry have read this book, but I’m currently re-reading Daniel Kahneman’s “Thinking Fast and Slow”. And for those who haven’t read it, I just really encourage you to read it. Yeah, I think after reading “The Undoing Project”, I’m kind of re-reading it with all this new context and that sort of thing. And since Dan is now my bestie, I thought I’d better give his book another read.

49:57 MH: [chuckle] You have gotten an email from him.

50:00 AG: Oh, and Michael since I totally forgot, sometimes people do twofers and Moe reminded me, ’cause she mentioned a book. If I could throw one more in there.

50:07 MH: Yeah, go for it.

50:08 AG: I highly recommend that everyone read the book, “It Doesn’t Have to Be Crazy at Work”, by Jason Fried. He is one of my favorite authors. Many years ago I read his book “Rework”, which completely changed the way I thought about working, especially if you work at a client side company. And then he wrote a great book called “Remote”, which is all about how you should be open to remote workers. But this one, it’s not great for people who love slack ’cause it’s not… It’s kind of a little bit against slack interrupting your day all day, but I think that DHH and Jason Fried have a really good book going there, and it’s really quick read, highly recommend it.

50:49 MH: Very nice. Okay, Tim, what’s your last call?

50:54 TW: So my last call is a article in Technology Review that is an interview with Andrew Moore, who I think back in September became the head guy at Google Cloud’s AI business. But the article was called “AI is Not Magic Dust for Your Company says Google Cloud’s AI Boss”, which is an interesting read in that in some cases he does sort of speak of it as, AI as magic dust. But he also talks about all of the hard work and, kind of relevant to this episode, the need to actually be figuring out what problem you’re trying to solve rather than just saying we’re gonna go get some AI magic dust and it’s gonna just give us wonderful stuff. We have data, we’ll throw AI at it and magic will emerge. He says that’s probably kind of a wrong-headed way of thinking about it. So it’s a nice little read.

51:47 MH: When will we have marketing magic? It seems like it’s…

51:50 TW: I don’t know. What’s your last call, Michael?

51:53 MH: Well, I’m glad you asked. So mine’s kind of a twofer too. First off, I just wanted us to stop and kinda laugh at Simon Britton on The Measure Slack. He thinks a lot of times that maybe we might laugh at him for asking a dumb question, but really we’re just laughing at him, just generally because that is a silly thing to think that anyone would laugh at you for asking a stupid question on such a friendly place as The Measure Slack. So Simon…

52:19 TW: Question is whether he will realize, he will know when he actually wrote that since it occurred during the recording. You can actually…

52:25 MH: During the recording, Yeah.

52:27 TW: And this will have long since floated off into the Measure Slack archives.

52:30 MH: Exactly. So a long time from now he’ll get this message. Alright, no, my actual last call is a little bit personally painful, but I have to highlight when stuff happens that’s really, really good. So there’s a company that Search Discovery is slightly competitive with, but the podcast is not. And so 33 Sticks, and specifically Jenn Kunz, recently wrote an amazing set of articles around migrating from Adobe DTM to Adobe Launch, and I highly recommend you read through those. There’s a ton of amazing tips. They’ve really done incredible work there. So as much as I’m sort of like, “Oh, we should have done that at Search Discovery,” from a podcast perspective, we gotta let the audience know. Great job, 33 Sticks and Jenn Kunz, for putting that out there. Highly recommend everybody go give that a read if you’re on the Adobe stack.

53:25 MH: Alright. I am sure as you’ve been listening and hearing what Adam’s been saying about implementations and their challenges and switching tools too soon, you are probably thinking to yourself, “Oh my gosh, I just ran into this same scenario.” So we would love to hear from you. You can reach out to us on The Measure Slack, or on our Facebook page, or on our website, and we’d love to hear from you, love to hear your thoughts. And you know where to run into some of us in the coming months, the ACCELERATE Conference in January, or the… Sounds like Tim’s gonna be at the Super Week following that. And I am a hermit and will not be seen anywhere in the next six months, so you can just hear my voice and that’s all you get. Alright, no, we would love to hear from you. Thank you again, Adam, for coming on the show. It’s really great to have you.

54:18 AG: Oh, thank you so much for having me. It’s great to be here.

54:21 MH: Love it. And for my two co-hosts, Tim and Moe, I think I can confidently tell all of you out there, whether you love your tool or you hate it, keep analyzing.


54:35 S1: Thanks for listening and don’t forget to join the conversation on Facebook, Twitter, or Measure Slack group. We welcome your comments and questions. Visit us on the web at analyticshour.io, facebook.com/analyticshour or at Analytics Hour on Twitter.

54:56 S6: So smart guys want to fit in so they made up a term called analytics. Analytics don’t work.

55:05 TW: Webtrends. I just want some Webtrends. How nice was their gift basket?

55:11 MK: I don’t do gift baskets. I think I paid for coffee. I really am getting this wrong, I paid for coffee. I don’t wanna accept coffee and then feel that I always need to say nice things about your platform.

55:21 AG: I think that’s a kind of a low barrier.

55:23 TW: Yeah. Wow. [chuckle]

55:26 AG: I mean, if that’s the case, then Moe I could buy you out. I’m gonna buy you out pretty cheap. Your sister was much more expensive.


55:40 MH: We’d prefer it if you’d use profanity, which I think you’re probably not gonna do. So we’ll probably have to do it for you.

55:47 AG: Not unless Tim gets me all worked up.

55:49 MH: Okay. That’s what I’m talking about.


55:53 AG: But, Tim, I will tell you that listing to your podcast on one and a half is perfect for you.

56:00 TW: Just one and a half?

56:01 MH: That’s about all I can handle.

56:04 AG: Did Simon buy you a cup of coffee or something? ‘Cause now you’re gonna speak positively about…


56:09 MH: Great last call, Tim. Great, thanks. Okay. Moe.


56:14 AG: Tim’s bringing us down on my last call here. So I was really trying to… Well, never mind. I’ll stop.


56:22 AG: I’m sorry. Yeah, I’m not sure what happened, but I don’t know. Moe can you do yours again?

56:28 MH: Nice attempt to get out from underneath that joke about your Los Gatos guitar playing, whatever that was. I know what you’re doing here, Tim. I just hit the refresh button.

56:40 MK: Hey, I wasn’t I wasn’t being open about that. Thanks for talking about the elephant in the room.

56:45 AG: Oh, sorry.


56:49 MK: I was gonna let people make crazy assumptions.

56:51 MH: Truth in advertising, Moe. We’re all about that on this show.

56:57 TW: Rock Flag and Omni man.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#247: Professional Development, Analytically Speaking with Helen Crossley

#247: Professional Development, Analytically Speaking with Helen Crossley

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_247_-_Professional_Development_Analytically_Speaking_with_Helen_Crossley.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares