Happy new year! Nothing says a new year like a new year’s resolution. And, what’s a better professional resolution than to work with stakeholders more effectively? Unfortunately, we’ve all come across business users who have no interest in the data, have too much interest in the data, or maybe even have the right amount of interest…but in the wrong data. Interactions with those stakeholders can be enormously frustrating and entirely unproductive, yet neither you nor they are going anywhere. What is an analyst to do? On this episode, the gang chews on this very topic with Rusty Rahmer, 20-year veteran of Vanguard, and the incoming president of the DAA’s board of directors. Give it a listen for some practical perspectives and topical tips!
0:00:04 Speaker 1: Welcome to the Digital Analytics Power Hour. Tim, Michael, Moe and the occasional guest discussing digital analytics issues of the day. Find them on Facebook at Facebook.com/analyticshour and their website analyticshour.io. And now, the Digital Analytics Power Hour.
0:00:27 Michael Helbling: Hi, everyone. Welcome to the Digital Analytics Power Hour. This is Episode 79 and you know what else? Happy New Year. This is the first episode of 2018. And you know it’s that time of year when we forget all of our acquaintances for someone named Auld Lang Syne, whatever that means. But in the spirit of brotherhood and bonhomie, we wanna handle a topic of some indelicacy. It’s ‘What you do when stakeholders just don’t get it?’ We barely started this year and I’m already frustrated. Moe, you’ve been in 2018 longer than any of us, how is the year going for you?
0:01:15 Moe Kiss: Yeah, the year’s going great so far, what with my extra time.
0:01:21 MH: Excellent. And Tim, our other co-host, how’re you doing?
0:01:27 Tim Wilson: Ah, I’ve gotta say, as we say it down under, I’m a little dusty coming off the new year. So I’m sure everybody [laughter] in the US knows what that means or they can look up in their Australian to American English translation guide.
0:01:42 MH: That’s right. If there’s one thing this show does it sort of brings those two worlds together so nicely. [laughter] And of course, I’m Michael Helbling, but we need a guest. Frankly, we need a guest who is way more patient and persuasive than we are. So we turn to the one person, he’s the next President of the DAA Board and he’s also the Head of Enterprise Analytics in Digital Intelligence at Vanguard, it’s Rusty Rahmer. And people often comment on his salt-and- pepper hair, but today we’re mostly interested in the brains underneath them. Welcome to the show, Rusty.
0:02:17 Rusty Rahmer: Great thank you. Happy new year, welcome. Thank you.
0:02:20 MH: Yeah.
0:02:21 TW: But people often comment as I understand that you meet a drunken lady on the streets of New Orleans during DA Hub, just to be clear.
0:02:28 RR: Yes, yes. Thank you, thank you.
0:02:30 MH: Listen, I was there to witness the birth of a legend. [laughter] So, it’s my job now just to share that story with anyone I meet.
0:02:40 RR: Yeah, thank you.
0:02:41 MH: Anyway, so, no we’re talking about how to get through to stakeholders that just don’t get it. But before we get into the meat of it, I think Rusty, it’d be great for our audience just to get a sense of some of your background and how you got to where you are today.
0:02:57 RR: Yeah, well. It’s an interesting story. It’s been 20 years at Vanguard. It’s been a long time. But, it’s never actually felt like the same job at Vanguard. It’s a very rotational company, so roughly about every two to three years, they sort of create a rotation for you. I got lucky early on in my career and I got involved with Six Sigma and DMAIC when Vanguard started going to operational excellence. That was really my first sort of endeavor with using analytics to improve processes. I rotated around a lot of different operational roles to do startups and turn-arounds there and eventually got into web services which was developing the websites for Vanguard and applying analytics to web experiences and eventually running the digital program at Vanguard.
0:03:43 MH: Excellent. Well then you’re perfect ’cause I think along the way you’ve had many opportunities to run into folks who don’t necessarily understand the concept of analytics, understand the concepts of digital analytics, all the reports and the data that might be coming out of them, so what have been some of the things that are high level and then we can dip into like different key topics but what are some things you’ve observed and obviously Tim and Moe, feel free to jump in as well.
0:04:13 TW: Well, I’m just wondering, does everybody get it now at Vanguard? I mean, you’ve had enough time? If not, what the fuck are they doing, plank?
0:04:21 MH: No. Didn’t you hear him, Tim? Every two years they have to start over again.
0:04:27 TW: Yeah, no, I’m just kidding.
0:04:32 RR: No. I wish that were true but it’s still as much a challenge today as it was then. I think it’s the analytics involved, the challenge to use them appropriately continues to be there and you get new folks coming in the pipeline, new leaders, and so it’s a constant education and advocacy process for how to apply analytics to problems. I don’t think it’s ever done. I mean, I think all of us have been in this for probably 10 years or so. And it’s still the first thing that analysts say they get challenged with, when you speak them, new or old. It just seems like it’s an eternal problem or challenge for the analysts.
0:05:12 MK: And what do you think really that challenge comes down to? Is it not trusting the person or the data or the process? From your perspective, what has been the most difficult, I guess, obstacle?
0:05:25 RR: I think people are aligned or oriented to a specific piece of a puzzle. Not the entire… I think we in analytics try to look at a broader, contextual data set and put things in a place within that context and I think organizations are typically aligned where someone is in charge of a piece of that puzzle. Hey, I know when I’m on the other side of that table and I’m responsible for a product or a service or a campaign, I want the analysts to say the efforts that I’m making are resulting in contributions to this company’s output. That’s my lens on the data and typically the analyst is, “Well let’s weigh that out against the context of other things that we’re doing and how that could fit in into the puzzle.” I think it’s a natural rub from the very beginning of what perspective you have on the analytics just to start out with. So it’s my goal to show success, if you’re the business line, yes; if it’s the analyst, it’s to talk about progress and context and learning, and so it’s two different goals, I think. And I think that’s where the rub typically comes from, in my opinion.
0:06:30 TW: I like that framing. I just had a little bit of a flashback to a recent episode with a client that, sadly, and I think this is probably the case for all of us, I won’t speak for you, but it’s like “Oh, I’ve been here before.” And that’s like dealing with a media company, a digital media, and watching that they wanna measure through the click. But then you’re actually looking at the side analytics and saying, “Well there’s a 96% bounce rate”, and it’s like laughably small, and trying to work with the business and saying “What are we really trying to do with this media campaign?”. But they’ve got groups they’re working with that are really focused on, “Oh, what’s our cost per click? How many impressions do we have?” And we have visited that, well many times, and it’s very, very frustrating. But I hadn’t really thought about that as being in any organization. They’ve got their window, there’s where it starts and where it ends, and an analyst is often saying “Yeah, yeah, yeah,” but what happens before and after and try to kind of frame that whole thing. And that’s just kind of a challenge ’cause in some sense they’re saying legitimately, “Well, I don’t control that stuff, I don’t care about it. I care about what I control.” Right?
0:07:47 RR: Yeah, absolutely. And I think that’s just one of the challenges. I think that you also have the… That the other one you see a lot is, “I’ve already made the decision, we’ve already invested the money, we’ve already done this thing, and now I need the analytics to show this worked.” And so again, from the analyst’s side you’re looking at it and going, “I had other options that I could have spent that money on.” I really want to be on the informed side of that decision rather than the, “Just tell me how the mortar landed in the field and how many people I hit with it.” How do you say, “Well, maybe you should aim before you fire next time; use my analytics to help guide that.” So I think that’s the second big challenge. I think it’s one, having a narrow slice of, “This is the thing that I’m responsible for and I’m working really hard at it and I want analytics to show that I’ve got success.” And then “I’ve already made the decision, I’ve already made the investment, bring the analytics to the table to show that was the right decision to have made, or the right investment to have been.”
0:08:45 MK: So how have you actually… That example about the mortar really makes sense to me. How do you, I guess, get those stakeholders that are really kind of involving you right at the tail end like, “Here’s where it fell”, etcetera, and bring them along to actually, “Next time let’s use the analytics to figure out where to aim”? How have you shifted stakeholders perspectives from one to the other?
0:09:10 RR: Yeah, so this might be different because I’m inside an organization, so I’m married to the outcomes of an entire business even if we’re just talking with one particular stakeholder who only has a piece of it. So, to me, that kind of mindset is symptomatic of a broader conversation that isn’t happening somewhere in that business line. So where is the contextual conversation of all marketing channels rather than this particular marketing channel? Where is our scoring of marketing success? Where do we have the broader conversation happening? Where do we have the engagement with analytics happening that prevents us from looking at such a myopic piece of data or outcome? So at the point you’re dealing with it you can apply some ointment to the one particular stakeholder that you’re dealing with, you can try and help them think a little more broadly about some other options, but I see that as an opportunity to figure out somewhere else in that business line where a different conversation needs to be created and how can I get in there and start bringing data to the table that’s gonna change the conversation, that’s what I see that as an opportunity to do.
0:10:20 TW: Which you’ve kind of done, if I’m kind of recalling from when I have seen you present, you do have a sort of strong, mature point of view and process that you can use to evangelize to the organization. So if Person X is a stakeholder who’s not getting it, it seems like you sort of just don’t take your eye off the ball of what is the more systemic, broader organizational approach. And there’s what’s said on the slides but then there’s what’s being said when you’re being grilled by tough podcast interviewers like… [chuckle] It seems that would work, so it does seem like countering when somebody’s coming at you with a… How important is it for you to have sort of a strong, organizational-wide, “This is the system and the process,” how much does that bolster you against myopia and some of the other challenges?
0:11:23 RR: Right. So it’s changed for me. A few years back when we didn’t have a center of excellence or hub and spoke model, you had to do the business level reporting and analytics to support the daily decisions being made in the business, and it was really hard to find the time and the resource to get around the conversation to try and create a more strategic conversation upstream of where you’re having these tactical conversations about analytics. That was hard. I think since we’ve moved to a center of excellence model and we’ve built that out, we’ve got spokes in the business line who are serving that daily question. And we can separate ourselves. It’s sort of the Google 20%. That 20% either has to be one team who’s serving the daily analytics needs, finding 20% of their effort to go and around on that conversation to be more strategic, or you create a center of excellence whose 100% of their time is dedicated to that, but that resource pulls only 20% of the resource pool that you have available working on analytics, if that makes any sense.
0:12:28 TW: So how is your hub… So when you say hub and spoke is that members of your team who are positioned in the departments or is that members of the departments who are supposed to be working on analytics and their dotted lines into you?
0:12:40 RR: Yeah. We’ve helped each business line for us. And business line isn’t necessarily division, it might be… In a division you might have an operational group who wants to use digital analytics to optimize e-commerce and fall out from the web. And then you might have experienced groups who wanna improve the customer experience, optimize some marketing campaigns. So each one of those within a division, we have helped them staff up and build then own analytics teams within that. And we stitch them all together through a dotted line as a community of practice. And my team sits at the enterprise level supporting that community of practice, making sure they have the right tools to do their job, training, working with them on projects and then doing that, “Hey, they’re really weighed down with the daily reporting and ad hoc questions of the department, where do we think we need to mature the departments?”, strategic analytics insight. But they’re not getting enough value out of analytics because they’re asking tactical, how can we help them use the data that’s available to them to be more strategic where they create a broader conversation. Create a marketing score card rather than a campaign level report or integrate the data more and have a more multichannel view of the data in your real time analytics of the operations, that sort of thing.
0:13:54 TW: So will there be analysts in one of those spokes who say, “I’m getting clobbered by stakeholders who don’t get it. I only joined this three months or six months ago. You were part of bringing me in. I don’t have the trust and the credibility to shift my group. Hey, Rusty, can you and your team come and have the more… Be the hammer a little bit?” Does it work that way or…
0:14:24 RR: That’s a good question. We play… For us, in the Center of Excellence we have this service model that goes, lots and lots of consulting at the top of the funnel. We try to mine opportunities to broaden the conversation and department or mature department with the solutions or turn that into a proof of concept for the business line. And then we’ll work very, very closely with the business line analyst if that is successful to enable them, so we have an enablement phase where we teach the business line analyst to use the solution that we developed. We refine it a little bit more and make it theirs. But through the consulting layer, is where we’re always attending different meetings, where we’re working with the marketing managers or the business operations leaders and try to guide that conversation up out of the tactical reporting and more to the strategic level, that’s what we’re there to do. And then when the opportunity presents itself, to say, “Hey, don’t you really wanna be thinking more strategically and holistically about your marketing efforts?” I’m thinking you wanna know, when I’ve got a goal or product or service awareness, what is the best channel that I should be launching that and to get that result?
0:15:35 RR: It’s hard to do if we’re analyzing everything at a campaign level, we’ll never assemble that insight up to where we can answer questions like that. I’d love to help your team come in to a place where it’s answering that more strategic question for you and then work with the team to get ’em up into that place. To give ’em a solution that they don’t have time to develop to move them up into that place. So, we’ve always got one hand in with the leadership and one hand in with the analyst team supporting both sides of that, really.
0:16:01 MK: I’m just thinking about the analysts in positions that don’t have that group and also that trust that you’ve built clearly in your organization where people really listen to you when you’re like, “Okay, here is the strategic direction, here is the bigger picture. How do we take a step back?” For those I guess less experience analysts that are really at the coalface, I’m just thinking about how do you really develop to that point where you’re getting hammered with request after request? I still get requests like this, like, “Give me the click through rate for this. Give me the bounce rate for this. Give me the number of uses, the number of sessions”. And that’s the ticket that someone will create and be like, “Give me that.” And I have to go back and be like, “Actually what are you trying to understand? What is the question we’re trying to answer?” What kind of techniques have worked for you in challenging stakeholders who are really, I don’t wanna say pushy, but maybe not sure what they’re doing quite yet?
0:17:07 RR: Yeah, I think… There’s a couple of techniques that just stretch the conversation a little bit broader. I had a great… My former lead analyst… He moved on, he’s leading the analyst organization over at Project Management Institute of America. And he had a great way about him. He’d go into a meeting and folks would say, “We need to know the bounce rate of this”. And Joe would go in there and he’d say, “15,000”. He made that number up, totally made it up. And he’d say, “Where are we going with that? What are we doing with that? What’s our next decision off for that? Are we driving, are we pulling the lever differently because we know that now?” And he just had a really good way that was getting at, I can go and spend a lot of time to get that answer for you but on the back of a napkin, I just gave you something… Oh, that’s just good to know. Okay. Well, I might save this a lot of time by not going off and start digging into that answer. Let’s get the data that’s gonna help us pull the lever differently the next time we do this. And what is that? He had a really good way of reframing the conversation. I think every analyst has to be part-time consultant. That is part of the responsibility of the job, I think. And as you mature, that becomes a greater part of your job.
0:18:18 RR: Now, you bring up the trust question. One thing I’ve learned through my rotations throughout the company, one thing it teaches you is that if you’re gonna be an influencer within a couple of years within that organization, and you might be sitting among people who have been doing that particular job for a long time, if you’re gonna come in there and understand that, you’ve gotta get really good at learning the business quickly, at listening to the people around you, understanding what their challenges are, what their opportunities are, what’s the competition in this space doing, where are we trying to place ourselves? I think analysts have to do that. If you’re gonna build the trust of the people around you, you’ve gotta know where they live and understand it, and bring that to the table. Like, “Hey, I think you are asking me about bounce rate because we’ve made a big investment here,” and whatever. There are other ways of looking… I can give you the bounce rate, but I can also do these other things for you which might help you make that decision differently next time.
0:19:11 MK: Yeah, nice. That’s a really great practical example.
0:19:14 MH: Yeah, it’s not just asking them to tell you their business objectives, it’s you digging in yourself and trying to understand why they’re trying to do what they’re doing and bringing some examples. I like that. It’s very proactive.
0:19:27 RR: I think business acumen is something we talk a lot about with our analyst. Now, it’s hard in the Center of Excellence when you bring somebody in from the outside because you’re talking about a lot of different divisions with a lot of different goals and a lot to know, but every analyst has a responsibility, in my mind, to have business acumen about the business that they’re supporting and to understand those things, I just think it’s a key part of what they’re trying to do.
0:19:51 TW: I almost flip it. And I’m curious what you guys think that…
0:19:54 MH: I think you’re wrong, Tim.
0:19:57 TW: Okay, that’s a given, which means that I am right. But, and this gets a little bit into the communication. I can’t remember… It’s almost like a given that people love to talk about themselves. So there’s the difference between totally understanding the business and arriving with a knowledge of the business, which can be really, really tough, and that’s great. But you can also walk in with just having a business-oriented mindset and a curiosity, and saying, “I’ve done enough work to… I think I understand how selling index funds to the Hispanic market or something, help me understand.” And just asking a couple of reasonable and smart questions that aren’t, “Do you want the bounce rate compared to the previous week or the previous month?”
0:20:49 TW: And I feel like analysts will sometimes… They’ll retreat into their domain of expertise, which is the specific metrics, as opposed to saying no. I really don’t really understand where they’re coming from. And if I actually ask engaging questions about what are you really trying to do, and I can couch that in terms of, “Yeah, I know you want me to pool the data for you, I just wanna make sure I’m pooling the right and most useful data. So let me ask what… Everybody in your department knows code, but I’m not in your department. I’m in the Center of Excellence or I’m external, or however, I’m set up.” I don’t know. I feel like that goes a long way as well ’cause they realize you’re… And sometimes, it actually uncovers that maybe they are not really clear on exactly what they’re trying to do with some investment, but that’s valuable too. You’re not asking in a threatening way.
0:21:45 RR: Yeah. A lot of the experience that we have working with different clients or working with different divisions, so much of the experience we have in analytics really applies to every business line. I may know less about some of our international business lines and what they’re trying to accomplish and what the market looks like, but I know from an advertising standpoint they’re trying to drive brand awareness, product awareness, service awareness and acquisition goals. So, bringing some of that to the table, asking just a few questions about what does acquisition mean to your business line, how do you know when you’ve created product success in your business line? But I think that’s a lot of years of experience, Tim, that you are talking about. As an analyst, that then allows you to bring that, that is sort of domain expertise to the table that builds trust pretty quickly.
0:22:33 TW: And when you guys, when you’re bringing people, say you were helping build, create or grow analytics expertise in one of those lines of business or in some group that’s gonna be a spoke. Does that include… Are you bringing in… Sometimes it’s a fairly junior analyst, and they’re really gonna have to rely heavily on you guys for some level of mentorship and guidance? Or do you have to go in and say, “We’ve gotta bring in… You’ve gotta have somebody who’s got a little bit of experience before you bring in a junior person”? How do you… It just seems terrifying you could have a bad analyst, maybe ineffective, I’m learning my language. It’s not good and bad, it’s effective or ineffective. And you could bring in an ineffective analyst, and now they’re off potentially steering one of those spokes that they’re supporting in a bad direction because maybe they don’t have experience or… I don’t know, how does that work?
0:23:34 RR: You wish they were all top… I’m proud to say that I think everyone we put in the seats are good analysts at some level. They aren’t doing wrong things with data. They may be at different points of maturity in the work that they’re doing, but I think they’re all good analysts. And we’ve mixed that from… The initial seeding of it came from a prior group of analytics that we had in the organizations. We seeded out with that. We brought in some outside folks and then we brought in some internal hires that we felt had the analyst genome that could join that collective group. Now I see that, but not every department has three or four analysts, some only have one, so you really need to make sure you got the one right.
0:24:15 RR: We’ve been pretty lucky that way. I think as things grow we might be a little bit more challenged with that, but I think we’ve been, in the initial start-up phases and staffing of this model, I think we were really particular knowing who we put in these seats were going to be really, really critical to the outcome of the entire model. We were pretty particular about who we put there. Now, hey, in some business lines, you’re one analyst among, I don’t know, maybe 50 marketers. So your challenge is still teaching 50 marketers what to do with analytics and how to apply what you’re capable of doing to the business problems that they have. Some people are better at the consulting side of that than others. And I think where we see that happening, we spend a little bit more time to try and support that group and give them the extra support that they need on that conversation.
0:25:01 MK: So my question is really about how you manage stakeholders, and this seems to be something that comes up a bit. I have a friend in the industry who said the other day that one of her stakeholders basically said data is subjective and that you can just manipulate it every which way to give you the answer that you want. Have you had any experience working, managing relationships where, for whatever reason, the stakeholder is just like… Like you can get them this answer or that answer, but the truth is it doesn’t matter what answer you give them ’cause they’re gonna to push back. They actually question the integrity of the data that you’re presenting and how have you managed that?
0:25:40 RR: Sure. I think when the data doesn’t tell them what they expect you often get that. Moe, I think you talked about an experiment that you ran not too long ago where the results didn’t prove anything out and so the answer was, don’t do this. And that’s not the answer. What do I do with that? I don’t quite understand, what I do with an answer like that. I think it’s a learning process. Data is about a learning process. It doesn’t tell you what to do. We’re tryin to say… We’re putting a framework together to tell us what is working and what isn’t working. I try to coach people around that and try and get them to understand that it isn’t always gonna tell you what you want, but we’re gonna learn something from what we have in front of us. And all of this data driving decision making, and powering conversations with customers is really transformational to a lot of business folks.
0:26:28 RR: So I think… I respect, I’ve been in their seats. I respect what what they’re trying to do and I respect it is a lot of change for them and just try to coach them and help them through it. It isn’t always successful. I’d love to tell you there’s some magic bullet that I found that they can pour a little this in their drink and suddenly they get it. It doesn’t always work out that way, but I think, what’s… The water in the stone. You just gotta keep the message going. You just gotta keep saying we’re gonna learn things. We are gonna use data. We’re going to power our next decisions off the data and just keep going with it. I just can’t let up on… I might have a detractor, but I can’t let up on this conversation in the business line.
0:27:08 MK: So I have a colleague who has this tactic which I really admire because this is not something that I would intuitively do, but when I see him do it, I am like, “Damn, that is a good idea”. And when he has people or when he’s presenting anything he actually goes to every single stakeholder before the meeting. He talks them through the results and gets any questions or, I guess, hesitations out of the way and then he goes to that meeting where he’s presenting his findings and all of the questions he had have now been answered and resolved so that when literally he goes into the meeting every single person agrees. Have you guys tried that? Has it worked for you?
0:27:48 TW: Absolutely.
0:27:49 RR: Yeah, maybe I take that for granted. Yeah, I probably take that for granted. But we are always meeting with our stakeholders before we put them in front of an audience where the data is gonna be read out. So, we do early on consulting on the project. We’re trying to teach a very good project management discipline with our analyst. So meet with your stakeholder, understand what their business problems are, what they’re looking to prove, ask those critical questions. Go off and frame out what you think the project’s gonna look like, the time [0:28:14] ____ and share that. And when you come back with your results, you are definitely reviewing that with your stakeholder first and foremost before you bring that into an audience of people. It’s gonna land much better that way, and give them an opportunity to color the conversation the way that they want to in front of a broader audience. Maybe I take that for granted, but that’s definitely part of our equation.
0:28:37 TW: That is super, super powerful and the more contentious the stakeholder tends to be the more you wannna work with them one on one, but sometimes they ask a question, they question the data and they’re right. And it’s much better in my mind to… I’ve laid an egg in a group when it turns out that there was some consideration I didn’t know. But I watch analysts who are like, “No, we provide the data and the data is right”, and that’s kind of a bad attitude. So, the more surprising of a thing you find the more you do wanna run it by somebody and say, “Hey, we got this. Let’s partner on this”, like, “This looks terrible and it’s looking like this thing was a disaster”, “So let’s talk about it one on one to make sure that we are looking at the data in an appropriate way.” There’s not some factor, “Oh, we are running an AB test.”
0:29:30 TW: I ran into this with a client, running an AB test somehow managed to not have web analytics tied in on the B variation. So basically, we didn’t even know they’re running a test. It was just half the data. I’m like, “Well, this sucks”, and it’s like well yeah half of the traffic is lost once it gets to that point. And if we hadn’t been collaborating before we were going public with it, then as an analyst, the analyst, we would have looked bad. So I don’t know, I think there are lots of benefits, it’s just we… It’s painful ’cause we’re like, “Oh, that takes time and they’re gonna question our stuff, we’re gonna have to go back and recheck our data”. It’s like, “Yeah, but you’re actually gonna get a good result at the end of the day if you do that”.
0:30:14 RR: Yes. So, I mentioned earlier… I sort of consider these really tactical questions when we’re talking about folks that bring really tactical questions to the table like bounce rate. I view those as a sales lead to a broader conversation with, probably, that person’s manager or the audience of that person’s peers. When we get there and we come to the table and we’ve got a solution that creates a broader conversation, part of our enablement process is actually teaching leaders how to engage with data and how to interact with it, and how to interrogate it and ask the right questions. And so, we’ll literally put glass jars in front of us at the table, when we come with monthly data about a broad… “Here’s what happened in your site with your marketing efforts this month, let’s look at this data”. And those jars will be labelled, “Things we’d like to know but we don’t know because we didn’t tag things properly.” And We’ll go, “Oh, that sounds like an action item, we can write that down”. And we write them, we put that in the “things to tag” jar. And then…
0:31:12 MK: That’s amazing.
0:31:12 TW: Oh my god.
0:31:14 RR: And sometimes there’s, “I don’t trust the data or I don’t think we’re measuring this properly, or I don’t like that metric.” And we go, “That’s a really good question, we’ll take that offline”. We write that down and we go, “Here’s the jar for… “, “Let’s do more research if this is the right data point”, we’re channeling it. What ends up happening at the end of that is you’ve got these jars with pieces of paper in them that’s a visual manifestation of good positive action coming out of interacting with data. And then we go [0:31:41] ____ on the stuff, but if you wanna interrogate it, you don’t wanna believe this metric, that’s fine. This is how you positively channel that and still move on engaging with the data, and don’t throw the baby out with the bath water and say, “I don’t engage with data”. This is how you channel it in a positive way.
0:31:56 MH: Right. Tim, would like to know if there’s also a swear jar there?
0:32:00 TW: I was kidding.
0:32:03 MK: So, this is actually really good timing, because just before the break, and yes, in Australia we get quite a lengthy Christmas break, which is really nice. I was meeting with some stakeholders and they basically have move teams and they gave me a laundry list of things that we wanna know. And it was literally like, percentage of customers that do this, percentage of customers that pay with this card. And I’m like, “Okay, what are we gonna do with this?”. If we know that 60% pay with this payment method and 40% pay with that, well like, “What are we gonna do with this information?” And it [0:32:39] ____ legit comes back to me like, “Oh we just wanna know”, like, “What do you guys do?” And so for me, what I tend to say is…
0:32:49 MH: I believe that is a sales lead to a deeper conversation with leadership. That’s what I just heard.
0:32:56 RR: That’s correct. You know what jig? If you guys ever saw the show, Lost, I use this analogy all the time. So, plane crash, island, people always trying to attack you, smoke monsters. There’s always a fire to be put out every day. So, 80% of the people are gonna focus on surviving the day. But somebody’s gotta be thinking about how we get off of this island. And I think… I don’t know that you can stop the tidal-wave of those kind of questions that are gonna come to you, but you’ve got to spend time working on changing the conversation. I think that’s where you just have to find, even if you’re not a center of excellence model, your one team, you’ve gotta find the 20% time to get upstream of that and change that conversation, or else you’ll just be stuck on the island surviving every day.
0:33:42 MK: I totally agree with that. The only question is, and I actually talk… I spend a lot of time talking to women in our industry about developing trust with your stakeholders and it’s something, I don’t know why, but what I can tell you is that women do seem to struggle with developing that trust. And I don’t know if that experience is different to guys in the industry but I know that it’s something that’s really important to a lot of women that I talk with. Is it worth ever sometimes doing some of those things that take you five seconds to do? Is it worth doing some of those tasks in order to develop that trust, or is it more, “I should try and reframe this conversation?” When, if you don’t have that trust then the reframe can seem as push back?
0:34:30 TW: I mean, to me, it’s this progression with… If you just start saying, “No, no, no”, as you said, I don’t think regardless of gender, if you’re saying, “Yeah, you’re asking for X”, and in your head you’re saying that there’s no value, you can probe a little bit, like, “What are you gonna do with that?”. Or what if it… I like the example of saying… Just for shits and giggles, let’s say that it’s 60,000. What would you do? You can come with that, but I don’t think you can go very far, you’ve gotta do quick little hits and then say, “You’re right, I can pull that for you easily.” “Well, let me establish that good relationship”, and I do think it is a… You’re walking a bit of a… It’s a balancing act, because if you continually do that then you just kinda become their bitch, I guess. It would be the phrase that’s popping to mind… And you’ve kinda trained them that they can just tell you, “Give me this, give me this, give me this”.
0:35:27 TW: So, to me that is part of the art of being an effective analyst, is not looking like you’re stonewalling and holding the data hostage, but also gently trying to guide them. And if you can actually say, “Oh, here’s your number, 60,000”, but also, “It seemed like you were really curious about this, so I also did this other thing that you didn’t ask for. I drew a picture, I just drew a diagram to show how this fall off is happening”. Something they didn’t ask for but you sort of detected and it does take more time, but that’s when all of a sudden they’re like, “Oh.” And then all of a sudden they spend all their time talking about this thing they didn’t ask for directly, and you’re off to the races. So inappropriate language aside, I think that’s key.
0:36:17 RR: I do wonder if the person that’s bringing those questions is the right person to spend that additional time with too, though. I don’t know the organization. I don’t know the people in that organization, but there’s some folks that I would say we can sort of get this person their tactical answer and get this piece of it over with and view the fact that they’re asking these kinds of questions as an opportunity to talk to that person’s boss about a more value-added conversation that we could create for them. I don’t know, I think that’s probably really proprietary to every business and who you’re dealing with and what the politics of that organization are, but it’s a math equation between this is definitely a sales lead to a broader conversation, but is it this person that I’m gonna have that conversation with or just get this pain over with as quickly as possible and go upstream and offer the additional services to a different person, upstream to that person.
0:37:10 TW: But what happens if the person is not the right person, but all they’ll tell you is that the person upstream, that’s what they want, that actually just set off another one like, “Oh, yeah. No, my manager just wants it, I can’t… ” But what are we really trying to do? “No, no, no, person two or three levels up, they just wanna know this metrics.” Like, can we talk with them? Like, no, that’s just what they want. How to get A, I guess, you guys run into that. B, do you have any good strategies for addressing that?
0:37:41 RR: Yeah. Maybe I’m lucky in my organization, they’re sort of a very open door policy to any level of any department, so that’s simply just sending an email going, “Hey, chief marketing officer would love to sit down with you, get 15 minutes, run some ideas past you.” And that’s a perfectly acceptable practice. I imagine it’s not that way at every reorganization so maybe I’m just lucky that way.
0:38:03 TW: I will say in my perspective the CMOs and the directors, they’re actually totally open to it. It’s just they wind up with people who are two or three levels down, who they don’t quite get that. So it’s weird, the barrier isn’t there. You’ll get into trouble and walking into that person from that person, they think it’s awesome. It’s the people who are two levels down, who were, whatever they’re terrified. The person, the executive asked some casual question and the person just decided to behave as an order taker and ran and they just wanted to get it off of their plate and onto the analysts, whereas the person would’ve loved for somebody to actually come back and ask a smart question. So I do try to remind myself and analyze that often.
0:38:49 MK: If you didn’t have that policy, I’m pretty lucky I have that kinda policy too, where I could go to anyone in the C-Band and say, “Hey I heard you have this question, I just wanna talk through a little bit of the details around it.” 100% that wouldn’t be a problem. But if you were in a company that wasn’t like that, do you reckon it’s still worth going back to that person two or three levels above and being like, “Hey, I heard you have this question” or would you be worried about fracturing that relationship with the person who asked for it?
0:39:19 MH: Sometimes I think you gotta give the person what they asked for just as a way to keep continuity to get that next opportunity to build the relationship and the trust.
0:39:29 TW: But you can ask the deeper question.
0:39:31 MH: You can.
0:39:33 TW: You can return your glass jar with it. You can actually say, “Here’s the question you asked and oh by the way, slide two says or the footnote says, ‘Really not clear if this is the bigger priority or that’s the bigger priority.’ I’ve definitely been subversive in my deliverables in those situations.
0:39:49 MH: And if it doesn’t clear up over time then that’s a signal that the organization’s maybe not gonna long term be the right thing.
0:39:56 TW: Then search discovery.com/…
0:40:01 MH: I don’t even have to do it even anymore.
0:40:07 TW: We didn’t let our guest answer it. I don’t know, Rusty?
0:40:09 MH: I know, yeah, Rusty.
0:40:12 RR: Yeah. I mean, I don’t know. There’s decision makers and then there are sort of the working level and at some point, the conversation you really wanna be having about a broader analytics application. I find it hard to believe showing up in that person’s office and saying, you know… Maybe not a very specific scenario, but after a series of those scenarios that played out where I’ve been asked very tactical questions, to have a lunch with someone a little bit further upstream and go, “Look, I love the team, I love being a part of this. I love serving the business with data, I feel like I’m being underutilized. I actually think there’s a more strategic conversation we could be having around these analytics. How can I help us create that? How can I help you and your organization get up from where we are to a more strategic conversation, ’cause I feel like the potential’s there?” I can’t imagine anybody not wanting to have that conversation.
0:41:05 TW: I 1000% agree, but I’m also realizing that that means… Because I’ve watched people say, “No, we just wanna keep elevating the conversation” and they’re not really prepared to go and really dig in and execute, so as soon as… You are raising the stakes. When you say I wanna elevate the conversation, you’ve inherently committed to delivering something of value on that and that means it maybe much harder data to get. It may mean you really having to work really hard to deliver it, because that is a killer. If you’re not gonna be able to provide your totally tactical simple thing, because I want to push and elevate the conversation, and then once you open that door and say I’m elevating the conversation, you are committing that you can participate in that conversation on an ongoing basis and so you damn well better be actually delivering and you have to. That should be scary and exciting, but a lot of people I think shy away from making that step.
0:42:08 RR: Lesson 361 of my career, is being too much of a visionary and talking about what’s possible instead of what we can do. So you get this reputation of deliver it. Now what I do is I really work on creating at least a working model or a plan of how to achieve that and bringing that to the table as well. If I can literally pull it up and show them or bring some kind of data to the table and go, “Look, here’s just a little piece of what’s possible here. Imagine if we scaled this up.” I bring something with me. I’ve gotta bring a tangible piece of it and keep it a little bit closer to practical for them.
0:42:49 MH: That’s great.
0:42:50 MK: Mm-hmm.
0:42:51 MH: Okay so I wanna turn this just a little bit but I’m interested in your opinion and I don’t know if there’s a right or wrong answer. But as we’ve been discussing, there’s a number of really cool little key things. So, turning the conversation a certain way to kind of the leverage you can pull as a business, how to model things. Kinda you gave the example of one of the analysts you had who kinda was really great at just turning the question. So there’s sort of emerging to me these two sides of this which one is a process and structure side, how we align the business, how we approach the business as a unit. And then there’s the individual side of how I interact with a set of stakeholders in the way that I kind of deflect change, ideate around their challenges and problems. What is the most important… Let’s say a VP who’s going to try to build an organization is listening, is the structure and the process part more important or is the individual talent of the team more important? Or is it something that’s somewhere in the middle? So I’m really just curious what you think.
0:44:00 RR: Both, but I think there’s an order, if that makes any sense.
0:44:03 MH: Yeah, yeah. No, that’s good.
0:44:05 RR: I think you won’t get there if you don’t have both sides of that equation. But I think the faster way to the answer is helping an executive get what they need at the table at a very high level and then driving down through the organization, the analytics that are gonna… So help them get a baseline of how well their marketing is performing. Go after that problem, ’cause money is following that problem. Their investment, their major decisions they’re having to make year over year, there’s not a chief marketing officer in a company that isn’t having to do more with less every year. So, analytics can help them spend that money more wisely and get good results. Getting to that person and helping with that problem, if you can do that and then drive that down will put the priority in the right place. And then you’ve got to help the team mature around being able to answer those questions. I think that’s where I’ve seen it work best. I’ve seen a lot of bottom-up and I see it stall at a certain level. You can get good ground level, but the numbers are not in their favor. It’s just never going to achieve what it can achieve if the top isn’t on board with that and driving it through the organization.
0:45:12 MH: Okay, no, I think that’s a really good point and that’s a tough one. That’s tough for people.
0:45:19 MK: What happens where you have stakeholders that are really data-savvy? They believe in using data to make decisions. They’re totally across the process everything. What ends up coming out is that every time they ask you for something, they give you the context, they give you like, “Here’s the decision that I’m making with it.” But then it’s also like, “This is exactly how I want you to do this analysis. And then can you visualize it this way and use these data sources.” I imagine this is something that you’ve encountered in your career. Do you do exactly what they say? How have you managed that if it’s someone who really does believe in the process but is also highly educated about data?
0:46:08 RR: I think we have people who are really, really savvy about data but I don’t think we have people who are savvy about digital data. I’ve never had a person that’s come with such a strong opinion. They might’ve framed the problem very specifically and said, “I need data that does this.” But I’ve never had them actually go to the extent of telling me how to perform the analysis or how they want to visualize the results of that. I’ve never actually come across that. So, I don’t have a good answer for you there, unfortunately.
0:46:37 MK: Mm-hmm.
0:46:39 RR: That’s maybe sort of keep your enemies close… Keep your friends close and your enemies closer. That’s somebody who can be… In some ways, they’re good client because they’re really passionate about this and in some ways they’re going to be the most challenging client you’ve ever dealt with. And so I’m gonna wanna collaborate with them. I’m gonna wanna figure out how to shape them a little bit and partner with them, that’s what I’m going to look for there.
0:47:01 TW: I feel like when somebody says, “I’m all about the data. I love the data.” That’s like a uh-oh, this is gonna be, they want to be and they might not have learned that there’s value and power in the brainstorming and the collaboration to try to work together. And to me that is just like, “Okay, strap on your patient boots.” ‘Cause a lot of the time they’ll say, “I’m totally bought into the process.” But the process they’re bought into is their process which is the process they learned at the previous organization that was a media agency and therefore was a complete disaster. So, that’s all I can do is say, “Be patient and try to figure out how not to piss them off.” Or if they go too nuts, it’s like well, “I’ll send you a log-in. Sounds like you know exactly what you want. You should have an email in your inbox with the log-in. If you’d like me to review your results before you present it, I’m happy to do so.” [laughter] I haven’t actually done that, but…
0:48:04 MH: It’s great. [laughter] Here’s a length of rope, good luck out there. [laughter] Alright, well Tim, listen. I know Tim, you got lots more questions, but unfortunately, we’ve gotta head to last call. This is a really good conversation, and actually surprisingly… Well, not surprisingly, [chuckle] I’m very impressed by the level of depth that you are covering here, Rusty, and that’s… I’ve got a lot of really awesome notes that I’m excited to think through. One of the things we like to do in the show’s called ‘The Last Call.’ We just go around the horn and talk about anything that we think is interesting and might be interesting to our audience. Rusty, you’re our guest, do you have a last call?
0:48:49 RR: I do. Can I have two? Is that okay? Is that acceptable?
0:48:52 MH: Wow.
0:48:52 MK: You get bonus.
0:48:53 MH: That’s pretty presumptuous, but…
0:48:53 MK: You get bonus points.
0:48:55 MH: Alright. Bonus last call. Of course, you can. Go for it.
0:49:00 RR: Okay.
0:49:00 TW: That’s okay. Michael and I will have the same one so you’ll make up for it.
0:49:03 MH: I mean technically, you are the king of the DAA, so that’s…
0:49:11 TW: You now have one for the salt and one for the pepper.
0:49:14 MH: Rusty.
0:49:15 TW: There you go.
0:49:16 RR: Alright. Well, I think about a decade ago, I remember working with Gerry McGovern, I don’t know, if you guys know Gerry McGovern or not, but he was really big on customer care words and framing your user experience around customer care words. I find myself coming back to Gerry McGovern’s blog again, as we think more about journeys. I think he’s really talking a lot about marrying user experience, analytic data, customer care words, content strategy and voice of customer together to really drive out a better customer journey. I think, he’s always been passionate about that, but he sort of added those other disciplines in, and I find myself peering over the journey analytics edge going, “This is a lot of what we’re gonna have to do.” So I find myself reading his blog a lot more these days.
0:50:09 TW: Literally, what’s a customer care word?
0:50:13 RR: Yeah, so Gerry, when we first brought him in at Vanguard years ago, his thing was he worked with an airline company that was offering really cheap flights, but they were using on their website discounted airfare, and they were practically giving the seats away, but no one was taking them up on the offer. So Gerry went out and really found that… He did a study with their customer base about what their brand stood for and what words resonated for them. And it turned out cheap airfare was the way they thought about that. So, the content strategy of the site didn’t align with the way the users thought about the process and so it was about optimizing to the words that are in your customer’s head about your brand and how to think about various actions.
0:50:55 RR: But, he’s expanded that more into a more journey-based analytics driven approach. Again, I find myself reading his blog a lot more again these days. And I guess the second one, and Michael, I know you’re always talking about developing analysts, and you’re pretty passionate about that. We’ve talked about that a lot. There’s a concept out there called the Fermi Principle. It’s sort of a scientific thing that I use with my analyst. I love this. It’s really about how you use unlimited data sets, do back of the napkin math, to estimate what an outcome could be, or would be. To me, it’s all about how a scientist, or how an analyst thinks about how they would tackle a problem, an analytic problem, and really getting people thinking about that. One of the famous applications of that is how do we know we are the only beings in the universe? There’s unlimited data set. You only have the data set that’s available to you from your perspective. How do you tackle that problem? And it really helps me work with the analyst to develop that next level thinking about how you assemble analytics to answer really, really difficult questions in ways that there isn’t a perfect data set to do that, so how do you approach that.
0:52:13 TW: I don’t know if Michael can remember, but we had a guest, who had a…
0:52:18 MH: I do remember.
0:52:18 TW: It was around the Fermi… Who was it though?
0:52:21 MH: Was it Dennis Mortensen?
0:52:23 TW: I was thinking Dennis Mortensen.
0:52:25 MH: Maybe the Fermi Paradox, yeah.
0:52:26 TW: It was the Fermi Paradox, but it was a post that was kind of coming at it from both the so insignificant and how do you… It was another label that was on top of the Fermi Paradox.
0:52:37 MH: That’s right.
0:52:38 TW: I will track it down.
0:52:39 MK: That sounds really interesting, though. I’m keen to have a read.
0:52:42 TW: It’s funny, I actually, did my last public presentation of 2017, I actually had a back of the napkin thing. I need to start… It would sound smarter if I referred to it as applying the…
0:52:53 MH: Exactly. [chuckle]
0:52:53 TW: Fermi principle.
0:52:54 MH: I’m not just making this up. I’m applying the Fermi principle here, folks.
0:53:00 TW: I have a picture of a napkin. That just doesn’t seem as a classy as perhaps.
0:53:05 MH: Dead bats, I’m about to do data science. [laughter] That’s right. Alright. Hey, Moe, what’s your last call?
0:53:11 MK: Oh, well, it’s hard to follow that up. Today, I just have a little tip. It’s nothing super significant, but I have been using Tableau for a while, where I use just a color picker tool, and I use that to pick the colors in a logo. And someone recently from Data Runs Through, introduced me to a tool called Pictaculous, which tells you the colors of logos, which as, we all know, I really get into my colors and making sure the graphs meet brand guidelines. If you haven’t checked it out, Pictaculous is a really cool tool.
0:53:47 MH: Nice.
0:53:48 TW: Excellent. Michael, you wanna go?
0:53:49 MH: Yeah, I’ll go first because I am very nervous that Tim will have stolen mine again. It is actually in the data science realm. Recently, I ran across an author and speaker and a guy who does a lot of work in machine learning and AI named Joel Grus or Grus, I don’t know how to pronounce his last name, G-R-U-S. Anyways, he has a blog, joelgrus.com, and he has this hilarious article, where he goes through as if he was creating a Fizz Buzz interview situation, but he uses TensorFlow to do it. [chuckle] So anyways, if you like data science stuff and AI, that’s a fun blog to read. And his book, actually, he’s got some books, so check it out. Anyways, so that’s my last call. [chuckle] It’s more of, it’s fun.
0:54:45 TW: So what it was, the Wait, But Why post on the Fermi paradox, was what Dennis referred to, which is a little mind-blowing. It separate… That was more like a blow your mind, for me, paradox. I feel maybe I need to go read a biography of Fermi, too, smart dude. Okay, I’m gonna sneak in it too for… So one is in the beginning of the year, you’re figuring out where are you going. I’ve not actually plugged it on the podcast, but womeninanalytics.org, the Women in Analytics conference in Columbus is on March 15th, 2018. If you’re a student, there are ways to apply to try to come to it for free. It is only… Women in Analytics, it is an analytics conference, all the speakers and presenters are gonna be women. It is open to both genders for attending. There’s a data visualization competition. It’s open to both genders. It’s a $1,000 prize, so check it out, Columbus in March can be awesome. It can also be a little cold and rainy so it might be awesome, hard to say.
0:55:51 TW: The other is just a random fun little tool. If you are heading into the year and you’ve got a budget, you’ve got a job description, I saw this tool that is textio.com, T-E-X-T-I-O.com, and it is, allegedly, machine learning behind under the hood. But you paste in a job description and it basically highlights, “Oh, you know what this word, actually skews feminine”, so think about that. Or this is passive, or you’re using the same phrase 17 times, that’s maybe overused. I saw a guy present and showed it and it just was very, very cool. You paste in your job description and almost… ‘Cause nobody, as he made the point, nobody likes to write job descriptions, and they all suck, they all sound the same, and many of them are very, very ineffective. Check that out with the next job description you have, and I bet you will feel like you have a much rich job description after using it.
0:56:54 MH: Nice. Yeah, as someone who frequently writes job descriptions, I’m all over that one. Alright. Well, listen, you’re probably been sitting there thinking, “How do I engage with these people? Especially this Rusty characters who seems so intelligent.” And I tell you there is a way, there is a way and it’s through our Facebook page, and Measure Slack, and on Twitter. Rusty, do you use Twitter?
0:57:23 MK: I do not. I do not use the Twitter.
0:57:25 MH: Oh, he does not used the Twitter. Well, good for you, it’s going away. It’s 2018, it’s the year of Twitter just crashing and burning.
0:57:34 RR: I was ahead of the curve.
0:57:36 TW: We’ll just post his cell phone number on the show notes and you could just text…
0:57:42 MH: Just throw a text over there. No, but we would love to hear from you. And actually I wanna talk just… I wanna talk to the people right now, just right out there in Radio Land. [laughter] We do this thing, where we need you to go review our show on iTunes. I know that sounds pretty self-serving, but it is something that we’re trying to do a better job of. So if you are listening in a car, just call or leave yourself a voicemail after this to say, “Hey, once I get to my desk, log into iTunes, leave a show review for the Digital Analytics Power Hour,” and we would very much appreciate it. Alright.
0:58:23 TW: Jim Stern, you’re excused, ’cause you’ve already done it. Andre Richardson, you’re excused, ’cause you’ve already done it. Thank you.
0:58:28 MH: Actually, Jim Stern is the person who recommended Rusty as a guest, so that’s a big shoutout to Jim, as well, for that. Well, you can come see us in person, and there’s two places you can do that this year. One is at Superweek in February, and the other sounds like it’s gonna be in, is it June? At Marketing Analytics World and that’s pretty exciting.
0:58:56 TW: Marketing Evolution World.
0:58:57 MH: Marketing Evolution World. I don’t even know the name of it, that’s how exciting it is. Anyways, that’s we’ll be doing, some live shows this year, so we’re excited for that. Alright. Well, Rusty, thank you so much for coming on the show. For my co-hosts Moe and Tim, for all of you out there, keep analyzing.
0:59:21 S1: Thanks for listening. And don’t forget to join the conversation on Facebook, Twitter, or Measure Slack group. We welcome your comments and questions, visit us on the web at analyticshour.io, facebook.com/analyticshour, or @analyticshour on Twitter.
0:59:41 S?: So smart guys want to fit in, so they’ve made up a term called analytic. Analytics don’t work.
0:59:50 S?: What’s you’re drinking, Holmes?
0:59:53 S?: Just some bottle whiskey from my desk drawer.
0:59:58 MH: Tim, how much is that rig that we could get from MPR.
1:00:02 TW: That’s only about $3,000 to $4,000 a piece, so we don’t need to get…
1:00:06 MH: Worth it. Yeah.
1:00:08 TW: Yeah. I’m really addressing Michael. [laughter] You get that. [laughter] I’m pretending I’m telling you that, but Michael is like “Oh yeah, yeah. Oh, that was fucked up, let me try that again. Okay.” [laughter] So then…
1:00:19 MH: Believe in your skill, Tim.
1:00:23 MH: I try, I don’t know how how to deal with stakeholders who don’t get it. I give up and go home.
1:00:31 TW: I’ll take that.
1:00:34 MH: That’s what I should have said.
1:00:35 TW: That’s good, stroke Tim’s ego.
1:00:38 MH: Rusty is the king of analytics in the United States.
1:00:47 TW: Also, other than being a few times away on the wrong cuisine, it was completely spot on.
1:00:52 MH: You were a little dusty there, Tim.
1:00:53 TW: Okay.
1:00:56 MK: And Tim is shaking his head furiously.
1:01:00 MH: Great job too, alright.
1:01:02 MK: You summarized it so well. No one needs to say anything, you just nailed it. [laughter] And feel free, Tim and Michael, to jump in if you want.
1:01:12 MH: Oh, Tim will jump in alright.
1:01:19 TW: So, if you don’t review the show, you’re insulting, both Jim Stern and Rusty.
1:01:26 MK: Nothing like an insult to finish the show.
1:01:29 TW: Oh man, my voice connections are rusty.
1:01:31 MH: Yeah, we lost Rusty about 30 seconds before, and I was like, “Oh no”.
1:01:36 MK: I was praying that you’d noticed that and so you wouldn’t be like, “And thanks Rusty,” silence.
1:01:43 TW: Rock flag and clueless stakeholders.
This site uses Akismet to reduce spam. Learn how your comment data is processed.