If you didn’t have a visceral reaction to the title for this episode, then you are almost certainly not in our target audience. There are few more certain ways to get a room full of analytics folk fired up than to raise the topic of dashboards. Are they where data goes to die, or are they the essential key to unlocking self-service access to actionable insights? Are they both? Is the question irrelevant, because, if they exist to inform business users, aren’t they soon going to be replaced by an AI-powered chatbot, anyway? We thought a great way to dig into the topic (and, BTW, we were right) would be to have someone on the show who has co-penned multiple books on the topic. As luck would have it, Andy Cotgreave, one of the co-authors of both 2017’s The Big Book of Dashboards: Visualizing Your Data Using Real-World Business Scenarios and the imminently releasing Dashboards That Deliver: How to Design, Develop, and Deploy Dashboards That Work agreed to join us for a lively chat on the topic!
This episode’s Measurement Bite from show sponsor Recast is a quick explanation of power analysis from Michael Kaminsky!
Photo by Foto Phanatic on Unsplash
00:00:05.73 [Announcer]: Welcome to the Analytics Power Hour. Analytics topics covered conversationally and sometimes with explicit language.
00:00:08.46 [Michael Helbling]: Friends, analysts, and dashboard builders, lend me your eyes.
I come to speak of dashboards, not to praise them.
The charts we build may live beyond our hands, yet meanings often tuned within the data.
Dashboards are loved, executives who click, marketers who scroll, and analysts who sigh.
But comprehension rarely walks these halls and insights haunts the tooltips like a ghost.
O judgment, thou hast fled to decks of slides and reasons lost.
My heart is with the charts, and I must pause until it come again.
Okay, well, as the Bard might say, aw, shit, here we go again. We’re talking about dashboards. The staple consumable of our little industry, but always a little bit hard to define. And now with AI, are they maybe finished for good? I don’t know, we’ve got a lot to talk about. So let me introduce my co-hosts, Julie Hoyer. You’ve built a dashboard or 500, I would guess. Oh yeah, well-versed. And Tim Wilson, I know this is your first time talking about dashboards, but I think you might be excited.
00:01:28.06 [Tim Wilson]: should have checked what we were talking about before.
00:01:30.15 [Michael Helbling]: Yeah, yeah. Well, we’ll try to keep you up to speed because I know you probably don’t have too many opinions about it. And I’m Michael Helbling. I’m super excited for our guest today. Andy Kotriev is the co-author of The Big Book of Dashboards and former technical evangelist at Tableau. He is the co-host of Chart Chat and writes the How to Speak Data newsletter. He’s got more than 15 years of experience in data visualization and business intelligence. first honing his skills as an analyst at the University of Oxford. He has inspired and trained thousands of people with technical advice and ideas on how to identify trends in visual analytics and develop their own data discovery skills. And finally, Andy has a new book out this month, Dashboards That Deliver. Welcome to the show, Andy.
00:02:17.05 [Andy Cotgreave]: Hello, everybody. It is a delight to be here. I’m looking forward to a lot of banter and possibly arguments.
00:02:25.35 [Michael Helbling]: I like your style. That’s not what our show is about. No, it’s awesome to have you, Andy. And I think, you know, obviously you’ve got a new book about dashboards that’s just coming out in like a week or so. I think maybe you’re taking the line that maybe dashboards aren’t dead, but let’s jump into it.
00:02:48.93 [Andy Cotgreave]: Yeah, let’s do it. I’m very excited because if I was listening to one of your recent episodes about AI, and I don’t know your voices by too well, but I don’t know, Michael or Tim, one of you said, dashboards are shit. I think that was the line you said in the AI episode. That was my call. That was my call.
00:03:07.71 [Tim Wilson]: That was something I might say.
00:03:09.68 [Andy Cotgreave]: So I’m looking forward to taking that one.
00:03:14.12 [Tim Wilson]: So can we start by actually defining a dashboard? Because I feel like that is actually one of those things where people get into raging debates, and it’s actually the underlying debate is what is a dashboard.
00:03:30.07 [Andy Cotgreave]: In the big book of dashboards, we defined, and in dashboards to deliver, we define a dashboard as a visual display of data that is used to monitor and or facilitate understanding. 15 words. When we wrote the Book of Book of Dashboards, the definition got longer and longer and longer as we were trying to capture all caveats. In the end, we’re like, it’s such a vague term. We just reduced it to virtually nothing. Stephen Fuchs chose to write about 1500-word blog post, destroying our definition, even questioning in his blog post whether myself, Stephen Jeff, the co-authors of that book, had the authority to teach about dashboards. Thank you, Steve, because he had very specific things, but things like must fit on a single screen, must be interactive. But it’s so easy to demonstrate that that is patently wrong, right? I wrote a whole chapter in the new book about this challenge of what a definition of a dashboard is. And in the end, what is a dashboard? It’s a piece of wood or leather that used to sit on a stagecoach between the horse and the driver to stop water dashing onto the driver’s legs. People made cars and they stuck that kind of thing to the front of the car and they put gauges on it and called it a dashboard. So it’s a term that’s borrowed from something else in etymology. and is a catch all for so many things. So we have a really loose definition of what a dashboard is. But if you’re building a chart or a collection of charts or even a collection or even a single text table, very controversially, we can make an argument that it is a dashboard, even though many people might say it isn’t. And we’re fine with that.
00:05:11.10 [Tim Wilson]: But you’ve said it’s displaying information and promoting understanding. That does stop short, I think, of the interface to an underlying data system for deep exploration. pick on adobe a little bit but they’ve got analysis workspace and they will send which is is you can build your building visual displays of information and then you’re sharing those so i think in some cases i think of anything it’s a challenge for their product. It’s built more to be kind of a. an interactive digging and drilling and ad hoc, but they also say you can use it to create a workspace for the display and understanding of data to end users, which I think it struggles at because that’s not what its core basis is. Do you see A line there between displaying and promoting understanding and exploring shallowerly or deeply?
00:06:21.05 [Andy Cotgreave]: Absolutely. So, I mean, you get into that question to me also touches upon the The audience is in that there are data analysts and engineers, and there are business people or non-data analysts, not necessarily business people, just people who are not data analysts. As a data analyst, then we have access and use the systems to visually explore data and just ask and answer questions and chase insights to find them. But much as I would love everyone in the world to be as skilled as all of us in exploring data, I’ve discovered that’s not a real option. Some people are freaked out by data so they don’t have time to learn data. So a dashboard is a window on a predefined, finite set of questions to allow those people to explore in a relatively narrow way the answers to those predefined questions.
00:07:18.05 [Tim Wilson]: I feel like that’s some of the copy that was in the larger, the longer definition that then got shortened.
00:07:22.82 [Andy Cotgreave]: Oh, yeah. So the definition is short and vague, and then we’ve written two books about what that actually means.
00:07:29.62 [Julie Hoyer]: So then within that broader definition of dashboard, I’m glad though, Tim, that you had us touch on the difference between what’s a dashboard and where is maybe like an analyst or an engineer like digging in because I think that’s really helpful. My other question then is within Bucket of Dashboards, do you talk about them Andy as like different categories of dashboards depending on the audience or like you said is it a a table of text or, you know, does it have a lot of visualizations? Is it a page long? Is it being put on, you know, a single screen or it’s a whole Adobe workspace with lots of panels? Are there categories for format? Are there categories for audience? Are there categories for any other characteristic?
00:08:14.63 [Andy Cotgreave]: So, we don’t formally create that categorization in our book. In our book, we refer to Nick de Barra, who’s written practical dashboards. He sort of inherited Stephen Fugh’s teaching course and then has built it into his own. He’s brilliant. He created a taxonomy of dashboards. I think it’s 13 different types of dashboards. Is it an infographic? Is it used for monitoring ongoing problems? Is it used to monitor a short-term project? He’s got a really good taxonomy. We chose not to go down that route, but that’s a really useful reference point if people want it for types of dashboards. holy endorse next approach. Another aspect of the question was around form factor. Should it be a single page? Should it have interactivity? Should it have multiple views? It depends. It depends on the vast infinity of user requirements, what they need, how they’re going to consume it, how much time they’re going to consume it, and again. So we explore that, adapt in the book, but we don’t categorize it.
00:09:26.53 [Julie Hoyer]: Just because I’m thinking, Tim, when we talk about dashboards a lot, I know Tim has the point of view, and I generally agree, but one of the best uses of a dashboard we talk about for performance measurement. And so, Tim, I know I When you talk about that, do you believe that’s the only way a dashboard should be used? Because I think that’s where we end up getting in debates a little bit. Should it only be used for performance measurement where it’s very clear, concise, it’s against a target, it’s a quick snapshot? Because does it make you bristle to hear like, there’s lots of categories and uses of dashboards, or are you okay with that?
00:10:04.73 [Tim Wilson]: I think I take a simplistic view and I say this is, I just live in that one category and I followed Nick. He is brilliant. He’s engaging a lot on LinkedIn, so I will further endorse like Andy, like Steven, multiple. Everybody who’s going to get mentioned here are really, really sharp. So I see that as kind of being a narrow view that if we use them for just measuring performance, monitoring how it’s going, having targets, it really works. I think some of the stuff that Nick and Andy have done is saying, yeah, there are broader uses. I think I’ve had an overly simplistic view and they make a, they spend a lot more time thinking about it and have a much more nuanced, view, although I do come to questions like data storytelling. I’ll see people, none of the people that I’ve just mentioned, but we’ll talk about how a dashboard should tell a story and it’s data storytelling. And that’s one of the ones that just cringes and makes me cringe and wanna run back to my, no, no, no, here’s this one specific area that I think dashboards can be used. I’ll put it this way. I think it’s very, very, I can so readily defend a dashboard that is generally one screen with some caveats for monitoring the performance of a project or an initiative, a campaign. And doing that well is, to me, easily, easily defensible. When people head off into other areas and start making broad proclamations, they’re kind of setting up things that they can then say, these are, it’s bad for this. And I’m like, well, yeah, but you’re talking about something that maybe isn’t really what should be a dashboard.
00:11:49.47 [Michael Helbling]: It’s that time of year. Half your team is on vacation, your inbox is quiet, and maybe, just maybe, you’re thinking about taking a little time off yourself. Well, with 5Cran, you actually can. Their fully managed data integration platform keeps your analytics humming even when you’re out of office. I mean, you could be off hiking, surfing, just sleeping in, 5Train is syncing all your sales, marketing, and product data into your warehouse accurately, securely, and always on. It’s like a reliable teammate that never takes PTO, so relax, unplug, stop worrying about late-night Slack alerts. Your data is not just covered, it’s thriving. That’s the 5Train difference. Check them out at 5Train.com slash APH. See interactive demos or start a 14-day free trial. That’s F-I-V-E-T-R-A-N dot com slash A-P-H. Let’s get back to the show.
00:12:48.43 [Andy Cotgreave]: I’m desperate to come back to storytelling, but I want to talk about that, your definition of what some of the things you’ve explained that to, right? One of the things that really frustrates me is I’ve seen in lots of organizations, they go, we need to build a dashboard and the project or we need to do BI, right? And then the project is sponsored by the CEO or some of the C-suite person. And so they define and build a system monitoring process. So let’s say it’s sales. So they are tracking sales by quarter to target and they can drill down to region and product type, perhaps, right? Brilliant. That is a high-level sales aggregated dashboard fitting in that criteria you’ve said. But the problem with those kind of project is they are aggregated and are useless for the people, for the millions, the mere normal people who are actually trying to do granular day granular work at the more curve face. So a dashboard we had at Tableau, which was hugely used, was, I think we called it the Who’s Hot dashboard, and it was used by the sales organization. And it would show individual leads or individual people in our CRM and the activity they’ve been doing over a timeline on the website or the activity related to Tableau they’ve been doing. You can see a dot-plot on one row was one person. You can be like, visited the website, visited the website, did some training, visited the website, logged on, downloaded. That dashboard allows the person to look at their opportunity list and be like, Michael has been doing a whole bunch of watched all the training videos. I’m going to call Michael up and say, Michael, you’ve been doing some training, you should be talking about Tableau. And that is a dashboard that allows people to monitor disaggregated data. And it is a fundamental use of a dashboard. It’s the disaggregated data visualized in a single screen or in multiple screens in that particular example. But that’s useless for the C Chief Revenue Officer, because he doesn’t get in the weeds with that, right? And so you have that spectrum of dashboards the need to be included inside any data strategy. And I get really frustrated when it’s like, hey, we’ve built the sales tracking dashboards, off you go, I can’t execs. And they’re like, what can I do with that? Nothing.
00:14:59.51 [Tim Wilson]: But I think you nailed it when it’s like, there is this idea, we have all this data and dashboard is kind of the, so therefore, if we just build the right dashboards, then that is how as an organization, we become data driven. I’m trying to square the circle. I do tend to say a performance measurement dashboard, which is what maybe could be considered a scorecard, but I’ve got some kind of beefs with scorecards. I think what you just described, I don’t think I’m in the habit of saying this is a dashboard. I absolutely think building something that is not aggregated, that is specific, that is part of a process. What you just described, I would say there is an operational process by which leads are being monitored and when you define it, they need information displayed that they understand so that they can follow their individual lead follow-up process, them as an individual having multiple leads. I haven’t been in the habit of putting the label dashboard on it, but I think I may start because it makes sense. I’m loving this.
00:16:19.42 [Julie Hoyer]: I have a little bit of a question for the group then. It’s interesting because I do feel like what happens to Tim’s point and the pain I’ve felt a lot of times with clients is they want a dashboard that has big lofty promises and that they feel like if I just surface the data for them that all the interpretation an actual synthesis of what those individual data points mean will just happen and they’ll know what to do. And obviously we know that’s false. So it’s interesting how much interpretation should a dashboard do for you? Like in practice, a lot of times when I talk to stakeholders about a dashboard, I have ended up making documents where we actually outline like, here’s the outcome you’re going for. Sometimes Tim, I can get them to a performance measurement type dashboard. Sometimes I can’t, but at least I can take that and map that to say like, these are the things you’re trying to achieve or you’re trying to understand. I can turn that into a list of very concise, specific business questions that actually have answers that are impactful for them to understand and then try to get them to something that is usable. In the dashboards, then I usually put headers of the question the data point is answering, but that’s like baking in a lot again of interpretation. How much interpretation it has to come from an analyst has to be baked into a dashboard for it to be a good dashboard. That’s kind of a, I don’t know what was the place question, but that’s where my brain’s going.
00:17:53.20 [Tim Wilson]: I’m so proud. I’m so proud of that question you asked, Julie.
00:17:55.99 [Julie Hoyer]: I was worried I was like, Tim, come for me.
00:17:57.89 [Tim Wilson]: No, no, it’s just all over the place and, you know, I’m like, it’s not just me.
00:18:03.25 [Andy Cotgreave]: So Tim, I think your statement belies the reason we don’t really care about definitions because some people describe what we call as reports, they might be applications, they might be dashboards. It’s like, What are you trying to achieve with this data and what do the customers need? I’m very interested in telling the fact that the semantics of what we’re doing isn’t important. It’s the end goal that is. Julie, coming on to your question, I think it’s, which I’ve now just forgotten. Now I’m answering Tim’s question, I’ve just forgotten. I have the answer. Sorry. Who does the interpretation? Right. This is brutally hard, right? Because are all your users at the same level of data literacy or data fluency? Because that will change the way you design some of that. What we talk about in dashboards that deliver, we’ve got this whole framework. And it’s about application design, agile development. It’s really about thinking about user stories. So go to your stakeholders, work out as an account executive. I need to see who’s active on my website in order to make a high chance, or I call it has a high chance of turning into a sale. Those user stories help define the answer to the question. And there isn’t a single answer to the question, Julie. I wish there was, or I’ve not found a single answer to the question.
00:19:36.92 [Tim Wilson]: But so as Julie, you described it, and I’m flashing back to my early days at Search Discovery, and there was a, I think it was a monthly report that was getting built entirely in what was Google Data Studio at the time. It was basically a presentation being built inside a BI platform, kind of a free, and it was really, and it was every month, it was kind of a, what’s the story we’re going to tell, and let’s build it with these multiple tabs. And I was, that was a lot of work to build something that was kind of a one-time delivery. And when you’re saying what question is being answered, I feel like, and I think this might get Andy to where your user stories, is this a question that I’m going to ask every day or every week or on a recurring basis, in which case I need a dashboard as a good way to say, here’s the answer to that question. Or is this a question that I’m asking one time? Like, hey, was there any impact from that big traffic accident that happened wherever or some, you know, the airline strike? What was the impact? Answering that question, should be kind of a one time. Maybe it’s a different type of dashboard. I think there are times where I’ve seen the somebody ask this, we might as well build it so it’s automated and they can just continually get the answer to that question, but they’re not gonna keep asking that question. Now we’ve just created another bit of jumbled clutter that they can go to and have to wade through to get the answer.
00:21:23.22 [Julie Hoyer]: And that kind of comes back to what we were talking about too of like, is this an actual, is that something that should live in an actual dashboard that a business non-analyst user is going and looking at and having to look at 10 different data points to come to the conclusion of the one-time answer that the analysts did? Or are we saying like, that’s the type of question and then the type of work data visualizations, whatever that’s best left to the analysts and the engineers because they had to go into the interactive tool, dig and find it and it’s not easily repeatable in some single visual that should live in a dashboard.
00:21:57.61 [Andy Cotgreave]: I totally agree. I think anybody building out data strategies needs to realise that dashboards are just one tool in a toolbox, right? I think a great example of this is, I’ve done work with the Cabinet Office, which is the arm of the UK government that works under the Prime Minister. They have dashboard teams, but they also have a team that builds data narratives. When there are big cabinet meetings or COBRA, our emergency crisis meetings, they don’t build dashboards. They build out narratives, and there will be charts with explanation, charts with explanation, because they’ve recognized that in that use case, and every business has use cases like this, a dashboard isn’t the right thing, it’s a narrative. Some of the things happened in the White House during the COVID years, though with dashboards and also narrative builders. In the end, the narrative builders were more favored by the administration in the White House at that point. And I think it’s really important, and I’m sure we’ll go on to AI, but AI brings another paradigm of exploring data. So dashboards just exist within this collection of options to which the hard job is trying to find the right one for the right purpose.
00:23:14.15 [Tim Wilson]: So is that distinction of narrative versus the dashboard? Is that getting back to the storytelling dashboard?
00:23:21.17 [Andy Cotgreave]: Yeah, absolutely. I think generally, I mean, Steve, co-authors of the book is Steve Wexler, Jeff Schofer, and Amanda McCulloch. Steve R. I chapter in dashboards that deliver about dashboards of a story finding, not storytelling. And I think that’s a really important distinction. About 10 years ago, data storytelling was the hype in the industry, and dashboards can tell data stories. It’s like, well, dashboards can answer four questions, maybe five questions, but probably four questions is the most. You might be able to design a grid flow that goes over Zoom and filter details on demand, like Ben Schneiderman’s thing, but it’s not really a story, is it? Find the insights, and then there’s a whole skill and paradigm of how to then tell that story, which We don’t address in this book, but I might in future books.
00:24:10.45 [Tim Wilson]: I can’t believe you’re actually already thinking about the next book when the first one. I must not have chewed you up and spit you out.
00:24:17.80 [Michael Helbling]: I just like that you’ve got things to say. Before you touched on AI and I want to spend a good amount of time on that, but before jumping off, we touched on this process you define in the book of how to build a dashboard. I would love to just actually go through that in a more of a step-by-step fashion because we touched on it, but I actually feel like that’s actually such a helpful thing that’s right there in the book, but I want to give people kind of a little taste of that. Do you mind just kind of stepping through?
00:24:49.58 [Andy Cotgreave]: Yeah, so the framework is largely the mastermind of Amanda McCulloch and listeners might know her. She was exact director at DataVis Society and just an exceptional talent in process. And so the framework is largely her brainchild. And it’s just like basically, what happens when you’re building a dashboard? First of all, there is a spark. That spark can come from many different places. But at some point, somebody goes, we need a data asset, and that is going to be a dashboard. So we talk about when or when it isn’t a dashboard. After that, you’ve got to go and talk to your users. So that is about discovering and prototyping. We took an inspiration there from the double diamond design approach. Go and talk to loads of users, get their requirements, and then prototype with wireframing or with data. Oh, and there was a debate we had internally. But all the way through this framework, talk to your users. Your users are part of the development team. So you’ve got your prototype, and then the next stage of the framework is development, and that’s slightly circular, because again, that’s very iterative. Then you go to user acceptance testing, release and adoption, how do you train your users. Then before you loop back to the start and spark, the time you come to redevelop or rehash the dashboard is, do you still need it? You know, can you deprecate it? Because we don’t deprecate enough dashboards and many, many dashboards should die. I’m not here to say they should all live, but that’s the framework Spark prototyping release, no, development release maintenance. And yeah, we really… What was the prototyping?
00:26:31.43 [Tim Wilson]: Prototyping with data versus without data? What was that? What was that?
00:26:36.73 [Andy Cotgreave]: Should you wireframe a dashboard in the same way you would wireframe an application, a website.
00:26:46.89 [Tim Wilson]: I would say yes without data, because as soon as it starts, it’s so hard to start putting either dummy data or, yeah.
00:26:56.96 [Andy Cotgreave]: I mostly agree. But once I got sent once to a client site in Rotterdam, and this was a couple of sales opportunities that they’ve worked with, Jan Willem Tulp, who’s a really great designer, and they’ve prototyped this dashboard. And they wanted you to go and build it in Tableau. And I’m like, brilliant. So Fleet of Rotterdam. And they gave me this beautiful, sketched, with Mark and Penn dashboard of beautiful line charts and bar charts and sketch plots. It was gorgeous. And I built it in Tableau. Tableau. I say the American accent in the print. Tableau. All right. I built it in Tableau. So 15 years working for this American company. Tableau. Stick to the bridge accent. Anyway, I built it and it looked like garbage. And I was like, why does it look like garbage? And it’s because when you prototype, you draw a beautiful line chart, right? Maybe a little… And when you draw a scatter plot, you go, oh, look, a perfect regression and bar charts. But then when you put real-day training, your line charts go… And then scatter plots show chaos and bar charts, there’s just one outlier. So that I do not… Why framing is amazing, but… But when you wireframe a static website or data application, the interface doesn’t change, but the data drives the way your dashboard will look as well. So in the end, the customer wasn’t very happy, but they did learn that their data, the dashboard they were building, wasn’t actually good use for the trends in their data.
00:28:23.15 [Tim Wilson]: But that is getting to such this fundamental how people believe that their business, that the data is cleaner, things are, there are trends, there are patterns. I mean, I think that gets back to that coming from the top down where we were earlier when somebody says, we need to build a dashboard. And Julie, it’s kind of to your point, there’s this assumption that if we just have the data visualized, oh, we’re going to see the scatterplot with everything in that one outlier. And that outlier is going to just magically tell us how to drive the business forward or you name it, and then it’s like, oh, actually then it becomes the dashboard is a way for them just to understand that when users actually look at it, the first time they look at it, they say, I’ve never been able to see this data before, this is great. And the third time they look at it, they’re like, this all just looks kind of like noise to me. It’s what I asked for, but I’m not getting anything new from it, which I think goes back to your user story
00:29:23.84 [Julie Hoyer]: The other thing I was going to ask about too in the data and design and like do you design it with data or not when you’re prototyping, my brain actually jumped to making sure that what they assume is possible with their data is. Because a lot of times, I was saying, we go through, what are your most crucial questions? What are you trying to achieve? And we know we have all XYZ data, all sitting there waiting for you to go build this for me. And I’m like, okay, well, we got to get really specific. What you’re actually asking means we would need to be able to have this type of rate and this type of data point and this type of relationship. And lo and behold, it wasn’t designed that way. So we either have to go take the time to redesign your data and work with the engineering team to get it to do what we’re asking it to do and make sure that it can do that. And I think people are usually shocked by that. So when you were saying like prototype it with data or not, my brain kind of went there, is that something that you commonly run into? Is it kind of baked into the process?
00:30:25.43 [Andy Cotgreave]: Yeah, we talk about that in the book as well because I worked in marketing for so long in a tableau. And the amount of time is like, we’re doing a campaign. We just can use a build as a dashboard for the campaign. It’s like, build you a dashboard. All right, what do you think?
00:30:38.94 [Julie Hoyer]: This is the scariest question.
00:30:40.73 [Andy Cotgreave]: Yeah, I’m a tableau pro, so you assume that’s half a day. It’s like, it’s going to take me seven days to find the data, goddammit. Right. So you’re dead right, Julie. So we talk about that in the book. Yeah. And so getting to the data early in the process, not at the, but still be wirefroaming, but getting to the data reveals the trends that are not in the data and whether you’ve got the data in the first place or not.
00:31:04.19 [Tim Wilson]: So yeah. I know we’re going to get to AI, but I’ve got to ask you, using the dashboard as a way to drive some data collection, I’m sure there’s somewhere in the book on this. having the dashboard, but having placeholders saying, this date is not available. You have declared that it’s important and you want it and you can get it. We can’t not roll this out, but we’re going to go ahead because this is a designed entity. We’re going to put a box here and it says data not yet available. I like doing that because it’s is, to me, I see it often as a constant reminder that this is something that you said was important, that you said you needed, but it’s not yet available. But I could also be not thinking about something, and that’s a terrible idea.
00:31:52.62 [Andy Cotgreave]: We don’t have any examples in the book of empty boxes, but we do talk about that in the framework section. Don’t let Perfect be the enemy of published.
00:32:04.32 [Michael Helbling]: I had one thought that, and you touched on this a little earlier, which is sometimes the dashboard needs to be deprecated. A lot of companies struggle with this for various reasons and end up with hundreds, maybe even thousands of data artifacts, dashboards, reports that are almost impossible to sift through and need to be maintained. It basically burns out the data team because they’re simply like carrying a massive carcass of old dashboards.
00:32:37.43 [Andy Cotgreave]: Yeah.
00:32:37.77 [Michael Helbling]: I’m trying to be a crizzly. I’m grizzly-dragging a carcass along.
00:32:43.24 [Tim Wilson]: Technical debt for dashboards is carcassness. Okay.
00:32:46.06 [Michael Helbling]: Yeah. So to refine that into a question is, in your experience and then all the work you’ve been doing over the years, how do you advise teams how to go through and step through a process where they do that hygienic step of getting rid of the ones that don’t work or refactoring or whatever?
00:33:07.01 [Andy Cotgreave]: So I think one of the things you can do, we talk about doing is right at the start, if you’re building a new dashboard, put in an end date. So at what point will this be deprecated? And you could define that. So then you can define that as nobody’s looking at it, but we’ll come back to that. Or it might just have a finite date because it’s related to a campaign. Or it might be we will review. The business owner will sit down and do a review in a year’s time or something like this. There are various aspects like that. At Tableye, we’ve got a great process because we’re allowed to publish dashboards internally, but they all get deprecated after two years, or after two years, if nobody’s looking at it, it just gets deleted. The builder, the owner gets warned and they can go and refresh it or just delete it, but we just flush them out. There is a peril in deleting a dashboard that only one person looks at once a year, because it could be that one person looking at that one dashboard once a year makes a $10 million decision, right? And it’s hard to capture that. But if they are doing that and the dashboard is gone, they’ll be like, where is my goddamn dashboard? You’re like, oh, now we realize that user had a use case. So in fact, actually, one of the dashboards in the book by Michael Gathers, who works at an IndyCar team, His definition of dashboard success is the amount of support requests he gets, because every support request is somebody engaging with the dashboard and frustrated they can’t do what they wanted to do with it, right? So that’s a slight aside. But yeah, it’s those kind of things you can build into the process early to deprecate a dashboard, and then good governance, monitor usage, and just Flush is the one, Flush is the sandbox published ones away.
00:34:57.61 [Tim Wilson]: I will say there’s the, if you ask somebody or using this dashboard, that’s bad data because they’ll, even if they aren’t, they’ll get a, they’re like, oh, but I should, they’ll either think I should be and I’m going to start. So they’ll say yes, or they’ll be like, well, I know somebody, I had somebody else build it. I never really got that much use out of it, but I don’t want to hurt their feelings. And I don’t want to be, I don’t want them telling me that, why did I even request it? If there is not the ability within the platform to monitor usage, I’m a fan of dropping the big old red text at the top saying, this will be deleted. This will be on whatever this date. Know when you put that up and give them plenty of warning. It doesn’t necessarily solve the one year. problem, but I had an analyst who worked for me years ago who just refused to, because the number of people she sent the report out to was vast and included some very high senior people, she was like, this must be important because I am spending hours of my week compiling this and sending it out. And I was like, but what are they using it for? She’s like, I don’t know. Look at the size of this distribution list. And so I finally told her she needed to start including in the email, we’re going to stop sending this. And I was like, can you tell me who comes back, and then my point being, instead of trying to solve, think you’re solving everything for everybody, there probably are some specific use cases in there, the people who are going to freak out, but they need one thing once a quarter. You don’t need to be sending everything weekly so that once a quarter they can get it. But I love the planning and end date at the outset. People will set it two years out and they’ll be like, it’s two years. Two years will be there before they know it. Yeah.
00:36:45.08 [Andy Cotgreave]: If they even release it in two years, hopefully they do.
00:36:49.63 [Michael Helbling]: We keep dashboards like we keep gym memberships. There’s a lot of for for data people like there’s dashboards are exceptional and useful, but there’s a lot of pain buried in there as well over the years. All right, Julie, you were about to say something, I think.
00:37:13.91 [Julie Hoyer]: push off our AI conversation with one more question.
00:37:16.55 [Michael Helbling]: Oh boy. But I’m sorry I have to. Okay, good.
00:37:20.35 [Julie Hoyer]: Because we were talking about like that one person that’s making a $10 million decision on a dashboard. Have you seen, do you have thoughts on in practice, what are some of the best ways people should be sharing learnings that they’re getting from dashboards, like across a team? Or I think I had seen like, You had part of the book talking about like there again and we’ve talked about different types of users right so they’re looking at the data asking questions with different lenses different purposes that they need the answers for. Do you have any guidelines for people or seen it in practice done really well like how do you communicate. what people on the same team are gaining from a dashboard or people on different teams are getting from a dashboard. And it’s hard to say, will it ever get back to the analysts that’s like, can I deprecate this thing or not?
00:38:10.40 [Andy Cotgreave]: But I’m interested to hear. One thing I’ve learned over speaking to thousands of customers in my time at Tableau is that a refrain I hear so often is we spend the first half of every meeting arguing about the data. We need a solution to that. And I’m like, fine, that’s true. But then I try and spin it. But what if that was actually a positive thing? What if you had a good robust data reporting system that told you average sales are going up, but mean and median sales are not doing the right thing, or they could have that reliable data that they trusted in so that they could spend half an hour arguing about the data, because then they’re having a data-informed conversation rather than a data argument. Organizations that bring data into conversations. For a start, data is never clean. There are so many There are just so many problems in the data pipeline at every step in the way that data is now the perfect. So it is never going to be able to be the 100% informer of any decision. There is human intuition. There is just human politics and ego in making decisions. And so we should embrace that. And then it comes a little bit back in storytelling. One of our best European leaders at Tableau, he always brought in every all hands meeting for Europe. and Tableau, he would always come to each meeting with a brand new chart. I mean, he added a couple of people who would work on these every month. I thought that was really good because if you sat in a meeting every month and your head of sales or your head shows the same dashboard every month, it looks the same every month. Three months into watching those meetings, you just don’t pay attention because it’s like, oh, here’s the bit where James shows the dashboard. Good God, it’s the same as if it’s boring. Whereas he would be like, okay, I’ve looked at the dashboard and I’m going to visualize one insight I’ve taken from that. You’d be like, oh, here’s James telling an insight visually with data. So I don’t have, there’s no specific framework answer to that Julie, but I just do examples that are what I’ve seen, what I’ve seen success and what talking about that to reserve, right?
00:40:25.88 [Tim Wilson]: I mean, counterpoint, the arguing about the data or the bringing a new… I’ve seen cases where if the argument is about the messiness of the data, why do these numbers not match? And then everybody can get caught up on what is the right number or if somebody brings a new charge and it’s surprising, then people’s tendency to say, well, did you exclude this? Did you do that? Did you do that? And in every case, there would be a… probably eventually some general consensus that, yes, maybe returns should have been filtered out, even if it doesn’t material to change what that person is showing. Then everybody gets sent off swirling on trying to get to agreement on the definition and there’s a lot of work and discussion and deeper understanding of the data not of the business and everybody. There’s a sense that wow we did something because we had an argument and we resolved it and like all you’ve resolved is. some definitional shit, you haven’t necessarily actually resolved anything that’s going to move the business forward. I think that’s where I struggle.
00:41:36.66 [Andy Cotgreave]: Tim, I’m going to force the segue now, because if humans can’t do it, how the hell would a janitor AI have that contact to know those things that even we can’t work on?
00:41:48.16 [Tim Wilson]: Well done.
00:41:49.80 [Michael Helbling]: This man is a professional.
00:41:51.36 [Tim Wilson]: Michael, he’s coming for your job.
00:41:55.53 [Michael Helbling]: But yeah, so that’s the latest type of course in every space. We all have this mandate to leverage AI, use AI, where can AI do all this? So in the world of data and dashboards, yeah, what are you seeing in the space as it relates to AI?
00:42:10.15 [Andy Cotgreave]: Let’s talk about ThoughtSpot. They were early in natural language approach to querying data, and ThoughtSpot are great.
00:42:17.72 [Tim Wilson]: This was hilarious if this is going to go away.
00:42:21.55 [Andy Cotgreave]: That genius marketing campaign was dashboards like that. Come and do Natural Language. It’s a brilliant campaign. I love it. You go to ThoughtSpot’s homepage. and watch the demo and says, you can ask any question. When you’ve got a chart you like, you save it to a live board, which is what we call dashboard. Right. So on the one hand, generative AI, LLMs promise infinite answers to infinite possibilities Data related to Data2. That’s fantastic, but I’m lazy. If I go ask the same question every week or every month, I don’t want to type that question every week. I just want to open my phone, go to say Tablet Pulse, for example, which is really good at this, and it’s just like, oh, my metrics are going up, my metrics are going down. Put my phone down and move on. I don’t want to type that every time. And thoughts won’t recognize that with their live boards. People want to go back to those things. So on the one level, people still want at a glance ways to monitor and explore data to facilitate understanding. Hey, that’s a dashboard. Then the second aspect of AI is this context thing we’ve just talked about. If we can’t solve this ASP challenge, then Generative VR isn’t going to do that. There’s a huge amount of efforts around semantic loads across the industry to try and define business data. We have been trying to do that for about 50 years. We haven’t done it yet. So I mean, you know, there’s more of a catalyst to get that shit done and get it done properly, but processes humans are lazy and under pressure. And then using generative AI as a data analyst, I mean, it’s when I do a data project, to build a dashboard for this. So, okay, fine. Who’s got the data? I’ve got to email that person and get them to give me the password. They haven’t got a password, it’s a local file, so they send me the copy of the Google sheet. Then I connect it somehow to the core data systems. They don’t actually match properly, but you can’t really do the matching on the thing, so I have to manually connect the data. Then I have to find another data set, but I have to broadcast the message, because who owns this data, right? Hey, I can’t get over that stuff. That’s human processes, because data is messy as anything, right? Right. So, so far, in this answer to the question, I’d be down on generative AI. I think it is unreliable for data analysis. It is unreliable. It doesn’t understand business context, and it can’t access the things away human can. It’s not all useless. There are some things it can do pretty well. But in all my experience of trying to use AI to do data projects, I’ve been left wholly underwhelmed and really frustrated. I just thought I could have just done this quicker myself.
00:45:10.32 [Tim Wilson]: There’s this idea. If you probe when somebody, if they said, well, I need on the dashboard, I need to see sales training over time broken down by geography. You’d say, well, okay, you could ask AI that and maybe it could produce it, but aren’t you going to always want to look at it? That’s a lot more work for you to continue to ask to your first point, ask that question again and again. That’s still not really what they want. I feel like there’s this nebulous, optimistic, thing that no, the question, I’m not even going to ask, I’m just going to have an agent that is just going to tell me insights and I’m not going to have to ask it anything. Like that’s sort of the path that’s going to like just trying to completely cut out the human, which I don’t feel like is all that different from past iterations of, that’s what digital was going to do. All of a sudden, we’re going to have all of this data, and therefore, this will be just a shortcut. We’ll be able to do one-to-one marketing. Never worked out. There’s a little bit of everything old is new again. There’s this weird thing where people think this is the technology that if they really probe, they want it to just tell them what to do. And then actually they’re like, well, no, just do it for me. It’s like, well, what are you as a human doing? Do you have that low regard of yourself that you think that you should just be entirely replaced? Like somewhere thinking through, what do I want to look at? Where does that data come from? What does it mean? There is value in the friction, even for non-analysts to understand the nuances of the business. What is a monthly active user? What does that really mean? And that’s not a data question. That is a, what do we in our business strategy think matters? And if we just were able to ask a chat, he’ll just tell me Mao’s over, tell me Mao anomalies. And I haven’t actually taken the time to understand what that really means. So yeah, I’m similarly frustrated
00:47:28.31 [Andy Cotgreave]: I’ve tried the exercise with customers. If you want the AI to be able to ask and answer any question, write down, so let’s say it’s to do with active users, write down as granular as you can the list of every single thing that has to be accurately defined or anticipated in order for an AI to know where that it can look across loads of data source. That list is so long. Then you talk about this digital was a thing. I’m going to tell you my favorite quote. Right? And ask you first, do you kind of agree with this? So millions of dollars are spent every year collecting data with the assumption that having the data solves the problems being studied. Nods, yes, does that seem reasonable? Millions of dollars being spent yearly collecting? Yes. Right.
00:48:12.32 [Tim Wilson]: Oh, yeah.
00:48:13.12 [Andy Cotgreave]: So second question, what year is that quote from?
00:48:18.43 [Tim Wilson]: 1968. 68, says Tim.
00:48:21.33 [Michael Helbling]: Michael. I mean, I don’t have no idea, but I’m with Tim that it’s probably a way longer to go than we even think.
00:48:30.47 [Andy Cotgreave]: Well, it’s from this book. I’m now holding up Graphic Methods for Presenting Facts by Willard Co. Brinton, printed in 1914, 110 years ago. He said, page one, we’re spending millions of dollars collecting data and it doesn’t solve the problems. Millions of dollars, right? And this whole book is about saying up a data culture, about being data analysts, creating curves and charts for executives and presenting data And this book gives me an existential crisis because we haven’t solved the problem in 110 years.
00:49:01.00 [Michael Helbling]: I was going to say that doesn’t make me feel any better at all.
00:49:04.15 [Andy Cotgreave]: But what I’ve come to realize is that there is a frontier. We have a bunch of data. And we have technology which can create a frontier of what that data and technology can do. And then since Brin and Stein, we’ve just pushed that our human ideas go beyond that frontier. So we will always be frustrated. Whatever happens in the next 20 years, the next technological innovations, they’ll be frustrating. But at that point, AI will have walked its way into our lives and we’ll just be using it and things will be better. The problems are not new, right? And I’ll have to wrestle with that every day since I read this gamble.
00:49:44.11 [Michael Helbling]: I do sort of see some really interesting use cases where people could leverage AI as part of the process. So in the discovery phase, Use AI to pull together common themes and make sure you’re getting good visibility to everything you’ve discovered in all those conversations with users in the prototyping phase. Get AI to help decide, like, am I using the right kind of visualization here? What other alternatives might be good for this use case? Or in the development, help me extend my coding abilities or my data normalization abilities beyond what I’m capable of. And so that’s sort of, I’ve got a friend, Donal Phipps, who commented talking to him. Two days ago. Augmentation over automation. And I think that’s kind of the framework that as analysts, something we can all sort of say, okay, yeah, it’s not going to replace what we do. We can’t. But there’s all these different places within a great model or context from the book, we can leverage AI to accelerate our efforts or augment the efforts that we’re putting in.
00:50:50.22 [Andy Cotgreave]: Automation, not automation. I think that’s a brilliant three-word summary. The challenge is that that doesn’t satisfy the business model. These tech companies are working on that because they are waiting for the massive enterprise scale solution and it isn’t there because it can’t be trusted.
00:51:06.08 [Tim Wilson]: But it is funny how it goes that way. You start, is it going to kill dashboards? And you’re like, no, no, no, you wind up with the defender saying, basically saying, well, it’s useful for prototyping the dashboard and it’s useful for evaluating the dashboard and thinking about what’s going to go on the dashboard. And somehow you’re like, see, I proved to you that GenAI is, and it’s like, no, nobody’s saying that it’s not useful and valuable, but that’s a different conversation about its use in the context of traditional analytic delivery of information.
00:51:45.13 [Julie Hoyer]: I’m curious, do you think that using AI in the interpretation of data would end up helping people. Tim, I’m thinking about how you build a dashboard. It has beautiful visualizations. It has clear trends, lines, things. You can see what’s going on. But at the end of the day, we always run into how do you help the business actually correctly interpret, like, this is time series data. Or here are the caveats. Or this is expected variation. So variation, time series, those types of things. Do you think that there is an opportunity that AI would be able to help people answer the questions around, like, is this a material change I’m seeing in the data? Or, I mean, I’m guessing that’s kind of far-fetched. I don’t even know if people would know to ask that question.
00:52:38.73 [Andy Cotgreave]: Yeah, I think that there’s huge opportunities and many people across the industry are chasing that. So Tableau Poults, is Traction Metrics, but uses a narrative science to just interpret the data and put it into text. You can paste charts into ChatGP to include in Haskit to explain the visuals. So that’s two things. So the narrative based on the data is kind of bland because it’s quite hard coded as text. It’s like You’re almost pre-defining what the paragraphs are going to say. As an end user, if you see that 10 times, you probably stop paying much attention to it. But then if you just get the AIs to interpret a visual, it’s so reliable. I’ve been through that pain, and you just explain this chart, and it just makes stuff up. Because it’s a probabilistic, stochastic approach, it just makes things up. It’s hard to rely on it. Yes, it’s got huge potentials. The pitfalls are huge. As data analysts, we spent decades trying to build up trust in our stakeholders. This is from Tableau Tyn from his video. I’m stealing his quote here, but it takes minutes to lose that trust. And if we give them tools that say sales went up when they went down, that would be a terrible indictment on what we do as analysts. opportunity currently high risk.
00:54:18.63 [Tim Wilson]: I got to make my point here because I think there is a part where even the human work, put in front of an analyst and I had this happen years ago when I was in an agency and they had outsourced the client, big CPG, I think it was this was P&G, had outsourced the polling of the reports to offshore. And then they said, and this was going to have us not the, the, the on short agency was going to get less of the work. And they said, they’re just going to send you the reports and you’ll look at the data and like interpret them. And so I think there’s opportunity for AI to move upstream with the business users to say, let’s help you really formulate questions and ideas, and not, I’m looking at a dashboard. I mean, that’s just, I die a little bit anytime someone’s like, we’ve pulled all the data together, we’ve built all these dashboards, they’re amazing, they follow all the bright principles, okay, this has happened to me multiple times in my career. Can you come in and Tim and tell us how to look at those dashboards and get insights from them? And I’ve gotten pushback. I’m like, let’s not look at those. Let’s talk about your business for a while. What’s keeping you up? What are your challenges? What are your ideas? And then come in with a much, much narrower view of what specific question we need answered, maybe even helping refine what data would we look at and what should we be concerned about? And then arriving at the chart with a much, much more clear purpose, we’re going to see if sales went up. When you say went up, are you clear? Are we talking about sales going up a material amount, an amount that doesn’t look like it’s just gauging as noise? I think huge potential for AI to patiently sit there and be introduced good friction to the process to say, you’re not going to jump to looking at charts and then think AI is going to glean insights from it. It’s like giving somebody a bag of rocks and saying, find the valuable ore in this rock. Well, they could bust them up. They could look at them. There’s no guarantee that there’s valuable ore in that rock. So you’re starting kind of farther, too far downstream.
00:56:37.08 [Andy Cotgreave]: Oh, well, I’ll say that it sounds like you’ve read the first section of Dashboards to Deliver, which is all about… For the turn of hell we had on that.
00:56:43.11 [Michael Helbling]: Damn, to bring it full circle. Andy, incredible. We do have to start to wrap up. And that is the new book, Dashboards that Deliver How to Design, Develop and Deploy Dashboards that Work. This has been so fun to have this conversation and I think very helpful and insightful. So thank you very much, Andy, for coming on the show. And now it’s time for something I’m really excited about. We’re going to take a quick break to hear here from a friend of ours, Michael Kaminsky from Recast, the Media Mix Modeling and GeoLift platform helping teams forecast accurately and make better decisions. Michael’s sharing some bite-sized marketing science lessons over the coming months, and we’re happy to have him on the show to help you measure smarter. Over to you, Michael.
00:57:31.27 [Michael Kaminsky (recast)]: What exactly is power analysis? I get asked this question a lot. Power analysis is the process of trying to understand the limitations of an experiment prior to actually conducting. When we’re designing an experiment, we face trade-offs. We want to generate the most useful conclusions as cheaply as possible. Power analyses allow us to understand this trade-off between the conclusions we’ll be able to draw and the cost of the experiment. The most generalized form of a power analysis is a simulation, where we code up the statistical analysis we’re going to do once the experiment concludes, and we apply that same statistical test to hundreds or millions of simulated experiments, showing us under what conditions we’ll draw the correct conclusions. We might see that with the amount of lift we expect to see, we need a sample of at least 250 participants to draw the correct conclusion 99% of the time. But if we accept some risk and are okay if we only draw the correct conclusion 95% of the time, maybe we only need 100 participants. The main variables that affect our statistical power are the assumed underlying variation in the data, the assumed effect size generated by the intervention being tested, and the sample size. So what’s the takeaway? Power analyses are simulation exercises that help you understand the limits of your experiments and their expected cost before you run them.
00:58:39.32 [Michael Helbling]: All right, well, if you enjoyed that mini lesson, Michael and the team at ReCast have put together a library of marketing science content specifically for analytics power hour listeners. For everything from building media mix models in-house to communicating uncertainty to your board, head over to www.getrecast.com slash aph. That’s www.getrecast.com slash aph. All right, well, one other thing we love to do on the show is go around the horn and share a last call. Something might be of interest to our users. And you’re our guest. Do you have a last call you’d like to share?
00:59:13.83 [Andy Cotgreave]: Yeah, I’ll go. Let’s go to the movies. Summer 2025. I recently, I’m not a massive Marvel Universe fan, but I did go and see First Steps or Fantastic Four First Steps. Why is that relevant to your users? Because they have some cool data displays in that film, right? In fact, the visual design of that movie is fantastic. And there are some really exceptional data displays. And so go check out the movie. And also, now I’ve pointed it out to the audience. you will never not see chance when they appear in movies. It’s brilliant. They’re everywhere. I’m a really good speaker.
00:59:50.31 [Michael Helbling]: Awesome. Awesome. All right, Julie, what about you? What’s your last call?
00:59:54.23 [Julie Hoyer]: Mine’s in the entertainment space a little bit too. A little different than my other last calls. But honestly, I am a sucker for any kind of sports documentary, honestly. And there was one I recently watched. It’s Power Moeves. It’s about Shaquille O’Neal coming and helping Reebok basketball and they could come back. And I am really curious because, of course, when those come out on Netflix, it’s so delayed. So I was talking about last year, I think the launch of their first basketball shoe was earlier in 2025. And I immediately was like, I want to know, is Reebok basketball really going to make a comeback from this? Do people love the shoes, hate the shoes? What are their dashboards look like internally? What’s their revenue doing? I was just kind of like, did he make the comeback that he was trying to do? So I’m kind of invested now to try to dig in and see if.
01:00:44.25 [Michael Helbling]: All right, so if you’re from Reebok, reach out to Julie to give her the update on how things are going.
01:00:49.95 [Tim Wilson]: Yeah, I want to know. Well, who’s the kid from Maine who got drafted first in the NBA because he wears Converse? Anybody? Cooper Flagg? What is Cooper Flagg? His shoes are… I’ve heard the name.
01:01:05.41 [Michael Helbling]: I don’t know what shoes he wears.
01:01:06.91 [Tim Wilson]: New Balance, I think. Yeah. All right, Tim, what is your last thought? Mine’s like, right, totally in entertainment as well. It’s also pandering to our new sponsor, ReCast, unintentionally. What are priors in MMM and why they’re difficult to get right? But you need to. I don’t know that it’s on Prime Video just yet, but I’m sure it’s going to be made into a movie. But it’s a long piece. Marty Sanchez at Recast wrote it. Because I’ve lived in the surface level of how does Bayesian differ from Frequentist? Well, you have priors, then you adjust your priors. So it’s a pretty deep dive that I kind of committed to reading it two or three more times and I’ll still be a little bit confused, but it’s really well written. That has not a knock on Marty, it’s a knock on my understanding, but it’s a good read on priors and media mix modeling and not the NBA or the Marvel Cinematic Universe. Yeah. And that was for the right person. Yeah. To Tim. What do you do for fun? Michael, what’s your last call?
01:02:21.73 [Michael Helbling]: Yeah. We’re aware. OK, so over the years, there’s been this conference in the US that’s gone by a couple different names, Exchange, DA Hub. And it’s always been one of my favorites. It’s got a huddle-based format. So you’re sitting down with and having these conversations with fellow practitioners, people from all these different companies. And so the signal-to-noise ratio is super good. You’re not getting talked at by a bunch of speakers all the time, and it’s coming back. It’s called the Data Exchange Conference, and it’s happening October 27th to the 29th. in Asheville, North Carolina. So I’m really excited about that. And I’m excited to see that conference emerge again, because I feel like we need something like that for business users, data users to be able to share insights and knowledge with each other in that kind of format.
01:03:18.43 [Tim Wilson]: So that’s my last call.
01:03:19.21 [Michael Helbling]: We’re going to be at the Biltmore Estate. It’s not at the Biltmore, but a good question. I think there’s another hotel or property that they’re hosting it at, but Asheville is a beautiful city. And it’s also right there, Asheville is coming back from those major hurricanes that hit Western North Carolina. So even going there and being part of it is also a part of bringing that region back from some major disaster. So you can even feel good about that too, and if you go. Okay. Obviously, we’ve been talking about dashboards and design of them and how to do that. And I’m sure you have thoughts and questions. We would love to hear from you. So please reach out to us. The best way to do that is through either LinkedIn or the Measure Slack chat group. or you can email us at contact at analyticshour.io. Andy is going to be out in the world. His new book is available in just like a week. So go to Amazon or wherever you buy books and check that out. I’m sure that you will see him and bring your books with you so you can get him to sign them when you see him out in the world. And Andy, thank you so much. This has been such a fun conversation and such a needed one, I think, right at this moment in time. My pleasure. Thanks, everybody. And no matter what you do out there with your dashboards, remember, I think I speak for both of my co-hosts when I say, keep analyzing.
01:04:46.56 [Announcer]: Thanks for listening. Let’s keep the conversation going with your comments, suggestions, and questions on Twitter at, at Analytics Hour, on the web at analyticshour.io, our LinkedIn group, and the Measure Chat Slack group. Music for the podcast by Josh Crowhurst.
01:05:04.41 [Charles Barkley]: Those smart guys wanted to fit in, so they made up a term called analytics. Analytics don’t work.
01:05:11.13 [Michael Wilbon]: Do the analytics say go for it, no matter who’s going for it? So if you and I won the field, the analytics say go for it. It’s the stupidest, laziest, lamest thing I’ve ever heard for reasoning in competition.
01:05:24.24 [Julie Hoyer]: I was waiting for Carlos to come back.
01:05:26.62 [Tim Wilson]: Oh no, that’s… Even though I’m the one who gets stuck doing that, I start panicking, like, as Michael Frapping up, saying, what? I know there was something. Oh. Yeah. I literally have to write them down during… I forget to do that. That would have been a… Oh, yeah. Damn it. Someday we’ll figure out how to actually do intros with asynchronous arrival.
01:05:52.32 [Michael Helbling]: Well, you know, I wanted to make sure to get some shop talk about book publishing in there, you know.
01:05:59.43 [Andy Cotgreave]: All right. Not out of that cool club. Last thing, I’m just going to tell my daughters to ensure that. Yeah, that’s totally fine. You’re all good.
01:06:08.81 [Tim Wilson]: Just scream it for where you’re standing.
01:06:10.53 [Andy Cotgreave]: Yeah, that’s the American way. I’ll be back in a minute.
01:06:19.39 [Michael Helbling]: Nobody respects what I’m trying to do here, so, you know.
01:06:36.60 [Tim Wilson]: Rock flag and dashboards are not dead. That’s the verdict.
Subscribe: RSS