Every listener of this show is keenly aware that they are enabling the collection of various forms of hyper-specific data. Smartphones are movement and light biometric data collection machines. Many of us augment this data with a smartwatch, a smart ring, or both. A connected scale? Sure! Maybe even a continuous glucose monitor (CGM)! But… why? And what are the ramifications both for changing the ways we move through life for the better (Live healthier! Proactive wellness!) and for the worse (privacy risks and bad actors)? We had a wide-ranging discussion with Michael Tiffany, co-founder and CEO of Fulcra Dynamics, that took a run at these topics and more. Why, it’s possible you’ll get so excited by the content that one of your devices will record a temporary spike in your heart rate!
Photo by Ryan Stone on Unsplash
0:00:05.8 Announcer: Welcome to the Analytics Power Hour. Analytics topics covered conversationally and sometimes with explicit language.
0:00:13.7 Michael Helbling: Hey everybody, welcome. It’s the Analytics Power Hour. This is episode 265. I think it was Socrates who said, the unexamined life is not worth living. And I believe he said that right before putting on his Oura ring, slipping on his whoop band and jumping into his Eight Sleep bed. One thing for sure though, we’ve got a lot more places to collect data about ourselves than we did back in his day. And I think it represents some interesting possibilities, maybe some challenges. So we wanted to talk about it. I mean, we’re data people, so who better to tackle this topic? And Julie Hoyer, manager of analytics at Further. Do you use any of these tools to like, measure stuff about yourself?
0:01:01.0 Julie Hoyer: Funny enough, I religiously wear an Apple watch and it’s collecting things, but I couldn’t tell you the last time I looked at the the dashboard summary data in the app, if I’m honest.
0:01:11.8 MH: Nice. No, that counts though. That counts. So Tim Wilson, head of solutions, facts and feelings. How about you measuring your heart rate?
0:01:22.3 Tim Wilson: I’ve got my Polar H10 heart rate monitor on right now because just want to see how excited I get throughout this nice show.
0:01:30.1 MH: Nice. So we should run that in real time along with a podcast to see how excited or unexcited Tim is on a topic or how stressed out it makes him. And I’m Michael Helbling and yeah, I think I’ve got stuff on my phone that measures how many steps I take and things like that. Okay, but we, we needed a guest, somebody who could help shed some light on this topic and bring this discussion to. To you, our listeners. So we found one. Michael Tiffany is the CEO and co founder of Fulcra Dynamics. He was the founding CEO, then president of Human, a cybersecurity company. He also serves on various boards as well as advises startups. And today he is our guest. Welcome to the show, Michael.
0:02:11.5 Michael Tiffany: It’s a pleasure to be here. Me and all of my connected devices.
0:02:16.2 MH: Nice. Are you big… Do you do quite a bit of that? Or just I assume because of your company, you probably do a lot of testing at least.
0:02:24.0 MT: Rocking an Apple watch. I’m wearing an aura ring. I’ve got a connected scale. I’ve got an Eight Sleep bed. I’m breathing into this Lumen device to instrument my metabolism by looking at my out breaths. Here’s how weird I am. I’m rocking a smart, addressable breaker box. So among other things I’m measuring… I’m like, monitoring power to the stove to just passively monitor how often I’m cooking.
0:02:55.8 JH: Wow.
0:02:55.8 MT: Yep.
0:02:58.2 JH: That is a whole nother level.
0:03:00.2 TW: What’s the Eight Sleep bed? What’s the Eight Sleep bed do?
0:03:02.9 JH: Yeah. I haven’t heard of that.
0:03:03.6 MT: Ah, it’s magnificent. It’s a bed that circulates through… Interwoven through the entire bed topper are small channels for water that run to a refrigeration slash, heating unit. So the bed can either cool you down or warm you up. Or in my case, key for marital bliss, cool me down while warming my wife’s side.
0:03:30.5 JH: Whoa. Wow. That sounds nice.
0:03:32.9 MH: But it also does a lot of measuring of, like, your sleep quality and stuff like that at the same time, right?
0:03:37.8 MT: That’s exactly right. I was an early adopter. Owning this thing feels like owning a Tesla where the same hardware has been getting better and better with OTA updates. So while I bought it mostly for that temper regulation, I’ve seen its sleep monitoring, its measurement of my heart rate in the night, like, just get better and better and more accurate, which has been a delight.
0:04:03.4 JH: Wow.
0:04:04.2 MH: Nice.
0:04:04.6 TW: Wow.
0:04:06.0 MH: Yeah. I used to have an if this, then that routine running against my scale to dump any weights I did into a Google sheet for a long time, but that was a long time ago. And I think the company that made that scale doesn’t have an API anymore.
0:04:19.9 MT: So that was like my gateway drug.
0:04:23.8 MH: Okay, nice.
0:04:25.4 MT: [0:04:25.4] ____ if this, then that scripts to gather this kind of stuff. Yeah. And look at me now.
0:04:30.9 MH: Nice.
0:04:31.8 TW: I do diligently when I’m traveling. I miss a little bit that I don’t have a scale because I have an every morning that is part of the routine, not to the point of having a connected scale. I actually was given a connected scale for Christmas, I think, a year or so ago. And I’m like, I don’t think I need that. It’s just take a measurement and punch it into my phone while my toothbrush is running. But yeah, who knows?
0:04:54.7 MT: Yeah, whatever works.
0:04:56.1 TW: Okay.
0:04:57.6 MH: All right, so…
0:05:00.2 TW: It’s not a competition.
0:05:00.2 MH: What is… Well, yeah, I hope not, because I’m not winning.
0:05:02.8 TW: Michael wins. Yeah.
0:05:06.4 MH: No. All right. So. So, yeah, Michael, what is the word for this? Like, so one of the things that gets used a lot is sort of self quantification or self data. But what is sort of the holistic term for this? Or what’s going on in this space because obviously there’s even, we’ve mentioned a bunch of different companies and things like that, but there’s more. There’s many, many more.
0:05:26.1 MT: That’s right.
0:05:26.5 MH: And you can go beyond that to like DNA like 23andMe and those kinds of things as well.
0:05:32.6 MT: Yes, so in the early days, I would say the pioneering hackers who were coming together and sharing tips and tricks were talking about the movement as quantified self. And that really was in its pioneering phase. These days, like I just showed you my Oura ring. They’ve surpassed a million sales in North America. This is now a popular device, not just a niche device. And while that has taken off, quantified self as a term of art, I would say has actually declined. And this is a good thing, not a bad thing, because what quantified self promises you by just the meaning of the words, is a bunch of numbers. And that’s not what people want. They want insights, they want self knowledge, and they want increasingly connected wellness.
0:06:19.7 MT: So you see now terms of art that they are more about connected fitness, connected wellness, connected health. And I think that that captures something important, which is the intention, the goals here. It’s not really about counting steps. It’s actually about 10 more years of a good life.
0:06:42.9 TW: Yeah. So what is… I guess, is there a term or is there, is there a singular idea or vision that everyone says this is what we’re trying to get to is X, right?
0:06:54.8 MT: Yes. So, if I had to pick one, it would be connected wellness. And the reason why it’s those terms in particular is that we’re in a transition right now based on the recognition that healthcare has for many, many years really been something more akin to sick care. It’s about fixing you after something is broken. And that’s not awesome. There are things you should be doing right now to improve your wellness that mean that less things will go wrong. So that’s the… Apart from just branding and marketing. That’s the true reason why you’re seeing the word wellness more. It’s to try to differentiate the proactive pursuit of optimal health versus recovery from something going wrong. And then we’re doing that in two ways that are new, signaled by the word ‘connected’. One is that we’re wearing increasingly smart devices that in effect make you like a type A personality, like, make you like a really good diarist without you having to do any work. I just step on my scale, I don’t write anything down, which is nice. And so it’s connected in that sense. The device is some somehow probably really sending bytes over the wire and then also connected in the sense that this data by being digitally native is more shareable with a doctor, with a loved one, maybe even just shared socially because so much about staying fit and healthy is like depends on social engagement and like doing it with others.
0:08:41.6 MT: So if I had to pick two words to capture everything that seems to be the ascendant term, it would be connected wellness.
0:08:51.9 TW: And who’s… This is funny. I think of the early days of internet of things where there was talking of, if you’re… Imagine your garage door being able to tell you that it’s got a bearing that needs to be greased and it’s gonna go out. Which sometimes those seem kind of forced. I don’t spend a whole lot of time feeling like I need to preemptively maintain my garage door opener. It will break every 10 to 15 years. But when you talk about the health, the logically early detection, early detection/preventative care makes sense. Is the thinking that that is in the hands of a person, is it a thinking that it’s in the hands of the… I mean the data collection has to… Is geographically tied to the human. But is it something that the healthcare provider will say I need your historical data, if you have it. Or like who’s… Yeah, where does it come from?
[overlapping conversation]
0:10:01.9 MT: Here’s how I’m approaching this in my own life. And I found this to be transformative. Actually goes back to Michael’s opening observation about Socrates. Self knowledge is incredibly hard. It’s actually incredibly difficult to achieve extraordinary self knowledge. And so the way it’s done, the best way to achieve extraordinary self knowledge and insight for the past several thousand years, going back to Socrates, going back to Vedic religions in India or even I was just looking at the Rule of St. Benedict some 1500 years ago. He’s writing everyone does the same thing worldwide, which is dramatic simplification. You live like a monk. This is the point of the monastic life. It’s to dramatically simplify your life so then you can focus and achieve extraordinary self knowledge and insight. And sometimes you peer into the very nature of reality as well. I don’t want to do that. I want to have the self awareness of a monk while actually engaging with the world like a bon vivant. And so the challenge I set before me, being a computer nerd is, can I use computers to help me out in this regard? Because computers are infinitely patient. And honestly, they’re really good at counting stuff.
0:11:36.4 MT: I believe that kung fu masters centuries ago really could cultivate the ability to just be constantly aware of their own heart rate. And that was probably awesome. I’m not going to do that. I’m going to put on an Apple watch. So that’s sort of an empowering view of the world. But I would say that something must be missing because the people donning Apple watches or Oura rings or other kinds of instruments, instrumentation, are augmenting their bodies, they’re augmenting their lives with breakthrough technology that was sci-fi just decades ago. But I don’t think we feel like the Six Million Dollar man. You strap this in and you just feel magically empowered. So what is it like what’s missing? And I think that siloization is a really big limiting factor. And I’ll give you a healthcare example and we’ll go back to like my connected breaker box.
0:12:39.0 MT: My bed has all these awesome instruments. It’s measuring my HRV. It’ll tell me how long I spent in deep sleep, but it knows nothing about what I did the day previously that contributed to or ruined a good night’s rest. I, for instance, learned and other Fulcra users have seen the same thing that by getting passive telemetry on my eating. So I’m not even a big food logger. That’s like a little bit too much work for me. But I will put on a CGM. So I’ve done multiple experiments wearing a connected glucose monitor, a continuous glucose monitor that just passively recording my blood glucose. So therefore is going to see blood sugar spikes when I’ve eaten a bunch of carbs.
0:13:31.2 MT: And what do you know, like a few weeks worth of experimentation showed that if I want better, specifically deep sleep, I should shift my carbs if I’m gonna eat any to the beginning of the day. So carbs before noon, I sleep well. Carbs afternoon. Eh, you starting to get into a danger zone. Like dessert after dinner, forget about it. I’m gonna have an elevated heart rate and I’m gonna have shortened deep sleep.
0:13:58.4 MT: It is impossible to know about that causal relationship unless you’re somehow tie in the data that’s drawn from the CGM with the data that the bed knows about. So We’ve surrounded ourselves with these ostensibly smart devices, but they’re not really smart, they’re just data producing devices. The smartness comes from a higher level of analysis. And I feel like people like me are on the leading edge. We’re geeking out on our own data, we’re doing data science on this raw data in a Python notebook which is like too much to ask for from maybe the average person, but that’s going to be within the grasp of the average person to some extent already. And to an increasingly large extent because of coding copilots. So people who’ve never written a lick of code before are sometimes getting like one shot outputs of functional code from [0:15:00.0] ____ ChatGPT. That means that what used to be really esoteric data science skills are becoming increasingly within the grasp of ordinary people. But only if you’ve gathered and de-siloed the data. Hence, my focus with Fulcra.
0:15:16.8 MH: I think that’s something that I’ve been thinking about a lot is how even once you have all the data in one spot so that you could use it to paint a bigger picture, ask more helpful questions about your health.
0:15:31.4 MT: Right.
0:15:32.0 JH: How do we determine like what good looks like? Because what’s interesting is some of the devices seem to be making some of that decision and determining what is a good range of these metrics, other ones don’t. They truly do just collect the data. So it’s interesting to think when you start to connect those things and tie them together, kind of back to Tim’s question. Does it become a place where that is baked in so an individual can go ask these questions and get those types of answers? Or is it more so that the value is it’s all together and you could take it to a professional to help tell you what does this mean? Is this good, is this bad? And based on that, like what do I do about it?
0:16:12.6 MT: Yeah, yeah. I’m thinking about thinking about changing the world in this order. Once you have the self knowledge that I’m describing, then you also have new ways of sharing how it’s going in your life with another person, which could be a doctor, but could just be a spouse, could be a group of friends. So everything starts with solving the observability problem. I think it’s too hard to get help because it takes so much effort to just describe to anyone else like this is what’s going on with me, this is how I’ve slept the last week, or this is what’s stressing me out. All of that data you can think of as the human equivalent of what we call in DevOps observability. The instrumentation, these connected devices, they’re solving the observability problem. Then there’s like this analysis problem which we just sketched.
0:17:11.8 MT: And then finally there’s new forms of sharing. And I’m like really excited about that. I want to know how my friends are sleeping in general. How is it going with people that I love but now live distant from me and also what’s normal. So what I’m hoping is by reducing the friction and the risk of sharing personal observability data like this, by making it secure and controllable, then we’ll also be able to pool this data to find out what’s normal across larger groups.
0:18:00.8 MT: So you can kind of compare yourself to, to averages. Right now it’s like really hard to tell, am I a weirdo? And I think the internet is sort of good at solving those problems if you can build the bridge between the data collection and the kind of social sharing that you want to do.
0:18:21.9 JH: Hmm.
0:18:24.0 TW: I’ve got anxiety now. As it is with… I mean with Strava or, I mean I had Fitbit before or Apple. I mean there does feel like a broad parallel that is not encouraging, which is move away from us measuring ourselves and just kind of the world of digital where at a corporate level there is this obsession with let’s gather everything we can. I mean, the 360 degree view of the customer taken to an extreme would be a marketer knows how often you’re cooking so they can make self easy, cook meals available to you or something.
0:19:08.4 MT: Yeah, right.
0:19:09.1 TW: I mean, there’s the nefarious which I feel like insurance and government we should get into as well.
0:19:15.3 MT: Right.
0:19:15.6 TW: But just the idea, I mean there have to be people listening because I’m experiencing it a little bit myself. Like oh my God, like sharing, comparing, like don’t we have a challenge with our youth just from the crude form of TikTok and Instagram comparing themselves and it’s not good for their mental health.
0:19:37.4 MT: That’s right.
0:19:40.7 TW: So it’s like this gather all this data first, hope the analysis happens and then we’re creating community. Is there a dark side or downside to that that we need to figure out?
0:19:53.7 MT: I think so. I think there’s extraordinary benefit and an extraordinary risk and that’s why I, an entrepreneur most known for starting cybersecurity companies have… That’s why I’ve waded into this. Our design of Fulcra importantly starts with who we’re working for and how we make our money. When you create an account with Fulcra, your data belongs to you. You are not sharing it with us, you are sharing it with your future self. And our revenue model is asking for money for that service and we need to re-earn our customers trust every day. And if we lose that trust then they will stop paying us money and we will be very sad. So I think that being a force for personal data sovereignty in this way is something you have to choose to do at the foundation of your company and build into your DNA. I think that if you are an ad funded company, even if you are a multibillion dollar or multitrillion dollar ad driven company, you cannot just decide to like pivot into a new line of business where customer data belongs to the customers and is encrypted in motion and at rest and is just designed for whatever the customer wants to do with it and nobody else.
0:21:17.2 MT: The control that people have over their own data I think is actually going to be of increasing importance as AI agents become an increasingly important part of the future. Because as we can see over the last especially two years of rapid improvement in generative AI, it’s going to be very hard to control AI models by trying to put a cap on their capabilities. I don’t even see how that’s going to work. I don’t think we can say AI can only have an IQ of 140, no higher. That’s just not going to work. So how are ordinary people going to have any control over an agent that they’re asking to help? So you want to get help from a helpful AI assistant. How are you going to be able to accept that help share enough data with that agent that you can get some help? But make it like a two way door, make it a revocable commitment.
0:22:16.6 MT: And I think there’s only one way to do that and that’s to control access to your own data. So you can grant it to an assistant. You say sure, you can read my health data but you can’t copy it. And if I change my mind for any reason or no reason at all, I get to turn that off. If instead all of our data is going to live with some large tech provider that’s also running the models. If the only way you get the help is by like uploading all of your data in a one step process, you’ve completely lost control. And that’s like not the future that I want to bring about.
0:22:52.6 MT: So what we’re trying to do here is empower people, as I said, with self knowledge, but it’s even more broadly building an important force for personal data sovereignty so that we can have the benefits of AI but put people in control.
0:23:07.1 JH: It’s interesting too with being health data. I think it brings a very different awareness to the world of AI and the sharing of data and your own data that I think people… It’s very different than today. Like some people think like, oh, you care about what I clicked on, what ads I saw. It’s your data. But it feels really different when you start to talk about like your personal health metrics. And so it’s… I’m really happy to hear you talk about it that way and it’s really helpful to hear you talk about that way for even just my understanding of like, what could this look like, what should this look like ethically in the future? But I really hope that it kind of sparks that light bulb for other people of like, when we’re talking about your data and privacy and the importance of it and how it interacts with AI. Yeah, thinking about it, the way you think about your personal health data for all your other data, I don’t know, it really sparked some clarity for me.
0:24:01.7 MH: But it also highlights the gap we have in the United States around data ownership and data rights as a person because there’s not laws in the US about if you give that data to somebody else, what they can use it for. And so health data can be predictive of many different things potentially. So just like the car insurance companies want you to take the little thing and plug it in to track all your movements to save you money, but in reality, it’s helping them create better predictive models for what the likelihood you’re going to get in an accident is and what the risk you have to them as an insured person is. And so in the same way, where that data goes. And so like even if you take your data, and I think this came up with 23andMe because, because I think they were contemplating selling the company to somewhere else and it’s like, well what, what happens to all that data if someone else comes and buys that company? What are they allowed to do with that data if they acquire it?
0:25:00.4 TW: What happens when meta buys Fulcra?
0:25:03.7 MH: Well, I mean, and so like that’s a legitimate concern because there’s no underlying regulatory structure that says the someone who comes along and buys a company like that can do or not do, do things with that data that they “own now.”
0:25:18.2 MT: Yeah. Right. I love this kind of thinking and I think that when you dig into privacy by design at many companies, you find that there is this end state where people just say, well, we’d never do that.
0:25:35.0 MH: Yeah.
[overlapping conversation]
0:25:35.5 MT: And that is an inadequate answer because you cannot guarantee that you will always have your hands on the wheel. So in fact, I would encourage anyone listening as they’re thinking through what privacy by design at a Olympic level really looks like. You have to show how you are preserving privacy even if Ultra Super Mega Evil Corp acquires your company. You actually need to limit the powers you have as a business operator to mess with people’s data and inspect their data so that even under the conditions where you’re acquired by a company that doesn’t share your values, they can’t just like switch on the data vacuum mode and undo all of your work.
0:26:27.9 MT: And absolutely, this is not just me thinking about this. Happily, there are good patterns of privacy by design that are built to operate at that high level. And I think that’s absolutely the level that literally every company should aspire to.
0:26:40.7 TW: But there’s the, there’s having, there’s following all the principles of privacy by design and putting something in place. And then there is also the… I mean, you sort of said it earlier, there needs to be a trust that somebody’s going to provide their data. And explaining there still winds up being truck to the masses to those million people with an Oura ring if you say… I mean, I would guess that most of them are saying, I don’t really care. I’m not giving it a whole lot of thought. Take my data. But if you’re going with 300 million people and the truly paranoid fringe, the… When we are in a very weird little subset of four people here who are happy to spend an hour talking and thinking about this and we’re not remotely scratching the surface of what’s actually going on in design to make that happen. So actually convincing Joe Smith that, no, this really is okay and maybe this becomes just a societal breakdown thing. They’re like, says who? My cousin Vinnie said you’re going to use this for nefarious purposes and no amount of rationalization will change their mind.
0:27:57.8 MT: So to me, this is a dimension of business design. I’m a business nerd. And an observation that I’ve had is that whatever a company says its mission is, if the execution on the mission is not exactly what earns them money, that’s not the mission. The revenue is the mission. Over time, if these two things are not in alignment, I’ll tell you which one wins. It’s the one that increases earnings. So you can just know that and then you can consider that a constraint of business design and then construct a revenue model that is truly consistent and in fact, even supports your mission. That’s one of the things that I’m most proud of with the magnificent success of Human. Cybersecurity. Company fights cyber crime at scale, goes after the profit centers of cybercrime, importantly, doesn’t have to sell to the CISO. It’s not just another layer of protection. If you’re in the business of fraud detection, you actually reduce losses due to fraud.
0:29:06.4 MT: And so the reason why you get paid is that you charge less than the savings. So then every single customer knows exactly why they’re paying you. And the incentives of that company are such that Human makes the most money by going after the biggest source of cyber criminal profit, which therefore means that it is designed to have the biggest possible positive effect on the world, which is super cool.
0:29:35.2 MT: Here with Fulcra, here’s the way I see this playing out. Lots of people consider the universe of everywhere there’s data about you. Everything you use that generates some data. So Facebook knows some stuff about you and Apple knows some stuff about you. And maybe the Oura ring has a little bit of data. And I don’t think you need to go around like deleting all of that. But if you and only you have the superset, if you have all your data from every single one of those sources, then you’re the only one who has the complete picture. And you could decide to then invoke some right to be forgotten, or you ask for all your data to be deleted and then you’ll truly only have… You’ll have the only copy. But I think it’s good enough that you are the master of the complete set, because that’ll alter incentives going forward where some people who just already have some sliver of data about you, they don’t have to ask permission, they’ve already got it. But if they want to have access to the full picture to provide a better service or whatever, they have to ask you.
0:30:48.8 MT: And to a great extent I think that’s winning. If individuals are just in charge and could say yes or no if they’re asked at all, that would be pretty great. Right now, the in real time bidding for like most of the ads that are getting served to you, even though you’ve had to answer a bunch of nonsense cookie consent pop ups, like no one’s really asking your permission for doing some kind of cookie or pixel sync that is connected to some email newsletter that you signed up for that they’re using to figure out how many people are in your household and what your income is. You were just not involved in any of that. And that’s the little turn that I want to just make on society.
0:31:33.1 MT: And we could do that through lawmaking. We try to force people to ask for your consent. But I think what’s even better is to reward them, to rationally motivate them to deal with you. Because if they deal with you, they get better data and will deliver you a better experience. So they’ll do it if it’s in their best interest. And I think that happens when people are in control of like the super corpus.
0:31:58.4 JH: You bring up a point that I actually would love to kind of circle back on because it goes in two areas we’ve talked about. One, I do feel like if there was clarity, so say you were the owner of all your data. I feel like the only way to get people to share their data openly, like on a large scale with companies is if those companies could tell us as the individuals, like we would love this type of data from you because then we could answer these types of questions. Here’s the benefit, like that value trade off they talk about, like if you’re allowing cookies what does it get you in return? Why should you share this with this company?
0:32:33.6 JH: But what’s really interesting is we know that one doesn’t happen. I think it would be amazing if it could. But because we know that people aren’t starting with a question in mind always. There is still the obsession that we talk about a lot on the show that companies have of like just collect all the data. And I do feel like it goes into the connected health conversation we’re having of people think if I have all the data on myself then I’ll be able to answer all these amazing questions. I don’t know what questions I’m exactly going to ask, but if I have all the data it I’ll be able to.
0:33:02.8 JH: And then you get into the reality of a lot of these questions you can’t answer or you’re answering them with data that you inherently realize has biased or errors in it. So then it kind of takes you down the path too of like there’s a whole area of the industry that’s spun up then to collect more and better data but we’re still probably going to miss the piece of like what’s the motivation of collecting all this data? What do companies want to ask and use it for? What do you yourself want to ask? What’s a helpful question to ask? What should you be collecting data to then get out of it? So I know there’s kind of like a lot of branches we could take off that but it’s just been interesting hearing the last couple points you’ve made.
0:33:41.2 MT: I’ll throw this out there as like a concrete prediction of the future. I think the way this plays out is that there’s like too much data for a human to sort through. There are too many potential use cases for it all. But it really does seem to me like we’re headed to a place where helpful AI assistants are within everyone’s grasp. So what I think will happen is you will have a kind of concierge agent concierge that only works for you, that has trusted access to your data and it intermediates with other companies agents and essentially like negotiates on your behalf. So instead of you having to deal with a whole bunch of questions about consent and individual offers, there’s just going to be too much sort through but you’ll be able to delegate it to your agent and just be like, show me the two marketing offers that, that you think like are really going to land with me. That’s like all… That’s how much attention… That’s how much human attention I actually have.
0:34:47.0 MT: And so the… Your agents might be dealing with like countless kinds of unsolicited offers or ideas and is providing the curation layer based on knowing you and then in a rules based way can share like the little subsets of data that are going to be able to activate those offers or make them work.
0:35:04.0 JH: I see.
0:35:04.8 MT: If I’m right, that means that agent to agent communication is going to be the majority of internet traffic within like 10 years.
0:35:11.7 JH: That’s kind of scary to think, though, they could be talking on your behalf in the background and then that becomes its whole own black box. It’s cool, but it kind of scares me too.
0:35:21.1 MH: Just as long as they open the pod bay doors well.
0:35:27.2 TW: But I mean, back to that, I think there is this, and I know I’ve run through it when I’ve made a Fitbit, which I don’t know, I’ve probably gone through six Fitbits over I don’t know how many years. And when I switched to an Apple watch, there was… I genuinely felt a, Oh, my God, I’m like losing all this historical data. And I draw that parallel to the business world. The reality is, is I bet when somebody has a heart issue, they get sent home with a heart monitor and they say, let me collect a couple of… Let me collect a couple of weeks. Wear this for a month. Like, so as you’re talking AI agents, my brain went off on a, I want to lose weight or I want to sleep better, or I want to do X or Y. Here’s all the data that I’m already collecting in an aggregated way. Here’s what’s already there. What can you do with that? Have the agent tell me, you know what, you should put a CGM on for a while, but not turn it into this… There’s nothing in my entire history of working with analytics that makes me think that anyone is going to be good at saying collect this data for a while for a specific purpose.
0:36:48.0 TW: Because there’s… Well, just in case. Imagine the next time you ask, if you’ve already been collecting that, then you don’t need to collect it for another two weeks. So the exchange, you just… You two just had had me thinking like, is there a data… Because that’s one of the privacy by design principles is around like collect the minimal amount of data.
0:37:13.6 TW: Where does that fit into it? That don’t collect it just in case you need it. Collect it once you know what you need. But this nebulous get everything and then we’ll have the most to work with. Some of it’s not going to ever matter. Or not matter enough to make it worth it.
0:37:30.9 JH: Yeah. And in the name of prevention, it’s kind of hard to make that case.
0:37:34.2 MT: Yeah, that’s right. Yes. So in the longevity context, I think if you ever want to train an AI on yourself, you kind of want to have as much data as you can possibly afford to have. So the things get different when you think about data retention, when you’re thinking about it for your own purpose versus regulating businesses for their commercial purposes. One of the reasons why we felt honestly compelled to create Fulcra was because of the data lossage that you just talked about. The fact is that Geocities died. It turns out the internet isn’t forever. Data will just completely go away. And you’ve got a host of options for saving files Dropbox, Drop, Google Drive, Apple’s iCloud. But there’s no streaming data store for consumers. There’s no like Kafka for people. So for data like your location history, your calendars, any biometric. My heart rate just keeps happening. Thank goodness.
0:38:39.8 MT: So it’s not a file. It’ll never be a file. It is a stream. So I need a streaming data store for it. And there literally were no options, so we had to write one ourselves. And with the way I see this being brought to bear over time is that all of these data streams that I have pouring into my Fulcra data store are capturing how I live and how I live and what’s going on with me. Situational awareness is one of the things you need to give to a potential assistant so they can actually be helpful. Right now we’re all experimenting with chatbots where you have to initiate every conversation and that’s really limiting. I want to live in a world that’s more like what you just described, Tim, where some external source of intelligence points out what I’m missing. Tells me about a thing I wouldn’t have thought of and is like, dude, you need to put on a CGM for a couple of weeks. It’s not going to be forever. We just need to sort of sample this diet of yours and see what is up. I think a lot of people want that. Kind of like, I’m looking out for you proactive guidance. Now I’m going to go weird, which is I want to leave this data corpus behind for my errors and to make all of this data unambiguously mine and unambiguously inheritable.
0:40:06.4 MT: I need to collect it before I die. My kids are not going to be writing to Amazon or whatever and being like, Please let us export the data. It’s over by then. You need to have… It needs to unambiguously be yours before the event. And what is all this data add up to? It adds up to how I lived. It adds up to who I did the living with. You’re going to be able to, in some cases, probably recreate my tone by transcribing this podcast and feeding it into ElevenLabs and capturing my voice and you’ll capture some of my vocal intonations. But none of this tells you about all that tacit stuff. All the procedural knowledge. So an AI model that’s trained on me, that lives on after me, is a model that I hope will bake cookies with my great, great grandchildren. I’m extremely proud of my almond flour chocolate chip cookie recipe. And it’s not just about the ingredient list, it’s about how I do it.
0:41:11.2 MT: So you should be able to like, walk into the kitchen in the future and boot up Grandpappy Michael and we’re gonna bake cookies together. This is gonna be great.
0:41:21.5 MH: But only in the first part of the day, not later.
0:41:25.4 MT: That’s right. That’s right. Yes.
0:41:28.1 TW: I was thinking there would be, there would also be an agent saying, you have not asked… Hey, I’m Grandpappy Michael and you haven’t asked me to make cookies with you in a while.
0:41:36.8 MT: Oh, my God. Oh, that’s a little too on the nose.
0:41:40.9 TW: Don’t you want to connect with your ancestry?
0:41:47.4 MT: Guilt tripping beyond the grave.
0:41:50.9 MH: Yeah. You never call.
0:41:53.8 TW: Yeah, I mean, there’s something that says like you, you could always be making better choices day to day. There is a bit of a bleak, hey, do you really want that next… I know you made the cookies. That’s good. But you really… Do you need the third one. We’ve been monitoring you and I don’t know. I mean, it’s…
0:42:20.3 MH: It is interesting because obviously this vision of the world creates and it kind of brings to life some very interesting possibilities. Kind of like you’ve been talking about Michael, and then some concerns as well. And so it’ll be very interesting to sort of see how this progresses. And the one thing, unfortunately, we can’t progress much further. We do have to start to wrap up because we’re running out of time. But it’s… This is pretty fascinating. And at the same time, sort of like, I think on the downside risk part of it, we all sort of envision that guy, Brian Johnson is his name that sort of like measures every possible thing and wants to live forever and we’re sort of like, yeah, I don’t think that’s me, but I think there’s a… Somewhere there’s a happy medium.
0:43:04.4 TW: Did you catch recently he found out he was doing… There was one of the things he was doing that was actually working in the opposite direction. I can’t remember what it was, but…
0:43:15.9 MH: Well, that’s comforting, actually, a little bit. So that’s fine. But it also is kind of exciting to sort of think of yourself like Neo in the Matrix and you turn around and be like, I know kung fu. Because I didn’t have to study to become a kung fu master, but now I have these AI assistants and data that helps me do the things they could do, like understand my heart rate and those kinds of things.
0:43:41.0 MT: Making a data driven decision that one of your health interventions wasn’t working is kind of where we all need to be. Instead of absorbing the recommendations that supposedly worked for the 22 people in the double blind clinical trial, but might not work for you, the question is, what works for you? Specifically you. And then you want to double down on those and stop the ones that don’t. So I’m optimistic about that kind of tuning over time. I think lots of people are going to live for a very, very long time from here.
0:44:09.2 MH: Yeah. Until we upload ourselves into the machine God. Oh, wait, did I say that loud?
0:44:12.0 MT: Yes, that’s right. Yes. Bring on the silicone brains.
0:44:17.9 MH: I mean, we didn’t even touch on Neuralink, so that’s a second episode maybe. Okay, we do have to wrap up, but one thing we like to do is go around the horn, share something that we think might be of interest to our listeners. It’s been a really awesome conversation though, Michael, and thank you so much for joining us to do it. But yeah, you’re our guest. Do you have a last call you’d like to share?
0:44:38.8 MT: It is outrageously cold here in coastal New Hampshire. It’s going to get down to 3 degrees Fahrenheit tonight. To the first thing that pops in my mind is actually just like my favorite new product. I got Inu heat gloves. I-N-U heat gloves. So get this.
0:44:57.6 MT: They’re gloves that take a battery pack. The battery pack doesn’t use some weirdo proprietary connector. It’s USB-C, thank goodness. So I like charge the battery packs with a USB-C outlet. I snap them onto my gloves and oh, my God. They really do work. Just…
0:45:18.4 MH: That’s awesome.
0:45:19.6 TW: It’s so cool.
0:45:20.8 MH: That’s so cool. I just got a fleece for Christmas that does the same thing and it’s… You literally just hit a button and it turns on and it warms you up all over.
0:45:30.6 TW: Michael Helbling, this is… Julie got the… My son got my wife the same vest because we were gonna go skiing and that’s so… And Helbling had shown me his and I was like, that’s weird. And then realized that actually my wife had also gotten one for Christmas and it’s similarly like hooked. And she’s had electric gloves for a while.
0:45:50.6 MT: I know, right here I showed up and I’m like talking. I’m like, oh, yeah. I have an addressable breaker box. Like I’m doing all this like crazy mad science but I swear to God, like the self heating clothing makes me feel like I’m living in the future. Like, yes…
[overlapping conversation]
0:46:05.8 MH: That’s awesome.
0:46:07.3 JH: That’s so awesome.
0:46:08.3 MH: That’s awesome. All right, Julie, what about you? What’s your last call?
0:46:12.6 JH: My last call. I’m sure everyone’s heard about the congestion tax in New York. I know it’s a big thing. And I had found the link to the congestion pricing tracker and it’s got some good data visualization. I’m really interested to see as time goes on, like what do they find? They even had done I think a good job like stating what they’re hoping will happen from it. So I love that they actually paired it with, hey, this is how we’re visualizing things. Want to know how they’re going to analyze it? What are their conclusions going to be? But my favorite part, Tim, is that when I got to the bottom of this tracker, it actually says that it is run by, I’m guessing two students at Brown University and supervised by Emily Oster. So I was like, no wonder. I love this. Oh, my God. It’s great. So I’ve just been peeking at it. It hasn’t been running obviously too long just for this year so far. But I think it’s really cool and I’m excited to see what comes out of it, especially knowing that Emily is involved.
0:47:08.8 TW: So I think that might be a last call that needs to become a future episode right there.
0:47:12.6 JH: I think so.
0:47:13.4 MH: Awesome. That’s so cool. I saw that same thing. I was like, oh, my gosh. So that’s so cool. All right, Tim, what about you? What’s your last call?
0:47:22.4 TW: So as I tend to do, I’m going to do a three [0:47:24.3] ____ I think. So one…
0:47:27.2 JH: Three.
0:47:27.3 TW: I want to call out back three. Yeah, they’ll be quick. And Cassie Kozyrkov will be included in one of them. So one, back when we first started talking to the Fulcra team about having Michael on for this, I actually tried out the, kind of hooked up what I could and also kind of interesting even I can find things that I’m not that connected. But there’s no swarm connection. It’s crazy how many things we have that are tracking and the challenge of tracking everything. But I think there is a… There’s like a seven day trial if anybody want. You just download the app and you kind of hook up whatever services and you kind of get to see what the aggregated data looks like. Is that right?
0:48:07.8 MT: Yeah, yeah, yeah. Everyone should give it a trial. I think most people are surprised by the data that they have and just didn’t know about. You quite likely have years worth of step count data that you didn’t even know because it was sort of silently turned on by your iPhone. Yeah. So it’s always… Even if you just want to look and you like delete the app after seven days, like, it’s a fascinating look.
0:48:30.7 TW: And it has a very well documented API from playing around with it. So we’re not… This is not a paid endorsement, but this whole discussion, if it’s got people thinking, oh, that’s kind of worth checking out. And I think actually hearing you talk about sort of division kind of makes it a little more exciting, get people thinking.
0:48:48.0 MT: Thank you.
0:48:48.3 TW: So that’s one. Number two, just a PSA for anybody who, if you’re not already following Cassie Kozyrkov, like, what the fuck is wrong with you? But you should have caught that she’s moved over to Substack and has went through her three weeks of acting training and whatnot. So that’s just in case, decision.substack.com. And then my core last call is completely off. Not really analytics, but over the holidays there was reasons that I needed to explore new podcasts and I did not realize that Mike Birbiglia had a podcast called Working It Out and he has his… David Sedaris comes on as kind of for his second appearance and oh, my God. I don’t know of like two more delightful people than Mike Birbiglia and David Sedaris on the Working It Out podcast.
0:49:39.7 TW: So if you’re looking something for that is just if you know either of those guys and their sensibilities and you’re into them, that was like, oh, My God. Just like heaven of listening for 40 minutes or however long it was, has nothing to do with analytics, but had to put in a plug for that as well. What about you, Michael? Do you have six last call?
0:50:01.5 MH: This is just… What’s a four [0:50:02.8] ____? Is that a thing?
0:50:04.0 TW: No. A quad.
0:50:08.0 MH: A quad [0:50:08.0] ____. And I’ve got an Octa [0:50:12.0] ____. No, it’s just really pushing the limits. I have an AI agent that will talk to your AI agent about my last calls. So we’ll negotiate. No, so, no, actually. So one of the things I ran into recently that I thought could be a little bit helpful to our audience is a lot of folks use Google Analytics for website tracking. And one of the things that is kind of required to make that tool useful at all is to export it into BigQuery. But what’s in BigQuery is often quite different than what’s in GA4, and that’s kind of hard for business users. So Himanshu Sharma actually did a mapping of those metrics and the calculations to get to those metrics pretty comprehensively from GA4 to BigQuery. So if you’re in that place where you’re trying to navigate that, that could actually be quite a good resource to use. Then alternatively you could also switch to a better tool. But in the meantime, that is one that I would say is one you could bookmark and use as a good reference.
0:51:14.5 MH: So. All right. So I think this has been so fun to kind of dive into this conversation, Michael. So thank you so much for joining us. It’s been really cool. I love kind of hearing your vision for the future, the way you’re thinking about this, the way that the world is progressing on these fronts is kind of a very cool frontier that we’re on again both with AI and the connected health and the connected self. So it’s… I really appreciate you kind of sharing some of your thoughts and things about that.
0:51:46.9 MH: And I’m sure as our listeners have been listening, they might have some thoughts and feelings about this as well. We’d love to hear from you. So go ahead, reach out to us. The best way to do that probably on LinkedIn or you can connect on the Measure Slack chat group or also by email contact@analyticshour.io. Michael, do you… Are you active on social media at all? Could people find you out there?
0:52:13.6 MT: Yeah, principally find me on LinkedIn, the company Fulcra Dynamics, also on LinkedIn. And I’m on X with my old school hacker handle of Kubla, K-U-B-L-A. Hit me up there and find Fulcra on X as well.
0:52:29.1 MH: Awesome. Thank you. So that you can reach out to him as well and follow him on those channels as well, so you can hear what the latest and greatest is in this crazy changing world. All right, well, hey, listen, one of the things we’re trying to do this year is make sure that people get access to this show. One of the ways that you can help with that is putting a rating or review where you listen to the show, whether it’s Spotify or Apple or wherever. We would love to have you rate the show. Review the show. That really helps, apparently algorithmically, until AI can take over and recommend us to the right people.
0:53:07.6 TW: IHeart radio, is that right? That’s really where we’re [0:53:09.6] ____.
0:53:10.3 MH: Yeah, we’re really targeting that one. That’s where our… That’s where our listener lives is… Anyways.
0:53:21.3 MT: AI Heart Radio.
0:53:22.5 MH: AI Heart Radio. I think we… Let’s package that up. We’ll get a series A in no time. So. But yeah, if you’re listening and you haven’t done that before, we’d love it if you could. And we always just love hearing feedback about the show as well. And it helps us think about the future of the show. So really appreciate it if you can. And no show would be complete without a huge shout out and a thank you to our producer, Josh Crowhurst, for everything he’s doing to make the show possible. So thank you, Josh. And I am sure I speak for both my co-hosts, Tim and Julie when I say, no matter what, what your device is and what you’re measuring, just remember, keep analyzing.
0:54:10.2 Announcer: Thanks for listening. Let’s keep the conversation going with your comments, suggestions and questions on Twitter @analyticshour, on the web at analyticshour.io, our LinkedIn group and the Measure chat Slack group. Music for the podcast by Josh Crowhurst.
0:54:28.7 Charles Barkley: Smart guys wanted to fit in, so they made up a term called analytics. Analytics don’t work.
0:54:34.7 S?: Do the analytics say, go for it no matter who’s going for it. So if you and I were on the field, the analytics say, go for it. It’s the stupidest, laziest, lamest thing I’ve ever heard for reasoning in competition.
0:54:53.1 TW: Rock flag and keep analyzing me.
Subscribe: RSS