Insights, Marketing & Data: Secrets of Success from Industry Leaders
Insights, Marketing & Data: Secrets of Success from Industry Leaders
NIKE, Sarah Beachler, Senior Director, Consumer Insights. Adapting best practice from Nike, Meta, Google and Sephora. The importance of multiple consumer touch points; how to balance product and marketing research; advice for agencies.
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
A pleasure to have on Sarah Beachler in a really great chat. At the time of recording Sarah was Senior Director of Consumer Insights at Nike and formerly at Meta as Global Head of Marketing Research for Reality Labs, at Google as Head of Global Consumer Insights for Hardware, Retail and Gaming and Head of Market Research and Consumer Insights at Sephora.
So, a fantastic foundation of experience to get into a range of industry issues including:
- How to integrate research practitioners across major corporations.
- Considerations in researching physical product vs software
- Best practice to knit together product, marketing and sales research
- The importance of integrating multiple consumer touch points
- What makes a good and a bad agency?
- Avoiding unnecessary research duplication
- Advice for young people coming into the industry
All episodes available at https://www.insightplatforms.com/podcasts/
Suggestions, thoughts etc to futureviewpod@gmail.com
at their heart these companies, whether it's it's nike or google. They follow at the highest of levels the same kind of best practice product insights process. You know you start with the white space opportunity in the market sizing. You go learn deeply about the consumer target and their needs. You get into the nitty gritty of product testing, including feature hierarchy, then it's stuff in support of marketing, execution, messaging, all the things that are media related. I actually think the biggest difference, more so, is one of culture at these companies, where that ends up having a lot of ramifications to how efficiently and how deeply that best practice process I just outlined is followed.
Guest intro
Speaker 1Welcome to FutureView and a fascinating perspective from across the world of Nike, facebook, google and more, with the brilliant Sarah Beechler.
Speaker 1At the time of recording, sarah was at Nike as Senior Director of Consumer Insights for the North America Women's Division, listening to get the lowdown on what I mean by. At the time of recording, sarah has also been at Meta as Global Head of Marketing Research for Reality Labs, at Google as Head of Global Consumer Insights for Hardware, retail and Gaming, in addition to being Head of market research and consumer insights at Sephora. So that's quite some perspective to bring to the podcast and I think we have a great conversation around a wide range of subjects, including the idea of an insights chief of staff who pulls together multiple strands internally best practice, differences and similarities, researching hardware versus software in the tech space, creating the ideal balance between UX, product marketing and in-store research, advice for agencies and why you shouldn't over-research Sorry, agencies, that one may result in fewer projects. Sarah's engaging, honest and super smart. All in all, a brilliant interviewee, so let's get into it. So, sarah, firstly, thanks so much for joining today Kind of early for you on the West Coast.
Speaker 2Yes, but not too early. It's 8.40 am. I have a two-year-old. I mean, I'm waking up sharply at seven, if not a little before, every day.
Speaker 1Oh wow, you're doing quite well actually to be making it to seven with a two-year-old. I'm quite impressed.
Speaker 2She usually. I mean, we try to get her to bed on the earlier side, but it inevitably ends up being pretty close to nine, so she's kind of a later sleeper, I would say.
Speaker 1Yeah, I got it, got it. Well, that's a little bit of an icebreaker, but I also wanted to get at the icebreaker I tend to use as well for a lot of these interviews, which is to delve into something that people wouldn't know about you just from doing an online search or looking on social media.
Speaker 2So what might that be? It doesn't have to be deepest, darkest secret, but it could be. I'm a really unhealthy eater. I'm a sucker for anything that is deep fried. My favorite food is fried chicken. So I would say, on the outside I appear very fit and disciplined, but I'm actually I'm a horrible eater.
Speaker 1Really Wow. That does surprise me, having met you kind of virtually before. And so how does that work? I mean, is that just the occasional binge, or are you just one of these lucky people who can kind of eat what you want and not worry about it?
Speaker 2I think it's the latter and I think having grown up in Oklahoma kind of gave me my sea legs, so to speak. That was not the you know healthiest of diets compared to now. They've been on the West Coast for so many years. It's so many salads, so many $20 salads abounding everywhere, and I feel like that was just not how I grew up, so maybe my body's a little bit used to it.
Speaker 1Well, I know what you mean. I've spent quite a lot of time in the Midwest for one reason or another, and you know I mean it's a great place and great people. Salads, though, still tend to be just iceberg lettuce and like a lot of blue cheese dressing on top, as far as I can see.
Speaker 2Yes, yes, california has definitely the better salads.
Speaker 1But not cheese. No, I'm going to get this wrong. Oklahoma isn't, which is the state that does.
Speaker 2Oh, I believe that is Wisconsin, wisconsin, it's very well known for its cheese.
Speaker 1Okay, sorry. Anyway, I'm betraying my ignorance yet again around the states of the US, amongst other things. Let's get back onto the subject. So you've had a great career across all sorts of brilliant businesses Nike, now Google, meta and we'll get onto some of that, and I know you've also got some really interesting perspectives around UX and product research. But before we go through your career, could we just double click on what you do, nike, what the role is and what your responsibilities are?
Acting as a research 'chief of staff'
Speaker 2Yeah, I mean up until very, very recently, I was senior director of consumer insights for the North America women's division and what that entailed was I really I sat on the senior leadership team for that division and directly served the GM, so it was pretty much anything she needed from an insights more analytics perspective. I would be sure to deliver to her. In a way, I almost thought of myself as like an insights chief of staff and I had a small team of analysts and researchers that helped out, so it was very kind of agile, more reactive role than I think a traditional it's really, really interesting.
Speaker 1A friend of mine and she works at another business, actually that sky in the uk, that the kind of the broadcaster, and that's similar to his role and so, if I understand it kind of correctly, so you're working with the head of the division, but it's almost like you're their sort of consigliere in terms of what they need to know about the market, consumer insights, that type of thing. Is that a fair analogy?
Speaker 2Yes, exactly.
Speaker 1If there was something outside, or if there is something outside the usual kind of sphere of information sources you've got, would you then go and commission custom research or look at kind of custom projects with your analysts to pull together existing data sources?
Speaker 2Yeah, we did a lot of synthesis of what was already done in Nike's primary research repository or just analytically elsewhere public off-the-shelf information, things like that. I think we did just a small handful over the past year, for example, primary projects very small handful. What we were really supposed to be adhering to was a model where the global insights team were considered the research practitioners and they had all of the roadmapping and the budget decision making, and so we would have to, you know, influence over to them in order to get what we needed roadmapped. And I think, as I mentioned, when you're in more of a fast-moving role, sometimes there's a real need to consider the timing and whether even commissioning a study that would come out in another you know three months would be worth it in the first place. So we were constantly balancing what was worth asking for versus what would be ultimately valuable.
Speaker 1What type of issues have you been exploring over the last I don't know year, 18 months?
Speaker 2I would say a lot of the usual suspects actually, like you know, product messaging, how to most effectively distribute certain product lines, predictive modeling of demand and of financial upside, depending on price movements, and things of that nature. So it's, I would say nothing, outside the realm of the ordinary. It's more so in the moment. Which franchise needs the most love in the moment? What is the decision on the table that really needs to be informed?
Researching hardware vs software
Speaker 1So you've described how those responsibilities kind of intersect with other departments. I was also interested in terms of some of your broader experience in that I guess at Nike you're primarily dealing with kind with physical products, whereas at Google you had quite a different role, heading up insights for hardware and retail and gaming. So how do the considerations and needs differ between those different types of products?
Speaker 2Yeah, I mean, I still consider consumer tech hardware smart home phones, all that. I mean it's still a physical product, right? And that's actually what got me into tech in the first place. I avoided it for many, many years and ultimately ended up there because, oh, this is a thing that sits on a shelf, that's what I like working on, and so that's what got me there in the first place.
Speaker 2And I would say, at their heart these companies, whether it's Nike or Google, they follow at the highest of levels the same kind of best practice product insights process. You know, you start with the white space opportunity and the market sizing. You go learn deeply about the consumer target and their needs. You get into the nitty-gritty of product testing, including feature hierarchy, Then it's stuff in support of marketing, execution, messaging, all the things that are media-related.
Speaker 2I actually think the biggest difference, more so, is one of culture at these companies, where that ends up having a lot of ramifications to how efficiently and how deeply that best practice process I just outlined is followed. So in tech there's just a lot more iteration, even for a hardware product that sits on a shelf, and there's a much stronger thirst in those organizations for somehow capturing a giant mass market for what many of us in you know Consumer Insights would consider more of a niche audience. So you know, I think that at the same time, tech companies have a lot more money to, you know, invest in each product's development and marketing and research.
Speaker 1So when you say a lot of the tech companies, they're aiming more broadly. Would that be say, for instance, as you're going through that best practice cycle, you've got a very specific consumer target and product market fit, whereas that's less so in the tech world, partially because you can actually iterate a lot of the software so quickly. Is that fair?
Speaker 2To a certain point, yes, but I'll use one example where I actually had a conversation at Google with a product manager and this was in smart home, where I was having to almost convince them that a consumer target was necessary for some of these products. And we're talking about a several hundred dollar smart display, for example, or mesh Wi-Fi all around the house. That's not an every person product and there was some convincing that had to be done because they had actually come from more of the software side in Google and they say, but we're Google, we build for everyone. And I said, yes, if you are a free search engine, you absolutely should and can be building for everyone. I think as soon as you start talking about price point and features and interest in something like Smart Home when at the time it was still kind of early majority at best for most of these products and late majority for something more akin to just the puck smart speaker like the Amazon Echo and Google, we called it the puck. But the really small smart speaker that like the Amazon Echo and you know Google, we called it the puck. But the really small smart speaker that was, you know, 10 bucks or so like that. That's a very different product even then, as I mentioned, a smart display for mesh Wi-Fi. So I think that it just was a different level of education that I had experienced as a more traditionally trained market researcher where, you know, I was steadfast on saying let's land a consumer target, let's build for the consumer target. And they were like well, we're trying to build for as big of a target as possible and they, you know, were focused on that. And I think that that unto itself was iterative, because there ended up being so many debates about the, I'd say, order of operations in defining the market and the product themselves.
Speaker 1It's also interesting as I hear you describe it. It reminds me of something when we were first talking that you mentioned to me around the difference of consumer insight when you're working with product as opposed to marketing and generally I've worked mainly on the marketing side, but product sounds kind of fun. I have to say, what you're describing there is really really central to so many decisions around how the product gets made. Is that fair and have you mainly spent most of the time on product or is marketing as big a piece of it?
Speaker 2I mean, I would say equal parts, both. I would say that I've been lucky enough to really work, you know, all sides of the fence, so to speak, when it comes to insights and bringing a product to market. So again, all the way from up here where it's at the more strategic level, whether that's market entry or category entry, and then it trickles into the actual product teams, the product development process, the designers, and then it's almost marketing. Even though they're connected throughout, marketing really comes in more so at the tail end. In all that I've personally experienced, and part of that is because those lead times are much, much shorter than what a product development or sourcing person has to think about. But I would say the nature of the work is just very different, even though there is a healthy sort of Venn diagram overlap of the core types of insights that both marketers and product folks need in order to bring a product to market successfully.
Constructing the ideal 'research flow'
Speaker 1And to that point around the Venn diagram overlap. I mean, if you were constructing a research department now, would you have the same researchers in the product slash UX piece of it as the marketing piece? Or are the skill sets and the tools becoming so different that you'd have specialization within that department?
Speaker 2I would say a mix of both. So I would almost call, instead of saying so super definitively, this is the UX research team and this is the product insights at a higher level, and both marketing insights folks and user researchers. They research some of these very foundational needs of a consumer jobs to be done. They are researching which features are most important and why, which has ramifications for that feature hierarchy, not just in terms of product build, but in what you end up wanting to highlight for marketing later. There are a lot of things that actually, at a very foundational level, both teams research and need to understand, their stakeholders need to understand, and so I would just put those folks together but still have a separate, for example, ux research team that was super, super focused on design principles and really looking at a very human factors sort of I say tactical, not in the operational sense, but tactical in the literal tactics of how to build this thing.
Speaker 2I would still have them there, just like I would still have marketing insight specialists that would be really, I would say, more integrated with marketing analytics than I've seen in the past, so that there's a mix of primary researchers and analytics folks that are focused on end-to-end marketing, mixed modeling, consumer journey, touchpoint evaluation and lift at different points in that journey and really the execution and measurement of what that whole marketing and retail experience sphere should be. I think that there's also, within that ladder team, a core component that's a little bit separate. That would be more about omni-channel retail experience design. So, whereas there might be those UX researchers focused on very specific product design, there would also be researchers that were really focused on experiential insights and how do we create, for example, the best fitting room experience, how do we create the best relationship between a sales associate and the customer who walks in? And their stakeholders would be field education and retail ops, people, et cetera.
Speaker 1Yeah, I hadn't really thought about that third area you just described. It's a great and they would. Their stakeholders would be field education and you know, retail ops, people etc.
Speaker 2Yeah, I hadn't really thought about that third area you just described. It's a great book because it kind of sits in the middle. I mean it's product slash, experiential, but by its nature it's also soft touch kind of marketing, isn't it? Or maybe you know of a marketing selling, it's the selling piece, right like. I think that this is where I've been really flabbergasted that there isn't enough link between primary research departments, marketing analytics departments and customer experience. I actually think that customer experience should be an integral part to those teams that I mentioned, and not just almost an addendum of. Are we solving a consumer complaint on time, but really thinking end-to-end about this customer relationship, from first touch brand awareness all the way through to what is that day-to-day interaction like between, whether that's a customer service agent or a sales associate on the floor in a store, just what are all those touch points and how do each one of them give an aggregate meaning to the relationship with the brand?
Speaker 1yeah, I guess that's also a very, very pertinent form of first-hand research and insight. Isn't as well? Actually, what's what's happening on the you know the, the shop floor or the virtual shop floor? It's kind of interesting as well, actually, what's what's happening on the you know, the shop floor or the virtual shop floor? It's kind of interesting as well, in that the point you're describing about the marketing departments I mean, from what you're seeing, what you're aware of do you still find that the kind of traditional consumer insights world, focused on marketing, is still a bit separated from the marketing analytics side of it in terms of share, in terms of where you should spend, and analytics in terms of what's working on digital channels? Because I still feel there's a little bit of a demarcation there, whereas what you're saying I totally agree with it. There shouldn't be, but for some reason there is.
Speaker 2Yes, 100%, and I think historically some of this comes from just which stakeholder departments had the budget to fund the analytics team versus the primary research team, and it actually is historically as simple as that. But one of my favorite places I've ever worked was actually Sephora and I, as the primary researcher, was embedded on the analytics team and I felt like that sort of tight collaboration and those relationships and being able to really truly do end-to-end mixed methods studies on some of these things we've been talking about super powerful. I thought that that was actually better done at Sephora than I've seen it in big tech or even at Nike.
Avoiding unnecessary research duplication
Speaker 1So interesting about Sephora because I guess in some ways they may well have been able to operate differently. How about Meta and Reality Labs? I mean, this was very much the hot subject of two, three years ago. What were your experiences like there and how were research needs different?
Speaker 2So the iterative thing I was saying earlier about, you know, having to be a bit nimble in the process, that was just 10x true at Reality Labs, because I think a lot of the products that were being created were attempting to be zero to one. Products that were being created were attempting to be zero to one, and so there just wasn't historic data or general precedent in most cases to draw from. So you had to be a true methodologist. You had to think about what does the organization really need to help it move along? And again, just move forward rather than create more spin.
Speaker 2It's easy to have a stakeholder over here asking one question, having a stakeholder over there asking another question, and then you start trying to create this wily roadmap that addresses everything. But you have to really be cognizant of taking a step back and asking yourself at its heart what are all of these questions about? What are the most uncertain things that the organization needs to understand and know in order to move this product line forward? And you just had to really be considerate of that there, because there were giant teams, there were giant budgets and it was just easy to do too much, quite honestly, rather than be a little more careful and strategic.
Speaker 1So it's almost like you know you had the budget, you could have commissioned lots and lots of research, but you're doing it in such a fast moving world that actually you've got to be careful it's not redundant.
Speaker 2You know, people just moved on or 100%, and I think that what we were talking about earlier about the separation of, you know, ux research embedded on the product team and the marketing insights team embedded in marketing.
Speaker 2That led, honestly, to a lot of duplication, and what I learned both at Google and at meta was that it's best to just really have a tight relationship across all research and analytics teams, even if you, you know you're still technically siloed, and that engagement model that you create in those relationships and the trust that you build and therefore the influence that you have, that's when and where people become way more informed, you know, as a collective and also where research roadmaps on each respective team aren't redundant or duplicative. I think these organizations that have a ton of different insights and analytics teams serving different stakeholder departments duplication is one of the toughest things to make sure isn't happening. Things slip through the cracks all the time, but it's really important for leaders of all of those teams to be super connected and encourage collaboration and connection and just friendly relationships with all of the teams underneath them.
Speaker 1Yeah, it makes so much sense, I guess, corporately and also within agency structures. By the way, it's just been more difficult to do than anticipated. To your point around the work at Meta. If we really need to understand these three things, if I've got these information strands that have given me a great answer, we're very confident now on issue A. Then we can move on to issue B and on to issue C. So again, sarah, you may have touched on some of these, but if I could do a sort of a rule of three type thing, what would you say? Three biggest things are that you've learned across, say, google, facebook and now Nike.
Speaker 2To recap a couple I've already covered, and then I'll actually probably choose my third one as the most important one I've personally found. So number one is collaboration. Like really just emphasizing what I was just about, you know, across research and analytics teams in particular, just tight collaboration, tight relationships. Second is I would say that you know, know your methodology and tools, but be agile in their application, be fluid in their application, because really the same method could be applied in so many different ways to so many different questions, but knowing which method to pull out of the hat at which time is probably the most critical thing. So broad methodologists know enough about a lot of things to be dangerous. You don't have to be a functional or super deep topical expert in every single one, but just know enough so you can choose your roadmap and your approach wisely. And then the third, which I will again say is most important.
Speaker 2It's really one of culture, and I feel like the phrase culture is everything can almost feel overplayed. Sometimes it takes on different meanings and different contexts. But from working at all of these companies, it's very clear to me that before an insights professional can be successful, they have to very deeply understand what the culture in the organization is what it values, how relationships and influence work and, ultimately, therefore, what to focus on that will bring, you know, the team's success, but also really success to the business, which is what we're there to do at the end of the day. And you know, I think that I could point to a couple of and you know, I think that I could point to a couple of instances in any company that I've worked at, where maybe that their external kind of customers, their consumers, their audiences, whatever they might be, but actually don't spend enough time understanding their own constituent groups and what it is they need.
Advice for young people coming into the insights space
Speaker 1Thank you, it's great advice and a variation on that question. So how about young people coming into the insight space now? What would your advice for them be?
Speaker 2I think it goes back to my point on methodology.
Speaker 2Just spending my first years on the supplier side was probably the best thing that I've done for my insights career longer term, because it gave me such a foundation across so many different approaches, be it qualitative or quantitative, that I was able to really draw from that toolkit throughout the rest of my career. You do have to be a lifelong learner in this industry. You do have to continue to reevaluate your toolkit, look at which tools might need sharpening or which tools need outright replacement. It's an ever evolving thing, of course. But I would say for those just starting out, unless they are just gung-ho, super focused on one company or one industry like if they're just like, oh, it's Nike or dye, or it's beauty or dye or whatever unless they're like that, which my advice would be find the company that you love and stick to it and grow there, whether that's actually insights or not even though I know this is supposed to be about insights young people but I would say, to know your methods and be able to apply those in a variety of different contexts, situations, stakeholders.
What makes a good and a bad agency?
Speaker 1Well, that is very sage advice and on that subject I also wanted to pick your brain a little bit from a client perspective and you see both sides, I guess. If I ask it just bluntly, I mean, what makes a good agency? I know there are lots of different specializations and lots of different areas, but what makes a good agency? I know there are lots of different specializations and lots of different areas, but what makes a good agency? And probably the flip side, guess you might even say what makes a not good agency for you.
Speaker 2So this seems really simple but it's oddly just not always done and it's really obvious when it's not always done. And that is actively listening and doing based on what was heard. So, whether that's feedback about a deck or whether that's business context or internal stakeholder context to the meetings or relationships or objectives of the project, just really listen and if you need to ask clarifying questions, ask clarifying questions, but listen and then don't just prove that, or don't just not, I should say, but prove that you're listening by making those edits or doing something differently when you're given that feedback. It astounds me sometimes at how often I just feel like it's in one ear out the other with agencies and I said I really thought they understood on that call, on those three phone calls where we kept emphasizing this and you just don't see it materialize in the outputs. So that's what makes a good agency is you listen and you do based on what you heard.
Speaker 2Yeah, and I think probably the worst agencies are the ones that it's kind of a different take on the inverse of what I just said, which is they oversell. So they come in hot during the proposal stage, all these lofty promises, and then they under deliver and then it's like are you ever going to use them again? No, are your reviews of that agency going to be kind in the future when your colleague from the other company reaches out and asks you about them? No, a lot of agencies will shoot themselves in the foot because they're so eager to earn the business to begin with they don't think about, well, what's that ultimate delivery at the tail end and what's that ultimate delivery at the tail end and what's that going to prove? So I'd say be honest in what your wheelhouse is and what it isn't, and don't oversell and don't under deliver.
Speaker 1It's funny because I've also had some agencies when I've asked them the same question. They've said one of the characteristics of a good client is listening. They said sometimes the client won't listen either. I don't want to pick on, I'm definitely not going to name any names, but in terms of that question around, listening or possibly even overselling, like a little bit, is that more of a characteristic of the big agencies as opposed to the smaller boutiques?
Speaker 2I'd say I've seen it a mix across the board. Large and small agencies have different pros and cons, as I'm sure we're all aware. But this particular aspect that we're talking about, I've seen it big and small, and I've seen the inverse, big and small.
Speaker 1Okay, which is a very fair response of you. Part of the reason I was asking about the bigger agencies is that I think there's been so much focus in recent years about SaaS-type products, and I'd actually caught a lot of private equity money has come into the sector as well a lot of pressure to try to move in a more tech, standardized direction, as opposed to a more customized type of approach, which maybe in some cases equates to not listening very well, or possibly listening, but somewhere behind the scenes being discouraged from acting on what you've been told because you're trying to create a standardized product, but actually it sounds like you run into it may just be an endemic industry issue, and you've run into a similar issue with smaller agencies as well that aren't subject to those pressures, I mean.
Speaker 2I think that's where, like a lot of, I think, where you might be kind of pointing to. Well, there are two things. So one is, I think, that it's all about the team that you're given on the given project, and so some individuals are just better or worse at listening and all of these things than others. And I think that and not to overgeneralize I would hate to do that but I do think if statistically you were to look at ratings of listening and of executing based on what you heard, there might be just more under delivery or missed expectations if your team is comprised of nothing but junior staff. And I think that in larger agencies, if they get very large and they're very focused on margin, they start doing that. And I think it's fine to have a lot of junior staff. One reason the team and really guiding the work and the relationship with the client.
Speaker 2But you know, I know that larger research firms tend to get a bad rap because of that very model, and I've seen junior people rocket it. I've seen them be better than senior researchers. So it is really individual by individual. But is that a pattern that has gotten large agencies into a bit of trouble here and there? I think so.
Quickfire round
Speaker 1Yeah, I could certainly see that. I mean, it's a balance, as with so many things, isn't it, in that you can certainly see the more junior people in inverted commas, who are less experienced, actually may be bringing a really, really important perspective onto the project, whatever it may be. But then you also kind of need that seasoning to go. Actually, how are these recommendations really going to impact the business? Are we listening properly? So on and so on. So, sarah, thank you, you've been so kind with your time. I'm conscious of time, though, so I may just jump onto a quickfire round, if that's all right with you. Some slightly tricky questions, slightly cheeky ones as well, but you don't have to spend long answering them. That's the good news about a quickfire round. What have you changed your mind about recently?
Speaker 2I just came back from a trip in New York personal, just kind of fun vacation with my husband, away from the kid, and I have officially decided that I am severely limiting my alcohol intake Henceforth. I feel like now that I'm in my elder years I just I can't hang like I used to, and I just admitted that to myself as of last week, so that's a new thing.
Speaker 1Well, at least you've got the junk food you know. So you've got some vices, yes, Okay, and so what would your partner say? Your best and worst qualities?
Speaker 2He would say that my best quality is my resiliency, maybe memory, and my worst quality is probably not doing enough laundry or not doing some household basics here and there when I should have done them.
Speaker 1Funny little insight. Final couple of questions what's your favorite book or recent book? Or it could be actually a piece of media, it doesn't have to be a book specifically, and why I was going to say I'm not a big reader.
Speaker 2I hate to admit that, but I've just never been a huge reader. I would say that one of my favorite podcasts that I tune into every week is the All In podcast, and I think that there are mixed I guess mixed takes on on that particular podcast, but I always found, uh find it uh really enjoyable. Um, I think they do a really good job of balancing different sides and perspectives of political and business issues and so, um, I enjoy that one.
Speaker 1And what do you think, and indeed hope you'll be doing personally in five years time?
Speaker 2Well, I'm in the midst of co-founding a startup, so I hope that that will have taken off in five years time.
Speaker 1Well, if you apply, as you said, the resilience and the smarts demonstrating this interview, I'm sure it will do. I'd love to hear more when the time is right. Thanks so much, Really really appreciate it. Hopefully we'll see you again soon, possibly in person.
Speaker 2Absolutely. Thank you so much for having me. Henry Really enjoyed it and hope that I was entertaining. At the least that something. I said was mildly interesting.
Speaker 1Definitely entertaining and very insightful. Thanks so much. I loved doing that interview and hope you enjoyed listening to. The question around the siloing and lack of interconnectivity between data and insights departments, or even integrating all around. Consumer feedback, for instance, in store or at the point of digital experience, is such a prevalent issue, as Sarah identifies. Part of that is around traditional demarcations of which department has held which budget. There's also been a long-term question around storage and integration of data assets and insights, as well as, frankly, attitudes as to who owns what and where data sits. However, that's an area where AI-based solutions are beginning to have a major impact. More to follow on that in future episodes. Next time up we have another great interviewee in Amy Cashman, head of Cantor in the UK. We look at Cantor's evolution and latest product innovations, as well as advice from Amy's long tenure in the industry. Thank you to Insight Platforms for their support on this episode, to Sarah for the interview and to you for listening. See you next time.