Insights, Marketing & Data: Secrets of Success from Industry Leaders

ZAPPI - Steve Phillips, Founder & Chief Innovation Officer. Why insight specialists should drive marketing & innovation; raising $170m investment for growth; the short and medium term benefits of AI; managing the transition from CEO.

Henry Piney Season 4 Episode 11

Send us a text

Delighted to have on Steve Phillips, the Founder, former CEO and now Chief Innovation Officer of Zappi. Steve is one  the foremost thinkers and entrepreneurs in consumer insights, marketing and innovation,  a veteran of several  start ups and the driving force behind Zappi, who took on a $170m investment from Sumeru Equity  Partners in December 2022 to drive further innovation.  

We have a great conversation including: 

  • The story behind Zappi’s creation and growth
  • Advice for entrepreneurs
  • How to instill true benefits from  automation
  • The balance between self-service and conventional agency offerings
  • Using meta-data and AI models to ideate before testing
  • The use of AI agents
  • Managing the transition from CEO and ensuring staff still say ‘no’

With thanks to Insight Platforms for their support and MX8 Labs for sponsoring. 



All episodes available at https://www.insightplatforms.com/podcasts/

Suggestions, thoughts etc to futureviewpod@gmail.com



Speaker 1:

The thing that's really interesting here is that we can, as consumer insight specialists, because the skills we have we are great at understanding data. But we're not only great at understanding data. We're great at aligning different data assets intellectually not necessarily technically at an IT level data assets intellectually not necessarily technically at an IT level, but intellectually, because we always are looking at brand tracking study plus an ad test, plus some focus groups that we've done, and so we naturally think about multiple data sources and coming up with an insight from them in a way that most data streams don't. The people involved in clickstream data don't think about focus groups, they just don't. They just think about clickstream data and the people who think about sales data just think about sales data. So we as insight teams have the opportunity at clients to play the role of bringing this stuff together, and AI is making the technical process of bringing that data together much, much simpler.

Speaker 1:

Welcome to FutureView and the first episode of the new year 2025. It's been a little bit delayed, I'm afraid, but it's an episode that I know is worth waiting for. I have on this week Steve Phillips, one of the foremost thinkers and entrepreneurs in the consumer insights, marketing and innovation space. Entrepreneurs in the consumer insights marketing and innovation space. Steve is the founder, former CEO and now chief innovation officer of Zappy, the consumer insights platform that really shook up the industry by running core methodologies in an automated fashion, speeding up deliverables and allowing clients to run projects on a self-serve basis. Steve also led the process of taking on $170 million of investment from Sumero Investment Partners in 2022, helping Zappy further accelerate their digital and AI product development. So there's a ton to get into here, including Steve's thoughts on the balance between self-service and more conventional agency offerings, applying metadata to evaluate and ideate before conventional testing, the use of AI agents, and why market researchers have the ideal intellectual skill sets to thrive in a new world. It's a fascinating conversation, I think, so I won't delay longer. Just a quick call out for our sponsor, mxa Labs, who are at an earlier stage but also pioneering AI-based efficiencies and analytical tools in the insight sector. You can check them out at mxalabscom.

Speaker 1:

Now onto the interview. Thank you for joining. It's a pleasure to have you on the podcast. A pleasure to be here, henry. Thank you for inviting me. Fantastic. Well, the usual first question I ask most guests. So what's one thing that most people wouldn't know about you, so something that it doesn't have to be a deepest, darkest secret it could be, if you want it to be but just something that isn't easily available on LinkedIn or the web.

Speaker 1:

Okay, I've lived in multiple countries, but probably the most interesting place I've lived was on an old Chinese junk in the South China Seas. I lived there for a couple of years. It was very old and leaked a lot. So my girlfriend at the time who's now my wife used to force me to sleep in the damp patch caused by the leaks in the boat. Crikey, so why were you doing that two years on the South China Seas? Well, actually I was working in Hong Kong. So I worked in Hong Kong for about four years in Consumer Insights and spent the first couple in an apartment downtown. And Hong Kong is very intense. It's a wonderful place but very intense.

Speaker 1:

And a friend of mine was living on a boat and was selling it and I thought, oh, that would be a much more relaxed way of living for a bit. So I bought it, I've got it. So it was almost like the equivalent of living on a bit. So I bought it, I've got it. So it was almost like the equivalent of living on a houseboat in London, exactly, except you didn't have a thingamoring, so I had to get a water taxi to the shore every morning and I had no running water or electricity or anything. So it was slightly more difficult than that, but still a lot of fun. Yeah Well, I was about to say it sounded like terribly glamorous and romantic, until you said the no running water or electricity benefit. Anyway, we should probably get on to a different type of journey.

Speaker 1:

That's probably a terrible one, but the background on Zappy and I was really interested to learn a little bit about your journey through the insights world and then leading to Zappy and why you decided to start the business. Yeah, sure, so I'd been in Consumer Insights and worked in multiple places around the world. So I worked in London for about a year, then Asia for about six years, then New York for about six years, and at that point I decided I was going to start my first company. So I opened my first company when we moved back to the UK and Zappy is my fourth company and it was really a realization that clients love consumer insight. It's very powerful. It helps them make smart decisions. They love consumer insight.

Speaker 1:

But there were a couple of things I really didn't like about it, and that was the process took too long and cost too much money. And I was trying to work out it was the genesis of zappy was very much a single moment of inspiration. So I was literally sitting outside in my garden with a glass of glass, two glasses of wine, wine and thinking about how you could make it cheaper and faster, which was the main complaint that clients had, and automation was the obvious answer. And then I started thinking, oh, if you could automate market research, what would you do? And then I came up with the idea of creating an app store for market research do. And then I came up with the idea of creating an app store for market research and probably the first four or five years of the business I came up with the entire sort of business plan in my head, in my garden, over a few glasses of wine, and then it was all about the you know, one percent inspiration, 99, perspiration, perspiration of actually making it work. So that took several years before we could launch. Yeah, yeah, I can well imagine.

Speaker 1:

And so the original iteration in terms of the app store was to focus then more on self-service, as in you'd be able to turn up a certain size of business from, I guess, sme up to big enterprise clients, and would very much be a self-service proposition. Yeah, exactly, and it was very much. The idea was that we would create a technology company and that we would then automate great thinking from other market research companies so a client could go onto the app store and say oh, what ad test do I want to do today? Do I want to do a system one or a brain juicer ad test at the time? Do I want to do a link test? Do do an xos test, and you know there would be consumer ratings and reviews of those different tests and they would be a different price point, exactly like the apple app store. And so that was the original idea and then it would be self-serve. Yeah, oh, so, really. So you were going to be kind of like, almost like a not necessarily a broker, an exchange, I kind of guess. Exactly in the first, exactly, yeah, I mean, we were going to be almost like a not necessarily a broker, an exchange, I guess in the first instance. Yeah, the idea was very much to repeat the Apple App Store, but for B2B market research, which didn't really work, but it set us on the path that did work. Yeah, that was going to be my next question as to whether that really worked, where I guess you probably ran into some challenges, given that that's not how the business evolved.

Speaker 1:

And so then, which bits did work? Was it? The automation behind the scenes became quite attractive? Yeah, so what? What did work? Was that clients very much liked the fact that we could turn around a project in a oh, so we could do great market research in an automated fashion, so we could do a project much cheaper than it had been and much, much faster than it had been, so that it enabled people to work in a much more agile, iterative manner. And this was a time when, you know, speed was thought, price was speed and price were going to be really attractive to clients. But, to be honest, it was the speed, it was the ability to be involved in decisions that otherwise would be made without consumer insights. If people were moving quick in the development of a product or an advertising campaign and the consumer insight, and they go to consumer insights and say, oh, can you do a test on this? And consumer insights said yeah, yeah, yeah. What I'd like you to do is stop working on what you're working on for four to six weeks while I go off and do a project and it just didn't work. So the ability to say, yeah, sure, I'll give you some results in a couple of days, that was that was revolutionary at the time and really what clients wanted. That became the core insight that we then took into the second stage, development of what became Alapena.

Speaker 1:

Yeah, I see, and, steve, I warned I might go a little bit off schedule, but I was having a conversation with somebody yesterday who's a pretty senior, successful guy in I'm sure remain nameless, but he gave me this phrase where he said market research agencies have institutionalized complication as a means of protecting their market position, and it seems like, to some extent, whether you would agree with that or not, that's one of the problems you were looking to solve, of going actually it doesn't need to be so complicated behind the scenes. You can still have very high quality, but at greater speed. Is that fair, very fair? I lawyers do exactly the same thing. I mean, you leave, read any legal document and you realize that that you know it's just a cabal of people trying to make things as complicated as possible and and I say this all the time in sapphire. So we we have researchers and they're not doing market research. They're building market research products and they have. I fight with them all the time.

Speaker 1:

We have this tension between simplicity and complexity and my point is always that it doesn't. You have that tension, but it should not show itself in the way you deal with clients. So my example is always Google. The Google algorithm, the Google search algorithm, is incredibly complicated. It's really sophisticated, but your interaction as a user with Google is incredibly simple and making complex things complicated relatively easy. Making complicated things simple is really difficult and that's the challenge that we always have internally and is what we I continue to fight, the good fight.

Speaker 1:

I like to think, and we're always trying to our sampling, because quality is so important in our industry. Our sampling is really sophisticated, but we can tell clients that if they want to know it, and we can show them a white paper, but we don't have to exhibit that in the self-service platform. That complexity can be hidden in the same way that the algorithms involved in some of our text analytics. You don't want to know that detail. You don't need to know that detail. What you need to know is the great insight and you need it quickly and simply so that you can use it easily.

Speaker 1:

Yeah, it's a common issue, isn't it, across the sector, whereby researchers bless them, who we've known very well for years and years love explaining how it was made. You know how the project, the design of the project, the way the analysis works, and sometimes you could see the client kind of go. I just want the answer. You know I trust you guys. That's why we hired you. Yeah, I, I remember when we we released a product, I got really annoyed. And it it was. It was basically had a derived importance measure and and we had some text underneath it to explain what was going on.

Speaker 1:

It started off by saying this is based on a Kruskal's analysis and we use Kruskal's. Because of this, I thought who cares about whether it's Kruskal's? What I want to know is why am I doing a derived importance map? What is it giving me? What decision is it helping me make? But people who put it together had been very excited. They looked at 37 different algorithms and decided cross-course is most useful in this situation, and so they wanted to tell someone. And it's like go and tell someone else. I don't mind having a geek button somewhere that says if know, if you want to know the details, then go here. But that's not what you should start with. Yes, I think you're suggesting it's absolutely fair for them to want to tell somebody about it, about this amazing methodology. It's just you've got to make sure you're telling the right person, I guess.

Speaker 1:

Yeah, yeah, so trying to pull myself back onto schedule like a little bit, and we've touched on some of this, but it seems like you had certain sort of ideas or theses when you set up Zappy, and so I want to look at that question from a couple of different directions. So what were some of the ingoing theses or any of them that didn't prove to be the case, and what were some of the ones that did prove to be the case? And what were some of the ones that did prove to be the case? Well, so the first idea, that market research companies would want to put their thinking onto a competitive app store, was partly right, but the business structure of it was wrong. It was a really interesting process going through with these large agencies, because they had a lot of pressure from clients to do things faster and automation is the thing that helped them. So they did like the idea of being involved in Zappy, but only in a very tentative way.

Speaker 1:

So we would say to these companies look, we can automate everything that you do. And they'd say, yeah, but we don't want to automate that because we charge. Say it out loud, but we know what the reason was. They charge for the labor doing a project and so if you take away the labor which is roughly what we were saying then you have to charge a lot less and they've got a lot of people sitting around fiddling their thumbs. That didn't work as a business model for them. We had to end up creating our own systems and our own product, because the fundamental business I mean, it's like you wouldn't automate the process of doing contract law for law firms. You do it for clients because the law firms don't want you to automate it. So it's the same problem. So that was the thing that definitely didn't work. There are lots of we've made lots of mistakes along the way of building bespoke code for different people in different situations and not having the right buy-in from different clients on different things, and we've made mistakes in what we've done. So there are lots of individual mistakes along the way, but I would say that's probably the core thing that I got wrong going in. But I said the core thing we got right was price and time, but particularly time, yeah.

Speaker 1:

And so, with the benefit of your experience of building out this business and other businesses, what would be your advice to others going along this journey? Say, there could be early stage companies or companies at a scale up stage in terms of evaluating the track that you're on. Are you about to make a mistake here? Where should you invest? What would be kind of? I'm not going to ask for golden rules, necessarily, but your thoughts on that? Well, market research is a great industry to innovate in and to open your own business in. It's good for both of those things, and there are two sort of types of businesses. There's one which is a consultative business, which my first few businesses were, but that's a very, very different type of business to Zappy, which is an innovation-focused technology company. So I would say, if you're starting a services business, so much of it is about clients and networking and and doing great work.

Speaker 1:

Continue. You know it's. It's a very I used to love working in in my old companies because the work itself was really interesting. So you know understanding consumers, understanding people and how they react and why, and helping businesses make smart decisions because of that is a really interesting place to be. You really get to know people, so I love doing that, particularly the qualitative work, but it's a hard business model. It's a really hard business model.

Speaker 1:

Next year bears no relation to this year. You have to redo everything. You really are only as good as your last project, whereas a product-led company like Zappy is a much better business model if you can make it work. It's much harder to make it work because it's so easy to get things wrong. And if you're doing something much more like Zappy, then being flexible and iterative in your approach is so important. You've got to constantly be testing things with the market, constantly iterating, constantly updating, constantly talking to clients about what's working, what's not working, and have that ability to develop something but then look to continuously improve it, rather than rest on your laurels and go on to the next thing. And that that's been. It's been very difficult lesson for us to learn, but it's very true. I can really imagine.

Speaker 1:

It's a very hard balance and it led leads into one of the other questions I want to get into in terms of how you manage the product development process and particularly this balance between I guess in investment jargon people would call it managed service as opposed to SaaS, or just self-service as opposed to. How much have you got to be really, really working with the clients? I imagine that must be quite a tricky balance. It is very much is, I would say, the thing that we've come to over the years. People have become more likely to self-serve. So 10 years ago when we launched the business, no one really self-served and we had a services layer on top that would serve for them.

Speaker 1:

But now, a proportion not quite half, but close, close, I would think self-serve the the interesting thing is, I think you have to make a platform the way we think about it. So you have to make a problem a platform that is self-servable, so it is simple to use, it is intuitive, it is easy to get results and those results are explanatory. So you can self-serve. But then you have to meet the client where they are. So that can be. You can self-serve if you want, or you can have services on top to help you do that for you, and then you can iterate over time. So if, if we have lots of large clients and they might have 30 users and 20 of them self-serve and 10 of them don't, and that's acceptable too. So you have to develop as if it's self-serve but then service as per the client need, I see. So it's a question of offering them really the optionality If they want software, great, and if they want higher touch services, that can be provided.

Speaker 1:

And then how would you charge for the managed service component of it? I don't mean in terms of specifics about the amount that Zappy actually charges, but do you tend to do that on kind of like a modular basis of going, is it per hour? Or you want us to do the reports for you? Or additional analysis? Well, the, the, so the reports are done by the system, right? So so the? The report is written by an ai, based on the data and based on our macro data asset, because we've obviously tested lots of different things. So everything on the reporting side is done by the computer.

Speaker 1:

Really, the service is about helping people do some meta analytics, thinking through what stimulus to use in certain situations. Maybe they're doing a competitive review, so helping them launch a number of projects. So then it is done on exactly what you said. It. It's a modular basis, roughly based on number of hours spent to help them with whatever task that might be, and some people do both. Some people will launch individual projects on their own but get a consultant to do more meta-analytic type or conventional evaluations Ah, that's interesting.

Speaker 1:

I hadn't thought about that as a component. So, yeah, they genuinely are using the platform's capabilities and they might say I'm going to make this up as a scenario, but I've got this consultant who knows the auto sector really well and I'd like to use Zappy as the backbone, but use this person for the industry, that really specific sector insight Also. They may know the sector, may know me really well. So Zappy doesn't necessarily know the system, the technology doesn't necessarily know the sector may know me really well, right? So Zappy doesn't necessarily know the, the system, the technology doesn't necessarily know the client and the client's ways of working very well. So fitting, that can also be a skill.

Speaker 1:

And then, in terms of the visualization component of it, have you really embraced that element? The idea for the company right from the beginning was we wanted to do great market research, automated. So I wanted to put a quantitative market research agency in a box in software and that meant sophisticated analytics, sophisticated design, sophisticated reporting. Now the thing that has changed over the last couple of years is not so much the visualization visualization was good before it's. The writing of the report has become a lot better because of ai, because of the advances in llms, so we've used those. So now, if you look, read a report from us, you know five years ago you'd oh, that's a good market research report, it's probably written by a junior researcher, whereas now you would say that's a good market research report, that's written by a senior researcher. So it's a sophisticated analytics, sophisticated visualizations, but we have been building them for 10 years and without getting too much into competitive evaluation.

Speaker 1:

Do you see that as one of the major points of the competitive differentiation in terms of how smart the analytics has got, as opposed to the basics? Because I won't name names, but there are various other companies that would claim to steal your phrase, steve quantitative research in a box, how does that be differentiated? I think we are better. Of course I do. You think you've got a better box? Yeah, yeah, yeah, yeah. But that's not where I see our genuine competitive advantage. So I think we've got a brilliant ad test and a brilliant pack test and a brilliant product test and a brilliant product? Of course I do. I was involved in designing most of them. I think we've taken the best thinking and, but that's not where I would say we are particularly strong.

Speaker 1:

What we have is now we have a very sophisticated underlying data platform, and that data platform allows you to look across all of the projects you've ever done and, to a certain extent, all of the data that's in our system, and do meta-analytics. We can answer the question this ad that I'm a client. I've got an ad and I want to see will this ad work with young men in Thailand? Well, we can tell you that, and so can lots of other companies tell you whether this ad will work with young men in Thailand. In our system, you can also ask the question what type of advertising works with young men in Thailand? So you can look at your meta data instantly and find out within three, four minutes what type of advertising works well with them, what type of advertising doesn't work well with them. You can inform the brief instead of testing the outcome of the brief, and increasingly and this is where we're using AI we can help you based on the insights that we have in our data asset, we can help you create the innovation in the first place.

Speaker 1:

So we are now building innovative products for our clients to test in our system, and they're built on the understanding of the consumer in that market, in that category. So it's very much the data asset that is the differentiator for us, rather than the individual test itself. Yep, I see that's really fascinating and so, playing it back a little bit, it's obviously the core product you could just test and fine. However, because you're able to aggregate all the data and it's well organized and you've got intelligent AI operating across the course of the whole data set, you can actually, in effect, use it for ideation and development of the concepts that can then be plugged into the system for testing, exactly, exactly. So if you think about from an innovation perspective I mean we're live with this now we have a GenTech I don't know if you're following the agent stuff on AI but we have a number of agents working together, together and they can help you create a new product.

Speaker 1:

And in the old world, if you were a big CPG FMCG client, it would take you a week to come up with 10 people in the room whiteboarding to come up with a product, a new product or a couple of new products and then you go into testing. Well, now, if you're a market researcher in one of those organizations, you could create 30 different new product ideas, all with a slightly different angle based on consumer insight, in a day and then you could test those overnight. You could come back the next day with and the next thing our system does is not just give you a market research. So if you test a product with us product concept, you can get a market research report which tells you what people think about that product concept. Or you can ask the system to revise the product concept based on that insight. So it will rebuild the concept for you based on that consumer insight. So it will rebuild the concept for you based on that consumer insight. So you could come up with 30 ideas in a day, test them overnight, throw away 20 of them, have the 10 of them left and have an automatic iteration on those concepts. So by day one and a half, one person in the consumer insights team has come up with 10 great ideas tested with consumer insights team has come up with 10 great ideas tested with customers and already improved by that testing.

Speaker 1:

So that's a fundamentally different way of doing innovation. It changes who's involved. It changes how quickly you do it and changes how the number of innovations you can create. Rapid transforms the innovation process that these large companies have and that's really exciting, and we'd like to do the same for advertising and I imagine that's also changing the stakeholder and the buyer to some extent. So the market research department is likely to be central, but that's much less a classic I'm doing inverted commas on the screen market research proposition. That's much more a chief product officer, chief innovation officer, is using this as as a central tool in terms of what they're doing. That that's exactly right, although I would say I mean, I think, I think what we're trying to do. I love the idea of.

Speaker 1:

I did a speech at a conference a year and a half ago, I think, saying how market research will do a reverse takeover of marketing and we can use AI to help us do this. The thing that's really interesting here is that we can, as consumer insight specialists, because the skills we have we are great at understanding data. But we're not only great at understanding data. We're great at aligning different data assets intellectually not necessarily technically at an IT level, but intellectually, because we always are looking at brand tracking study plus an ad test, plus some focus groups that we've done, and so we naturally think about multiple data sources and coming up with an insight from them in a way that most data streams don't. The people involved in clickstream data don't think about focus groups, they just don't, they just think about clickstream data, don't think about focus groups, they just don't, they just think about clickstream data. And the people who think about sales data just think about sales data. So we, as insight teams, have the opportunity at clients to play the role of bringing this stuff together, and AI is making the technical process of bringing that data together much, much simpler.

Speaker 1:

So, whilst, yes, in in this case, we need the chief innovation officer to be, the key component of saying this is the system that we're going to use throughout the organization, which is exactly what's happening with us at the moment, but the implementer of that is not necessarily the marketing team, it's consumer insight team, taking that into the business, and that, I think, is really exciting for our industry. Yeah, I'd never thought about it from that perspective, in terms of good consumer insights, people on a very human level, when they've had a big brief come in, generally, you've had a client who's may well be saying here's my problem, how should I address it? They will come back with a multi-mode approach and therefore it's implicit within that of going whatever it is. I'm going to do my quantitative segmentation and then I'm going to do my focus groups amongst it and then I'm going to do a bit more testing, and I'd never really thought about it like that. Actually, the good researchers are all trained to take on board data strands. I guess they've just been inhibited by the fact it's just them as humans doing it, whereas now the AI, the machine learning and all the rest of it is actually really, really able to accelerate that skill set. That's exactly right and it's accelerate the skillset, it's accelerate the alignment of the data. So the alignment of the data, the ability to utilize multiple data sets simultaneously, is technically very difficult but getting significantly easier, which is particularly in a world of agents. Why directing these agents and managing these agents and thinking about which agents you should use, which data streams effectively you should be using and aligning, is so important and it's a skill I think we have. I'll get also the agent's question at a moment, because it'd be good to understand that a little bit better.

Speaker 1:

But going back to the, the idea of analyzing all your data, coming up with recommendations in terms of the training data sets that you're pulling on. So how does that work? Are you looking within a specific client's training data set or are you looking across the thousand clients, whatever it is you've had like over the years, and you're looking at the, the commonalities and strands between them? I mean, obviously we, we have norms and that's looking across all clients. We also have some very large clients who are looking primarily at their own data sets. This takes me to, I think, the most important thing that insight people on the client side need to start doing, and they need to start thinking about becoming data asset managers.

Speaker 1:

About becoming data asset managers, because in this world of meta-analytics and AI, you can tell an AI to do anything and it will do it, but if the data asset that you're asking it to do something on top of is not well-structured, well-managed and diverse and high quality, then what comes out is poor, and the people and I think it's us, I think it's Consumer Insights are the people who need to say, okay, let me look at, let me think about how I structure and manage that data asset underneath. I should stop thinking about an individual project. I should stop that. We can automate that. We've done it. Let's just stop thinking about that. That's not your job anymore. Your job anymore is no longer thinking how do I test this ad in Thailand? That's just automated. What your job is now is how can I think about the data I have which helps me understand people and their reactions to stimulus like advertising, and where is that data set and what insight needs to come out of it? Who just needs to have access to that insight and how do I make sure that they have access to the insight? That's the job that we're moving into now. It's a job of data asset management. It's a job about understanding agents and AI, and that's, as I said, I think it's a very exciting place. I think we have the skill sets to be able to do it. I think it's a very exciting place. I think we have the skill sets to be able to do it. I think it's what businesses need.

Speaker 1:

Steve, what is an agent? Sorry, excuse my simple question. So the easiest way to visualize what an agent is, it's a chatbot managed by an LLM focused on a particular data asset. So let's say you have a, let's say you have a social media data feed. You're a client you use I don't like brandwatch or whoever for social media. What you would do is create an agent on top of that data asset. The agent is basically an ai. Your interface with that agent is a text box, but that AI has been prompted and trained to look very specifically at that social media data. So you would interact with it and say what are the trends now, today, and it would look at that data and tell you what the trends are. So it's an AI trained and prompted in a specific way on a specific data asset. And the critical thing is that you can have lots of right. So you can have and we've built a legal agent for a client that's trained on their legal understanding of what they need to think about when they're creating new products. So you can create 10, 20, 100, 1,000 different agents. You can have agents reporting to other agents. It's a very interesting world that we're going to, but that's the core way of thinking about it. It's effectively a chatbot aimed at a particular data asset.

Speaker 1:

If I take a slightly old-school analogy, it's almost like having a bunch of research managers or SRMs or whatever they might be, who are specialists in certain sectors and they're helping facilitate and perpetuate in-depth knowledge on certain sectors. And then the super agent or the manager is kind of pulling it all together. But again, that can be an agent too. It's not a bit like that, it's exactly like that. That is exactly the parallel. So we came to this and the manager absolutely can be an agent. In fact, the manager's manager can be an agent, but they don't, they don't necessarily replace, they augment. It's, you know, it's. It's almost like having 30 great smart grads working for you simultaneously. So any task you want, you can get great input to it and great feedback. But there is an element of you understanding the business, you understanding the history of the company and where it's been, where it wants to go, which is harder to put into an agent. The human in the loop, at least for the next five years, is still incredibly important. But that human has so much brainpower behind them that it means they can get to better places so much faster and so much cheaper than they ever could.

Speaker 1:

And so I was going to ask, in terms of the time frame that we're looking at, what would you see as being the most immediately actionable practical uses of AI, say next two to three years, as opposed to more medium and long-term. I mean, there's a lot of talk about efficiency in the short term and then it seems like the agents piece is being developed. But is that more of a medium-term thing, or is that happening right here and now thing? Or is that happening right here and now? So we have very large clients, big multinational clients, who are now solidifying all of their innovation to happen in this way, through an agent-based universe. Salesforce came out recently saying all customer service interaction should happen through or be powered by agents. So the agents world is here. It's probably in the scaling. So it's not in the innovation stage, it's in the scaling stage and certainly will be primarily scaling, I think, in 25, 26.

Speaker 1:

I think the way people initially start thinking about using it is in terms of efficiency. I don't think that's the best way of thinking about it. It's the easiest way of thinking about it. I think what you can do with them is you can change the game. So instead of making the game a bit slicker, you can actually just change the game. But then all new technologies tend to. It's that classic thing about you know the first radio ad, with someone sitting there reading out a newspaper ad and then the first TV ad, with someone reading a radio ad but with a TV camera in front of them, and it took a while before people started thinking, oh okay, I've got a new medium, I've got new technology, I can actually do something genuinely different, and I think that is what will happen with AI.

Speaker 1:

So, whilst agents, I think, will scale, their primary tasks to begin with will be efficiency and effectiveness, but in a year or two I mean, that's where we go I would like to see our agents creating advertising campaigns, getting rid of the creative agency, literally doing it all based on understanding of consumers, understanding of advertising, and maybe there's some scope for an ad agency. But Sora was released today you can create an ad, you can create a movie with Sora and you can integrate great prompting but also great insight data, great consumer insight, into that prompting process, and Sora will start creating things, start creating corporate films, digital campaigns, tiktok ads. It's a new world and people need to start embracing it. And so, if I, obviously there are all sorts of questions we could get into. And so, if I, obviously there are all sorts of questions we could get into around that specific issue, but if I imagine that scenario, then how does Zappy help? For instance, we've got this plethora of ads that are being created by Sora or something similar, and then clients have got to decide what they're going to put into market, but there are so many different ads that are prospectively being kind of created. Maybe I'm answering my own question as I think it through. So the way Zappy would approach it would be it's guiding the concepts in the first place, so it's guiding the AI going here. Are we started off with 10 concepts. We've narrowed this down to three. Get to work, create various iterations within the concept directions that are the most likely to succeed.

Speaker 1:

Would you actually need to test then with consumers, or do you see the AI just going on and predicting what's most likely to be successful? I think it's a really interesting question because you test. You test things now because they were not made with consumer insights at the heart of them, and so you want to check whether consumerists will like them, and so there is a question of do you need to test or use synthetic respondents to test, right? So there certainly is a question about if you have consumer insight embedded in the process. Do you need to test at the end? Well, maybe not. On the other hand, you need to keep your data asset up to up to date. So you need to be testing maybe not so much yours as other people's ads to make sure you understand what the trends are and how people are changing their views of creative etc. So that goes back to my data asset management point.

Speaker 1:

I think people will also always test some things, but sometimes it will be with synthetics, sometimes it will be with real respondents, but if they've got consumer insight embedded in the creation process, then less necessary. Yeah, it's funny, isn't it? Because it goes back in some ways to that argument which, by the way, I've always believed in that if you do good consumer insight early in the process, so you've done good concept work, good positioning studies, so on and so on, you should really you have less need for creative testing down the line because you've done the strategic work up front and you're on the right direction. Absolutely, I think the interesting thing is that will AI come up with genuinely breakthrough ideas? I think it probably will, but humans will too. Some of those are genuinely breakthrough, but lots of them aren't. Lots them are, you know, tweaks, iterations similar to, and those are very easy for, I think, an ai to create.

Speaker 1:

Yeah, um, now I'm conscious we're rambling all over the place, but it's really interesting picking, picking your brain on this stuff, just going back to what you're doing next and what and where zappy is going to go. So you've've stepped back from the role as CEO and you're now the Chief Innovation Officer. To some extent, the clue is in the title in terms of what's involved there. But what is involved there and what was your motivation? Personally, I always said within the business that I would look at. So I'm a shareholder, I care about Zappy's future, and I always wanted to question whether I was the right person to be CEO. I think I'm more of a startup person than a scale-up person. It's just what my brain is attracted to and is interested in and is good at is not necessarily the structure stuff, the organizational stuff, and so we needed someone with that skill set. So I think I'd always thought about it.

Speaker 1:

Maybe I went on too long. I certainly think it was time for me to hand over the reins of managing the business, and the thing that I was always most interested in or always most passionate about was the innovation. So that's the stuff that excites me, and so being able to focus my time on that, and integrating us into the new world of AI, and thinking of line extensions for our own work, and how we can use data better and how we can manage with clients better, and doing things like the book which we released a couple of months ago those are the things that excite me, and so it was time for me to focus all of my time on that and let someone else take over the running of the business day to day. And so how does that work then, without being indelicate, when you've got a founder, the former CEO, the chairman no idea what the shareholdings are, but I assume it's still a significant shareholder and you're the chief innovation officer and you're running with certain ideas, how does the rest of the business say no to you? Oh, to be fair, the rest of the business always said no to me. So we have a very, we have a very open and challenging culture. So it was never, it's never been deferential, so I don't, I have not noticed any difference between when I was ceo and when I was chief innovation officer.

Speaker 1:

People, if I come up with an idea on helping someone with an idea, people will argue about with me. We have a phrase here that we use a lot, which is strong opinions loosely held. So it's everyone can have a strong opinion and you argue it through and you look at data and you test it, and then you change your mind if it turns out that it's wrong, and so the place is full of people who have strong opinions and hopefully they are loosely held because data can trump opinions. So that's the way we've always worked. It hasn't really made any difference Fantastic. Now, on the subject of strong opinions loosely held, what have you changed your mind about recently? So we're trying to do some innovating. What have you changed your mind about recently? So we're trying to do some innovating? I've changed my mind all the time, innovating around some syndicated offerings that we're going to launch and some synthetic offerings, and so we are constantly testing things right now, and it's also true with AI. So I go into something all the time with an opinion and it turns out often to be wrong, but I think if you're into innovation, that's fine. It's just a way of working. So I'm wrong literally all the time, I think, and that's fine, because it just helps you design the experiment and then you find out what the truth is In terms of work-life balance.

Speaker 1:

Some people actually don't believe in the notion, but if you do, how do you maintain it? I think over the years, work and life has blended in a way that just wasn't true 20 or 30 years ago. So you used to have downtime, which was probably downtime, and you used to have work time, which is probably work time, and the two sort of meld into each other a bit. So I find myself, you know, in the gym or going for a walk and I'm thinking about business and I come up with an idea. Equally, I'm I can be in the office and my wife can call on, my kids could call and I'll chat with them, and that's fine too. I think I probably it was probably harder when the kids, my kids, were younger, but now I think I mean there were a lot of people in the organization I don't need to be looking at reports on a Saturday afternoon in the same way I did probably 15 years ago. So I think that balance has probably naturally improved.

Speaker 1:

And in terms of any favorite books or they don't have to be books, they could just be pieces of media, movies, music, whatever it is Any favorites that you would call out or recommend. I listen to podcasts a lot. I cycle a lot, go to the gym, etc. And I tend to listen to podcasts when I'm doing it. So I'd probably recommend a couple of podcasts, one of which would be the All In podcast, which I'm a big fan of. It's a really interesting way of understanding what's happening in technology, but also what's happening in two times in business and in politics. It can be somewhat controversial, but it's quite. It's an interesting lesson. So that would be one I'm big fan of. The rest is history. History is a one of my I can get geeky about that, I'm a big history addict, so the rest is history would be.

Speaker 1:

And final question so where will Zappy be in five years' time, or what do you hope it will be? I would hope it was deeply integrated within the CMO's tech stack. So we are effectively a data company. If I look at where we were five years ago, we were. We were an automation software company. We've become a platform and we're increasingly going to become a data company, and that data has to be integrated within within the cmos, within the organization's data infrastructure, and so thinking about what that integration looks like, thinking about where our data has value, where other types and streams of data has value and how you can combine those value streams to create value on top of the two different streams or multiple streams. That's where I think we need to spend more and more time increasingly.

Speaker 1:

And it's interesting you see it as being part of the CMO's data stack, not, for instance, the CPO as in products that we're talking about, or the CFO. You see applicability mainly around the marketing space Marketing and innovation. Certainly for us, if you look at, say, a Qualtrics, they're much more embedded in an operations stack. How much clients can have different stacks within that is it is an open question maybe that they need to have one and that needs to be integrated. So again, it's a really fascinating place. I was looking at what palantir were up to and you look at what's what salesforce are talking about and those data infrastructure people are in, the ability to integrate is rapidly evolving and then the value on top of that is also rapidly evolving and it's a place where consumer insight people often get scared, but I think we have to jump in and I think we have a lot to add.

Speaker 1:

100% agreed, steve, I will let you go, otherwise, I'm just going to sit here and ask you questions all day at Pickle Ray. But thank you so much, it's been a real pleasure. It's been fun. Henry, thank you for inviting me. Wonderful to do that interview with Steve. I learned so much in the hour or so talking to him.

Speaker 1:

I love the balance of practicality and vision he's able to articulate, especially in the area of AI, where there can often be a lot of blue sky thinking. This isn't necessarily well grounded. It's refreshing to talk to someone who's done it, keeps on doing it and is prepared to experiment on an ongoing basis. Now quick request If you like these interviews, then please follow and rate on your preferred platform. It makes a difference with the algorithms and all that good stuff. On the subject of more interviews, we have some great ones coming up in January with representatives from the likes of Diageo and IHG Hotels and Resorts. These include discussions on how to uncover real human motivations, the state of news in the US, as well as practical means to knit together elements of the marketing funnel. I can't wait to share this with you. However, in the meantime, thank you to Insight Platforms for their support, to Steve for doing the interview, to MXA Labs for sponsoring and, of course to you for listening. Interview to MXA Labs for sponsoring and, of course to you for listening. See you next time.

People on this episode