Insights, Marketing & Data: Secrets of Success from Industry Leaders

MARKETCAST - Tom Weiss, CTO & Chief Data Scientist. Understanding Connected TV; why media mix modelling is on the way back; understanding the limits of AI.

October 12, 2023 Henry Piney Season 3 Episode 2
MARKETCAST - Tom Weiss, CTO & Chief Data Scientist. Understanding Connected TV; why media mix modelling is on the way back; understanding the limits of AI.
Insights, Marketing & Data: Secrets of Success from Industry Leaders
More Info
Insights, Marketing & Data: Secrets of Success from Industry Leaders
MARKETCAST - Tom Weiss, CTO & Chief Data Scientist. Understanding Connected TV; why media mix modelling is on the way back; understanding the limits of AI.
Oct 12, 2023 Season 3 Episode 2
Henry Piney

Ever wondered how research and measurement works in the world of Connected TV? How the world of advertising has evolved over the years? Are you curious about journey to lead tech and analytics at a major research and analytics firm?   

Then tune into the conversation with Tom Weiss,  CTO and Chief Data Scientist at MarketCast, who turned his childhood fascination with computers into a remarkable career, spanning from software developer to project manager launching mobile phone networks and finally landing in the world of data science and advertising.

Tom's journey is not just about the job titles, it's about the evolution of an industry through the eyes of an insider. He takes us back to the dot-com era, sharing challenges, learnings, and how he honed his skills for quick scaling and managing in a larger corporate setting like T-Mobile. As we move through the discussion, Tom deep-dives into his current role in the world of advertising and data integration. He untangles the complexities of capturing and measuring advertising impact, and the shifting paradigm from linear TV to connected TV, driven by changing consumer behaviour. 

Tom also explains the role of Automatic Content Recognition (ACR) data, integration of survey with first-party data with non-survey datasets, his view on the disruptive potential - but also the limitations - of AI in creative development. 

Tom's book recommendation:

SMALL IS BEAUTIFUL -  E.F. Schumacher


All episodes available at https://www.insightplatforms.com/podcasts/

Follow FutureView on Twitter at https://twitter.com/FutureView7

Show Notes Transcript Chapter Markers

Ever wondered how research and measurement works in the world of Connected TV? How the world of advertising has evolved over the years? Are you curious about journey to lead tech and analytics at a major research and analytics firm?   

Then tune into the conversation with Tom Weiss,  CTO and Chief Data Scientist at MarketCast, who turned his childhood fascination with computers into a remarkable career, spanning from software developer to project manager launching mobile phone networks and finally landing in the world of data science and advertising.

Tom's journey is not just about the job titles, it's about the evolution of an industry through the eyes of an insider. He takes us back to the dot-com era, sharing challenges, learnings, and how he honed his skills for quick scaling and managing in a larger corporate setting like T-Mobile. As we move through the discussion, Tom deep-dives into his current role in the world of advertising and data integration. He untangles the complexities of capturing and measuring advertising impact, and the shifting paradigm from linear TV to connected TV, driven by changing consumer behaviour. 

Tom also explains the role of Automatic Content Recognition (ACR) data, integration of survey with first-party data with non-survey datasets, his view on the disruptive potential - but also the limitations - of AI in creative development. 

Tom's book recommendation:

SMALL IS BEAUTIFUL -  E.F. Schumacher


All episodes available at https://www.insightplatforms.com/podcasts/

Follow FutureView on Twitter at https://twitter.com/FutureView7

Speaker 1:

One of the key things is let's not the research company pretend to be operating in isolation. Let's work with our customers to be integrating their data in our data together, because that tells the complete story.

Speaker 2:

Welcome to Futureview. That's the very engaging and quite brilliant Tom Vice. He's the CTO and chief data scientist at MarketCust. For those of you who don't know, MarketCust has been one of the traditional powerhouses in entertainment research and has expanded significantly in recent years, introducing a broader remit of services combining data science, marketing, effectiveness solutions and primary research across sectors. In full disclosure, I remain a small shareholder in the company, but I haven't been involved operationally for a number of years, so it was great to catch up with Tom, find out what's going on and get his perspective across a whole range of areas, building from his almost 30 years of experience working with multiple data streams, launching mobile phone networks to measuring, TV building, ACR and attribution solutions and revamping brand tracking. There's always a ton to learn from Tom.

Speaker 2:

So on to the interview. Tom, thanks so much for joining me today. Real pleasure to have you on the podcast, Thank you. Lots and lots to talk about, but first of all I want to start with a bit of an icebreaker and see if I can delve into your dark and possibly dubious past. No, I'm sure it isn't dubious, but what I wanted to know is just one thing that most people wouldn't know about you, that they wouldn't be able to find out just in the public domain.

Speaker 1:

Well, I did release a punk rock album last year that is an actually publicly associated with me under the Monica Dr Rope Maker. It was a lockdown project. It's very angry music but if you feel you need to challenge that anger, then the Dr Rope Maker is available on all good streaming services. I was very pleased. Actually, I got my first royalty check from Spotify $41.83, which I thought not bad for years streaming.

Speaker 2:

Tom, I'm so impressed. I would dare to say it's about $40 more than I expected you might have made Without being rude, so I think you're doing very well. I also heard a rumor I don't know whether you can talk about this on a podcast that you may once have been in trouble for breaking into the police computer, is that?

Speaker 1:

true? No, I never got into trouble at all. It's certainly true that when I was younger I mean this was back in the 1980s yeah, I mean everyone had to get on the police computer. Yeah, well, it was just a dial-up number and you were dialing and you could get in. You couldn't get far enough to really see very much without a little bit more work. I mean, everyone with a modem. They knew what they were doing was always and it was just one of the fun things that you did. But of course back in the 80s I'll authorize that accessible computer. What's the criminal offense?

Speaker 2:

I hadn't thought about that, of course it wouldn't be. I suppose the legislation wouldn't have even conceived of that at that point. Anyway, that is the start, clearly, of a glorious career and interest in all things tech and data. And so before we get onto marketcast and how the business is evolving, I wanted to maybe take a little bit of a trip down memory lane and some of the highlights of your career. I mean, you've done a lot, but I was wondering if you could just sort of pull out a few highlights from your perspective, tom, in terms of kind of major career milestones, what you did and some of the things you've learned along the way.

Speaker 1:

So I mean I started off as I'm starting playing around with computers when I was about 10, I think my first bit of software was probably published when I was about 12 or something like that and then I became a proper professional software developer. I was doing software with electron microscopics basically, and I worked with a company called Oxford Instruments who were based in High Wycombe and it was fabulous. So the team was just moving to Windows when I first joined them, having previously done everything themselves in scratch. So I got to work with a bunch of really smart technologists who were just learning how to deal with off-the-shelf software, I'd say for the first 70 years of my career.

Speaker 1:

Probably the highlight for me was when I graduated from being a software developer to being a project manager and I used to basically run all the projects out in Japan. So I mean that was great for someone in kind of their mid-20s going out to Japan learning how to eat sushi, how to eat sake, et cetera, but also learning a little bit more about what international business was like. And the thing that I got the bug for was actually it's really interesting working with people from different cultures, and that probably characterized most of what I was doing, I guess. Then the dot-com era happened and, as everyone did, you had to get involved and do a dot-com and we raised a load of money. But it all blew up very, very quickly.

Speaker 2:

Why was that, Tom? What was the business?

Speaker 1:

So, funnily enough, the business model was to do streaming games over broadband and so it was exactly the Steam business model. But in the late 1990s and the year 2000, the broadband was only just starting and the bandwidth wasn't there for streaming games and so we just tried to do too much too quickly and it didn't work out. But again, work with a whole bunch of great people there. Harris used it out of that into a role where it was basically retraining COBOL programmers how to do stuff in the internet, which I think there was a lot of people doing in my position and I ended up then working for a shop called Kevin Cunnington at T-Mobile and he was kind of the lead technologist for all things innovative and I was basically, you know, I was running kind of change management and stuff like that for him. If I'd learned if a doctor in instrument said, learn how to write code in the dot-com era I'd learned how to scale things quickly Like it's. When I was at T-Mobile I learned how to manage things within a larger organization.

Speaker 2:

What lessons would you see together in terms of scaling quickly?

Speaker 1:

Well, I mean it was very interesting. I mean, I think the key thing is getting the right. So when you're scaling, one of the biggest problems is getting the right people through the door, training them up, and I've always kind of focused on the new things, so it's always been about getting new people in, or the game folks. At the dot-com. We didn't have anything to hold us back and we were just trying to be do too much too quickly. At T-Mobile there was an awful lot of organizational inertia and so that kind of naturally gave you a slightly more conservative approach and slowed you down, and there was more about OK, how do we work within that, but still do things faster? I think the average cycle time was kind of 18 months and we started doing things in three, four months.

Speaker 1:

I remember it was Mobile World Congress when we launched 3G. It was this great demo where you could see how fast 3G was, but only worked if the phone was on the table, Because the only 3G base station we had working was underneath the table. If you took it too far away, the signal wasn't strong enough anymore. So it was things like that where we were yes, we were doing a lot of innovation, but we weren't going too quickly because we had these conservative people to hold us back a little. And I think one of the things I've kind of always learned is you need I've always been very much at let go quickly, do the new thing. I've always needed to find people I can partner with, who are a little bit okay, let's think a little bit more about this, let's not be too impulsive, etc.

Speaker 2:

So then, moving on, Tom, through your career. So then you then got heavily involved in the TV data world. So can you talk to me a little bit about that?

Speaker 1:

Absolutely. So basically I left T-Mobile shortly after I got married and I kind of pushed by my wife into let's try a little bit of entrepreneurialism, yeah, and with that, you know, started off doing a TV recommendation engine, which was great fun when recommendation engine was a thing, then did some stuff around, set up box data when that was a thing, and they all went really well. It's been two years with inside GFK, which I kind of I learned how to do market research. And then a few years later I started working with Zed and Michael who had a company called Cognitive Networks outside of San Francisco, and I spent two weeks over there with them looking at their smart TV data. And at this point there wasn't such thing as smart TV data but they had this ACR algorithm automatic content recognition that would detect what was on the glass.

Speaker 1:

Yeah, what does that mean? Do you think we can make things that look like you know needless and ratings doing this? So you know, I went in there. Look, I don't know I was looking at the right tool, blah, blah, blah. But at the end of the two weeks I actually was starting to see the shapes that did look correct. You know, we were seeing that kind of time of day, day of week type shapes. Yeah, we were seeing numbers that were, you know, looking similar to the kind of ratings that were published in the trade press etc.

Speaker 1:

And you know, for me I thought, okay, well, actually here's something really interesting that I think could be heavily disruptive. So I spent, you know, most of the next few years working with them. Yeah, michael would go in and say, hey, we've got 7 million TV. I think it's 20 million today, but at the time we've got 7 million TV. I knew we didn't quite have 7 million, but you know, we had, we had almost that number, and then like, let's go to the data, let's go to the phone, let's try and find how it works. So for me, this was, this was my kind of break into the US media market and you know, for me it was a fascinating market.

Speaker 2:

Yeah, and then Tom, just to back up a little bit for those who are less familiar, can you explain what ACR data is and then how you were able to use it?

Speaker 1:

So what Michael and Zev had? They had this technology that sat on the TV and it would detect anything. It had automatic content recognition, so basically it would detect any image that hit the glass, yeah. So what they'd set up is they'd set up a system where they would ingest video from the top few hundred TV networks in the US and each frame of those videos they'd create a fingerprint and the software that ran on the TV it would detect when it saw the same fingerprint. So the software would have to say, hey, I'm seeing this fingerprint, send out a stream of them, and then those fingerprints would be processed. It would say, okay, actually this TV is watching NBC, this TV is watching Fox, this TV is watching that. Yeah, and that looked like a signal, but it would look like Nielsen ratings.

Speaker 2:

And was that ultimately how you ended up using it? I mean, I know there are lots of competitors to Nielsen out there and that's a big space that we may well get on to. So was it around actual TV ratings for programs or was it more about advertising?

Speaker 1:

So there were lots of use cases. I mean so far all of the competitors of Nielsen live in the InScape data now as the video it really underpins all of the emerging currencies but a lot of the early use cases. So I mean I was going to vote in the ratings discussions because it hadn't made it look good. But I mean retargeting was a thing for a while. Yeah, advanced audiences has always been a thing. How do we target beyond age and gender? But I mean a lot of it. Nielsen everyone's thrown at me.

Speaker 1:

Everyone trades on Nielsen because it's currency, but they wish they didn't have to pay hundreds of millions of dollars a year for it. So if there's an alternative data set that they think they may be able to use, they're always going to be interested in trying it. I think for me the kind of poster child would probably tune in. A lot of people did very well out of tune in, because you know tune in. What you do is basically you see, okay, someone being exposed to a promo for this TV show. Did they then go and watch that TV show? I'm thinking you've got the same device. You can see where they are in most of the ad. You can see where they are converted. It's very easy to build attribution solutions around that kind of environment.

Speaker 2:

Yeah, I could very much see that, because you're not dealing on, you know repo, for instance from a survey, and then try to knit different data sets together. It's all through the same device.

Speaker 1:

It's all one thing. You don't need any density graph, you don't need anything like that.

Speaker 2:

It's funny what you said about Nielsen. Someone gave me the phrase the other day that you date the other providers but you end up going home with Nielsen, yup, yup, anyway, which may or may not have been a comment on their dating behavior, but anyway I just thought it was a funny little phrase around that type of world. So, moving on to where we are now and the connected TV landscape, I mean so people have been banging on about this forever, tom, about this kind of this, you know, supposed sort of panacea or wonderful world in which you're going to get the detail of digital targeting with the quality of TV inventory. Are we at that point now or where are we in terms of connected TV progression?

Speaker 1:

So, yes, we are Absolutely, because you know, I mean, if you look at CTV ads, bender US, it's about 30% of TV advertising expense. It's a 70-30 split, yeah, and so pretty quickly that's going to go over and I think it's partly driven by advertising but really it's driven by consumer behaviors. Yeah, so consumers like streaming TV. It's cheaper than cable. Yeah, being able to watch what you want to watch when you want to watch it is. You know Netflix has kind of paved the way in that experience. But you know there's a lot of ad supported experiences as well, including there's an Netflix ad supported tier. Everyone needs to see seven ad supports in our nothing. It's more driven by consumers desire to watch on the streaming environment than it is by, you know, advertisers wanting to do that.

Speaker 1:

But you know ultimately what brands have to do is follow the audience and there's a whole bunch of audiences that you cannot reach on linear TV and cable anymore. You know, if you look at younger people, it was funny. So you know NFL has always been a stable cable. You know you go and you watch it on your Fox or NBC, and then Amazon did a deal for Thursday night video. What was fascinating is NFL has always skewed. Certainly in recent years it's always skewed older. As soon as you have 30 night football streaming on Amazon, you'll discover it's full of young people watching it. Yeah, and it wasn't that young people weren't interested in watching football, it's that young people didn't want to watch it on a linear TV network.

Speaker 2:

That's really, really interesting. So for the NFL, clearly that's a strategic decision. Which type of thing I imagine the NBA and whoever else's rights are coming it will also be considering in that if you want to keep the store fresh for a younger audience, you've got to think about new distribution platforms.

Speaker 1:

That's actually right and I think the reason advertisers, advertisers are following into CTV because that's where that younger audience is going. You know, if you're wanting to reach them, you have to buy CTV.

Speaker 2:

And, Tom, just the point you've made just now around it's really been more of the advertisers following the audience rather than the advertisers driving it. Why is that being why? Advertisers not keen, and it seems to me, conceptually anyway, that connected TV or streaming TVs is a great proposition for them.

Speaker 1:

But why would you want to do anything different? I mean, from an advertiser's perspective, if your current strategy is working, you want to continue doing that and, yes, you're always going to have some budget that you can experiment with, that you can try on new things. When you've got to make wholesale adjustments to your media plan, to shift hundreds of millions of dollars from one medium to another, there's always risk involved in that and it goes back to that thing I was saying earlier about me. We've been one of the guys to change everything. You actually need to be working also together with the person who's going to slow things down a little bit. I think when you've got billions and billions of dollars being traded, there's a lot of inertia behind that, I see, and you don't want to make changes and let it go after you and I guess, to be fair to everybody, there's probably inertia.

Speaker 2:

that's just inertia because it's the way we've always done it. And then you've got ideas like sort of switching costs. You've got CMO of big CPG company X who's going okay, if I'm going to put whatever it's, a couple of hundred million dollars against connected TV, it's going to cost me more. There's the opportunity cost of people doing set that up. Where's the return? I'd imagine that that's also the argument.

Speaker 1:

I think if you look at what people like NBC have done with Peacock, they will sell NBC and Peacock together in a package. So I think that's lower friction for a brand than if they're going to have to decide am I going to need to buy on Netflix as well, because actually they're buying from the same person. They're just saying okay, well, you shift some of those linear dollars on streaming.

Speaker 2:

And then is there any kind of makeup provision between the two different outlets? So if they're not getting what they thought they were going to get on linear, then they want to make up to them on connected or streaming TV, or vice versa.

Speaker 1:

I don't know how widespread that is, but I certainly know it has occurred.

Speaker 2:

Interesting? And how are the not necessarily newer entrants, but somebody like the YouTubes of the world, who I think of as more small screen mobile first content providers that were really in this game and now they're heavily, heavily in this game.

Speaker 1:

Yeah, absolutely. It's funny. I've had several debates with people over the last year and YouTube is a TV. It is a big stream medium. You just have to go and ask anyone who's got a big YouTube channel. Share their stats with you and you'll see that probably over half of their viewing is on the big screen. And that'll be everyone, from big things you expect to be business to business down to people who are showing stuff that you would expect here on the big screen. So YouTube is bundled with every smart TV. It's on every streaming stick you put in. It's on every set of box. It's absolutely a big screen proposition and that's what people are using to watch it.

Speaker 2:

And so it's one thing just taking linear TV and putting it in the streaming environment, but that probably not going to work as well as something I would have thought anyway as something that's native and actually designed for the different environments. And then you've got many different types of environments in user experiences. So how's that all working?

Speaker 1:

Yes, I mean there are definitely. There's definitely more experimentation for different ad formats on TV. We see quite often the QR code type one. You've also got pause ads on some platforms. People are experimenting with different things. I think they also A B test in the wild a little bit easier. I think there are the kind of problems that you've had. You know, frequency is quite often a problem on CTV. You think you know the advertising is sold by so many different people.

Speaker 1:

Yeah, so, you know, let's say I want to be targeted by Dashwater. You know they've worked out, I drink their drink and they want to target me. Yeah, they're going to put out the bids to target me and you know all of the different streaming service providers I use will say, oh, yeah, we'll take a bit of that. Yeah, all of the ad nets will say, yeah, we'll take a little bit of that. And before I know it I'll have 30 ads for the same thing. And it's very hard as an advertiser to de-duplicate because you know they don't have a unique ID for me. They just know I'm in the target profile and you know there's no individual ID. So frequency has been a problem on CTV. The other problem is that there are either seems to be a glass of inventory or not enough. So certainly the retirees, where you know it's impossible to get on what you want to get on. But there are other times when it again that makes frequency problem even worse. You know, because you know there's just too much inventory.

Speaker 2:

And so by inventory you mean inventory you can buy. Yeah, Exactly. What are some of the ways in which the research and data world is beginning to solve these issues? I mean, this is something you've explained to me in the past, Tom, but I'm thinking about how you knit together the different data sets.

Speaker 1:

Draw to speaking of two different ways to do it. You can do it behaviorally or you can do it by asking people questions. Most people tend to do a combination of both. Yeah, so if you look at the behavioral one first, the great thing about behavioral is what it can measure if you complete picture four. But it can't measure everything. So, for example, we can put what we call a pixel tag into it a lot of advertising and we can get the IP address if we've been exposed to it. We can license, for example, vzo's in-scope data and we can see who's been exposed to the add-on of VZO TV and what their IP address is. We can match that together with our pixel tags and we can start to see okay, this is exactly the frequency these people have been exposed to. The problem is that doesn't cover every form of advertising. So, for example, exposures on who you wouldn't get to see, exposures on any of the meta-properties you wouldn't get to see, any kind of search appetite that you wouldn't get to see. So that will give you a comprehensive but incomplete view of who's seen your ad.

Speaker 1:

The other approach is you do some kind of recall-based survey initiative. So, for example, we've got this brand at MarketCast. We've got this brand effect survey where we've talked about 3.2 million people a year and we ask them questions about do they remember seeing these ads? We've got other survey-based solutions as well, but it's similar that you're relying on people recalling the ad. Now, the problem with this is recall is highly influenced by how good the creative is, how many times you've seen it. It's also impacted by how well-known the brand is. You'll more like to remember seeing a Coke ad than you are a Dashwater ad, and so the advantage of survey approaches is you can cover everything, including search ads and Hulu, but you're only slightly incomplete in terms of the fact that you're relying on human memory. So what most companies don't do is you end up doing a combination of both. You do something which is behavioral-based to give you full information on what that covers, and you do something which is then survey-based to give you a full market view, albeit slightly more incomplete.

Speaker 2:

But how do you then go further in terms of potentially then tying those two data sets together, and then you're trying to also go down the funnel and see what the outcome is. So what are the some of the techniques that are used to do that?

Speaker 1:

Absolutely so, knitting the two together, you can either do it using that IP graph. If you've got the IP respondents, which can be, you can do it online. So you get the IP, that's the respondent. If your respondents are the same people who are in your device graph, that's all great. You can knit it together that way. And, of course, the other way you can do it is just using a more standard fusion technique where you say, okay, here's a cell of white women from Missouri in my survey data set and white women from Missouri in my behavioral data set, and you can tie things together that way In terms of outcomes. So you know, historically, you know you can do it online, but you can.

Speaker 1:

There's been a lot of. You know that digital has grown a lot based on, you know, attribution. Yeah, you know, Google's famously introduced their last click attribution, which means that you know if you click on a Google ad just before you convert, google takes all the credit for the ad Brilliant bit of marketing for from from Google. They, you know, and that works, basically because you know Google can see that you came through, you saw the ad and they can see you clicked on it. Yeah, and you can put stuff on the ad so you can work out if you can, if you converted yeah. There are a bunch of other behavioral approaches you could do around. Multi-touch, yeah, and they all involve time things together based on some kind of device graph, so it could be an IP address, it could be a mobile ad ID, yeah, and this is a whole industry that you know. Apple is aggressively, you know, trying to manage out the big distance.

Speaker 2:

Yeah, that was what that was. One of the obvious follow-on questions is the extent to which all this is Privacy compliant or privacy forward. I guess it is in everybody's opted in at some level.

Speaker 1:

So it seems that everyone's opted in. But I mean, I would say you know definitely the move toward medium mix modeling, which is the more survey based approach and it's also a slightly more econometric approach as opposed to the deterministic. Were they exposed, did they convert them? I think there's a whole bunch of categories where I think you know Media mix modeling and econometric solutions are more appropriate than ones which are, you know, the pure attribution based. You know if you're, for example, you know CPG data, you know you can't license that to. You know that that used to be.

Speaker 1:

Why do you like it in the US? It's not. Why do you like it in the US anymore on a on a one-to-one match basis? If you look outside of the US, you know matching on IP addresses is more problematic. Some people will do it on a hashed IP address. A lot of people take a more conservative view on the privacy side and finish it. They're not going to do it. So we're kind of entering this world where there's a degree of deterministic Attribution but actually people are relying a lot more on statistical methods now, so as if you're going to take the deterministic approach, you've got to have some form of deterministic data.

Speaker 2:

Well, that's a map against. If you can't get that, you've got to go back to some of the old methods.

Speaker 1:

Absolutely, absolutely. And also I think you know you can't just do lower funnel advertising. See that you're kind of your lower funnel advertising is what drives your conversions. You know, and that's always been your kind of direct response. You know, pull now and I will give you the special offer on the car. But you know, if you look at what you know, coke spends most of their money on it brand advertising.

Speaker 2:

It's anybody, as far as you're aware, trying to tie the the brand effect advertising and maybe actually playing this Direct kids, your, it's your hands, given that I think that that's what one of Mark Carson you products is is described. But the brand effect advertising Down to lower funnel and showing, if I can, whatever you know, create this level of that brand recall or I get these Brand attributes up to a certain up to a certain level, this has a direct effect on the bottom line is anybody managing to solve that conundrum?

Speaker 1:

Well, so there's, funnily enough, We've just actually I know you didn't set me up, we did we did just launch our brand tracking plus product recently. As you know, I mean I've always been more of a data guy and a software guy and I guess in my kind of role at market cars is to try and take Things that have historically been survey based and research based and make it so they use a Con, a combination of different data sets. You know, one of the classic problems you have with survey data is If you ask people about their intention to purchase, you know it doesn't necessarily correlate to do they purchase or did they purchase. You know if you ask people did. If you ask people, you know did they purchase that something, you know actually half the time they're wrong because they miss actually what brand they bought, etc. Etc.

Speaker 1:

Yeah, so you know you, that's kind of stuff where you really need to be bringing behavioral data sets I'd have a very bottom of the funnel and then combining them with, you know, mid and upper final stuff. And so you know, last three years I've been in my past I've been working on lots of things like that. Brand tracking plus is one of the ones we just happened to have launched Literally in the last month, so thank you very much for raising that.

Speaker 1:

I'm very proud of that work that's gone into that.

Speaker 2:

I should say full disclosure. I'm still a small shareholder market cost, but not operationally involved. But I do want to understand a little bit more about this. How does brand tracking plus work and how does it tie together the consumer funnel?

Speaker 1:

So basically what? So one of the said there are two more components of it. So we, we built together a founder model which is in a basic kind of model of how brand activity works. Yeah, and we have different drivers for that. We can help explain the story for how the brand is working now.

Speaker 1:

But what funnels up to that is, yes, absolutely survey data, but it's also other data sets. So you know social monitoring. So we know we, we pull in all of the Twitter data. We can look at tiktok data, reddit data, etc. Etc. So we can see what are people saying about the brand. You know that's a great are in real indication of intent, people talking about things and they're asking questions great indication of intent. Yeah, google search data is another great indication of intent. We've got that real time. Yeah, and the other thing is then, you know, integrating in brands first party data. All of these, you know, all of these consumer brands are building up, you know, mountains, the first party data. I think you know one of the key things is, you know, let's not, as a research company, you know, pretend to be operating in isolation. Let's work with our customers to be integrating their data in our data together, because you know that tells the complete story.

Speaker 2:

That makes an enormous amount of sense and both points and playing back the brand tracker. So it sounds like there's a certain sort of model schematic around what market cost believes is important to track in terms of what guys found them, and You're tracking those components and I see that will vary slightly, yeah, for a given client, and then you're building in multiple data sets on top of that to actually see Whether those perceptions and those brand attributes are moving, but not just rely on survey.

Speaker 1:

That's exactly right. I would say I'd probably go a little bit stronger than saying these are the different components that we think make it up. This is what we've looked at from all of the years of data we got and all of the extensive research we've done. When you start to include non-survey data, these are demonstrably the things that drive the value of your brand and fandom.

Speaker 2:

So, Tom, are you able to give me any sense of the components that you believe a brand should be tracking the elements of fandom, or is that proprietary and part of the secret source?

Speaker 1:

No, no, it's based on the research that we've done. Basically, we've found fundamentally that there are three different things. We talk about them as presence, distinction and brand elements. The first thing is how much is the brand in the market and how much is it breaking to people's consciousness? How much do they recognize?

Speaker 1:

Presence is very much survey driven, but we're also looking at what's the media spend, what's out there, but how much we're talking about it and seeing it, remembering it. That's presence. The second one is distinction. How much is the brand unique? How much does it stand out? Is it trustworthy? Yes, the brand might have presence, but if people don't remember it and it doesn't stand out, then it's not going to be effective. Again, we can see that a lot in social media, our people talking about it. Then the final one is relevance. Yes, a brand might be out there. You might understand that it's distinctive, but you might not feel it's unique, the best brand. They trigger this feeling in everyone that it's relevant for them. We can see that not only through survey data, but really that's what drives consideration and intention. But we see a lot in search data. We tend to talk about presence, distinction and relevance. Those are the three components come to make up our fandom score, which is what really drives passion amongst consumers for your brand.

Speaker 2:

Yes, that's really interesting. I imagine in time you can evolve a model with each client as well in terms of the type of scoring they should have against each pillar for their category. Then if you've got their first party data, as you said, you can also start to validate that in market, as in you're a brand where this is particularly important for you this pillar If we can see movements against these data strands, it's more likely to affect the bottom line.

Speaker 1:

Yes, absolutely. What do you need to do if something's going wrong? If you lack presence, then you probably need to be spending more. If you lack distinction, maybe there's something wrong with your creative. You need to be a little bit more edgy. Perhaps you don't communicate. Your messaging isn't quite right. Messaging is probably primarily about relevance. You know what your audience is. If you're not resonating with your audience, you can think about what you need to change in your messaging.

Speaker 2:

Again makes a lot of sense. It also feels like a version of brand tracking that's a little bit more dynamic and actionable over a lot of the old solutions.

Speaker 1:

The key thing to me is it's got to be actionable and it's got to be fast. The main frustration that I see in the research area quite often a lot of this data arrives when it's too late to do anything about it. It's like oh, I discovered at the end of the campaign that my consumers don't find it relevant. That's a bit disappointing, isn't it? We'll try and do better next time. A lot of what drives me in a way, is how can we get survey results and big data results in people's hands faster? They're not only actionable, but actually they come within a time frame where you can action them.

Speaker 2:

That's probably a very nice segue on the next subject we're going to talk about in terms of usage of AI and AI in terms of efficiency and speed. I think there's also a sense of talking to brands and different research agencies that for too long, the research industry has made perfect the enemy of good. 80% there in many cases is good enough because brands need to make decisions and they need to get on with it.

Speaker 1:

It's funny I've actually got a post coming out later this afternoon where I talked about exactly that point. Survey data is crucial because it tells you about what goes on in people's minds. If it comes there too late to be acting upon it, then we're forcing marketeers to use imperfect signals to optimize on. The real opportunity for us as a research industry is to get that data really, really quickly. If you think about something like a copy test, where you want to get 200 respondents, you can get 200 respondents within an hour. Why does it take three weeks in order to get my copy test results? It's because there's time wasted up front when you're setting it up. There's time wasted when you're testing it. We're probably not targeting it sufficiently. When we're in field or we're going for a segment which is too granular, then we're wasting time doing things manually. Afterwards we might have made mistakes and have to refield. Then we're having to go through it manually. Do a bunch of data processing and reporting For me.

Speaker 1:

A research analyst should be thinking a lot about what are the questions they want to be asking people and how are their minds working, and they should be thinking a lot about what do these results mean and how can a brand make adjustments in order to improve their performance, whereas actually at the moment, what they're spending far too much time worrying about is how do I get this damn table working in Excel? How do I get this survey programmed in this way? And I think the promise of AI is we can take a lot of that out of the research process and we can focus on the value-adding areas. If we can do that and really do a survey within an hour rather than weeks, then I think, first of all, whoever gets their first, you have a math professor advantage. But also, I think research has become so much more valuable again.

Speaker 2:

Yes, as you say, it actually creates the space and the capacity for humans to add the value on top. It's fascinating.

Speaker 1:

It's absolutely right. I mean one of the things I've done a lot in data science area. We do a lot of predictive models and one of the metrics we use is is my model better than random? Because if you want to know which category does someone go into, basically random is your baseline. If you've got a model that's 10% better than random, that might sound like it's absolutely useless, but if the only alternative the client's got is random, then 10% better than random is an awful lot. It can have massive impact on their business. When we're building machine learning and predictive models, again, we're not always trying to shoot. Let's get it right 90% of the time. What we've got to do is just get it right better than it is at the moment so that we can have more positive business impact.

Speaker 2:

Yeah, that makes a lot of sense. What was the phrase you used to use in the land of the blind? The one-eyed man is king.

Speaker 1:

That's exactly right. The land of the blind, the one-eyed man, is king.

Speaker 2:

Fantastic, Tom. I'm conscious of time and that we should probably start to move on and let you can wrap up. Just a couple of final questions then I want to do a quick fire round, if that was all right. You've reached a lot around the importance of maintaining a human component of measurability rather than just relying on, say, A-B testing and cutting lots of crazy A-B testing it and to see what works. Why doesn't that work in your view? Why do you need a human component to measurability?

Speaker 1:

I'm so delighted that I can answer this question with someone who's got a UK audience. My favorite example of this one is the gorilla playing the drums ad Cadbury's Dairy Milk. Huge people in the UK would recognize that if they saw it. Huge hit for Cadbury's. It was just so random. That ad would never have tested well by an AI looking at it. It didn't look like anything that came before. It didn't look like anything that came after it. There are lots of ads like that. Lots of ideas.

Speaker 1:

The best creative ideas in AI is never going to sink off, but also it's never going to work out whether it's going to work. It's never going to work out what's really bad either, where something is just jarring culturally. Sometimes it's humans receiving and it's like no. That makes me feel a bit squeamish. What AIs tend to do is they do regression to mean, which is basically it's all about, let's try and do the standard stuff. It's kind of the enemy of great. I think we should be thinking less about. Can we get an AI to test things? I think for sure there are uses for AI in the creative process. If we're going to use AI in the creative process, we need to make absolutely certain that a human has a look at it afterwards. If you're investing even tens of millions of dollars in that advertising campaign, you don't want to put in fields unless you've had a good number of humans looking at it. You're not trying to sell products to machines yet. I think you need to have people in the process.

Speaker 2:

Tom, as mentioned, conscious of times, I'm going to just wrap up with a few final questions. Quickfire question is a slightly cheeky one, if you don't mind me asking, but interested to know. What would your partner say are your best and your worst fallacies my best?

Speaker 1:

quality would definitely be my full head of hair.

Speaker 2:

This is not a video podcast. I'll put up a photo of Tom when we promote it.

Speaker 1:

My worst quality would be Sorry, could you repeat the question? I wasn't listening. What she would say is the worst quality is I'd constantly say sorry, I wasn't listening. Could you repeat the question?

Speaker 2:

That sounds directly accurate. Slightly more serious. Sorry, it doesn't have to be. If you were a mid-sized research company, or even largest research company like Market Cast, and you were hiring a CTO, what type of interview questions would you be asking them?

Speaker 1:

The first thing I would always make sure is it doesn't want to steal code and not keep it up to date. I think a lot of people who try to run technology teams without cutting code themselves, and I think the simple reality is that cutting code nowadays is totally different to cutting code four years ago. If you're hiring a CTO who hasn't written code for 20 years, you're going to find that he or she is managing development teams as if they were developing code 20 years ago. I would always ask that. I think also the research industry has a bunch of peculiarities. If you're not hiring someone with the domain knowledge, then I think you need to understand how quickly they're going to be able to get up to speed on them.

Speaker 1:

There's complexities about how we did Code. Data is not something that's very easy to structure. It's not well-structured data. You've got peculiarities around. There's a lot of complexity of the way we have deeply nested data sets and how you deal with things like that. Those would be the kind of areas Basically, I would say you still hands on technically and I need to get some credibility on how quickly they're going to get up to speed on the domain.

Speaker 2:

Thank you, tom. That's very good advice. Final couple of questions. If you could have one or two guests for dinner, who would they be and, even more importantly, what would be on the menu? What would you eat and drink?

Speaker 1:

Well, it would all If I could only have two guests for dinner, probably my wife and my mother, and we would probably eat caviar with homemade chive blinis. I guess we'd probably drink a bottle of Harrow and Hope Blanc de Noir.

Speaker 2:

It was a very politic answer, Tom. Final one what's your favourite almost impactful book or recent book or piece of media? It could be a movie or something like that, or TV show or podcast or whatever.

Speaker 1:

I mean Star Wars obviously had a massive impact, but I think it did on over one of my age, but I guess the one I mean Small is Beautiful by ES Schumacher. I read that when I was a teenager. Although I'm probably not the strongest environmentalist on the planet, I think the whole idea of sustainability and how you stop things growing out of control has been something that I've thought a lot about throughout my career.

Speaker 2:

Tom, thank you so much. It's been a pleasure, as always, talking to you. I'll see you in a month or so. It's fabulous.

Speaker 1:

Thank you, Henry.

Speaker 2:

Always so much fun talking to Tom. I've borrowed and probably garbled many of his lines in the past, and they'll be fascinating to see how his belief that AI and creative development is potentially the enemy of great, to use his phrase. Whatever the case, it's certainly got to be healthy to have a dose of realism and cautious skepticism amid all the hype. Thanks once more to Insight Platforms for their support. I'm releasing the podcast every couple of weeks now. Next up, I'm planning to dive more into the world of Samsung with Rupesh Patel and then jumping onto a different tack to look at the appetite for investment or acquisition in the sector with Tony Wolford of Green Square. Thank you for listening and see you next time.

Guest intro
Tom's secret career as a punk
From software developer to launching 3G with T-Mobile
Opportunities & Challenges within Connected TV
Principles of merging data sets
Implications for privacy & media mix modelling
MarketCast's new approach to brand tracking
Speeding up research results/using AI effectively
Why we still need a human dimension
Quickfire round