Episode 47 of the Influencer Marketing Lab podcast where Ian Forrester of DAIVID talks effective influencer marketing measurement

More...

The Influencer Marketing Lab podcast is sponsored by Tagger the data-driven influencer marketing platform and social listening tool.

Show notes

Welcome to episode 46 of the Influencer Marketing Lab - a weekly podcast tracking the growth spurts and growing pains of influencer marketing.

This podcast is sponsored by Tagger the data-driven influencer marketing platform and social listening tool.

This week I'm in conversation with Ian Forrester, founder and CEO of DAIVID, an advertising research company building a ‘house of influencer insight’ proposition.

We discuss influencer marketing's WHAT metrics of impressions, reach, likes, shares, comments, site visits and sales and then we explore why these should be augmented with the WHY metrics of attention, emotions and memory creation. Shifting data collection to insight learnings and improved effectiveness. 

Check out the Influencer Marketing Lab  for full show notes and related useful links

And I want to hear your feedback. Google Influencer Marketing Lab podcast and either drop me a line or leave a voicemail message directly from the landing page.

Speakpipe Influencer Marketing Lab

Ian Forrester biography

Ian is Founder and CEO of DAIVID, an advertising research company which uses a suite of advanced technologies, including facial coding, eye tracking, machine learning and computer vision, to help advertisers optimise the attention captured by, and emotional impact of, their creative.

Ian cut his teeth at L’Oréal, Nestlé, Tate & Lyle and Sony Pictures before joining Unruly in 2012. He was the first Insight hire for Unruly, going on to lead a global team that helped brands optimise video creation and distribution, and create several world-first insight products. These included ShareRank – a tool which forecasted virality of videos, EQ – a video testing technique focusing on emotions, and Cultural Connections - a product which predicted how videos would perform across countries. Ian created Cultural Connections in partnership with MediaCom and Hofstede Insights.

More recently, Ian brought his vision, innovation, analytical and data-storytelling skills to Whalar in the influencer marketing space. He is in the process of running world-first research projects and creating a unique ‘house of influencer insight’ positioning for the company by pushing the boundaries of research in influencer marketing.

Ian’s expertise is in high demand; he has been featured by BBC World, Sky News, Warc Admap, MRS Impact and MediaCom Blink. He has also been a judge for the WARC Media Awards and the IPA Effectiveness Awards.

Outside of the world of insight and advertising, Ian’s other love is pugs. It’s a constant source of frustration that there aren’t more pugs in advertising. He lives in London with his faithful pug Lola, who he describes as “the embodiment of everything that's good about the world - kindness, affection, silliness and love!”

Ian Forrester -- Influencer Marketing Lab podcast

Useful links

Transcript

Ian Forrester  03:24

NOTE: This transcript was produced using AI technology. It has been edited slightly.

Thank you Scott, it's a pleasure to be invited on to such an illustrious podcast.

Scott Guthrie  03:29

I don't know how illustrious it is, but it's kind of you. Let's get stuck in straightaway, shall we? We're here to talk about measurement and evaluation and metrics. So what metrics are influencer marketers using to measure their campaigns? And why in your opinion, are those metrics incomplete?

Ian Forrester  03:50

Say, there are loads of metrics being used to measure influencer campaigns these days, you've got things like media metrics, which are metrics like reach impressions and views from the platform's. You've got social metrics, of course, so likes, shares, comments, engagements, you've got Brand Lift metrics, as well. So that's control versus expose shifts in awareness, consideration, purchase intent, and also commercial metrics.

So website visits, app downloads, sales, metrics like that. And all of these metrics, it's not as if these metrics are bad in any way. It's just that they're a little bit incomplete. Because all of these metrics are telling us what is happening, but not why. And until you understand the why it's very hard to learn from what you've done in the past and improve going forward. So a simple example would be view through rate as a media metric, right? So view through rate tells us the percentage of people who viewed a video who went on to complete it. And so we might have a bunch of data which tells us these videos have great view through rate and these videos do not have great review through rate.

So that is the what is happening. These videos are being viewed to the end whereas these videos are not good. which is obviously useful to know. But until you understand why, why was this video set being viewed to the investor the other not, you're not going to learn from that behaviour, you're not going to change going forward. And so that is what we focus on at David bream, the y to the watts, which is already being measured.

Scott Guthrie  05:15

Okay, so So the what they are, or at least they should be the table stakes of campaign measurement. They're the impressions, the reach the awareness, the consideration the you know, the likes the shares the comments, and then there's nothing wrong with that. That's the beginning. That's not the end, the end should be around capturing the emotion. Is that Is that what you're saying?

Ian Forrester  05:35

Yeah, so I'm not suggesting for a second that we should stop measuring any of those things. They're all useful. And they're all telling us something about the campaign and the impact that it's having. But until we understand why we don't generate insight, we need to move from measurement, which is a box checking exercise, my campaign did this thing. Really useful, totally necessary. But that doesn't generate any insight, because you don't understand why until you understand why. When you understand why you generate insight, which can then be used to set strategy going forward. And why the why metrics that I'm talking about here, come down to attention, emotions and memory structure created this, these are the set of why metrics, which we focus on.

Scott Guthrie  06:21

Okay, so understanding how your campaign performed on these elements, and that generates an insight, not just data, that insight leads to learning learning leads to improved effectiveness. That's, in essence, what the why metrics are 100%. Okay, on learning. Thank you very much so far. And so what what should we be measuring? You mentioned very briefly, but let's dig into this a little bit. How can we augment the Watts metrics, tell us more about what these y metrics are.

Ian Forrester  06:52

we have spent a tonne of time and energy at David understanding why content is effective. And to do that, we've created this effectiveness process. And to do that, we have done a meta analysis within advertising research. So we've looked at studies from both academia and from industry to understand what the latest thinking is around why content is effective. So we've gone super deep here, we've got a couple of academics on the team. So we've done an academic literature review, and continue to do so because our knowledge in this space is always evolving. So we need to keep on tapping back into academia to see what those guys are finding.

So that's a continuous process, as is our review of research going on within the advertising industry as a whole. So we look at the IPA and the IAB and Nielsen and McKinsey, and Ehrenberg, bass and unruly and all these organisations who are doing really awesome, really interesting individual studies and advertising research. But then we take this helicopter view, to see how all of the studies fit together. Because each study, just by its nature, it's very rare to get a research study, which is super, super broad, simply because it's not normally financially viable to do so there are data limitations, and so on.

So each individual study is useful in its own way. But it's only until you take the helicopter view to see how they all fit together, that you can put together this effectiveness process. And that's what we've done. I'll walk you through how content is effective in the wild. So what have we learned from this meta analysis? First of all, to be effective in the wild content needs to capture attention. And of course, attention is the major buzzword in the industry right now. It's all anyone's talking about following can and absolutely right, because without attention, the effectiveness process doesn't work. If people are not paying attention to your content, then you're dead in the water. It's absolutely critical. So attention is super, super important.

And it's a really fundamental part of what we do at David. However, attention is only the first hurdle that you have to overcome, it's definitely not the be all and end all attention gets you into the room. And once you've captured attention, you then need to do something with it. And by something, I mean, evoke an intense, positive emotional response. Because that emotional response creates a memory structure, which then drives action.

The problem with a lot of commercial content is that from an emotional point of view is just about average. So you watch it and you think oh yeah, that was okay. It's like a four or five out of 10. The issue with that, from the brand's point of view is that two seconds later, it's forgotten. So the memory structure is not created, and therefore the action is not driven. But when a brand can elevate its emotional response to an eight 910 out of 10. The viewer remembers the way the ad made them feel and attaches that feeling to the brand. And it's that feeling, which then nudges that person to do something at a later date, which the brand is asking them to do. So maybe let's take the example of someone in a grocery store, choosing between laundry detergents.

They look at one detergent, feel nothing. Look at another detergent and in that moment, the emotion which has been attached to that brand returns because the memory structure has been created. And people look at that brand, and feel that emotion. And that emotion in that moment nudges that person to purchase that brand over its competitor. And that's how this effectiveness process works by delivering tension and emotions at scale, across the cost of millions of impressions within an hour. So

Scott Guthrie  10:20

if I can sort of condense what you've articulated, you're saying that the effectiveness process can only begin if there's a tension. But at the moment, we are as an industry with just looking pretty much at attention as the be all and end all. But you're saying attention is crucial, but it's the starting step. Attention, then leads to emotion. And if that content is imbued with high levels of emotion, that leads to higher levels of recall, subconsciously or otherwise, which leads to action.

10:54

Exactly right. Yeah.

Scott Guthrie  10:56

Absolutely. David, are you just tracking that? Or are you helping brands understand how they can improve future campaigns? How does it we'll talk me through David's AI, and how it allows you to deliver attention and emotion data at scale.

Ian Forrester  11:14

The three core pillars of David AI, attention, emotions, and AI. So in terms of the attention and emotions, we are measuring attention and emotions in the deepest, most granular way that we can. And our AI enables us to deliver that data to that scale to our clients. How does that work? Well, we have a human training data set to feed the AI. And within that training dataset, we're showing contents to humans. And from those humans, we are measuring our attention and emotions. So this is the core fundamental part of the AI that the data which we feed it is really deep attention and emotion dataset.

Within that training dataset, we're getting content, and we're showing it to people, we're asking those people to turn on their webcams, as they're watching the content, which means we can then film their faces as they're watching it, which allows us then to capture certain metrics.

One is from an attention point of view, is their face turned towards the screen or not? Are they looking at the screen, you can see that when you're filming their face as they're watching the content, you can also see the point at which most people are tending to turn away. So what point is that drop off for a particular piece of content? Do people get through the first five seconds and then drop off? Or do they get nearly to the end? Or do they get to the end, and so on. So that's one of our core retention metrics, understanding whether people are looking at the screen, the other thing we're measuring is where are people looking on the screen. And that's using eye tracking.

So again, by filming people's faces as they're watching content, you can see the direction of their eye gaze. And by calculating the angle of the eye gaze, you can essentially say that the most people are tending to look at a particular point on the screen. And that then gets visualised using a heat map visualisation, where the stronger the colour, the more people are looking at that particular point on the screen. So in terms of attention, we're capturing two things. Are people looking at the screen? And where are they looking on the screen. In terms of emotion, we're measuring in two ways as well.

One is with facial coding, filming people's faces as they're watching content. And in this case, we're picking up expressions which people are making during different parts of the video. And so there are six universal human expressions which are being picked up smile, fare, disgust, anger, surprise and sadness. And the great thing about facial coding is that you can see where the peaks and valleys of emotional response are because you can see where people are expressing. So you can see where most people are smiling in a video, for instance, that will be the emotional peak.

And when we visualise facial coding, you get this trace of response, which is really cool and allowing us to pick up the moments which are working or not working in a particular piece of content. That special coding is great strength, when a facial coding is not so good is in allowing us to understand which emotions are being evoked because we're only picking up those six expressions. But there are loads more than sex emotions. We put a tonne of time and energy into understanding which emotions we should be testing for.

Again, we've gone deep into academia, we've built on the work that I really did over the years where they those guys test for 18 psychological responses we built on that work and brought in the latest thinking from academia. Long story short, we've arrived at 39 emotions that we test for both positive and negative. So these are things like admiration, anxiety, embarrassment, nostalgia, pride, relief, romance, trust, warmth, and so on. The reason we've gone through so much time and energy to arrive at this list of emotions, is because 90% of why content works or doesn't work can be explained by these 39 emotions, if content isn't working, it may be because it's boring people or confusing them. Those are two emotions, or more extreme negative emotions could be coming through things like contempt, or disgust, or distrust, or horror or guilt or shame or any one of these terrible things which were never intended to be evoked by the advertiser or the influencer. And yet, they're coming through anyway. And we test content around the world on a daily basis.

And these these emotions are coming through on a daily basis. As an advertiser, it's very useful to know if your content is evoking these emotions. Because if it is, and then you know about it, then you can do something about it.

On the flip side content is working, because it's evoking positive emotions at an intense level. And that is gaining cut through it is lifting that particular piece of content above the morass of stuff, which we're all bombarded with on a daily basis, it's making it stand out, because it's evoking that emotion intensely. And by intense, I mean, eight 910 out of 10, it's got to be very, very intense to gain that cut through. Also, the type of emotion is really, really important. So evoking an unusual emotion, which other players in your space are not evoking can be a really great way to gain cut through and stand out versus to pack. Because when you've got a different emotion, and you add intensity to that different emotion, then you've got a double whammy effect. And then great things happen in terms of

Scott Guthrie  16:25

One alternative reason that the emotion isn't being induced by the brand is that they don't want the followers to be disgusted, I suppose. But at least it's identify.

To backtrack, Ian there are three pillars, and we've talked about two. The first one is attention. And that is where you turn on the webcam. And you actually track that heads are pointed towards the screen. And more importantly, you're tracking the head movement. And where the viewer is looking on the screen. Is that that sums up exactly the second one that we were talking about emotions. You mentioned 39 emotions, how has that list been created? Or who's created that? Or is the amalgam of other people's lists?

Ian Forrester  17:09

That's a great question. So it is an amalgam of a bunch of individual studies within academia. Originally, we built on the original work from unruly which itself was based on work by Aaron Burr bass, and specifically Karen Nelson field. Back in the day, when we worked with Karen in 2012, to develop the science of sharing, we developed this emotional framework which contained 18 what they called still do psychological responses. Essentially,

Scott Guthrie  17:36

she's from the University of Adelaide.

Ian Forrester  17:39

Yeah, she lives in Adelaide. She was with Ehrenberg bass. But she is no longer she's now running amplified intelligence, which is an attention company focused on attention driven by media placements, doing really interesting work. But yeah, Karen, kick this off and and really took up the mantle and expanded it. And the emotional categorization used by and really is great. I mean, look, I played a key role in developing that emotional categorization. I used it for years, it generates loads of interesting insight. But frankly, the world has moved on since 2012. And that list was made, and academia has moved on. And so we wanted to bring in the latest thinking from academia into the product here. And like I could talk about this all day, but like, I won't bore you with too much detail. But essentially, academics don't necessarily agree on what constitutes an emotion.

There's loads of grey area has loads of nuance, what language is used to describe emotions differs from country to country, in like in Japan, for example, there are like five different words for shame, because shame is like a big part of their, their, their culture, and you know, so it's not necessarily an easy thing to do to arrive at a really strong list. But we've taken all of these academic thoughts and kind of mash them together into a list which we have found we've we've tested it independently, we've done our own factor analysis and cluster analysis to understand within our data, what what overlap is there with among these emotions are these emotions, in our view, discrete.

So we've taken the theory from academics, we've validated it with our own data and our own data science team to arrive at this list. Like but like I said, the point of doing all that was to be able to test for emotions in the deepest, most granular way that we can, because so much of advertising effectiveness comes down to the emotional response which people have when watching content, because we're advertising to humans at the end of the day, and humans respond to everything emotionally. Normally, that emotion is kind of a bit muted. It might be between a four and a SIX.

You're just kind of bumbling along your day and nothing's really pissed you off. Nothing's really excited you and it just kind of an average day right until you You find that a certain spike someone cuts out in front of you. When you're in the car or you know, you get some great news. And you see, you see these spikes, but normally you're kind of like middling. But understanding that emotional response as it pertains to advertising when the advertising can break you out of that rift between four and six, to make you remember something which then will then influence your action later. That method of action understanding it is critical to what we're doing, David because that brings us to why essentially.

Scott Guthrie  22:36

You're now two thirds of the way through your pillars. We've done attention, you've now eloquently answered my questions about emotion. The third pillar is artificial intelligence. Is that right?

Ian Forrester  22:48

It is yeah. So to feed an AI system, you need a training dataset, because essentially what what AI is doing is it's taking an input and making a prediction based on that input. So in order to predict whether content is going to capture the attention, which emotions are going to evoke, are people going to remember the brand. And to be able to make those predictions by audience, we need to feed the system two datasets.

One is the data I've just described, how are people responding to stuff. And that's the attention emotions, and brand metrics, which we're collecting as part of our performance dataset, or how are people responding to stuff dataset. That's one side of the coin. The other side of the coin is what is happening within content to elicit that response. And to gather that data at scale. We're sending content to various computer vision API's. So these are services like Google vision, Amazon, recognition, clarify, and so on. There are a bunch of these services available these days. And what these API's allow us to do is categorise on a frame by frame basis, both visually and orally, what is happening within content. So you can send a video to various of these API's. So you can't just happen to one and think you're done. You've got to tap into a bunch of them, bring those inputs in house build on top of them to arrive at a really deep frame by frame understanding of what is happening within that content. 

Because when you've done that, you then end up with two data sets one data set, which is telling you what is happening in content, and the other dataset saying how are people responding to that content. The system then understands the connections between these two data sets. So I'll give you an example. Like, imagine a video where we've got a scene where there's a bunch of people walking through Piccadilly Circus, it's a busy day, there's a kind of a crowd bear. There's a girl who's wearing outlandish clothes, and there's traffic. There's a bus driving past, just from that single scene and just from a couple of frames. We can pick up all of those things that I just mentioned. So we're picking up the fact that there's a crowd present. There's traffic noise from the road. And there's a girl between the ages of 25 and 34. And she is wearing bright colours. Now, all of these things become data points which we can correlate with outcomes. So maybe people are paying attention to the girl because of what she's wearing.

Or maybe people are looking at the crowd, and they're feeling a little bit anxious or perhaps a bit excited. So that's their emotional response. The system is understanding the connections between these two datasets within our training data on a micro level. And then on a macro level, all this training data flowing into David. So we're making sense of the creative attributes which are present, what is happening in content, the performance data, which is our facial coding our eye tracking our survey data, understanding the connections between these two datasets.

So then we can take an asset from a client, we can understand what's happening in that asset automatically, by sending it to various computer vision API's, which essentially return a collection of stuff, this collection of stuff is happening in this video. And our system then says, okay, given that collection of stuff, we predict that this video is likely to capture attention within the first few seconds or not, the emotions is likely to evoke, or XYZ people are likely to remember the brand or not. And we can cut these predictions by audience as well. So we can say an older guy will respond in this way versus a younger girl who will respond in a different way. And so that is how our our AI brings together all these datasets and allows us to predict attention, emotions and memory structure at scale. Well, thank

Scott Guthrie  26:27

you very much. That's certainly a full answer. I hesitate to ask the next question. You know, what else are you working on? I mean, there's three fascinating pillars there. And I can see how they combined. They work very well, from an influencer marketing perspective. How are you helping our industry at the moment?

Ian Forrester  26:44

Again, great question, right. Like, I have a unique perspective on influencer marketing, given the fact that I worked at Wailer for three years. And I know the challenges which influencer agencies are facing when it comes to measurement and insight generation setting strategy. And so as part of the work we're doing at David, we're bringing this tech to bear on the issues which I know to exist within influencer marketing. And so we developed a number of products to help influencer agencies meet those challenges, which I know to exist. The first and most obvious product really is one called evaluate, which is a creative testing tool, which we can use to test assets within a particular campaign, and understand whether they're capturing attention which emotions they're evoking.

How are they working for different audience members, and that when you run evaluate on assets within a campaign, it's going to give you two things, one is proof points to take to the client and say, we know this content was effective, because it was capturing attention because it was working these positive emotions and people remember the brand. So in addition to we reached 50,000 people we can say of this 50,000 People 70%. Remember the brand 50% felt an intense positive emotion, may shifting brand perception and so on. So his proof points, but it also, more importantly, really generates insight because you're learning from that content. And if content is working, you're learning what was working about it.

And then you can repeat that success in the future. If it's not working for whatever reason, or for a particular audience type, then you learn that as well. And again, you're learning for future, which you can then use to set your strategy going forward. So it's a really key generator of insight, which then allows you to set strategy, my second product is audience. So with an audience, we are optimising the attention captured, and the emotions evoked by content, not by trying to change the content itself. But by using paid media to find the right audience for that particular asset. And the cool thing about influencer marketing is that it's very easy to target certain audiences using the tools which the platforms provide.

So you go on Facebook, ad manager, and you've got all these targeting capabilities. Super, super interesting. But the question is, okay, we've got all this, this potential, who should we actually target to, to supercharge this campaign? Well, with an audience, we find the subset of the broad target audience who are responding particularly well to this asset, so any assets can have a broad target audience, let's call it 18 to 34 year old males and females for argument's sake. But that particular piece of creative is going to be working especially well, for a subset of that group. So it might be 18 to 24 year old males who like golf, let's say, audience identifies that audience for you, and then allows your paid media team to then go out and target that group.

And when you do that, we see huge, huge uptick in performance because those are the people who are most likely to watch that content to the end, and most likely to click through and visit the site and then go on and purchase. They're most likely to have a perception shift as a result of being exposed and so on. Because you're finding the right person that that particular asset is really seeing two. So that's product two, product three is brand strategy where we're bringing our capabilities to bear on large datasets. So evaluating audience are awesome products for use on individual campaigns.

Whereas brand strategy enables us to unlock the insight which is sitting within large data sets. And again, this is really relevant for influencer campaigns because just by their very nature, influencer campaigns are generating loads and loads of content, and loads and loads of data. But critically, the kind of data which has been generated right now, which agencies and brands are sitting on is the what kind of data not the why every influencer agency under the sun is setting on loads of media metrics, like reached impressions and engagement status, social metrics, as well like shares comments, which is a really powerful data set.

But it's only telling us what is happening. Whereas David brings the why. So when we do a brand strategy, we look at not just an individual campaign, but we look at tonnes of campaigns data, and we draw out the insight, which is lying latent within that dataset by adding the y to the y that we can see. So maybe an example might be let's say, we have a site visit data from a client. So we know that a bunch of posts, drove brilliant site visits, and a bunch of post did not drive brilliant site visits. That is the what we then understand why by testing that creative, the ones that did well and the ones that did not so well and finding out what it was that they were doing, which was working well or not well.

And when we do that, we generate a bunch of insight, which can then be used to set strategy going forward, because we've learned from what we've done well, and not so well, which then enables us to improve our performance going forward. So brand strategy is a much more strategic tool versus evaluate an audience. And it's really being used by brands who are planning for the long term, and by agencies who have strategic relationships with their brand partners.

Scott Guthrie  31:56

So the strategy can be used as sort of an after action review guide that you can look back, and help you inform the future by interrogating the Watson imbuing it with the why

Ian Forrester  32:08

100% Yeah. And that's often how brand strategy is used, actually, in six monthly or yearly reviews where we've got a body of content and we want to learn from that what worked, what didn't work, and how are we going to move forward from here and like three months trench or a six month trench, those are a chunky amounts of data, which we're going to learn a lot from, generally speaking, the more posts that you test with a brand strategy analysis, the more that you learn, be exactly right, as you described, 

Scott Guthrie  32:39

so I'm going to try and sum up the last 25 minutes into 25 seconds. So at the moment, we are capturing what metrics and there's nothing wrong with that, but they paid half a picture they tell us how many impressions a piece of content received, they tell us how many likes and how many comments and how many shares that piece of content receives and what metrics can look at things like site traffic ad sales and and site activity who's doing what on that website, all good stuff. But what it doesn't tell you is the why it doesn't tell you that attention leads to emotion emotional leads to memory and memory leads to action capturing that creates an insight and the insight is more than the data which is the What insight produces learning and it's that learning which helps improve effectiveness. How did I do that? Was that alright?

33:45

Perfect. Yeah 100%

Scott Guthrie  33:47

Well on that winning streak let's try and ease the conversation down now towards a close and I always ask this question or I usually always ask this question How do you keep up with this ever changing landscape of the creative economy or influencer marketing what what resources the term to keep up to up to date

Ian Forrester  34:08

I attend shows such as The influence Marketing Show influence marketing hub live Did you Mark Hall and you know, there are a bunch now so I sending as many shows as possible. And reading talking influence as well. And those guys to some really interesting articles. Of course, I listened to your podcast, Scott. whenever I can. Yeah, I'm following leading players in this space as well. Who would you classify as a leading practitioner in the space? whaler?

Of course, my old buddies are doing some super super interesting work particularly focused on creativity. But then you've got players like open influence, influential Captivate to que mi terrify, you know, the list goes on.

As I try and follow all these guys on social and access social are relatively rare. actually just to see what they're all doing, and what's going on is it is harder to keep up with the influencers face because it's it's moving so quickly and not actually working on it. Day to day as when I was doing I was I was at whaler, it's become a little bit more difficult but I do try and consume some influencer content on a daily basis as in content about the influencer marketing industry on a daily basis just to keep abreast of what's

Scott Guthrie  35:23

going on. There's also a tremendous weekly newsletter called the influencer marketing digest. It's excellent because I write it full disclosure. And I write it on behalf of fourth floor, which is an influencer marketing agency. Well, now where can listeners turn to find out more information about you and more information about David and the product? And I should say, I've always liked the name David because of its spelling is D ay ay V ID, where and how can listeners get in touch with you and find out more about you and your product line?

Ian Forrester  35:59

So our website is obviously a good place to go, which is simply DAIVID.co. also email me at Ian@daivid.co. And obviously, we're on LinkedIn as well. I think you're going to share some links right on the notes of the podcast.

Scott Guthrie  36:17

Yeah, I'll be sure to include all of those links in the show notes accompanying today's podcast episode, just Google the influencer marketing lab for further details in Forrester, founder and CEO at David, thanks so much for your time and for your insights today really enjoyed our conversation.

36:36

Me too. Thanks for that Scott. Really appreciate it.

Scott Guthrie is a professional adviser within the influencer marketing industry. He is an event speaker, university guest lecturer, media commentator on influencer marketing and active blogger. He works with brands, agencies and platforms to achieve meaningful results from influencer marketing. That tells you something about him but it's not giving you a lot of detail, is it? So, read more here.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

Episode 47 of the Influencer Marketing Lab podcast where Ian Forrester of DAIVID talks effective influencer marketing measurement

More...

The Influencer Marketing Lab podcast is sponsored by Tagger the data-driven influencer marketing platform and social listening tool.

Show notes

Welcome to episode 46 of the Influencer Marketing Lab - a weekly podcast tracking the growth spurts and growing pains of influencer marketing.

This podcast is sponsored by Tagger the data-driven influencer marketing platform and social listening tool.

This week I'm in conversation with Ian Forrester, founder and CEO of DAIVID, an advertising research company building a ‘house of influencer insight’ proposition.

We discuss influencer marketing's WHAT metrics of impressions, reach, likes, shares, comments, site visits and sales and then we explore why these should be augmented with the WHY metrics of attention, emotions and memory creation. Shifting data collection to insight learnings and improved effectiveness. 

Check out the Influencer Marketing Lab  for full show notes and related useful links

And I want to hear your feedback. Google Influencer Marketing Lab podcast and either drop me a line or leave a voicemail message directly from the landing page.

Speakpipe Influencer Marketing Lab

Ian Forrester biography

Ian is Founder and CEO of DAIVID, an advertising research company which uses a suite of advanced technologies, including facial coding, eye tracking, machine learning and computer vision, to help advertisers optimise the attention captured by, and emotional impact of, their creative.

Ian cut his teeth at L’Oréal, Nestlé, Tate & Lyle and Sony Pictures before joining Unruly in 2012. He was the first Insight hire for Unruly, going on to lead a global team that helped brands optimise video creation and distribution, and create several world-first insight products. These included ShareRank – a tool which forecasted virality of videos, EQ – a video testing technique focusing on emotions, and Cultural Connections - a product which predicted how videos would perform across countries. Ian created Cultural Connections in partnership with MediaCom and Hofstede Insights.

More recently, Ian brought his vision, innovation, analytical and data-storytelling skills to Whalar in the influencer marketing space. He is in the process of running world-first research projects and creating a unique ‘house of influencer insight’ positioning for the company by pushing the boundaries of research in influencer marketing.

Ian’s expertise is in high demand; he has been featured by BBC World, Sky News, Warc Admap, MRS Impact and MediaCom Blink. He has also been a judge for the WARC Media Awards and the IPA Effectiveness Awards.

Outside of the world of insight and advertising, Ian’s other love is pugs. It’s a constant source of frustration that there aren’t more pugs in advertising. He lives in London with his faithful pug Lola, who he describes as “the embodiment of everything that's good about the world - kindness, affection, silliness and love!”

Ian Forrester -- Influencer Marketing Lab podcast

Useful links

Transcript

Ian Forrester  03:24

NOTE: This transcript was produced using AI technology. It has been edited slightly.

Thank you Scott, it's a pleasure to be invited on to such an illustrious podcast.

Scott Guthrie  03:29

I don't know how illustrious it is, but it's kind of you. Let's get stuck in straightaway, shall we? We're here to talk about measurement and evaluation and metrics. So what metrics are influencer marketers using to measure their campaigns? And why in your opinion, are those metrics incomplete?

Ian Forrester  03:50

Say, there are loads of metrics being used to measure influencer campaigns these days, you've got things like media metrics, which are metrics like reach impressions and views from the platform's. You've got social metrics, of course, so likes, shares, comments, engagements, you've got Brand Lift metrics, as well. So that's control versus expose shifts in awareness, consideration, purchase intent, and also commercial metrics.

So website visits, app downloads, sales, metrics like that. And all of these metrics, it's not as if these metrics are bad in any way. It's just that they're a little bit incomplete. Because all of these metrics are telling us what is happening, but not why. And until you understand the why it's very hard to learn from what you've done in the past and improve going forward. So a simple example would be view through rate as a media metric, right? So view through rate tells us the percentage of people who viewed a video who went on to complete it. And so we might have a bunch of data which tells us these videos have great view through rate and these videos do not have great review through rate.

So that is the what is happening. These videos are being viewed to the end whereas these videos are not good. which is obviously useful to know. But until you understand why, why was this video set being viewed to the investor the other not, you're not going to learn from that behaviour, you're not going to change going forward. And so that is what we focus on at David bream, the y to the watts, which is already being measured.

Scott Guthrie  05:15

Okay, so So the what they are, or at least they should be the table stakes of campaign measurement. They're the impressions, the reach the awareness, the consideration the you know, the likes the shares the comments, and then there's nothing wrong with that. That's the beginning. That's not the end, the end should be around capturing the emotion. Is that Is that what you're saying?

Ian Forrester  05:35

Yeah, so I'm not suggesting for a second that we should stop measuring any of those things. They're all useful. And they're all telling us something about the campaign and the impact that it's having. But until we understand why we don't generate insight, we need to move from measurement, which is a box checking exercise, my campaign did this thing. Really useful, totally necessary. But that doesn't generate any insight, because you don't understand why until you understand why. When you understand why you generate insight, which can then be used to set strategy going forward. And why the why metrics that I'm talking about here, come down to attention, emotions and memory structure created this, these are the set of why metrics, which we focus on.

Scott Guthrie  06:21

Okay, so understanding how your campaign performed on these elements, and that generates an insight, not just data, that insight leads to learning learning leads to improved effectiveness. That's, in essence, what the why metrics are 100%. Okay, on learning. Thank you very much so far. And so what what should we be measuring? You mentioned very briefly, but let's dig into this a little bit. How can we augment the Watts metrics, tell us more about what these y metrics are.

Ian Forrester  06:52

we have spent a tonne of time and energy at David understanding why content is effective. And to do that, we've created this effectiveness process. And to do that, we have done a meta analysis within advertising research. So we've looked at studies from both academia and from industry to understand what the latest thinking is around why content is effective. So we've gone super deep here, we've got a couple of academics on the team. So we've done an academic literature review, and continue to do so because our knowledge in this space is always evolving. So we need to keep on tapping back into academia to see what those guys are finding.

So that's a continuous process, as is our review of research going on within the advertising industry as a whole. So we look at the IPA and the IAB and Nielsen and McKinsey, and Ehrenberg, bass and unruly and all these organisations who are doing really awesome, really interesting individual studies and advertising research. But then we take this helicopter view, to see how all of the studies fit together. Because each study, just by its nature, it's very rare to get a research study, which is super, super broad, simply because it's not normally financially viable to do so there are data limitations, and so on.

So each individual study is useful in its own way. But it's only until you take the helicopter view to see how they all fit together, that you can put together this effectiveness process. And that's what we've done. I'll walk you through how content is effective in the wild. So what have we learned from this meta analysis? First of all, to be effective in the wild content needs to capture attention. And of course, attention is the major buzzword in the industry right now. It's all anyone's talking about following can and absolutely right, because without attention, the effectiveness process doesn't work. If people are not paying attention to your content, then you're dead in the water. It's absolutely critical. So attention is super, super important.

And it's a really fundamental part of what we do at David. However, attention is only the first hurdle that you have to overcome, it's definitely not the be all and end all attention gets you into the room. And once you've captured attention, you then need to do something with it. And by something, I mean, evoke an intense, positive emotional response. Because that emotional response creates a memory structure, which then drives action.

The problem with a lot of commercial content is that from an emotional point of view is just about average. So you watch it and you think oh yeah, that was okay. It's like a four or five out of 10. The issue with that, from the brand's point of view is that two seconds later, it's forgotten. So the memory structure is not created, and therefore the action is not driven. But when a brand can elevate its emotional response to an eight 910 out of 10. The viewer remembers the way the ad made them feel and attaches that feeling to the brand. And it's that feeling, which then nudges that person to do something at a later date, which the brand is asking them to do. So maybe let's take the example of someone in a grocery store, choosing between laundry detergents.

They look at one detergent, feel nothing. Look at another detergent and in that moment, the emotion which has been attached to that brand returns because the memory structure has been created. And people look at that brand, and feel that emotion. And that emotion in that moment nudges that person to purchase that brand over its competitor. And that's how this effectiveness process works by delivering tension and emotions at scale, across the cost of millions of impressions within an hour. So

Scott Guthrie  10:20

if I can sort of condense what you've articulated, you're saying that the effectiveness process can only begin if there's a tension. But at the moment, we are as an industry with just looking pretty much at attention as the be all and end all. But you're saying attention is crucial, but it's the starting step. Attention, then leads to emotion. And if that content is imbued with high levels of emotion, that leads to higher levels of recall, subconsciously or otherwise, which leads to action.

10:54

Exactly right. Yeah.

Scott Guthrie  10:56

Absolutely. David, are you just tracking that? Or are you helping brands understand how they can improve future campaigns? How does it we'll talk me through David's AI, and how it allows you to deliver attention and emotion data at scale.

Ian Forrester  11:14

The three core pillars of David AI, attention, emotions, and AI. So in terms of the attention and emotions, we are measuring attention and emotions in the deepest, most granular way that we can. And our AI enables us to deliver that data to that scale to our clients. How does that work? Well, we have a human training data set to feed the AI. And within that training dataset, we're showing contents to humans. And from those humans, we are measuring our attention and emotions. So this is the core fundamental part of the AI that the data which we feed it is really deep attention and emotion dataset.

Within that training dataset, we're getting content, and we're showing it to people, we're asking those people to turn on their webcams, as they're watching the content, which means we can then film their faces as they're watching it, which allows us then to capture certain metrics.

One is from an attention point of view, is their face turned towards the screen or not? Are they looking at the screen, you can see that when you're filming their face as they're watching the content, you can also see the point at which most people are tending to turn away. So what point is that drop off for a particular piece of content? Do people get through the first five seconds and then drop off? Or do they get nearly to the end? Or do they get to the end, and so on. So that's one of our core retention metrics, understanding whether people are looking at the screen, the other thing we're measuring is where are people looking on the screen. And that's using eye tracking.

So again, by filming people's faces as they're watching content, you can see the direction of their eye gaze. And by calculating the angle of the eye gaze, you can essentially say that the most people are tending to look at a particular point on the screen. And that then gets visualised using a heat map visualisation, where the stronger the colour, the more people are looking at that particular point on the screen. So in terms of attention, we're capturing two things. Are people looking at the screen? And where are they looking on the screen. In terms of emotion, we're measuring in two ways as well.

One is with facial coding, filming people's faces as they're watching content. And in this case, we're picking up expressions which people are making during different parts of the video. And so there are six universal human expressions which are being picked up smile, fare, disgust, anger, surprise and sadness. And the great thing about facial coding is that you can see where the peaks and valleys of emotional response are because you can see where people are expressing. So you can see where most people are smiling in a video, for instance, that will be the emotional peak.

And when we visualise facial coding, you get this trace of response, which is really cool and allowing us to pick up the moments which are working or not working in a particular piece of content. That special coding is great strength, when a facial coding is not so good is in allowing us to understand which emotions are being evoked because we're only picking up those six expressions. But there are loads more than sex emotions. We put a tonne of time and energy into understanding which emotions we should be testing for.

Again, we've gone deep into academia, we've built on the work that I really did over the years where they those guys test for 18 psychological responses we built on that work and brought in the latest thinking from academia. Long story short, we've arrived at 39 emotions that we test for both positive and negative. So these are things like admiration, anxiety, embarrassment, nostalgia, pride, relief, romance, trust, warmth, and so on. The reason we've gone through so much time and energy to arrive at this list of emotions, is because 90% of why content works or doesn't work can be explained by these 39 emotions, if content isn't working, it may be because it's boring people or confusing them. Those are two emotions, or more extreme negative emotions could be coming through things like contempt, or disgust, or distrust, or horror or guilt or shame or any one of these terrible things which were never intended to be evoked by the advertiser or the influencer. And yet, they're coming through anyway. And we test content around the world on a daily basis.

And these these emotions are coming through on a daily basis. As an advertiser, it's very useful to know if your content is evoking these emotions. Because if it is, and then you know about it, then you can do something about it.

On the flip side content is working, because it's evoking positive emotions at an intense level. And that is gaining cut through it is lifting that particular piece of content above the morass of stuff, which we're all bombarded with on a daily basis, it's making it stand out, because it's evoking that emotion intensely. And by intense, I mean, eight 910 out of 10, it's got to be very, very intense to gain that cut through. Also, the type of emotion is really, really important. So evoking an unusual emotion, which other players in your space are not evoking can be a really great way to gain cut through and stand out versus to pack. Because when you've got a different emotion, and you add intensity to that different emotion, then you've got a double whammy effect. And then great things happen in terms of

Scott Guthrie  16:25

One alternative reason that the emotion isn't being induced by the brand is that they don't want the followers to be disgusted, I suppose. But at least it's identify.

To backtrack, Ian there are three pillars, and we've talked about two. The first one is attention. And that is where you turn on the webcam. And you actually track that heads are pointed towards the screen. And more importantly, you're tracking the head movement. And where the viewer is looking on the screen. Is that that sums up exactly the second one that we were talking about emotions. You mentioned 39 emotions, how has that list been created? Or who's created that? Or is the amalgam of other people's lists?

Ian Forrester  17:09

That's a great question. So it is an amalgam of a bunch of individual studies within academia. Originally, we built on the original work from unruly which itself was based on work by Aaron Burr bass, and specifically Karen Nelson field. Back in the day, when we worked with Karen in 2012, to develop the science of sharing, we developed this emotional framework which contained 18 what they called still do psychological responses. Essentially,

Scott Guthrie  17:36

she's from the University of Adelaide.

Ian Forrester  17:39

Yeah, she lives in Adelaide. She was with Ehrenberg bass. But she is no longer she's now running amplified intelligence, which is an attention company focused on attention driven by media placements, doing really interesting work. But yeah, Karen, kick this off and and really took up the mantle and expanded it. And the emotional categorization used by and really is great. I mean, look, I played a key role in developing that emotional categorization. I used it for years, it generates loads of interesting insight. But frankly, the world has moved on since 2012. And that list was made, and academia has moved on. And so we wanted to bring in the latest thinking from academia into the product here. And like I could talk about this all day, but like, I won't bore you with too much detail. But essentially, academics don't necessarily agree on what constitutes an emotion.

There's loads of grey area has loads of nuance, what language is used to describe emotions differs from country to country, in like in Japan, for example, there are like five different words for shame, because shame is like a big part of their, their, their culture, and you know, so it's not necessarily an easy thing to do to arrive at a really strong list. But we've taken all of these academic thoughts and kind of mash them together into a list which we have found we've we've tested it independently, we've done our own factor analysis and cluster analysis to understand within our data, what what overlap is there with among these emotions are these emotions, in our view, discrete.

So we've taken the theory from academics, we've validated it with our own data and our own data science team to arrive at this list. Like but like I said, the point of doing all that was to be able to test for emotions in the deepest, most granular way that we can, because so much of advertising effectiveness comes down to the emotional response which people have when watching content, because we're advertising to humans at the end of the day, and humans respond to everything emotionally. Normally, that emotion is kind of a bit muted. It might be between a four and a SIX.

You're just kind of bumbling along your day and nothing's really pissed you off. Nothing's really excited you and it just kind of an average day right until you You find that a certain spike someone cuts out in front of you. When you're in the car or you know, you get some great news. And you see, you see these spikes, but normally you're kind of like middling. But understanding that emotional response as it pertains to advertising when the advertising can break you out of that rift between four and six, to make you remember something which then will then influence your action later. That method of action understanding it is critical to what we're doing, David because that brings us to why essentially.

Scott Guthrie  22:36

You're now two thirds of the way through your pillars. We've done attention, you've now eloquently answered my questions about emotion. The third pillar is artificial intelligence. Is that right?

Ian Forrester  22:48

It is yeah. So to feed an AI system, you need a training dataset, because essentially what what AI is doing is it's taking an input and making a prediction based on that input. So in order to predict whether content is going to capture the attention, which emotions are going to evoke, are people going to remember the brand. And to be able to make those predictions by audience, we need to feed the system two datasets.

One is the data I've just described, how are people responding to stuff. And that's the attention emotions, and brand metrics, which we're collecting as part of our performance dataset, or how are people responding to stuff dataset. That's one side of the coin. The other side of the coin is what is happening within content to elicit that response. And to gather that data at scale. We're sending content to various computer vision API's. So these are services like Google vision, Amazon, recognition, clarify, and so on. There are a bunch of these services available these days. And what these API's allow us to do is categorise on a frame by frame basis, both visually and orally, what is happening within content. So you can send a video to various of these API's. So you can't just happen to one and think you're done. You've got to tap into a bunch of them, bring those inputs in house build on top of them to arrive at a really deep frame by frame understanding of what is happening within that content. 

Because when you've done that, you then end up with two data sets one data set, which is telling you what is happening in content, and the other dataset saying how are people responding to that content. The system then understands the connections between these two data sets. So I'll give you an example. Like, imagine a video where we've got a scene where there's a bunch of people walking through Piccadilly Circus, it's a busy day, there's a kind of a crowd bear. There's a girl who's wearing outlandish clothes, and there's traffic. There's a bus driving past, just from that single scene and just from a couple of frames. We can pick up all of those things that I just mentioned. So we're picking up the fact that there's a crowd present. There's traffic noise from the road. And there's a girl between the ages of 25 and 34. And she is wearing bright colours. Now, all of these things become data points which we can correlate with outcomes. So maybe people are paying attention to the girl because of what she's wearing.

Or maybe people are looking at the crowd, and they're feeling a little bit anxious or perhaps a bit excited. So that's their emotional response. The system is understanding the connections between these two datasets within our training data on a micro level. And then on a macro level, all this training data flowing into David. So we're making sense of the creative attributes which are present, what is happening in content, the performance data, which is our facial coding our eye tracking our survey data, understanding the connections between these two datasets.

So then we can take an asset from a client, we can understand what's happening in that asset automatically, by sending it to various computer vision API's, which essentially return a collection of stuff, this collection of stuff is happening in this video. And our system then says, okay, given that collection of stuff, we predict that this video is likely to capture attention within the first few seconds or not, the emotions is likely to evoke, or XYZ people are likely to remember the brand or not. And we can cut these predictions by audience as well. So we can say an older guy will respond in this way versus a younger girl who will respond in a different way. And so that is how our our AI brings together all these datasets and allows us to predict attention, emotions and memory structure at scale. Well, thank

Scott Guthrie  26:27

you very much. That's certainly a full answer. I hesitate to ask the next question. You know, what else are you working on? I mean, there's three fascinating pillars there. And I can see how they combined. They work very well, from an influencer marketing perspective. How are you helping our industry at the moment?

Ian Forrester  26:44

Again, great question, right. Like, I have a unique perspective on influencer marketing, given the fact that I worked at Wailer for three years. And I know the challenges which influencer agencies are facing when it comes to measurement and insight generation setting strategy. And so as part of the work we're doing at David, we're bringing this tech to bear on the issues which I know to exist within influencer marketing. And so we developed a number of products to help influencer agencies meet those challenges, which I know to exist. The first and most obvious product really is one called evaluate, which is a creative testing tool, which we can use to test assets within a particular campaign, and understand whether they're capturing attention which emotions they're evoking.

How are they working for different audience members, and that when you run evaluate on assets within a campaign, it's going to give you two things, one is proof points to take to the client and say, we know this content was effective, because it was capturing attention because it was working these positive emotions and people remember the brand. So in addition to we reached 50,000 people we can say of this 50,000 People 70%. Remember the brand 50% felt an intense positive emotion, may shifting brand perception and so on. So his proof points, but it also, more importantly, really generates insight because you're learning from that content. And if content is working, you're learning what was working about it.

And then you can repeat that success in the future. If it's not working for whatever reason, or for a particular audience type, then you learn that as well. And again, you're learning for future, which you can then use to set your strategy going forward. So it's a really key generator of insight, which then allows you to set strategy, my second product is audience. So with an audience, we are optimising the attention captured, and the emotions evoked by content, not by trying to change the content itself. But by using paid media to find the right audience for that particular asset. And the cool thing about influencer marketing is that it's very easy to target certain audiences using the tools which the platforms provide.

So you go on Facebook, ad manager, and you've got all these targeting capabilities. Super, super interesting. But the question is, okay, we've got all this, this potential, who should we actually target to, to supercharge this campaign? Well, with an audience, we find the subset of the broad target audience who are responding particularly well to this asset, so any assets can have a broad target audience, let's call it 18 to 34 year old males and females for argument's sake. But that particular piece of creative is going to be working especially well, for a subset of that group. So it might be 18 to 24 year old males who like golf, let's say, audience identifies that audience for you, and then allows your paid media team to then go out and target that group.

And when you do that, we see huge, huge uptick in performance because those are the people who are most likely to watch that content to the end, and most likely to click through and visit the site and then go on and purchase. They're most likely to have a perception shift as a result of being exposed and so on. Because you're finding the right person that that particular asset is really seeing two. So that's product two, product three is brand strategy where we're bringing our capabilities to bear on large datasets. So evaluating audience are awesome products for use on individual campaigns.

Whereas brand strategy enables us to unlock the insight which is sitting within large data sets. And again, this is really relevant for influencer campaigns because just by their very nature, influencer campaigns are generating loads and loads of content, and loads and loads of data. But critically, the kind of data which has been generated right now, which agencies and brands are sitting on is the what kind of data not the why every influencer agency under the sun is setting on loads of media metrics, like reached impressions and engagement status, social metrics, as well like shares comments, which is a really powerful data set.

But it's only telling us what is happening. Whereas David brings the why. So when we do a brand strategy, we look at not just an individual campaign, but we look at tonnes of campaigns data, and we draw out the insight, which is lying latent within that dataset by adding the y to the y that we can see. So maybe an example might be let's say, we have a site visit data from a client. So we know that a bunch of posts, drove brilliant site visits, and a bunch of post did not drive brilliant site visits. That is the what we then understand why by testing that creative, the ones that did well and the ones that did not so well and finding out what it was that they were doing, which was working well or not well.

And when we do that, we generate a bunch of insight, which can then be used to set strategy going forward, because we've learned from what we've done well, and not so well, which then enables us to improve our performance going forward. So brand strategy is a much more strategic tool versus evaluate an audience. And it's really being used by brands who are planning for the long term, and by agencies who have strategic relationships with their brand partners.

Scott Guthrie  31:56

So the strategy can be used as sort of an after action review guide that you can look back, and help you inform the future by interrogating the Watson imbuing it with the why

Ian Forrester  32:08

100% Yeah. And that's often how brand strategy is used, actually, in six monthly or yearly reviews where we've got a body of content and we want to learn from that what worked, what didn't work, and how are we going to move forward from here and like three months trench or a six month trench, those are a chunky amounts of data, which we're going to learn a lot from, generally speaking, the more posts that you test with a brand strategy analysis, the more that you learn, be exactly right, as you described, 

Scott Guthrie  32:39

so I'm going to try and sum up the last 25 minutes into 25 seconds. So at the moment, we are capturing what metrics and there's nothing wrong with that, but they paid half a picture they tell us how many impressions a piece of content received, they tell us how many likes and how many comments and how many shares that piece of content receives and what metrics can look at things like site traffic ad sales and and site activity who's doing what on that website, all good stuff. But what it doesn't tell you is the why it doesn't tell you that attention leads to emotion emotional leads to memory and memory leads to action capturing that creates an insight and the insight is more than the data which is the What insight produces learning and it's that learning which helps improve effectiveness. How did I do that? Was that alright?

33:45

Perfect. Yeah 100%

Scott Guthrie  33:47

Well on that winning streak let's try and ease the conversation down now towards a close and I always ask this question or I usually always ask this question How do you keep up with this ever changing landscape of the creative economy or influencer marketing what what resources the term to keep up to up to date

Ian Forrester  34:08

I attend shows such as The influence Marketing Show influence marketing hub live Did you Mark Hall and you know, there are a bunch now so I sending as many shows as possible. And reading talking influence as well. And those guys to some really interesting articles. Of course, I listened to your podcast, Scott. whenever I can. Yeah, I'm following leading players in this space as well. Who would you classify as a leading practitioner in the space? whaler?

Of course, my old buddies are doing some super super interesting work particularly focused on creativity. But then you've got players like open influence, influential Captivate to que mi terrify, you know, the list goes on.

As I try and follow all these guys on social and access social are relatively rare. actually just to see what they're all doing, and what's going on is it is harder to keep up with the influencers face because it's it's moving so quickly and not actually working on it. Day to day as when I was doing I was I was at whaler, it's become a little bit more difficult but I do try and consume some influencer content on a daily basis as in content about the influencer marketing industry on a daily basis just to keep abreast of what's

Scott Guthrie  35:23

going on. There's also a tremendous weekly newsletter called the influencer marketing digest. It's excellent because I write it full disclosure. And I write it on behalf of fourth floor, which is an influencer marketing agency. Well, now where can listeners turn to find out more information about you and more information about David and the product? And I should say, I've always liked the name David because of its spelling is D ay ay V ID, where and how can listeners get in touch with you and find out more about you and your product line?

Ian Forrester  35:59

So our website is obviously a good place to go, which is simply DAIVID.co. also email me at Ian@daivid.co. And obviously, we're on LinkedIn as well. I think you're going to share some links right on the notes of the podcast.

Scott Guthrie  36:17

Yeah, I'll be sure to include all of those links in the show notes accompanying today's podcast episode, just Google the influencer marketing lab for further details in Forrester, founder and CEO at David, thanks so much for your time and for your insights today really enjoyed our conversation.

36:36

Me too. Thanks for that Scott. Really appreciate it.