June 27, 2024

Lindsay Hong on the genesis of SmartAssets

“I’m optimistic about the future of AI and advertising.”
Media
TABLE OF CONTENTS

Part of the Stagwell Marketing Cloud, SmartAssets is a dynamic tool that uses AI to exhaustively tag your image assets and make concrete suggestions to optimize future performance—before you’ve wasted time and money on A/B testing.

I spoke with founder and CEO Lindsay Hong about where SmartAssets came from, its various use cases, and what the future holds.

What was the original idea behind SmartAssets, and how did it develop over time?

I’m originally from the Stagwell production agency Locaria, which has historically had a close relationship with the media agency Assembly, where we did  a lot of localization for paid media content. 

Locaria has a proposition called Performance Linguistics, where we use data science to identify how changing copy affects its performance.

At one point, I went to our data science team and said, “You know what we can do for copy? Can we do it for visual assets? Can we look at what’s in a picture or a video and pinpoint what’s driving the ad’s performance?”

The original idea overall was to bring together Creative and Media into a single data set that allows for more objective conversations about what does and doesn’t work in the world of content, copy, and translation. 

I imagine you'd often have a lot of differing opinions there, without much hard data to back those opinions up.

There are a lot of subjective conversations. Sometimes when a campaign doesn't work, Creative will say, well, the media strategy was clearly poor. And Media will say, Oh well, the creatives are rubbish.

We were almost called SmartAssets “Hug” because we wanted to bring the whole value chain of marketing together into one big team.

It was all very serendipitous and timely with what was going on in Generative AI when we were conceiving of this tool. 

Computer vision was getting very mature. You could tag up assets in a way you never could before: create this big metadata set of all your assets, and then cross-reference that with performance data, at scale, to derive statistically significant insights.

What other use cases have you turned up along the way?

There area lot of benefits when it comes to improving workflow, and also global quality assurance and standards.

And then there’s more basic but important things, like alerting you that 40% of your assets don’t even have your logo on them. Or that the asset you’ve produced for Facebook has an aspect ratio that also works for Google's formats, so you could use it there and be successful.

Initially, the asset tagging features existed because you need the tagging to do the creative analysis of your images. 

But we discovered that the tagging alone was creating loads of value. Once you’ve tagged things, you can put a GPT-style search functionality on the front, and then find assets so much more easily than you can with a traditional production workflow.

This provides a much more powerful search and find functionality than you get in a traditional Digital Asset Management (DAM) system. We also have a relationship with Google, and we can offer you storage if you don’t have a DAM.

Let’s talk a bit about how the creative analysis portion of SmartAssets works.

There’s so much uplift that can be gained from simple fixes to your own content. 

Overall, we’ve got three categories of measurement and analysis: brand, platform, and behavioral science.

On the brand side, SmartAssets is customizable around your brand—the logo, what design features are and aren’t allowed, the colorways, the overall brand guidelines. There’s also overall branding best practices that are analyzed.

Then there’s a platform analysis. That could mean simply analyzing the asset to see if it meets the specs to go live. Here’s an example: Around 40% of the assets that the out-of-home team at one of our media agencies receive from a third party are in the wrong format.

It’s a very simple thing to fix, right? 

A lot of the content that brands are paying a lot of money to be created—or that they're creating in-house and and putting blood, sweat, and tears into—never gets seen because it's in the wrong format. That's an absolute travesty.

Talk me through the behavioral science component.

Whether something is memorable or not has a lot to do with physiology and brain chemistry. How your eye and your brain respond to a stimulus isn't actually that different market to market, from person to person.

We have findings from research papers, and we code those findings into the algorithm: What has to happen for the brain to remember something?

There’s a sweet spot of emotional cadence within an advertising spot. Too much emotional turbulence makes people drop off, because it’s stressful. But not enough emotional change is boring. 

If you’re doing YouTube ads, which are very skippable, you want to draw people into the narrative and keep them hooked to the end, to your CTA. 

And so SmartAssets can analyze your video asset in terms of the amount of emotional turbulence and whether it’s enough, or too much, or just right.

Once SmartAssets analyzes your creative, you can also make easy fixes or amendments to the images in-platform. What’s the benefit there?

Well, your Creative team probably created those downstream assets months ago for a Facebook ad, for example. They’re on to the next job, they don’t want to go back and reformat or tweak an older asset.

So SmartAssets gives the Media team the opportunity to make creative changes and revisions. We’re often talking about relatively short-lived social campaigns, though, and small assets. The risk overall is low.

It’s great in a scenario where speed-to-market is essential, like around the holidays, when you simply don’t have time to wait on Creative for minor revisions.

If SmartAssets is making suggestions for future assets based on past performance, does that create a scenario that’s adverse to risk-taking or envelope-pushing?

There’s a narrative out there that AI is going to damage the quality of advertising. That there’s just going to be a flurry of low-quality images that are pumped out by the GenAI tools.

I feel much more optimistic about it, personally, because we’re using AI to understand what should be in the asset. So the quality of the assets we’re producing is going to go up. They won’t be spammy. The asset will always say what the product or service is and why it meets your needs in an effective manner. You definitely can’t say that for all ads you see today!

We can ensure that consumers are receiving proper information that drives them toward a purchase. It’s not about just having some random GenAI multi-colored picture with lollipops and unicorns in it, you know.

One issue might be how differently people define advertising “quality.” Is a quality ad one that’s artistic or aesthetic—or one that achieves the point of an ad in the first place, which is brand awareness and conversion?

What we mean by “effective” is a good question, and that’s also for us to discuss with the brand, to determine what their campaign is looking to do.

But there are some objective things, in terms of quality. If the resolution on the asset is poor, it might look terrible on your phone. If you take a square asset and stick it in an Instagram reel, you’ve got all this real estate at the top and bottom—it looks bad.

SmartAssets is built on AI, and AI is changing all the time. How is the platform set up to evolve as the tech evolves? What’s on the horizon?

I think we’ll see automated prompt engineering, and SmartAssets will be part of that. We are effectively creating a set of data-based rules that will write the prompt in a really effective way, automatically.

In terms of flexibility of the product, and how that changes as the tech does... Well, there are lots of generative models out there, right? And we’ve built SmartAssets in an agnostic way. There are tools out there that consistently monitor the quality and consistency of the output of various LLMs, and that’s really useful. 

Eventually, everyone will have access to the same tools at the same price. The nuance is going to be how you prompt them, and which model you choose based on knowing what it’s specifically good for.

In this sense, SmartAssets works like middleware, pairing your specific need with the right model for that use case. And that’s all happening under the umbrella of a single platform.

How have you personally observed people’s attitudes about AI and marketing changing recently?

The creative agencies are getting warmer and warmer to it. 

They’re not afraid. They realize it has to be part of their suite, and that the value is the nuanced use of these tools—not whether or not we have them in the first place.

Interested in learning more? We’d love to show you how SmartAssets works.

Scott Indrisek

Scott Indrisek is the Senior Editorial Lead at Stagwell Marketing Cloud

Take five minutes to elevate your marketing POV
Twice monthly, get the latest from Into the Cloud in your inbox.
Related articles
Coca-Cola’s risky AI video bet
Online reactions to a new holiday campaign were harsh—but did they miss the point?
Media
Hey, PR agencies: Stop trying to build your own tech!
Rather than creating bespoke platforms from scratch, license best-in-class ones.
Communications