How Yieldmo Elevates Creative Performance
with Dynamic Format Optimization and AI

Part 1 of 3

SHARE

Yieldmo’s AI/ML System: Ingredients 1 and 2, Formats and Training Data

In our whitepaper, “Elevating Creative Performance with Dynamic Format Optimization and Artificial Intelligence (AI)” we identify five critical ingredients for building a machine learning (AI) platform to deliver the best format experience for your creative, every time it serves. 

We learned that for every opportunity that exists to serve a creative to a user, there are many possible creative format experiences. The final form of the advertising creative should not be the same in every scenario; it should be a product of the multitude of factors that converge at the moment the user and the creative meet. This is a challenge that AI is well-positioned to solve. 

In this next blog series, we’ll talk about how Yieldmo developed an AI/ML system to generate 45% additive KPI performance gains for customers using ingredients 1 and 2: Formats and Training Data. 

 

Yieldmo’s AI/ML System: Ingredients 1 and 2, Formats and Training Data

The most important step to tackle the challenge of predicting which ad format is likely to perform best under different conditions is that ML models need a source of feedback that is prevalent and granular (training data). To accomplish this, Yieldmo developed a solution that involves collecting and processing an enormous amount of data. To provide the necessary dataset for machine learning models to find meaningful patterns, Yieldmo embeds a piece of javascript in each ad impression that sends information to our servers, five times per second, about how each user is interacting with the ads. We couple this with data from the bid stream for a complete picture of a users’ experience with the ad.

Yieldmo’s 21+ creative formats enhance this process, because they are designed to drive engagement, and offer additional interactions like expands and swipes to gather even more of this valuable, predictive data. 

These signals we utilize can broadly be divided into attention signals and environment signals.

  • Attention signals define how the user interacts with the ad (pixel seconds, scroll speed & direction, changes in screen angle and device movement, touches inside outside of format, ad expansions etc).
  • Environment signals define the context in which the users interacts with the ad (daypart, location, connection type, publisher, content, keywords, ad slot information, etc)

 

Due to the prevalence and value of these signals, machine learning models can predict other outcomes interesting to advertisers, including selecting the optimal format. All of these signals are inherently privacy safe, and don’t require identity, but can be intersected with audience data or available IDs whenever they are compliant and available for improved predictions.

We process over 70 different signals from each ad and about over 200 terabytes worth of these signals each day from every user interaction with every ad on our exchange, which in turn feed into our machine learning models. 

The right creative format amplifies the core underlying creative without changing it (or its purpose), to create a unique interactive experience.  This shows marked improvement across a variety of key KPIs including brand awareness & recall, attention, consideration, shoppability, site visits, and more. 

By developing and deploying a suite of programmatic friendly and high-impact creative formats, Yieldmo generates the valuable training data we need to make meaningful predictions. 

That’s it for ingredients 1 and 2. 

In our next post, we’ll talk about steps 3-5, how we process and activate the data–and the models built from them–to drive results. 

YMSignals