Speed, scale, and mass targeting — it’s not hard to see why the lure of programmatic benefits has led digital advertising to become an industry defined by data. But, as Giovanni Strocchi (pictured below), CEO of ADmantX, writes, such a one-track approach has its shortcomings.
“Without knowing the force of words, it is impossible to know more.” – Confucius
By focusing almost exclusively on data-driven automation, the industry has neglected two key areas: semantics and context. And this oversight has created multiple issues, including dubious brand safety, ads that aren’t relevant, an inability to create audience profiles, and poor budget planning. In short, consumer and brand faith in digital advertising is being jeopardised by its failure to recognise one fact: the complexity of language matters just as much in the online sphere as it does on paper.
As technological advances continuously increase our dependence on automation, it is vital that we go back to basics to understand the impact of linguistics. The force of words may be easily overlooked, but without an understanding of what they mean, even programmatic progress is limited.
What, then, is it that makes linguistics important and how can advertisers use it to transform their performance? Let’s take a deeper look.
Digital advertising is in bad shape
So far this year, the digital advertising industry hasn’t been cast in a very flattering light. First, Procter & Gamble’s (P&G) chief brand officer, Marc Pritchard, described the current media buying system as “murky at best”, while calling for greater transparency and more emphasis on ad quality. Then, an investigation by The Times revealed that ads for hundreds of major brands had appeared alongside content that promoted hate speech and salacious content, which many put down to gaps in programmatic security – allowing ads to be placed in unsafe and inappropriate environments.
To move digital advertising into a more favourable hue – especially the portions of it that are traded programmatically – marketers need to take control of the way words are treated across the industry. And they must start by acknowledging problems with existing methods.
Overload and underload: contextual targeting is missing the mark
Most marketers know that ensuring ads end up in the right place requires a solid foundation of detailed insight. Yet current approaches to contextual targeting and keyword analysis take a broad approach to language assessment. Search engines, for example, use simplistic full-text systems whereby a ‘cleaned-up’ version of page text is created by considering words as simple character strings. But, while this might make searches run faster, it also strips out vital linguistic meaning and increases the risk that time will be wasted evaluating ‘useless’ words. Consequently, the information these tools produce is often not only incorrect, but also too little or too much – causing issues known as data ‘overload’ and ‘underload’.
Data overload is what happens when contextual targeting systems regard words as ‘strings’ or ‘tokens’, rather than exploring their meaning in a particular context, which makes it hard to classify and organise inputs and outputs effectively. This is problematic for marketers because the information these systems generate flows into ad servers, filling them with unreliable data that frequently leads to ad mismatches.
Data underload occurs when systems are prevented from generating actionable insight by inefficiencies in classification – they cannot distinguish main page topics from minor points covered within content and score all concepts equally or too low. Again, this is bad news for advertisers. Restricted insight increases the risk of ads reaching inappropriate audiences or being placed on the wrong site.
Either way, the result is a series of potential disasters for marketers: compromised brand safety, irrelevant advertising, and budgets spent on media that does not yield a good return.
Averting disaster: unlocking the meaning of each word
To avoid the issues inadequate analysis can bring, marketers must rediscover the value of using the meaning of words on a page to ensure appropriate and successful ad placement. And the good news is, there’s a simple way of doing so. Developments in cognitive semantic technology have made it possible for artificially intelligent (AI) machines to read content as a human does, using tools such as Natural Language Processing (NLP). Using the information generated, marketers can then uncover the true meaning of words, in context, and the sentiment they express.
How this is achieved is with a slightly more complex process of linguistic analysis that digs down to understand the many layers and ambiguities of every word on a page. Here’s the short version: semantic technology collects all structural and lexical text aspects of content to identify the meaning of words, just like the traditional study of linguistics on paper. The longer version is an assessment that involves five key stages:
1. Parser: the word forms, grammar, and syntax of each sentence is assessed – this includes analysing words independently of their written form to pick up on subtle changes in the way nouns and verbs are used.
2. Lexicon: the meaning of words is recognised and ambiguities are removed; so ‘Jaguar’ may be identified as an animal in one context and a car in another.
3. Memory: the human tendency to understand the concepts within content by comparing them to their internal database of previous concepts is replicated.
4. Knowledge: real-world knowledge is used in place of human ‘common sense’ to establish the cultural context of content for different audiences.
5. Content representation: analysis output is represented in a concept map, with concepts represented independently of words to make the meaning behind them clear.
What can cognitive semantic analysis do for marketers?
Once marketers have got to grips with the power and meaning of words, an array of options are open to them.
With a precise view of the concepts and deep emotional analysis on any given page, they can achieve in-depth contextual targeting – placing ads beside content that complements and enhances their message. The insight this analysis produces can also be leveraged to improve personalisation through first-party advanced propensity profiling. By analysing the moods, tastes, and interests of consumers, marketers can create a behavioural base on which to apply machine learning techniques ‘trained’ with the positive results of real customer interactions and develop individual profiles that indicate how audiences are likely to respond to certain messages.
Last but not least, it’s brand safety. Through pre-bid analysis of pages at URL-level, marketers can gain a comprehensive view of the topics, concepts, and emotions associated with a piece of content before ads are placed. This means they can make sure ads appear beside content that reflects well on their brand and steer clear of content that would be detrimental to it.
Data matters. It’s the essential ingredient for accurate campaign planning, targeting and measurement – but its utility is limited if marketers don’t also take the importance of a word’s meaning into account. Without a full understanding of the complexities inherent in language, digital advertising is at risk of losing audience interest and its reputation. So, what’s in a word? An awful lot, and if marketers want to secure greater audience engagement, and consistent brand safety, it’s time to get to grips with linguistics