Nielsen’s David Kenny Explains Why Diversity in AI is Non-Negotiable

September 26, 2019 Anne Curtin

Gone are the days when marketing decisions were guided by intuition and experience. Today, marketers have access to more consumer and interaction data than ever before to inform their decision-making. Artificial intelligence (AI) is only increasing their ability to process and activate that data faster.

No one is a bigger proponent of AI than Nielsen CEO and Chief Diversity Officer David Kenny. But when it comes to measurement, he says the key is to make sure everyone is included.

In a recent interview with Bloomberg’s Shelly Banjo at the Bloomberg Global Business Forum, Kenny explains how AI is opening doors to innovation, and why it’s essential that both the data and the models reflect everyone.

Changing the Way Humans & Machines Work Together

Kenny is no stranger to AI. His decades of experience in big data and artificial intelligence go back to his time as CEO at The Weather Company. 

“When we decided to put the weather forecast on every mobile phone, we had to do forecasts for two billion people every 15 minutes. This had to be done by machine,” said Kenny.

But removing the error from the models meant eliminating people from the process. “There used to be a 12% error rate between what we predicted and what actually happened when looking three days out. We got it down to 4% because the machines got smarter.” 

But that also meant Kenny and his team had to stop letting meteorologists override the machine. “Seventy five percent of the time, when the meteorologists [meddled in the machine], they made it worse. So half the error was taking humans out of overriding the machine,” continued Kenny.

Removing false data caused by human intervention allowed the machine to learn and make more accurate predictions. “It’s much better to have the algorithm work and improve on its own,” explained Kenny. “We’re finding this all the time in the way we do ratings and CPG advertising today. Letting the machines operate is just getting to better answers.”

The key to success, he says, is training humans to trust the machine – whether it’s forecasting the weather or putting a network schedule together. “Programmers think they know better what people are going to watch, that it’s all an art. Not true,” says Kenny. “The machines are actually better at predicting what people are going to watch and how the ads are going to show.” 

While some might worry that AI will supplant people, Kenny insists that’s not the case. Rather, it just opens the door for more innovation. For instance, the majority of airline delays are caused by weather. By better predicting the forecast, airliners can be more innovative in how they use that data in their aviation systems. 

The same thing is true in media planning and advertising. “[AI] is actually encouraging people to be far more creative. I honestly think jobs are going to be a lot more interesting when we let machines do the rote work and we’re able to spend all of our time on new innovation.” 

The Importance of Diversity in AI Models

Nielsen ratings are often referred to as “currency” in the media buying and selling process. For context, there’s a $7B TV advertising market in the US alone, all of which trades on Nielsen data. This data is also used to determine which shows are produced, where ad dollars are spent, what products are developed, where new stores are built, and more. 

Ensuring dollars are properly allocated to these efforts means Nielsen’s data must be true. And ensuring Nielsen’s data is true means counting everyone – a commitment Kenny is passionate about since taking on the role of chief diversity officer at Nielsen in addition to CEO.

“Fundamentally, the only way Nielsen becomes the trusted data set in media and consumer goods is by counting everybody, and I need to commit that to our shareholders, our board of directors, and to our clients. So, for me, there was no other place but to put diversity at the very top. You can’t have a true data set unless it counts everyone,” reiterated Kenny.

“This means counting every LGBTQ person, ensuring both men’s and women’s voices are represented, removing racial bias, and including both rural and urban areas in the model. The risk is that you don’t want AI to be an elite platform. You want it to be a human platform, and I think we have a great chance to bring all humans into the decision using this technology,” added Kenny.

He also expressed the importance of looking forward, not back, particularly as it relates to measurement and programming. “If [these efforts] are only based on what we’ve seen in the past, then you’ll probably end up with some gender bias, you’ll certainly end up with some racial bias, and you certainly won’t be inclusive of all sexual orientations.”

“So I think we really have to insist that the data be inclusive to make sure that the AI models are serving everybody. And quite honestly, it drives revenue growth. There’s good business in serving everybody versus deducing down to the population you’ve traditionally understood,” concluded Kenny.

>> Watch more of David Kenny’s interview at the Bloomberg Global Business Forum.

Request a Demo

Learn how Nielsen can help you measure and optimize your marketing and advertising: Request a demo today.

Subscribe to our Blog

Get news and information in your inbox every month: Subscribe

Previous Article
Nielsen and J&J to Present at NYC Attribution Accelerator
Nielsen and J&J to Present at NYC Attribution Accelerator

At the 2019 NYC Attribution Accelerator, industry experts from Nielsen and J&J will show CPG brands how the...

Next Article
Make Your TV Ad Dollars Work Harder
Make Your TV Ad Dollars Work Harder

Get the findings of the WarnerMedia Ad Sales and Nielsen partnership to diagnose the impact of TV ad decisi...

×

Contact Our Sales Team!

First Name
Last Name
Phone Number
Job Title
Company
Comments - optional
Thank you! We'll Get In Touch Shortly!
Error - something went wrong!