CANNES — Data used by marketers is likely to be intrinsically biased, and needs its own dose of diversity, according to speakers at Weber Shandwick’s fringe panel session with AdAge this week.

Introducing the session, Weber Shandwick president Gail Heimann said around 45 sessions at the Cannes Lions Festival of Creativity were focused on data, but “while we believe it’s the foundation of everything we do, and assume data represents fully-objective truth, the reality is that data may in fact be biased, sexist and even racist. If marketing depends on data, how can we protect the integrity of that data?”

Heimann was joined by IBM designer Adam Cutler, who has written the firm’s Everyday Ethics for AI guidelines, which includes a chapter on fairness and data bias. He said part of the problem was that AI or machine learning used to gather and evaluate data used for insights-led marketing was programmed by humans, who are intrinsically biased.

“I don’t know if it’s possible for us to be without bias, but what does it mean when we teach machines? Data is the fuel and insight is the destination, and we’re figuring this out in real time. If you’re not thinking about the ethical ramifications of what you’re doing at the idea stage, you’ll be trying to put toothpaste back in the tube,” he said.

Also on the panel was Bonin Bough, founder and chief growth officer of Bonin Ventures and a former Mondalez, PepsiCo and Kraft Foods exec, said that data could do with its own dose of diversity: “Our decisions are shaped by our background, and the technology today is being shaped by human biases we naturally come to the table with.

“We talk a lot about diversity but it doesn’t matter if you are male, female, black or white, if you all do the same course at Harvard you’ll all think the same. One of the biggest challenges in organisational decision-making is diversity of thought.”

And Caroline Criado Perez, author of ‘Invisible Women: Exposing Data Bias In A World Designed For Men” said biased data amplifies bias: “Images involving cooking, for example, are something like 33% more likely to include women, so algorithms are 60% more likely to label men in a kitchen as women.”

Criado Perez pointed out that because most data sets did not include women, they were intrinsically biased: “Nearly every data set in the world, from economics to health, suffers from the bias of not having disaggregated data on women, so you are starting with information based on male bodies and male lifestyles. Data from women is excluded as being confounding and atypical, with the result that we’re not designing cars, workplaces or medication for humans, we’re designing for men.”

She added: “If you’re going to market a product, you need to make sure you have women on your team, and you need to base the stories about that product on sex-disaggregated data.”