The third future is hyper-personal

Written By :

Category :

Artificial Intelligence

,

Future PR

,

Public Relations

,

Society

Posted On :

Share This :

​The third future for public relations is all about you – and it’s personal. Hyper-personal in fact. I suspect a lot of us have experienced that spooky moment when, after a random in-person conversation, we start to see ads and information connected with our chat appear on our feeds. It manifests itself even though we have never searched for the subject in our life and, despite little or no interest on our part, ads on German castles, Icelandic pony trekking or tips on chicken farming just keep coming. We’ve data to thank for that and we’ve got used to the personalisation of brand and consumer offerings over the years – Amazon, Netflix, Starbucks and others led the way and for a lot of people, that’s just fine. They’re happy to hand over their data in return for a more personalised experience but, as we’ve shifted into the faster gear of hyper-personalisation, and with that ‘personal experience’ coming at us in real time, it’s time to test the brakes before we find ourselves on a runaway train. 

I’m hoping we all agree here that there’s no such thing as the public. There are an infinite number of ‘publics’ and anyone who has ever created a stakeholder map will know that mapping is messy, with our communities of interest getting smaller and smaller and the information we share, the stories we tell, the connections we make, becoming ever more personal. We also know that there is no such thing as ‘the public interest’ – there is a multitude that rises and falls, clashes and diminishes much like the spheres in an old school lava lamp – my lifetime analogy for community activity. In understanding the need for niche communication, appropriate for smaller groups and individuals, it always surprises me that many organisations still use a shotgun approach trying to be ‘everything, everywhere, all at once’ for everybody. 

Different ways of using data gives us different types of personalisation. ‘Ordinary’ personalisation is about the past, whereas hyper-personalisation is about the present. Hyper-personalisation combines real-time data, machine learning, and advanced technologies such as predictive analytics to deliver individual experiences to customers. And we’re not just talking about nuts and bolts data like your IP address, activity and location. We’re now looking at emotional recognition, layered over facial recognition, manifested most recently in Pizza Hut’s emotional recognition system that will serve up a pizza best suited to your current mood. It’s an ‘in-your-face’ future we glimpsed many years ago in the film Minority Report and, unimaginable though it is, the reality is more sinister than the movie. Facial recognition is used by governments and organisations on a daily basis and there is a blurred line between hyper-personalisation and surveillance. So what does this future-now hold for public relations practice?First call will be inside our organisations, guiding ethics and intent. Just because something is legal, doesn’t make it ethical and organisations need to be sure of their intent. A Deloitte report from 2020 – Connecting with meaning – Hyper-personalizing the customer experience using data, analytics, and AI – states that the use of hyper personalisation for the CMO is to drive profit and the technology will maximise revenue. I would argue that this is not good societal intent. It is legal but certainly questionable on the ethical front. The sourcing, use, retention and future deployment of live data needs to be understood and the intent identified. This will help – at least in some small way – to preserve the relationship and maintain a human connection between organisation and people that is mutually beneficial, otherwise our people and communities simply become a cash cow, milked dry of data. 

Hyper-personalisation also lands us on the shifting sands of misinformation. Targeted information conceived, seeded and based on scanned mood will undermine societal cohesion. As many countries enter election cycles, it is entirely probable (and I’m holding my breath just waiting for this to happen) that the atomic mix of technologies used in this way will unleash mayhem in many parts of the world. Here in New Zealand, we’ve already had one political party using undeclared AI and thinking nothing of the skewed perspectives created and untruths spread as a result. US Republicans created their dystopian AI-generated attack ad, depicting another term with President Biden but at least – unlike the New Zealand National Party – they declared it to be AI. Yet even with that declaration, it is unlikely that the majority of people will realise such depictions are fictional and, in the same way as so-called behavioural economics is used to manipulate many publics, they will simply believe what they see. 

This pace of change suggests that in our third future of hyper-personalisation, public relations practitioners will become firefighters, spotting and extinguishing wild fires of disinformation and discord. Deeper understanding and management of AI-human relations will form part of our role. As organisations deploy generative models to develop and optimise ‘content’, that content will, in the end, speak only to other algorithms, leading to relationship breakdowns and humans slipping through the cracks of perceived communication. It is hard enough now to speak to a person – negotiating chatbots, ‘live’ online help desks (run by chatbots), possibly a digital human with a smiley face, and, if you are very, very lucky and prepared for a four-hour wait, you might – and only might – get through to a human. The dark patterns used in websites for years, that make it harder to leave the site or complain, have been seamlessly integrated with AI powered ‘personalised’ customer service provision (and I use the term ‘customer service’ loosely). Such methods may reduce costs but they ultimately reduce customer numbers too. How then, do we avoid dangerous deployment? It could be that regulation will play a part or the control and training of AI is taken out of the hands of the hugely powerful corporates like Apple, Meta, Google, OpenAI and others – the shadowy others that we seldom see. Our societal models are based on profit and power and corporates and media companies are tuned to make a profit -because that’s their job. In all the great talk of purpose, commercial organisations are looking to maximise their returns and governments are looking to hold on to power. The movie ‘Oppenheimer’ is about to be released and tells the story of the atomic bomb, first tested on July 16 1945, just shy of 78 years ago. It was a technological development that changed the course of history and we remain in its shadow today.The deployment, testing and use of generative AI, a technology that improves itself but which has human bias and frailty at its core, is another such moment. Without brave navigation supported by good intent, the explosion of data and implosion of truth and reality will injure us all and the opportunity for good – great good – will be lost. Hope to see you tomorrow for our fourth future – immersion