Misinformation, disinformation, scams, and questionable videos have been commonplace aspects of social media for years. The Badger, like many, has become distrustful of content pushed to him by algorithms because normally it is not what it appears or purports to be. Three typical examples of content that’s helped to fuel the Badger’s distrust are as follows. The first is spectacular, obviously fake, video of shipping and aircraft incidents that put Hollywood movies to shame. The second is content from activist or political groups that criticise or parody others and promise a better future. Activist and political groups are unreliable and frequently blinkered with short memories. The third is incessant clickbait. Life’s too short to waste time clicking such links. Putting it diplomatically, you can tell by now that the Badger’s trust in what’s pushed to his social media feeds is not high.
AI, of course, is increasingly helping the producers of this content that’s led to this erosion of trust. As this report from the University of Melbourne in Australia highlights, there’s a complex relationship between AI adoption and trust. It reports that while 66% of its survey respondents use AI regularly and believe in its benefits, less than a half (46%) trust the AI they use. The Badger aligns with this finding. He’s an occasional user of AI, but he doesn’t trust it. This ‘trust gap’ – as the report highlights – is a critical challenge for AI’s wider adoption.
Reflecting on this has led the Badger to two conclusions. The first was that since anyone can create content with AI tools, it’s inevitable that the volume and sophistication of misinformation, disinformation, scams, and questionable video content in social media feeds will increase further. Soon the question to really ask yourself about social media feeds will no longer be ‘what’s fake?’… but ‘what’s real?’ The second conclusion was that this, society’s huge energy bill for AI, and its unsustainably high stock market valuations, are widening rather than closing the Badger’s ‘trust gap.’
AI tools are here to stay, but as the report above points out, the biggest challenge for AI is trust. As the common adage highlights, trust is the easiest thing in the world to lose, and the hardest thing in the world to get back. At present, it doesn’t feel as if AI is winning the battle for our trust. The Badger’s current overall feeling about the question of trust is nicely summed up by this passage from J.K. Rowling’s book ‘Harry Potter and the Chamber of Secrets’. ‘Ginny!’ said Mr. Weasley, flabbergasted. ‘Haven’t I taught you anything? What have I always told you? Never trust anything that can think for itself if you can’t see where it keeps its brain?’ For the Badger, the last sentence of this passage, written over a decade ago, gets to the nub of the AI and trust issue…