My son and I were talking about the tragic death of the father of Liverpool’s goalkeeper, Alisson Becker. He drowned in a lake on their property in Brazil. As with most 13 year old’s, my son gets most of his news on TikTok and he was shocked at the comments below the story, people being mean and making jokes over a footballers father dying. This is the world of social media that we are living in today. A time where football players get racially abused by their own “fans” for missing a penalty. Where politicians are abused for whatever they do and hatred is allowed to flourish.
Then we has the instance of the protest on Grafton Street last week by anti-lockdown, anti-vaxxers. A lot of them are part of Facebook groups that peddle in conspiracy theories including one that babies are being killed and harvested for “adrenochrome” which is used to keep RTÉ presenters looking young and the corpses are buried under the new children’s hospital. These Facebook groups aren’t small, they have members in the tens of thousands.
Social media platforms in the US are protected from what their posters put on their sites. More and more people are getting their news from social media sites that do not have the same reporting standards and restrictions as news organisations. Facebook have moderators who go take down posts, but they can’t keep up. But when Facebook were in dispute with the Australian government over paying for news sites, it didn’t take them long to be able to take down this content. Why can’t they take down the hatred that is spewed out on their platform all the time?
YouTube’s algorithm is designed to keep you watching so they can sell advertising. They do that by giving you options to watch other related videos so if you start watching a QAnon video for example, they will give you options to watch more of them, leading you into the extreme opinions. This is how people become radicalised. They won’t give you options of videos telling you that QAnon is made up and based on lies. That isn’t going to keep you on YouTube and it will impact on their revenue.
Which brings us to their place in ESG funds. This stands for Environmental, Social and Governance. Companies like Facebook and Google may be forward looking with regards to the environment and corporate governance. They may pay their employee well and give them lots of perks. But what about what they are allowing on their sites and what it is doing to people’s minds?
Social media platforms are being used to promote hatred, bigotry and racism. They have become meeting places where people with extreme views can meet people with similar views and gain a voice. Even if they are only a small number, the power of social media allows them to amplify their views.
Should fund managers accept companies that hide behind Section 230 of the Communications Decency Act as an ESG fund? Should we believe them when they are saying they are doing what they can when they are quick enough to take other actions that they see as a threat to their massive revenue streams?
With EU regulations that will see more emphasis on companies meeting ESG level standards, isn’t it time that fund managers and large investment funds put pressure on these platforms to clean up their act.
What do you think? Drop me an email at firstname.lastname@example.org
08 March 2021