<div class="header-image"></div> <table class="table-header"> <thead> <tr> <th colspan="2"></th> </tr> </thead> <tbody> <tr> <td>2024-09-08</td> <td style="text-align: right;"><a href="About.md" class="internal-link">About</a></td> </tr> </tbody> </table> # Social Winds and Social Media ![socialMediaStorm](../Blog/Assets/socialMediaStorm.jpeg) With the recent [detainment of Telegram CEO Pavel Durov](../Blog/2024-08-30%20Telegram,%20Signal,%20and%20Maintaining%20Digital%20Privacy.md#Telegram,%20Signal,%20and%20Maintaining%20Digital%20Privacy), his release on 5 million euros bail, and revelations that "this year we are committed to turn moderation on Telegram from an area of criticism into one of praise," I thought I would get my thoughts out on appropriate moderation for social media platforms, what should be regulated by law, and what should be up to the platform owner. There is much talk lately about the responsibility of social media platforms to moderate what appears on said platforms. I, myself, [have previously blamed social media](../Blog/2024-07-02%20AdBusters,%20Occupy%20Wall%20Street,%20and%20the%20Great%20Culture%20War%20Swindle.md) for much of the unrest that has spilled out into our cities in the last few years. Many people are indeed out protesting things they know nothing about, or they have such an exaggerated sense of urgency and disgust about any given situation that they feel compelled to throw bricks through windows, or leave antisemitic graffiti on statues in parks. It's a difficult situation, because, on one hand, I do feel that the responsibility of these acts should rest squarely on the idiot doing them. How can social media platforms possibly assume responsibility for what every moron does who happens to have an account there? On the other hand though, I also feel that these idiots wouldn't be doing these things without social media algorithms pulling their marionette strings, influencing their beliefs and worldview with such twisted perspectives that we see huge crowds in organised protests *supporting Hamas* for Christ's sake. It's a cognitive dissonance that I'm unable to resolve. That being said, I do have a few ideas that, I think, when put together will result in far fewer headaches and ruined statues. First, there needs to be a crackdown on public protests. **Anything** illegal needs to be dealt with swiftly and harshly. Perhaps the minimum penalties for crimes should be doubled when committed during protests or public gatherings of over 50. But at the very least, police need to have more of a presence during such protests, and there should be *immediate* arrests for anyone breaking the law. They need more leniency and leeway to break up mobs of unruly thugs. There must be consequences again, for the massive damage and turmoil caused by these demonstrations. Secondly, there really *does* need to be some regulation regarding how social media sites are run, however I don't think heavy-handed moderation is the answer. There needs to be laws controlling, or prohibiting altogether, social media algorithms that are specifically designed to addict users to their platforms, by feeding them whatever nonsense possible to keep them scrolling. This is the *true* source of rampant misinformation, and is what tips the scales in favour of civil unrest. It distorts reality, and gives the false impression that any given situation is far more dire than it actually is. These algorithms are not just used by social media platforms, but search engines, and video platforms as well. The law should be stated in a way to force companies to open-source the code for these algorithms to ensure transparency, and put limits as to their reach and manipulation of users' feeds. Or ban them outright, beyond what each individual user does to craft their own algorithm (checking boxes for interests and keywords, for example). As far as moderation goes, for things like hate-speech, that should be up to the platform owners. If a platform decides to allow the full gamut of speech, that should not be against any law, regardless of what any particular user decides to post that would break speech laws in his or her own country. Let the police look at the posts themselves and decide if any arrests need to be made. It shouldn't be the role of a social media platform to know what the law is in every country on earth, and police users accordingly. If the police want to arrest people for Tweets, for example, that is a situation to be left between the police and citizens of that country. And of course, any anonymous users should understand that, if police require any cooperation, that it will probably be given, and protecting themselves is a personal responsibility. With all that being said, users should have the tools to moderate their own feeds however they see fit. They should be able to block/mute individual users, countries, users that have particular interests, they should be able to maintain a list of words that they don't want to see in their feeds, along with phrases, *emojis* even, and any number of things I haven't thought of. Users should have ultimate control over what they see and what they don't. It shouldn't be left to a cold, emotionless algorithm designed only to keep eyeballs glued to screens for as long possible. It also shouldn't be left to overzealous moderators trying to sanitise the world for us all. Users should be responsible for their own feed, and have the ability to curate it for themselves. What I've written above would also require that users actually be responsible. This would put age limitations on participating in this grand social media experiment. Social media needs to be left to consenting adults, who can actually be responsible for their own feeds. This doesn't even get into the problem of porn being so rampant on these platforms, but I'm including that in my reasoning that kids should not have access to these sites. I don't pretend to know how to keep age limits while still allowing anonymity. There have been attempts at this from legislators and it's usually a privacy nightmare that only affects those who abide by the law. Even the EFF, who are supposed to be experts in this area, [oppose these bills as they arise](https://www.eff.org/document/eff-opposition-letter-californias-ab-3080) but offer no solution to the problem. There are solutions, but none of them are perfect. One such solution could be a 3rd party age verification system, similar to how PayPal works as a 3rd party financial middleman. PayPal allows users to make purchases on many sites without having to give out a credit card number to all of them. Similarly, an age verification site could be used to collect enough government ID to prove who you are, eliminating the need to give the same information to a bunch of sites upon which you'd rather stay anonymous. There's still privacy implications to this (the 3rd party verifier would have records of every adult site you visit), but there might also be ways to mitigate some of that. Regardless of whether this is the solution, or something else, it's time we all recognise that *something* needs to be done about it. This, honestly, should have been dealt with 2 decades ago, and it's a failure of us all in neglecting to place the necessary pressure onto government agencies to force regulation in this area, to force the market to come up with a better solution than clicking a button that says "yes I'm 18." Keeping porn out of the hands of minors was once just common sense. And any adult showing kids that sort of thing would be very suspect. People would want to ensure that they were not left alone with kids for obvious reasons. Today kids of any age can look at any sort of porn any time they like. And it's just accepted as an inevitable fact of life now. If we can't even force sites to verify age in order to access porn, how can we possibly expect that social media sites do it? Or force them to have user-controlled algorithms in place of algorithms designed only to keep users looking at advertising? Social winds are unpredictable, but small gusts whip up into forceful gales with a speed and ferocity that was unheard of before the social media plague was unleashed. They change direction faster than anyone can keep up with too. Protests about Islamophobia turn into women's marches, to BLM riots, trans-rights, the Russia/Ukraine war and Israeli/Palestinian conflict, the anti-immigration riots in the UK and pro-Hamas/anti-Jew nonsense on US and Canadian university campuses (many of these having come 'round on several occasions). It shifts from one random topic to another. These storms can no longer be prevented, but hopefully it's not too late to keep them contained.