Apple Re-Platforming the Parler App Could Set A Dangerous Precedent
I’ve spent the last 2 weeks using the Parler app, and it is a hate-filled, vitriolic platform that condones racism under the guise of free speech and operates with utter impunity.
Parler exists for users who believe that their free speech is being oppressed by other social media platforms. These users gravitated to Parler in droves after the 2020 Election in an effort to “speak their mind openly” on pro-Trump ideologies, conspiracy theories around the veracity of COVID, the integrity of the election, and white supremacy ideology.
Following the January 6th Capitol insurrection, Apple and Google decided to ban Parler’s access to the app store under the pretense that they did not have a sufficient content moderation policy, and did nothing to curb the encouragement and glorification of violence on the day of the Capitol Riots.
This move was applauded by some and resented by others. There was fierce backlash against the Silicon Valley companies for doing so, but they remained firm in their stance.
So why am I talking about Parler today? Two days ago, GOP Congressman Ken Buck made a premature announcement stating that Apple was allowing Parler back into the app store.
This isn’t strictly correct however. The document that Congressman Buck shares states that Apple would be amenable to reinstating Parler only after it meets the same standards of moderation all other apps need to meet.
Since its removal from the App Store, Parler has been on a media blitz, drumming up support within the GOP that conservatives are being censored and making it clear that they’ve regularly moderated violent and hateful content.
I’m here to say that this is utter bullshit.
Not only was their website’s “Community Guidelines” link broken as of a week ago, their guidelines and moderation policy look virtually unchanged from before the riots. Their guidelines revolve around 2 major principles:
Don’t use Parler as a tool for crime or unlawful acts
Spam Bots are not allowed
Here are some things that SHOULD have been mentioned in writing:
Child pornography and sexualizing minors is not allowed
Racism is against community guidelines
Harassment and/or Doxxing will result in being blocked
Encouraging violence against public or private figures is ill advised
This is not a marketplace for drugs
Self-harm and suicide related content should not be glorified
It’s also important to understand the very specific language being used here: “Parler will not knowingly allow itself to be used as a tool for crime, civil torts, or other unlawful acts.”
The word to focus on is “knowingly”. Thus if someone reports a piece of content, it is admissible towards some form of moderation, however it is not Parler’s responsibility to seek out unlawful content and proactively moderate UNTIL something has come to their immediate attention.
Putting the onus completely on users, especially a user base which seems to enjoy and perpetuate unlawful content seems to be a moot and counterproductive strategy. This is morally dubious at the very least and would be deeply troublesome if Apple does not consider this lackadaisical policy a violation of their App Store guidelines.
But hey, maybe I’m wrong. Maybe the community has done a brilliant job moderating itself and users are actively reporting content. Every community has a mix of good and terrible individuals, and perhaps there is good within Parler to help balance or even mitigate the hate.
Don’t get your hopes up because my time on Parler yielded some horrific trends of content, which I was able to discover within 5 minutes of digging around.
Huge trigger warning - there will be pictures of deeply hurtful, racist, anti-semitic imagery. It is not my intention to further propagate these terrible posts, but rather shed light on a platform that seems to value one life over another.
Parler is somewhat different from other social media platforms where you can only search for users or hashtags. It took me less than 5 minutes to find Nazi ideology on the platform.
It’s important to remember that it’s not just the unmoderated racism that makes this app problematic, it’s the fact that Parler has NO software or humans in place to detect or investigate child pornography hosted on the platform.
To be absolutely clear, automated systems for detecting child porn is something that every single mainstream social media platform has implemented end to end over the last 10 years.
Weirdly enough, no company is legally obligated to be scanning their sites for CSAM (child sexual abuse material), but one would think they have a moral and ethical imperative to stop the distribution of this heinous content.
When Parler COO Jeffrey Wernick was asked about this, he responded “I don’t look for that content, so why should I know it exists?", which is just about the worst response an executive could give with regards to clamping down on child pornography on your platform.
I am a big believer in the first amendment, and an app like Parler, as reprehensible as it is, should perhaps exist. Apple, is a conduit for people viewing content, therefore has a responsibility to make sure that Parler’s basic moderation is up to par with other social media platforms before re-platforming them. Not doing so, could be disastrous as more disinformation and conspiracy theories will be allowed to spread.
On a side note, it’s disheartening to see Republican leaders and business people claim censorship and lack of free speech when the speech that they don’t want censored is that of horrible racism, harassment, hate speech, bullying and doxxing.
Now, I’m not saying this sort of content doesn’t exist on Facebook, Youtube, Twitter, etc. However the difference is that these companies have teams with thousands of people tasked with proactive content moderation on a human, algorithmic and community front. No social platform is going to be perfect, but at-least they’re working towards combatting the scourge and not hiding behind their perception of “free speech”.
As of today, Apple hasn’t yet approved the Parler App. But I hope their App Store team does a much deeper dive than mine to understand the truly draconian and lazy moderation system that’s been set up and the impact that their approval could have on the app store eco-system and their users.
Till next time, thanks for reading!