Seduction via aubergine emoji arouses Facebook’s suspicion
Image credit: Dreamstime
According to reports, the social media giant has updated its community standards to forbid use of certain emojis for the purposes of sexual solicitation.
While emojis have been used in Japan since the mid-1990s, their incorporation into common mobile operating systems in the 2010s have caused them to become an important part of digital culture all around the world. Innocuous-looking emojis – which linguist Gretchen McCulloch has characterised as digital ‘emblem’ gestures rather than word substitutes – often take on second meanings with cultural, political or sexual significance.
The aubergine emoji is frequently used to indicate a penis, with the peach emoji, licking emoji and splash emoji among others used most commonly as sexual euphemisms.
According to a report from the New York Post, Facebook acknowledged these double meanings in its July Community Standards update, stating that “contextually specific and commonly sexual emojis or emoji strings” can be considered a “suggestive element” in sexual solicitation. According to the guidelines, a user can be considered to have committed sexual solicitation if their post contains an implicit or indirect “offer or ask” for sex or sexual content (e.g. nude photos) alongside suggestive elements. This offence can result in the user’s post being removed and their account being flagged or terminated.
These Community Standards were enacted in September.
Facebook told the Post that: “[Posts] will only be removed from Facebook and Instagram if it contains a sexual emoji alongside an implicit or indirect ask for nude imagery, sex, or sexual partners, or sex chat conservations. We aren’t taking action on simply the emojis.”
Facebook and Instagram will also no longer allow users to post links to pornography, or nude photos which use emojis to cover genitals, bottoms and nipples.
Facebook subsidiary WhatsApp is likely to remain a haven for emoji-facilitated smut, however, as its end-to-end encryption prevents messages being read and filtered.
As is the case with much ‘borderline’ and euphemistic content, Facebook may face an uphill battle to enforce this new community standard. The company uses a combination of machine-learning algorithms and human moderators to keep inappropriate content off Facebook’s platforms.
Content which is not explicitly violent, racist, sexual, or dangerous but which most internet-savvy people can see is inappropriate (e.g. using euphemisms to joke about the Holocaust, or using emojis to indicate sexual acts) is difficult to detect and remove using algorithms, particularly given the rapid and decentralised means by which euphemisms fall in and out of use or change meaning.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.