You may never have heard of Omegle–or perhaps you have. It is an online platform that allows people to meet and talk to total strangers online through video links. Participants are randomly paired but there is also a form of self-selection by indicating shared interests. Once people are paired, they can stay and chat or move to another partner.

The site has a reputation for being raunchy, attracting those with a penchant for kinky behaviour and, unfortunately, as a place where sex predators meet children online for “grooming”. The site has the standard disclaimers on its home page, but there is no age verification process and apparently very little content moderation.

The BBC did an investigative report on Omegle and came up with some disturbing findings. When it input a certain keyword reflecting its supposed interests, it was more frequently paired with people engaging in explicit activity. BBC didn’t reveal the keyword, which apparently has now been dropped from the service, but judging by the type of material the reporter was exposed to, my guess is that it may have started with a w and rhymed with banker. User beware.

Omegle has been in the news recently because one of its users, at the time (2014) an 11 year old girl, is suing the platform in a product liability case. She was randomly paired by Omegle with a man in his late thirties who went on to sexually abuse her online for several years. The perpetrator, a Canadian resident, has been convicted by a Canadian court in a criminal prosecution. However, the plaintiff alleges that Omegle is also responsible for the outcome, based on product liability arising from defects in design, defects in warning, negligence in design, negligence in warning and instruction, facilitation of sex trafficking, sex trafficking of children, human trafficking and negligent misrepresentation. That is quite a list.

Not surprisingly, Omegle trotted out (among others) the infamous Section 230 defence, arguing that it was not liable for user-generated content on its site. That defence was recently dismissed by a district court in Portland, OR. The ruling was based on the platform’s design rather than the content itself with the presiding judge writing that “Omegle could have satisfied its alleged obligation by designing its product differently—for example, by designing a product so that it did not match minors and adults.” The Section 230 immunity did not protect Omegle. That is good news for those concerned about Section 230 over-reach and the abuse of this provision by internet intermediaries.

Many readers will be familiar with the origins of Section 230 and its use over the years by internet platforms to shield themselves from any civil liability arising from content on their services, notwithstanding obvious abuses and the fact that many platforms encourage controversial and marginally legal (or in some cases blatantly illegal) content in order to attract viewers, and thus ad revenues. Tech companies love it and have defended it tooth and nail. It has been under close scrutiny in the US Congress from both Republicans and Democrats.

Republicans got engaged when Donald Trump threatened to revoke the legislation because Twitter had the temerity to fact-check some of his more outrageous tweets. There are many good reasons for revising Section 230 but this was not one of them. In 2018 sex trafficking was carved out of the legislation, removing the liability immunity for platforms that promoted sex trafficking and prostitution, largely because of the role that platforms like Backpage played in sexual exploitation of minors. That legislation, known as FOSTA/SESTA (Fight Online Sex Trafficking Act—the House version—and the Stop Enabling Sex Traffickers Act—the Senate version) has been controversial because it has seldom been used to prosecute offenders and has been criticized by sex workers as having removed a legitimate forum for communication. Nonetheless it has helped to stop online advertising related to child sex trafficking.

The Democrats, particularly in the person of Senator Ron Wyden (D-OR), one of the authors of Section 230 in the first place, are concerned that the balance in the legislation, which was supposed to be (as described by Wyden) both a sword and a shield, has become nothing more than a shield. The sword was supposed to have allowed the platforms to undertaken necessary content moderation without being sued. Instead, it is the immunity provided from avoiding any content moderation that has come to prevail. In an interview back in 2018 with The Verge, Wyden said;

“…what was clear during the 2016 election and succeeding events surrounding Facebook, is that technology companies used one part of what we envisioned, the shield, but really sat on their hands with respect to the sword, and wouldn’t police their platforms….The industry better start using the sword, part of the two-part package, or else it isn’t going to be in their hands”.

So, in short, Section 230 is under attack from both sides although neither can decide exactly what action to take. In the current Congress about 20 bills have been introduced that would amend Section 230 in one way or another, although none seems to have enough impetus to pass. But the courts may be making more progress, at least if the Omegle decision stands. Denying platforms immunity when they ignore obvious abuses, or, in this case, design their platforms to facilitate abuses, is hopefully one more nail in the coffin of Section 230 as it exists today.

© Hugh Stephens, 2022. All Rights Reserved.

This post originally appeared on the Hugh Stephens Blog and is reproduced with permission and thanks