Grindr enjoys totally immunity from liability for the harm suffered by a gay Manhattanite whose ex-boyfriend created fake Grindr profiles in his name that led more than a thousand people to contact him at home and at work for “fetishistic sex, bondage, role playing, and rape fantasies,” a unanimous federal appeals panel has ruled.
The March 27 ruling by a three-judge panel of the New York-based Second Circuit Court of Appeals in a case brought by Matthew Herrick of Manhattan stated total agreement with District Judge Valerie Caproni’s January 2018 ruling against him.
Unlike Caproni, the appellate panel, consisting of Circuit Judges Dennis Jacobs, Reena Raggi, and Raymond J. Lohier, Jr., omitted from its brief “summary order” the details of some of the dire consequences Herrick suffered. The panel’s ruling does not have “precedential effect” but is consistent with other court decisions that have noted that the Communications Decency Act ‘s Section 230 states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
The Communications Decency Act was intended to crack down on Internet pornography by requiring service providers, among other things, to enable parental controls over what minors can access online. Freeing interactive online sites from liability for what users post, however, relieved Grindr of liability for the fake profiles Herrick’s ex created.
According to Caproni’s ruling, Herrick alleges that among the many men who responded to the fake profiles, some showed up at his home or workplace and “physically assaulted or threatened Plaintiff and his friends and co-workers.”
Herrick had achieved some success when he first filed suit in a New York State court, getting a judge to grant a temporary restraining order requiring Grindr to disable the fake profiles.
Grindr, however, immediately removed the litigation to federal court and moved to dismiss it, citing Section 230.
There is nothing in the Communications Decency Act that would prevent Herrick from suing his ex-boyfriend, but Grindr is essentially immune from liability for the harm caused by the fake profiles.
When the case was removed to federal court, Herrick’s attorneys amended the original complaint, which he had filed on his own in state court, in order to allege a variety of legal theories seeking to get around the Section 230 immunity issue, but to no avail. The court found that all of Herrick’s claims arose out of “information provided by another information content provider” — that is, his ex-boyfriend — so all of them fell within the broad sphere of Section 230. The provision has been liberally interpreted by federal courts to avoid imposing extremely burdensome censorship obligations on operators of “interactive computer services,” which include “any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server.”
As Caproni found in her earlier decision, courts have ruled that “social networking sites like Faceb
Trying to get around this, Herrick’s lawyers argued that Grindr provides a defective product and misrepresents its site’s safety for users, but the court concluded that Grindr’s Terms of Service published on its site provide adequate warnings. The panel noted Caproni’s finding that those Terms of Service “do not represent that Grindr will remove illicit content or take action against users who provide such content, and the Terms of Service specifically disclaim any obligation or responsibility to monitor user content.”
In any event, the court found, Herrick deactivated his Grindr account when he met his ex-boyfriend in 2015, so he “could have suffered the exact same harassment if he had never seen the Terms of Service or created a Grindr account.”
Quoting a decision by the San-Francisco-based Ninth Circuit Court of Appeals, the appeals panel wrote that under Section 230 an interactive computer service “will not be held responsible unless it assisted in the development of what made the content unlawful” and cannot be held liable for providing “neutral assistance” in the form of tools and functionality available equally to bad actors and the app’s intended users.”