The Digital Closet, page 18
A number of scholars have argued that the internet can be a very effective medium for disseminating educational information about sexual health, introducing sexuality, and fostering sex-positive attitudes in children and adolescents.37 While people of any age can reap these benefits, it is more common for younger people to use the internet for information about sex and sexuality, and even more common for LGBTQIA+ youths to do so.38 Keep in mind that this is precisely the age group meant to have its internet traffic censored by filters like SafeSearch. Overblocking frequently leads to the censorship of sex education materials. Take, for example, the case of the National Campaign to Prevent Teen and Unplanned Pregnancy’s online campaign called “Bedsider,” which was launched in 2012. This campaign was meant to be hipper so that it would have more appeal to young people. This is a common strategy in newer sex education social media campaigns. As Susan Gilbert, codirector of the National Coalition for Sexual Health, explains, “We have to make healthy behaviors desirable by using creative, humorous, and positive appeals.”39
Bedsider made use of these standard social media strategies to try to entice teens to engage in safer sex practices. For instance, Bedsider tweeted, “98 percent of women have used birth control. Not one of them? Maybe it’s time to upgrade your sex life.”40 In response, Twitter banned Bedsider from promoting its tweets for violating Twitter’s ad policy, which prohibits the promotion of or linking to adult content and sexual products or services. A Twitter account strategist noted that the problem would persist as long as Bedsider’s website continued to host the article “Condom Love: Find Out How Amazing Safer Sex Can Be.”41 Even though the article was focused on encouraging young people to engage in safe sex, the Twitter account strategist told Bedsider, “It still paints sex in a recreational/positive light versus being neutral and dry.”42 In 2017, Facebook banned advertisements from the National Campaign to Prevent Teen and Unwanted Pregnancy that promoted regular health checkups. Like others, their modern and catchy “You’re so sexy when you’re well” advertising campaign was deemed as profane or vulgar language. Similarly, journalist Sarah Lacy’s advertisements for her book The Uterus Is a Feature, Not a Bug were rejected for containing the word “uterus.”43
In response, Lawrence Swiader, Bedsider’s director, told the Atlantic, “We need to be able to talk about sex in a real way: that it’s fun, funny, sexy, awkward . . . all the things that the entertainment industry gets so well. How can we possibly compete with all of the not-so-healthy messages about sex if we have to speak like doctors and show stale pictures of people who look like they’re shopping for insurance?”44 This is not an isolated incident. The Keep A Breast Foundation, a youth-based organization that promotes breast cancer awareness and educates young people about their health, was banned from using Google AdWords because of their slogan, “I Love Boobies.”45 Both of these instances constitute a staggering reiteration of early Supreme Court bias in enforcing obscenity doctrine against LGBTQIA+ and sex education materials but not against Playboy magazine; Playboy has been allowed to advertise its content through its Twitter account and has even posted photos of bare breasts. And they have very real material consequences. In the last systematic study I could locate from 2013, the Pew Research Center found that 59 percent of Americans had turned to the internet for health information in the past year, with 77 percent of them starting at a search engine like Google, Bing, or Yahoo.46
The appeals process for banned content is too complex, time-consuming, and expensive for nonprofit organizations to successfully engage in. For instance, in 2014, the sex education organizations Spark and YTH (Youth+Tech+Health) had four of their sex education videos removed from YouTube. The organizations repeatedly contacted YouTube and filed two official appeals through the online process, all to no avail. They were only able to successfully get their videos reactivated after hiring a lawyer who happened to go to law school with another lawyer high up in YouTube’s policy department. As Swiader noted, “While some organizations have had success getting content through after initial rejection, the process of winning that minor victory is tireless. Many smaller organizations just don’t have the bandwidth to fight for each individual piece of content.”47 This has been a huge hindrance to online sex education campaigns, as at least forty sex educational content creators have had their YouTube videos demonetized, their channels deprioritized in search results, and their accounts shadow banned, including channels like Watts the Safe Word, Come Curious, Bria and Chrissy, and Evie Lupine.48
This overblocking is not confined to sex education though. Entire identity categories have been subject to overblocking, even when their online content is not sexually explicit. Take, for example, Google’s understanding of the term bisexual. From 2009 to 2012, Google only understood the term “bisexual” as a query for mainstream heteroporn. While the effects of this oversight at Google and their slowness to address it are quite bad, it is easy to understand how their algorithms would have come to such a conclusion. In mainstream porn, the term “bisexual” is popularly appropriated heteronormatively to signal only scenes with females willing to engage in group sex with other women—for example, male-female-female (MFF) threesomes.49 The term “bisexual” then is hugely popular in mainstream heteroporn, and mainstream heteroporn comprises a large percentage of internet pornography (if not of the web in its entirety). As such, the term “bisexual” actually is more likely to indicate pornography than not. And while it is a flagship term in the LGBTQIA+ marquee, bisexuals often speak of feeling underrepresented or even marginalized in LGBTQIA+ discourse. With the term often being collapsed into its container initialism, one can see how this usage would have been less compelling to the content filter’s machine learning protocols. The result was Google adding the term “bisexual” to a list of banned search terms that could cause a website to be deprioritized in search rankings if any of these terms appeared on the site. Because of this, for three years, all bisexual organizations and community resources were either deprioritized in Google Search results or completely censored.50
I can find no comprehensive studies of the effects of Google’s changes to its algorithms post-2012 to disallow the censorship of bisexual organizations and community resources. There are also no comprehensive studies on other such terms that have been designated as exclusively pornographic, though “gigolo” and “swinger” went through similar classifications between 2007 and 2015.51 Without such studies, it is hard to determine how many online LGBTQIA+ resources are still being prevented from reaching their intended audiences by Google’s SafeSearch features. These sorts of resources are a particularly difficult issue to deal with from Google’s regulatory framework, as the line between explicit and educational or identity-forming content is hazy. In communities that look to the performativity of sex, sexuality, and/or sex acts for their communal identity formation, visuals and discourse that might be considered explicit in other contexts take on a new valence. Here “prurient” interest can be tethered to sexual education and individuation. “Hard-core” pornography is used—in particular by adolescents—for educational purposes.52 There is some evidence of a correlation between prurience—in this case masturbation to online materials—and seeking information about sex and sexuality online. While, not surprisingly, masturbation also correlates to viewing these materials more favorably, more interestingly it also correlates to people reportedly being less disturbed by sexual material.53
The internet is well suited for offering a safe space to experiment with one’s sexuality with few negative repercussions—people can “try on” and “test out” sexualities and practice coming out—and for building communities for people with marginalized sexual identities.54 As Nicola Döring notes, “The Internet can ameliorate social isolation, facilitate social networking, strengthen self-acceptance and self-identity, help to communicate practical information, and encourage political activism.”55 While the internet offers very promising opportunities for LGBTQIA+ individuation and community building, its heteronormative content moderation practices work to circumvent those opportunities. As Attwood, Smith, and Barker note, “Young people appear to be using their encounters with pornography as part of their reflections upon their readiness for sex, what they might like to engage in, with whom, how and what might be ethical considerations for themselves and prospective partners.”56 As such, we need to be having a much more robust conversation about what constitutes pornography, in which contexts, when it is actually in the best interests of children and adolescents to censor it, and how, and this conversation needs to better reflect LGBTQIA+ and sex-positive voices. To facilitate this conversation, we need a more robust and longer duration dataset that tracks online censorship, particularly when it comes to LGBTQIA+ resources online so that we can better understand just what content is being considered “explicit.” Additionally, we need to collect more information on how people (adolescents in particular) use the web for sexual education, experimentation, individuation, and community building.57 Without this basic information, it is very difficult to provide a well-founded critique of content moderation or to advocate for precise interventions into how content moderation algorithms ought to be altered to better suit LGBTQIA+ communities online.
FOSTA-SESTA and the Financial Incentive to Overblock
As I noted briefly above, in March of 2018, the US Senate passed the Stop Enabling Sex Traffickers Act (SESTA) and the tacked-on Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) by a vote of ninety-seven to two. Collectively, these acts are known as FOSTA-SESTA and work to close off Section 230 of the CDA of 1996, which for two decades had allowed internet providers and content hosts to avoid legal culpability for obscenity and the facilitation of prostitution that at times may have been facilitated by their services. Under the pretense of protecting women from sex trafficking and cracking down on child sexual exploitation, FOSTA stopped the protections of Section 230 and instituted a new, very ambiguous definition of content and services that can be considered to facilitate sex trafficking and prostitution. For instance, under FOSTA, sex work and sex trafficking are the same thing, and content hosts and service providers can be held liable for “knowingly assisting, supporting, or facilitating” sex work in any way.58 Congresswoman Ann Wagner, a key sponsor of the bill, has also explicitly conflated consensual sex work and sex trafficking in a speech on the House floor.59
FOSTA is a clear mark of heteronormative bias in the congressional agenda, or at least a pandering to it. As Violet Blue notes, “Lawmakers did not fact-check the bill’s claims, research the religious neocons behind it, nor did they listen to constituents.”60 FOSTA was opposed by everyone from the ACLU to the Department of Justice.61 The Electronic Frontier Foundation (EFF) published many dozens of articles condemning the act, as did law professors, anti-trafficking groups, and sex worker organizations.62 Large internet companies like Amazon, Google, Facebook, Microsoft, and Twitter also unilaterally opposed the act under the guise of the Internet Association in August of 2017.63 Though by November these companies had changed their minds—ostensibly after unspecified revisions to the legislation—and were thanking the same senators that they were testifying before in Congress about having facilitated Russian interference in the 2016 US presidential election.64 Tech journalists writing in both conservative and liberal news forums attributed this shift to Facebook’s breaking ranks and championing the legislation in the wake of their series of scandals ranging from Cambridge Analytica to Russian bots spreading pro-Trump propaganda and hate speech.65
FOSTA was immediately claimed as a significant victory for NCOSE, which wrote shortly after its passage, “This is as great a moment in the fight to free our country from sexual exploitation, as the Emancipation Proclamation was in ending the scourge of slavery.”66 It is unclear exactly how much credit can be reasonably attributed to NCOSE for FOSTA. As has been noted, between 2016 and 2018, NCOSE was able to get its language into the official Republican Party platform and saw a number of states pass resolutions declaring pornography a public health crisis. It is unclear exactly how much of this grassroots organizing had made its way into the US House of Representatives and Senate, but the time line of FOSTA overlaps significantly with this movement, and it is safe to assume that it is part of a growing consensus among legislators that pornography needs stronger regulation, though under the familiar guise of protecting children. What is clear is that the discursive conventions of NCOSE have become mainstream. Their so-called intersectional approach to sexual exploitation that considers pornography and human sex slavery to be different only in degree rather than different in kind has been taken up on both sides of the aisle and was echoed by the Internet Association and Facebook executives like Cheryl Sandberg in particular. This is a particularly dangerous conservative anti-sex apparatus that has been constructed. It can mobilize the ambiguity of its blurred definition of sexual exploitation—containing equally everything from soft-core pornography to sex slavery—to attack any and all forms of sexual expression from the unassailable rhetorical ground of protecting children from being sexually abused and exploited on the dark web. And further, as has repeatedly been the case in the past, these sorts of apparatuses often reach a détente with the untamable flow of erotic expression in which only the most industrialized, corporate, and heteronormative versions of pornography are able to persist.
Ron Wyden, the only senator besides Rand Paul to vote against FOSTA, noted that rather than preventing sex trafficking or helping child victims of abuse, the law would primarily create “an enormous chilling effect on speech in America.”67 We can already see that this is precisely the case. The new law incentivizes law enforcement to focus on intermediaries that facilitate prostitution rather than sex traffickers themselves. It thus shifts focus away from real criminals, and in shuttering these intermediaries, it cuts law enforcement off from essential tools that were previously used to locate and rescue victims. It similarly cuts law enforcement off from easily tracked evidence that can be used in criminal cases against sex traffickers. This is why the bill was also opposed almost universally by anti-trafficking groups and sex work organizations.68 Chapter 4 will dig deeper into the impact that FOSTA has had on the finances and everyday lives of sex workers and adult entertainers, with a particular focus on those offering LGBTQIA+ services and content. Here it is worth exploring how a number of platforms responded to the shift in regulatory policy. Most major ISPs and internet platforms clamped down on sexual expression, ramped up their content moderation practices, and ended up overblocking more content than ever. In particular, we will look at the overblocking imposed by Apple through its App Store that serves as a gatekeeper to all iPhone users globally and at the Google platform, both of which have engaged in heteronormative overblocking in the wake of FOSTA-SESTA.
Overblocking in Apple’s App Store
The Apple App Store was set up to function as a sort of moral policing mechanism for mobile content. Steve Jobs famously said, “We do believe we have a moral responsibility to keep porn off the iPhone,” and he further noted that “folks who want porn can buy an Android phone.”69 This anti-sex sentiment has been literally codified in both the community standards for the app store and the algorithmic procedures for policing iOS content. These policies have been claimed as a victory by NCOSE, which had been putting pressure on Apple for years.70 This sentiment also permeated the iPhone’s firmware at one point, as researchers discovered in 2013 that the following words were intentionally excluded from the iPhone’s dictionary and thus also from autocorrect and auto-complete: abortion, abort, rape, arouse, virginity, cuckold, deflower, homoerotic, pornography, and prostitute.71 In essence, the system was hardwired to be blind to these terms and thus to inhibit conversations mediated by iOS about abortion, rape, virginity, sex, homosexuality, pornography, and prostitution. The industry describes lists like these as kill lists, and many text input technologies like Android and the Swype keyboard contain them as well.72
This was not new for Apple, as in 2011, it was discovered that Siri could not answer simple questions about where people might go to get birth control or to receive an abortion. In the latter instance, Siri would instead direct iPhone users to antiabortion clinics.73 In 2016, researchers in New Zealand found that Siri produced either no answer or answers from disreputable sources for 36 percent of the fifty sexual health–related questions they asked. In particular, Siri failed to produce visual illustrations, misinterpreted “STI” as a stock market quote, and when asked to tell them about menopause pulled up the Wikipedia page for the show Menopause the Musical that was then running in Las Vegas.74 These findings were in line with previous research demonstrating that Siri trivialized many important inquiries about mental health, interpersonal violence, and physical health.75 These discrepancies between voice search and desktop search have a disproportionate impact on communities that more frequently access the internet through these or similar vocal interfaces, including people with visual impairments, lower literacy rates, or whose only internet-enabled device is their phone. Nor are they exclusive to Apple. In 2016, all the top virtual assistants—Siri, Google Now, and S Voice—could not understand questions about what to do if you are raped or being abused in a relationship.76 As Jillian York, then director of international freedom of expression at the EFF, told The Daily Beast, “I hate to say it, but I don’t think this should surprise anyone. Apple is one of the most censorious companies out there.”77 Apple’s commitment to an anti-sex and anti-pornography regime of censorship is particularly important because so many technology companies, and platforms in particular, require the intermediation of the App Store to interact with iPhone users.
