Everything in moderation: How students shape the College’s online communities

(Devika Goel/The Williams Record)

Editor’s note: This article contains mentions of suicide and self-harm in the section titled, ‘This could be my friend.’

The internet of the early 2000s was worlds apart from the internet of today. To get online, most Americans had to sit through the burbling and screeching of a dial-up modem. The phenomenon of “internet culture” could still be disentangled from plain old culture. And as the world was trying to catch up with the explosive growth of this technology, so too were students at the College.

Filling a niche currently occupied by Facebook and GroupMe, listservs maintained by Williams Students Online (WSO) were the main medium for digital discourse on campus. But for students who preferred anonymity and informality, WSO also ran online forums — which, in the fall of 2002, became a center of controversy.

“A thread entitled ‘Gays Suck’ prompted the Queer Student Union to print out the thread and post it in Baxter Hall, inviting responses with paper and pen,” reads the Willipedia page for WSO. “Abuse escalated at the end of October, at which time there were several pornographic images, violent threats, and racist, sexist, and homophobic posts to be found in the forums. On October 30, the forums were removed.”

“At that time, much of the internet was still learning to cope with trolls, spammers, and other malicious people, and we were as naive as other developers at the time,” Shimon Rura ’03, who was a member of WSO, recounted in an email to the Record. The following spring and summer, Rura would play a leading role in revamping the forums. In addition to quality-of-life additions like a message board for students seeking rides, a key feature of the updated site was the institution of an authentication process for users. Posters on the forums could no longer produce hateful content from behind a veil of anonymity.

Still, the questions raised by this outpouring of hateful speech had not become any less significant — and nearly two decades later, despite all of the aesthetic and cultural changes that have since occurred online, students at the College continue to explore them. What rules, regulations, and measures can best strike a balance between the safety of users and the robustness of discourse? Who should be in charge? And how far should they go?

 

‘I’m just here to make sure that no one’s putting racist stuff on the page’

For most of her time at the College, Onyeka Obi ’21 has been a moderator of the pre-eminent meme community on campus: the “Williams Memes for Sun-Dappled Tweens” Facebook group. Working alongside six other administrators and moderators, she helps manage a community of over 3,000 students and alums. Her philosophy as an administrator is relatively straightforward: Other than hateful or otherwise harmful content, pretty much anything goes.

“I’m just here to make sure that no one’s putting racist stuff on the page,” Obi said. “Bad posts are allowed because I don’t want to yuck anyone’s yum.”

This has not always been the case across the group’s nearly four-year lifespan. Record coverage from November 2018 quoted one of the group’s original administrators as saying, “We admins liked to exercise a sort of ironic dictatorship-like approach to content moderation; we’d delete things if we didn’t like them and had all sorts of arbitrary rules.”

“It was very much like a gaggle of white men and their friends running this page and being the arbiters of comedy on campus,” Obi said of the meme group’s original moderation team. “And I think that definitely rubbed people the wrong way. Because it’s like, who are you to determine what’s funny and what’s not?”

Obi emphasized that discourse within the group, while varied, still has its boundaries. “As I’ve existed on this campus — as a Black woman and a low-income woman, as multiple marginalized identities — I know that there is just literally no way in which you can operate without rules of some sort,” she said.

One such norm is that the group is meant to be a safe space for College students. Obi recalled that at one point in 2020, a student informed her that an alum had reported one meme to the College for speaking harshly of an administrator, an action she found frustrating. “I’m not interested in punitive consequences for jokes that are obviously not intended to harm anyone,” she said.

The meme group certainly has its serious side. Students have used memes to discuss changes in College policy — most recently its new visual identity — and posts from the group have been archived by Special Collections. Obi said that she has been pleased to see more posts criticizing the College’s administration and making fun of the absurdity of college life in a pandemic. But these developments have occurred organically: There was no overt effort by moderators to incentivize the creation of such content.

“I just want it to be a place where people can post dumb things, get dumb responses, and feel really good about something that they made and are sharing with the world,” Obi said.

 

‘This could be my friend’

Moderators on the Unmasked app, an anonymous forum for College students that is specifically oriented towards mental wellbeing, take a different approach. From the prompt that greets users at the top of their feeds — “How are you really feeling today?” — to the deliberate choice to disguise the number of likes that posts and comments receive, the app is designed to produce healthy and productive conversations relating to mental health.

An essential component of this is proactive moderation. Moderators reserve the right to delete comments that are unhelpful — “Somebody’s asking for advice, and somebody comments ‘lol.’ And then we just delete that,” Unmasked co-president Claudia Iannelli ’23 said. They even respond to posts if no one else has commented in order to promote engagement and make the original poster feel heard.

Williams Unmasked co-presidents Iannelli, Cailin Stollar ’21.5, and Katie Nath ’23, who together lead a team of ten student moderators, emphasized the importance of “productivity.”  

“Our general goal is having productive conversations,” Stollar said. “And I think with everything being anonymous, you can have a lot more productive conversations [on Unmasked] than on Facebook or something.”

“We’ve had to jump in a couple of times, like with back-and-forth arguments — people being like, ‘I’m right, you’re wrong,’” Iannelli added, referencing a recent argument in which students debated the degree to which it was acceptable to bend the College’s COVID-19 rules. “We literally commented one time, ‘Please stop this back-and-forth conversation, it’s not productive.’”

The app’s inherent anonymity allows users to confess or share things that they might be reluctant to attach their names to. Though heavier or potentially triggering subject matter comes with content warnings, student moderators of the app have a responsibility to keep tabs on such posts. They have not always found it easy.

“Sometimes when somebody talks about self harm or possibly suicide, and the post gets flagged, so we see it immediately, we reach out to them to make sure they’re okay — and we’re lucky in that every time we’ve reached out to somebody, they’ve said that they’re okay,” Stollar said. “And they’re not in immediate danger of hurting themselves. But there’s a little voice in the back of my head that’s like, ‘What if they’re not telling the truth? Like, what if they really might hurt themselves?’”

“Especially since you know it’s a Williams student, and you’re like, ‘This could be my friend,’” Iannelli added.

The difficulty and intensity of moderation vary across time, too. All three co-presidents identified the November 2020 election as a particularly stressful period on the app, both due to students who were anxious about the result and the political arguments that cropped up in the forum.

“What we try to do is make sure that no matter what your opinion is or how you think about something, you can share it on this app, and as long as you’re not directly antagonizing another person, you can share how you feel,” Nath said.

 

‘I felt like I had a responsibility to be there for people’

The experiences of student moderators at the College are bound to differ based on the size, composition, medium, and purposes of their respective communities. However, in being tasked with the maintenance of a public forum for their peers, they share common ground. 

Samir Ahmed ’24 is a moderator of the “Ephs Gang ’24” Discord, which boasts approximately 170 members. The Discord platform, which combines messaging, video, and voice chat features into one space, affords moderators a great deal of flexibility in configuring the user experience, and the first-year students in “Ephs Gang” have set up numerous different channels, chat bots, and quality-of-life improvements. The long list of different channels includes names such as “campus-pictures,” “course-schedule,” and “vent.” 

Ahmed expressed happiness at how the Discord has developed. Not only does it provide remote students with a way to connect with others in their class, but he said there have not been any especially bad situations or incidents within the group. Still, Ahmed told the Record, some moderation-related concerns are evergreen.

“It’s still in the back of your mind all the time that you want to never slip up, never want to end up hurting someone or making the atmosphere worse than it should be,” he said. “So I guess that added sense of responsibility, or that added sense that I really want to make this successful, was kind of draining at times. But it was never overbearing.”

Ahmed added that his position as a moderator motivated him to stay active in the chat as a peer. “I felt like I had a responsibility to be there for people, just be positive and be welcoming,” he said.

 

‘There’s always the potential for destructive attention-seeking behavior’

The COVID-19 pandemic has undoubtedly magnified the role of online communities in campus life. “Because everyone’s so digital now, this is the main way people communicate across campus instead of meeting in person,” Obi said. 

Ahmed similarly said that he and his fellow moderators took into account the fact that aside from online platforms like Discord, first years’ opportunities to socialize and meet new people were especially limited in the fall semester.

Yet, even as the role of the moderator has become more significant, not much has changed about the underlying forces at play. Student moderators still consider how to improve the cultures and content of their forums, and still worry about the consequences of harmful posts. In helping to shape conversations and create communities, they take on a leadership role among peers that is simultaneously powerful and understated. 

In the Record’s email exchange with Rura, who has worked as a software product manager and engineer since graduating from the College, much of what he said sounded like it could have come from any of the present-day student moderators — and the experiences which informed it could have been recounted by any of them.

“Across all communities — online and in-person — there’s always the potential for destructive attention-seeking behavior,” he wrote. “I don’t think this is unique to social media at all, because this challenge is fundamental to any society and manifests in all sorts of social norms … We’ve just had to evolve different controls in online media.”

A Discord may not look the same as a listserv; Facebook is far more than just any forum. But the continuity of the challenges that these students have faced, across platforms and across time, cannot be ignored. A techno-pessimist might claim that such continuity underscores the problematic aspects of social media. Others, including Rura, see it as a sign of something inescapably and innately human.