Parker Programmers Peek Behind the Tech Curtain

Where do your Facebook posts come from? Who decides what tweet you see at the top of your timeline when you log in to Twitter? While some may be content to leave these questions unanswered, Parker’s faculty understand the value of exploring these ideas, which have a tremendous impact on our lives and the future of technology as a whole. To that end, Middle and Upper School Technology and Innovation teacher Seth Bacon and Upper School Computer Science teacher Brianna Ifft oversaw a full day of conversation between Upper School Programming I students and expert and professional software engineer Emily Chao.

Chao graduated from the University of Illinois at Urbana-Champaign with a B.S. in computer science before landing a job at Twitter, where they worked on a team focusing on the “health of the conversation on Twitter.” Chao spent most of their time “working on detections and remediations that support user trust and safety and prevent harassment, abuse and manipulation in online discourse on the platform.” Leaving the tech giant behind, Chao then became the 40th employee at Recidiviz, where they work on tools and data platforms to help state Departments of Corrections better understand their populations and assist prisoners in advocating for themselves to help move the U.S. criminal justice systems toward decarceration. This experience has led Chao to care deeply about tech policy, the collective trust and safety in online discourse and ethical and fair ways to make use of Machine Learning/Artificial Intelligence technologies.

In addressing the current state of tech policy and how we arrived at this point, Chao explained that, initially, the Internet focused mostly on online message boards and private servers. Then came the Telecommunications Act of 1996, which operated on the bookshop precedent, similar to the idea that owners of an independent bookshop cannot read everything they sell, so another entity had the responsibility to monitor content. However, as more people obtain their news from the Internet, governments worldwide are becoming increasingly concerned about this lack of legal responsibility, considering the large role tech companies play in regulating and facilitating this information.

As content grew past the point of proper monitoring, companies recognized the need to add “operations, practices and solutions” to maintain their customers’ trust. And this was not limited to social media; financial websites need to enforce a feeling of safety for people who use credit or debit cards online. In response to a need for “Trust and Safety,” companies have implemented a Terms of Service (TOS) policy. However, because advancements in technology are outstripping the legal system, companies, rather than the government, are setting precedents in enforcing TOS.

Chao explained that companies use a threefold set of tools to monitor content
  • Features for users, such as “block” or “mute” or being able to report “I don’t like this”
  • Content moderators, who use policies, investigatory tools and rules engines to put certain offenders in remediation
  • Machine Learning Models, which detect violating content and help rank reports for moderators to review; engineers must first train the models—for example, a company like Twitter would train a model based on tweets that humans had labeled “abusive or not abusive,” deploy the model into Twitter where every new tweet would be assigned a toxicity score and, finally, take a set of actions based on this score
Although it is perhaps a necessary evil, Chao was very open with students about this third method’s faults. First, despite the vast amount of content, the volume of toxic content is relatively small, “almost like finding a needle in a haystack.” Next, deciding whether a piece of content is abusive is a very nuanced task that requires human input, a process that can be tedious and traumatizing for moderators. Further, sometimes these models can perpetuate discrimination when factors like systemic racism directly affect the data and make their way into the model. Finally, the engineers and others fighting to implement Trust and Safety sometimes encounter resistance from the company itself since, at the end of the day, they make more money when they show more content.

Chao ended the presentation speaking about changes they would like to see, beginning with more government involvement to check the largest tech companies, which means more tech-minded and “tech-adjacent people” to help these governments make decisions that aren’t based on a profit model. Next, more tech workers at the source will organize and begin movements like this example from the Engelberg Center on Innovation Law and Policy at the NYU School of Law. Finally, computer scientists need to bring more education to the general public.

After Chao finished speaking, the Programming I students asked numerous questions on a range of topics, from “How effective are the features users can use in determining what content you see?” to “How can we support tech workers who are beginning to advocate for themselves?”.

Ifft commented, “I’m encouraged by my students’ takeaways. Many of them shared that this talk will change the way they think about their privacy (or lack thereof) on the Internet, and I think they’re still mulling over the idea that they have to give up a bit of privacy to be kept safe. Overall, I’m glad they were challenged by Emily’s presentation and that it has them thinking; that was our goal going into this, and it seems it was accomplished.” Bacon added, “In a year when misinformation and disinformation played a central role in current events and national politics, this talk helped students wrap their minds around these nuanced issues and learn about the major players involved. Emily drew a line from their sophomore year Programming class to their current career, which students could connect to their personal experience.”

While Parker students are learning the fundamentals of programming, we applaud Bacon and Ifft with providing an expert who has succeeded in this field.

Click here for photos.
Back
Francis W. Parker School educates students to think and act with empathy, courage and clarity as responsible citizens and leaders in a diverse democratic society and global community.