The title of the research article is "Unsupervised Discovery of Gendered Language through Latent-Variable Modeling." We think it might be more aptly titled, "How Computers Figured Out What Women Already Knew."
You can read a quick summary of what the computers did here. And a list of the words that it identified--and the value judgments ascribed to them (positive, negative).
The words describing women offer us little or no agency. Further, "unmarried" is a negative word describing a woman and "unarmed" is a negative word describing a man. We're considering writing a whole separate piece just on those two words and value judgments.
Here's how the process worked:
“The algorithms work to identify patterns, and whenever one is observed, it is perceived that something is ‘true.’ If any of these patterns refer to biased language, the result will also be biased. The systems adopt, so to speak, the language that we people use, and thus, our gender stereotypes and prejudices,” says computer scientist and assistant profession Isabelle Augenstein.
Augenstein gives an example about how this impacts our daily lives--and economic prospects: “If the language we use to describe men and women differs in employee recommendations, for example, it will influence who is offered a job when companies use IT systems to sort through job applications.”
Today's reminder that words matter.