Skip To Main Content
A team of Texas A&M University students next to their faculty mentor and teaching assistant after a presentation.
Under the guidance of their faculty mentor Pauline Wade (center) and teaching assistant Sai Harini (second from right), the team of students worked to help bring awareness to and address the long-standing problem of unconscious bias in syllabi. From left, Seungjeh Lee, Christopher Mei, Khunnapat Reanpongnam, Wade, Annette Morales, Harini and Brandon Gonzales. | Image: Courtesy of Pauline Wade

As it can occur unintentionally or without conscious realization, most people are not even aware when bias occurs. Without that awareness, it becomes hard to prevent. For their capstone design project, a team of five Texas A&M University computer science and engineering students built a software system to address the bias while also helping bring awareness to this long-standing problem.

The team includes Brandon Gonzales, Seungjeh Lee, Christopher Mei, Annette Morales and Khunnapat Reanpongnam.  

Using machine-learning models that can identify the presence of unconscious bias in syllabi, the students aimed to understand how language can impact students from underrepresented minority groups. One way bias can occur in a class syllabus is by using language that suggests the students should have prior knowledge of a program or skill, even though there’s no prerequisite for the class. Another is by using a singular pronoun when referring to all students.

“Unconscious bias comes in many forms, but we are trying to solve the issue that there could be language in a class syllabi directed at certain people that makes them feel like they’re being discriminated against and shouldn’t take a particular class,” said Mei. “It can lead to them to feel like they aren’t prepared, so then they drop the class.”

The students worked with Microsoft to develop the system and had the unique opportunity to be mentored by former computer science and engineering student Josue Martinez-Montero ’13, who serves as director for product strategy and customer engagement and fraud protection.  

“Coming back to the capstone program, this time as an industry mentor, was a fantastic experience,” Martinez-Montero said. “I had a great time working with the senior design team on a project that has industry-wide implications and considerable impact. The program at Texas A&M gave me real-world experience and a chance to work on projects that go beyond what you can experience in a purely academic setting.”

The students expressed how they were all motivated and inspired to take on the challenge of bias in academia by Martinez-Montero’s presentation on the topic.

“I knew our mentor because he was a recruiter at the career fair my freshman year,” Lee said. “He explained to us how important the topic of unconscious bias is in the tech industry right now and how everyone’s trying to tackle it. I thought it would be cool for us to understand what’s going on.”

As academic settings are not immune from the effects of bias, unconscious bias in syllabi can cause several issues, including making students feel intimidated or alienated in their classes.

“My initial understanding of unconscious bias was limited to thinking it was only done toward ethnic or racial groups. But there are many types of bias,” said Gonzales. “Our mentor pointed out why a syllabus shouldn’t have any negative sentiment. You’re paying to attend university and these classes; it shouldn’t discourage you for any reason.”

To prove their hypothesis, the team manually combed through old syllabi to identify if words or phrases they came across were biased. This task proved to be a bigger challenge for the team than they initially expected.

“Gathering the data to train the model was one of the hardest parts of this project,” said Morales. “We had to make sure we weren’t feeding any of our own biases into the model while classifying different words because then that would make any data the model puts out biased as well.”

Using that data library, the team then used supervised learning to train their system to recognize keywords or phrases they classified as highly biased. Then, when they fed a new syllabus into the system, it was evaluated by an algorithm and given a score that represents how biased the syllabus is determined to be.

Overall, the team expressed how the learning experience the capstone program provides would help prepare them for their future careers. 

“Working as a team to find a common goal, you have to learn from others,” Mei said. “Coding and basic stuff like that was my practice, and I thought I learned a lot from this class that I didn’t know before going in, especially with machine learning. Gaining experiences that I never had before was something that I really enjoyed.”

The collaboration with Microsoft was made possible by the department’s industry capstone program, where companies can sponsor a team of students to work on a solution to a multi-disciplinary computing problem.