Algorithmic Hiring Bias

While applying for summer internships, I noticed that many companies required that I complete a "game" at the screening stage to determine whether I got an interview. I quickly became curious how a game could determine my deservingness of a role and discovered that companies use hiring algorithms to compare applicants' performances on these games to current employees. This instantly worried me as someone who is a minority in the careers I am pursuing. If companies are comparing me to a group of people with different identities and experiences, how will the companies judge me fairly with these "games"?

Therefore, I decided to research algorithmic hiring bias with a focus on gender to understand why companies use them, the bias within them, and the computational priorities behind them. This research culminated in a presentation addressing these questions and implementing potential engineering-based solutions to address problems with these models.

Slide Presentation

I created this slide deck to assist my scripted presentation for individuals familiar with algorithmic hiring but do not know much about bias in this process.
I focused on visuals to ensure the audience focuses on the words the speaker says. Instead of just using bullet points to follow my speaking points, I utilized colors, diagrams, and pictures to make the presentation more engaging. I also utilized black slides to ensure listeners would not be distracted by the screen when it did not supplement my presentation.

The presentation above stemmed from my research based argument paper called Bias in Algorithmic Hiring. This paper dives deeper than the presentation into case studies and details regarding the topic.

Integrating De-Biasing Methods In Coursework

During my research, I discussed this project with my linear algebra professor, Stephen Boyd. In this discussion, I told him how topics from the course like constrained least squares and multi-objective least squares could help de-bias algorithmic hiring processes. We also discussed how we rarely mention societally positive applications of linear algebra, such as de-biasing, in the course.

We agreed that de-biasing should be something that students are exposed to in the course, so they can understand how the concepts they are learning can directly make an impact in de-biasing algorithmic processes, so we made a course problem on algorithmic hiring. Here, the student audience was key: by embedding an example of socially impactful math directly into a standard curriculum, I could spark broader interest and reflection among aspiring engineers and data scientists. Integrating real-world bias concerns and the practical ways to mitigate them offers a more concrete and relevant learning experience, bridging linear algebra with ethical and societal applications.

I grew significantly as a communicator through creating a professional slide deck and a course problem, recognizing how each audience required a unique modality, level of detail, and rhetorical emphasis. Looking ahead, I plan to continue weaving real-world societal issues into core engineering coursework during my graduate studies, especially by crafting more problems that link data science with ethics and social impact. By investigating algorithmic hiring bias, I gained deeper insight into equity in hiring processes, discovering how math and machine learning can both perpetuate and combat discrimination. This exploration involved examining the problem through analyzing historical trends, designing and presenting potential solutions, and ultimately demonstrating how a linear algebra-based approach can mitigate bias in screening algorithms.

← Design: Wildfire Recovery