sciencenewsnet.in

We Know AI is Biased; This Design Approach May Help Fix It

Bias in artificial intelligence (AI) and machine learning programs is well established. Researchers from North Carolina State University and Pennsylvania State University are now proposing that software developers incorporate the concept of “feminist design thinking” into their development process as a way of improving equity – particularly in the development of software used in the hiring process.

“There seem to be countless stories of ways that bias in AI is manifesting itself, and there are many thought pieces out there on what contributes to this bias,” says Fay Payton, a professor of information systems/technology and University Faculty Scholar at NC State. “Our goal here was to put forward guidelines that can be used to develop workable solutions to algorithm bias against women, African American and Latinx professions in the IT workforce.

“Too many existing hiring algorithms incorporate de facto identity markers that exclude qualified candidates because of their gender, race, ethnicity, age and so on,” says Payton, who is co-lead author of a paper on the work. “We are simply looking for equity – that job candidates be able to participate in the hiring process on an equal footing.”

Payton and her collaborators argue that an approach called feminist design thinking could serve as a valuable framework for developing software that reduces algorithmic bias in a meaningful way. In this context, the application of feminist design thinking would mean incorporating the idea of equity into the design of the algorithm itself.

“Compounding the effects of algorithmic bias is the historical underrepresentation of women, Black and Latinx software engineers to provide novel insights regarding equitable design approaches based on their lived experiences,” says Lynette Yarger, co-lead author of the paper and an associate professor of information sciences and technology at Penn State.

“Essentially, this approach would mean developing algorithms that value inclusion and equity across gender, race and ethnicity,” Payton says. “The practical application of this is the development and implementation of a process for creating algorithms in which designers are considering an audience that includes women, that includes Black people, that includes Latinx people. Essentially, developers of all backgrounds would be called on to actively consider – and value – people who are different from themselves.

“To be clear, this is not just about doing something because it is morally correct. But we know that women, African Americans and Latinx people are under-represented in IT fields. And there is ample evidence that a diverse, inclusive workforce improves a company’s bottom line,” Payton says. “If you can do the right thing and improve your profit margin, why wouldn’t you?”

The paper, “Algorithmic equity in the hiring of underrepresented IT job candidates,” is published in the journal Online Information Review. The paper was co-authored by Bikalpa Neupane of Penn State.

Original post https://alertarticles.info