Journal article 850 views 106 downloads
Likelihood of Questioning AI-Based Recommendations Due to Perceived Racial/Gender Bias
IEEE Transactions on Technology and Society, Volume: 3, Issue: 1, Pages: 41 - 45
Swansea University Author: Denis Dennehy
-
PDF | Version of Record
This work is licensed under a Creative Commons Attribution 4.0 License
Download (766.59KB)
DOI (Published version): 10.1109/tts.2021.3120303
Abstract
Advances in artificial intelligence (AI) are giving rise to a multitude of AI-embedded technologies that are increasingly impacting all aspects of modern society. Yet, there is a paucity of rigorous research that advances understanding of when, and which type of, individuals are more likely to quest...
Published in: | IEEE Transactions on Technology and Society |
---|---|
ISSN: | 2637-6415 |
Published: |
Institute of Electrical and Electronics Engineers (IEEE)
2022
|
Online Access: |
Check full text
|
URI: | https://cronfa.swan.ac.uk/Record/cronfa59910 |
first_indexed |
2022-04-27T11:58:10Z |
---|---|
last_indexed |
2022-05-05T03:32:10Z |
id |
cronfa59910 |
recordtype |
SURis |
fullrecord |
<?xml version="1.0"?><rfc1807><datestamp>2022-05-04T11:18:06.5961644</datestamp><bib-version>v2</bib-version><id>59910</id><entry>2022-04-27</entry><title>Likelihood of Questioning AI-Based Recommendations Due to Perceived Racial/Gender Bias</title><swanseaauthors><author><sid>ba782cbe94139075e5418dc9274e8304</sid><ORCID>0000-0001-9931-762X</ORCID><firstname>Denis</firstname><surname>Dennehy</surname><name>Denis Dennehy</name><active>true</active><ethesisStudent>false</ethesisStudent></author></swanseaauthors><date>2022-04-27</date><deptcode>CBAE</deptcode><abstract>Advances in artificial intelligence (AI) are giving rise to a multitude of AI-embedded technologies that are increasingly impacting all aspects of modern society. Yet, there is a paucity of rigorous research that advances understanding of when, and which type of, individuals are more likely to question AI-based recommendations due to perceived racial and gender bias. This study, which is part of a larger research stream contributes to knowledge by using a scenario-based survey that was issued to a sample of 387 U.S. participants. The findings suggest that considering perceived racial and gender bias, human resource (HR) recruitment and financial product/service procurement scenarios exhibit a higher questioning likelihood. Meanwhile, the healthcare scenario presents the lowest questioning likelihood. Furthermore, in the context of this study, U.S. participants tend to be more susceptible to questioning AI-based recommendations due to perceived racial bias rather than gender bias.</abstract><type>Journal Article</type><journal>IEEE Transactions on Technology and Society</journal><volume>3</volume><journalNumber>1</journalNumber><paginationStart>41</paginationStart><paginationEnd>45</paginationEnd><publisher>Institute of Electrical and Electronics Engineers (IEEE)</publisher><placeOfPublication/><isbnPrint/><isbnElectronic/><issnPrint/><issnElectronic>2637-6415</issnElectronic><keywords/><publishedDay>16</publishedDay><publishedMonth>3</publishedMonth><publishedYear>2022</publishedYear><publishedDate>2022-03-16</publishedDate><doi>10.1109/tts.2021.3120303</doi><url/><notes/><college>COLLEGE NANME</college><department>Management School</department><CollegeCode>COLLEGE CODE</CollegeCode><DepartmentCode>CBAE</DepartmentCode><institution>Swansea University</institution><apcterm>Not Required</apcterm><lastEdited>2022-05-04T11:18:06.5961644</lastEdited><Created>2022-04-27T12:54:40.0454942</Created><path><level id="1">Faculty of Humanities and Social Sciences</level><level id="2">School of Management - Business Management</level></path><authors><author><firstname>Carlos M.</firstname><surname>Parra</surname><orcid>0000-0001-6029-4512</orcid><order>1</order></author><author><firstname>Manjul</firstname><surname>Gupta</surname><order>2</order></author><author><firstname>Denis</firstname><surname>Dennehy</surname><orcid>0000-0001-9931-762X</orcid><order>3</order></author></authors><documents><document><filename>59910__23977__3a96736d03ba44759295ac5785b15598.pdf</filename><originalFilename>59910.pdf</originalFilename><uploaded>2022-05-04T11:04:42.6066754</uploaded><type>Output</type><contentLength>784987</contentLength><contentType>application/pdf</contentType><version>Version of Record</version><cronfaStatus>true</cronfaStatus><documentNotes>This work is licensed under a Creative Commons Attribution 4.0 License</documentNotes><copyrightCorrect>true</copyrightCorrect><language>eng</language><licence>https://creativecommons.org/licenses/by/4.0/</licence></document></documents><OutputDurs/></rfc1807> |
spelling |
2022-05-04T11:18:06.5961644 v2 59910 2022-04-27 Likelihood of Questioning AI-Based Recommendations Due to Perceived Racial/Gender Bias ba782cbe94139075e5418dc9274e8304 0000-0001-9931-762X Denis Dennehy Denis Dennehy true false 2022-04-27 CBAE Advances in artificial intelligence (AI) are giving rise to a multitude of AI-embedded technologies that are increasingly impacting all aspects of modern society. Yet, there is a paucity of rigorous research that advances understanding of when, and which type of, individuals are more likely to question AI-based recommendations due to perceived racial and gender bias. This study, which is part of a larger research stream contributes to knowledge by using a scenario-based survey that was issued to a sample of 387 U.S. participants. The findings suggest that considering perceived racial and gender bias, human resource (HR) recruitment and financial product/service procurement scenarios exhibit a higher questioning likelihood. Meanwhile, the healthcare scenario presents the lowest questioning likelihood. Furthermore, in the context of this study, U.S. participants tend to be more susceptible to questioning AI-based recommendations due to perceived racial bias rather than gender bias. Journal Article IEEE Transactions on Technology and Society 3 1 41 45 Institute of Electrical and Electronics Engineers (IEEE) 2637-6415 16 3 2022 2022-03-16 10.1109/tts.2021.3120303 COLLEGE NANME Management School COLLEGE CODE CBAE Swansea University Not Required 2022-05-04T11:18:06.5961644 2022-04-27T12:54:40.0454942 Faculty of Humanities and Social Sciences School of Management - Business Management Carlos M. Parra 0000-0001-6029-4512 1 Manjul Gupta 2 Denis Dennehy 0000-0001-9931-762X 3 59910__23977__3a96736d03ba44759295ac5785b15598.pdf 59910.pdf 2022-05-04T11:04:42.6066754 Output 784987 application/pdf Version of Record true This work is licensed under a Creative Commons Attribution 4.0 License true eng https://creativecommons.org/licenses/by/4.0/ |
title |
Likelihood of Questioning AI-Based Recommendations Due to Perceived Racial/Gender Bias |
spellingShingle |
Likelihood of Questioning AI-Based Recommendations Due to Perceived Racial/Gender Bias Denis Dennehy |
title_short |
Likelihood of Questioning AI-Based Recommendations Due to Perceived Racial/Gender Bias |
title_full |
Likelihood of Questioning AI-Based Recommendations Due to Perceived Racial/Gender Bias |
title_fullStr |
Likelihood of Questioning AI-Based Recommendations Due to Perceived Racial/Gender Bias |
title_full_unstemmed |
Likelihood of Questioning AI-Based Recommendations Due to Perceived Racial/Gender Bias |
title_sort |
Likelihood of Questioning AI-Based Recommendations Due to Perceived Racial/Gender Bias |
author_id_str_mv |
ba782cbe94139075e5418dc9274e8304 |
author_id_fullname_str_mv |
ba782cbe94139075e5418dc9274e8304_***_Denis Dennehy |
author |
Denis Dennehy |
author2 |
Carlos M. Parra Manjul Gupta Denis Dennehy |
format |
Journal article |
container_title |
IEEE Transactions on Technology and Society |
container_volume |
3 |
container_issue |
1 |
container_start_page |
41 |
publishDate |
2022 |
institution |
Swansea University |
issn |
2637-6415 |
doi_str_mv |
10.1109/tts.2021.3120303 |
publisher |
Institute of Electrical and Electronics Engineers (IEEE) |
college_str |
Faculty of Humanities and Social Sciences |
hierarchytype |
|
hierarchy_top_id |
facultyofhumanitiesandsocialsciences |
hierarchy_top_title |
Faculty of Humanities and Social Sciences |
hierarchy_parent_id |
facultyofhumanitiesandsocialsciences |
hierarchy_parent_title |
Faculty of Humanities and Social Sciences |
department_str |
School of Management - Business Management{{{_:::_}}}Faculty of Humanities and Social Sciences{{{_:::_}}}School of Management - Business Management |
document_store_str |
1 |
active_str |
0 |
description |
Advances in artificial intelligence (AI) are giving rise to a multitude of AI-embedded technologies that are increasingly impacting all aspects of modern society. Yet, there is a paucity of rigorous research that advances understanding of when, and which type of, individuals are more likely to question AI-based recommendations due to perceived racial and gender bias. This study, which is part of a larger research stream contributes to knowledge by using a scenario-based survey that was issued to a sample of 387 U.S. participants. The findings suggest that considering perceived racial and gender bias, human resource (HR) recruitment and financial product/service procurement scenarios exhibit a higher questioning likelihood. Meanwhile, the healthcare scenario presents the lowest questioning likelihood. Furthermore, in the context of this study, U.S. participants tend to be more susceptible to questioning AI-based recommendations due to perceived racial bias rather than gender bias. |
published_date |
2022-03-16T02:27:52Z |
_version_ |
1821370716243623936 |
score |
11.04748 |