Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

This article reports how 18 UK and Canadian population health artificial intelligence researchers in Higher Education Institutions perceive the use of artificial intelligence systems in their research, and how this compares with their perceptions about the media portrayal of artificial intelligence systems. This is triangulated with a small scoping analysis of how UK and Canadian news articles portray artificial intelligence systems associated with health research and care. Interviewees had concerns about what they perceived as sensationalist reporting of artificial intelligence systems - a finding reflected in the media analysis. In line with Pickersgill's concept of 'epistemic modesty', they considered artificial intelligence systems better perceived as non-exceptionalist methodological tools that were uncertain and unexciting. Adopting 'epistemic modesty' was sometimes hindered by stakeholders to whom the research is disseminated, who may be less interested in hearing about the uncertainties of scientific practice, having implications on both research and policy.

Original publication

DOI

10.1177/0963662520965490

Type

Journal article

Journal

Public understanding of science (Bristol, England)

Publication Date

02/2021

Volume

30

Pages

196 - 211