GSTDTAP  > 气候变化
DOI10.1126/science.abe9195
Achieving fairness in medical devices
Achuta Kadambi
2021-04-02
发表期刊Science
出版年2021
英文摘要The hardware or software that operates medical devices can be biased. A biased device is one that operates in a manner that disadvantages certain demographic groups and influences health inequity. As one measure of fairness, reducing bias is related to increasing fairness in the operation of a medical device. Initiatives to promote fairness are rapidly growing in a range of technical disciplines, but this growth is not rapid enough for medical engineering. Although computer science companies terminate lucrative but biased facial recognition systems, biased medical devices continue to be sold as commercial products. It is important to address bias in medical devices now. This can be achieved by studying where and how bias arises, and understanding these can inform mitigation strategies. Bias in medical devices can be divided into three broad forms (see the figure). A medical device can exhibit physical bias, where physical principles are biased against certain demographics. Once data are collected, computational bias, which pertains to the distribution, processing, and computation of data that are used to operate a device, must be considered. Subsequent implementation in clinical settings can lead to interpretation bias, where clinical staff or other users may interpret device outputs differently based on demographics. The physical working principle of a medical device is biased when it exhibits an undesirable performance variation across demographic groups. An example of physical bias occurs in the context of optical biosensors that use light to monitor vital signs. A pulse oximeter uses two colors of light (one in near-infrared and the other in visible light) to measure blood oxygenation. Through the pulse oximeter, it is possible to diagnose occult hypoxemia, low levels of arterial oxygen saturation that are not detectable from symptoms. However, a recent study found that Black patients had about three times the frequency of undiagnosed occult hypoxemia as measured by pulse oximeters ([ 1 ][1]). Dark skin tones respond differently to these wavelengths of light, particularly visible light. Because hypoxemia relates to mortality, such a biased medical device could lead to disparate mortality outcomes for Black and dark-skinned patients. Physical bias is not restricted to skin color. For example, the mechanical design of implants for hip replacement exhibits a potentially troubling gender disparity. The three-dimensional models used to design hip-joint implants sometimes do not account for the distinct bone structure of female hips ([ 2 ][2]). This could lead to alignment issues and relatively poor outcomes for affected females. This problem was one motivation for the development of gender-specific implants. Fortunately, physical challenges can also be addressed through unexpected technical innovation, such as in the example of the remote plethysmograph. This device measures heart rate through visual changes in skin color. Because visual cues are biased, researchers developed an alternative approach using motion cues to estimate heart rate. Because motions are visible on the surface of skin, the technique is less biased by subsurface melanin content ([ 3 ][3]). With the goal of promoting fairness, an exciting technical direction of studying motion cues instead of color cues has been advanced. ![Figure][4] Measuring fairness Fairness can be quantified based on ϵ-bias. Fairness is maximized when ϵ = 0, achieving a state of 0-bias. GRAPHIC: N. DESAI/ SCIENCE Computational workflows are becoming more tightly coupled with devices, which increases the number of entry points where computational bias can invade medical technologies. An aspect of computational bias is dataset bias. Consider the following example from x-ray imaging: Diagnostic algorithms can learn patterns from x-ray imaging datasets of thoracic conditions. However, these imaging datasets often contain a surprising imbalance, where females are underrepresented. For example, despite having a sample size of more than 100,000 images, frequently used chest x-ray databases are ∼60% male and ∼40% female ([ 4 ][5]). This imbalance worsens the quality of diagnosis for female patients. A solution is to ensure that datasets are balanced. Somewhat unexpectedly, balancing the gender representation to 50% female boosts diagnostic performance not only for females but also for males ([ 4 ][5]). Despite best efforts, demographic balancing of a dataset might not be possible. This could be due to conditions that present more often in one sex than the other. In such cases where balancing a dataset is truly infeasible, transfer learning can be used as a step toward a longer-term solution ([ 5 ][6]). Transfer learning could repurpose design parameters from task A (based on a balanced dataset) to task B (with an unbalanced dataset). In the future, it might be possible to balance a dataset using a human digital twin. These are computational models that can be programmed to reflect a desired race, sex, or morphological trait. Another form of computational bias is algorithm bias, where the mathematics of data processing disadvantages certain groups. Now, software algorithms are able to process video streams to detect the spontaneous blink rate of a human subject. This is helpful in diagnosing a variety of neurological disorders, including Parkinson's disease ([ 6 ][7]) and Tourette syndrome ([ 7 ][8]). Unfortunately, traditional image-processing systems have particular difficulty in detecting blinks for Asian individuals ([ 8 ][9]). The use of such poorly designed and biased algorithms ([ 9 ][10]) could produce or exacerbate health disparities between racial groups. Interpretation bias occurs when a medical device is subject to biased inference of readings. An example of a misinterpreted medical device is the spirometer, which measures lung capacity. The interpretation of spirometry data creates unfairness because certain ethnic groups, such as Black or Asian, are assumed to have lower lung capacity than white people: 15% lower for Black people and about 5% lower for Asian people. This assumption is based on earlier studies that may have incorrectly estimated innate lung capacity ([ 10 ][11]). Unfortunately, these “correction factors,” based on questionable assumptions, are applied to the interpretation of spirometer data. For example, before “correction,” a Black person's lung capacity might be measured to be lower than the lung capacity of a white person. After “correction” to a smaller baseline lung capacity, treatment plans would prioritize the white person, because it is expected that a Black person should have lower lung capacity, and so their capacity must be much lower than that of a white person before their reduction is considered a priority. ![Figure][4] Bias in medical devices A device can be biased if its design disadvantages certain groups on the basis of their physical attributes, such as skin color. For example, pulse oximeters (see the photo) detect changes in light passed through skin and are less effective in people with dark skin. Computational techniques are biased if training datasets are not representative of the population. Interpretation of results may be biased according to demographic groups, for example, with the use of “correction factors.” CREDIT: N. DESAI/ SCIENCE However well intentioned, errors in “correction” for race (or sex) can disadvantage the groups it seeks to protect. In the spirometer example, the device designers conflated a racial group's healthy lung capacity with their average lung capacity. This assumption does not account for socioeconomic distinctions across race: Individuals who live near motorways exhibit reduced lung capacity, and these individuals are often from disadvantaged ethnic groups. The spirometer is just one of several examples of systemic racism in medicine ([ 11 ][12]). If our society desires fair medical devices, it must reward a fair approach to innovation. It is inspiring to observe the speed at which the artificial intelligence (AI) community has recognized fairness in its endeavors. Authors can be encouraged by journals to address the societal implications of their technologies and include a “broader impacts” statement that is considered in peer review. This has already been introduced at an AI journal to encourage consideration of the diversity of potential users of their software ([ 12 ][13]). Fairness research in AI is increasingly garnering scholarly acclaim. For example, a seminal report highlighted the widespread problem of bias in face recognition, which found that darker-skinned females are misclassified at rates up to 34.7% while the maximum error rate for lighter-skinned males is only 0.8% ([ 13 ][14]). In response to concerns of fairness, action is being taken. For example, Amazon Inc. has recently banned the use of its facial-recognition products by police until bias concerns can be resolved. There is still a long way to go in addressing bias in AI, but some of the lessons learned can be repurposed to medical devices. A “fairness” statement for the evaluation of studies of medical devices could use the three categories of bias as a rubric: physical bias, computational bias, and interpretation bias. A medical-device study does not need to be perfectly unbiased to be reported. Indeed, it may not always be possible to remove all sources of bias. For example, an oximeter reliant on an optical sensor is likely to remain biased against dark skin ([ 1 ][1]). The fairness statement can consist of technical explanations for how attempts to mitigate bias failed and suggest technical compensations for disadvantaged groups (e.g., collect additional data points for dark-skinned people). This is consistent with the introduction of “positive biases,” where race-aware and gender-aware methodologies are explicitly designed to counteract negative bias ([ 14 ][15]). Additionally, the inclusion of fairness metrics in studies of medical devices could be considered. Choosing the right fairness metric of an algorithm is a quantitatively challenging computer science exercise ([ 15 ][16]) and can be abstracted here as “ϵ-bias,” where ϵ quantifies the degree of bias across subgroups. For example, 0-bias would be seen as perfectly fair. Achieving 0-bias on its own is trivial: Simply return a measurement that is consistently useless across demographics. The problem is to maximize performance and minimize ϵ-bias. This may present a Pareto trade-off, where maximizing the performance and minimizing bias are objectives at odds with each other. A Pareto curve can quantitatively display how changing device configuration varies the balance between performance and fairness (see the graph). Such analyses might be a useful inclusion in medical-device studies. Achieving fairness in medical devices is a key piece of the puzzle, but a piece nonetheless. Even if one manages to engineer a fair medical device, it could be used by a clinical provider who has conscious or subconscious bias. And even a fair medical device from an engineering perspective might be inaccessible to a range of demographic groups, owing to socioeconomic reasons. Several open questions remain. What is an acceptable trade-off between device performance and fairness? It is also important to consider how biases that are not easy to predict or easy to observe at scale can be dealt with. Race and sex are also part of human biology. How can positive biases be properly encoded into medical-device design? Diversity and inclusion have gained increasing attention, and the era of fair medical devices is only just beginning. 1. [↵][17]1. M. W. Sjoding et al ., N. Engl. J. Med. 383, 2477 (2020). [OpenUrl][18][CrossRef][19][PubMed][20] 2. [↵][21]1. C. W. Hartman et al ., Semin. Arthroplasty 20, 62 (2009). [OpenUrl][22] 3. [↵][23]1. G. Balakrishnan, 2. F. Durand, 3. J. Guttag , in Proceedings of the 2013 IEEE Conference on Computer Vision and Pattern Recognition (IEEE Computer Society, 2013), pp. 3430–3437. 4. [↵][24]1. A. J. Larrazabal, 2. N. Nieto, 3. V. Peterson, 4. D. H. Milone, 5. E. Ferrante , Proc. Natl. Acad. Sci. U.S.A. 117, 12592 (2020). [OpenUrl][25][Abstract/FREE Full Text][26] 5. [↵][27]1. F. Doshi-Velez et al. 1. S. Jabbour et al ., in Proceedings of the Fifth Machine Learning for Healthcare Conference, F. Doshi-Velez et al., Eds. (Proceedings of Machine Learning Research, 2020), pp. 750–782. 6. [↵][28]1. R. Sandyk , Int. J. Neurosci. 51, 99 (1990). [OpenUrl][29][CrossRef][30][PubMed][31][Web of Science][32] 7. [↵][33]1. C. N. Karson et al ., J. Nerv. Ment. Dis. 173, 566 (1985). [OpenUrl][34][PubMed][35] 8. [↵][36]1. J. Zou, 2. L. Schiebinger , Nature 559, 324 (2018). [OpenUrl][37][CrossRef][38][PubMed][39] 9. [↵][40]1. Z. Obermeyer et al ., Science 366, 447 (2019). [OpenUrl][41][Abstract/FREE Full Text][42] 10. [↵][43]1. L. Braun , Breathing Race into the Machine: The Surprising Career of the Spirometer from Plantation to Genetics (Univ. of Minnesota Press, 2014). 11. [↵][44]1. A. H. Wingfield , Science 369, 351 (2020). [OpenUrl][45][Abstract/FREE Full Text][46] 12. [↵][47]1. B. Hecht et al ., “It's time to do something: Mitigating the negative impacts of computing through a change to the peer review process,” ACM Future of Computing Blog, 29 March 2018; . 13. [↵][48]1. S. A. Friedler, 2. C. Wilson 1. J. Buolamwini, 2. T. Gebru , in Proceedings of the Conference on Fairness, Accountability and Transparency, S. A. Friedler, C. Wilson, Eds. (Proceedings of Machine Learning Research, 2018), pp. 77–91. 14. [↵][49]1. D. Cirillo et al ., NPG Digi. Med. 3, 81 (2020). [OpenUrl][50] 15. [↵][51]1. C. H. Papadimitirou 1. J. Kleinberg, 2. S. Mullainathan, 3. M. Raghavan , in Proceedings of the Eighth Innovations in Theoretical Computer Science Conference, C. H. Papadimitirou, Ed. (Schloss Dagstuhl, 2017), pp. 43:1–43:23. Acknowledgments: I thank P. Chari, L. Jalilian, K. Kabra, M. Savary, M. Majmudar, and the Engineering 87 class at UCLA for constructive feedback. I am supported by a National Science Foundation CAREER grant (IIS-2046737), Google Faculty Award, and Sony Imaging Young Faculty Award. [1]: #ref-1 [2]: #ref-2 [3]: #ref-3 [4]: pending:yes [5]: #ref-4 [6]: #ref-5 [7]: #ref-6 [8]: #ref-7 [9]: #ref-8 [10]: #ref-9 [11]: #ref-10 [12]: #ref-11 [13]: #ref-12 [14]: #ref-13 [15]: #ref-14 [16]: #ref-15 [17]: #xref-ref-1-1 "View reference 1 in text" [18]: {openurl}?query=rft.jtitle%253DN.%2BEngl.%2BJ.%2BMed.%26rft.volume%253D383%26rft.spage%253D2477%26rft_id%253Dinfo%253Adoi%252F10.1056%252FNEJMc2029240%26rft_id%253Dinfo%253Apmid%252F33326721%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [19]: /lookup/external-ref?access_num=10.1056/NEJMc2029240&link_type=DOI [20]: /lookup/external-ref?access_num=33326721&link_type=MED&atom=%2Fsci%2F372%2F6537%2F30.atom [21]: #xref-ref-2-1 "View reference 2 in text" [22]: {openurl}?query=rft.jtitle%253DSemin.%2BArthroplasty%26rft.volume%253D20%26rft.spage%253D62%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [23]: #xref-ref-3-1 "View reference 3 in text" [24]: #xref-ref-4-1 "View reference 4 in text" [25]: {openurl}?query=rft.jtitle%253DProc.%2BNatl.%2BAcad.%2BSci.%2BU.S.A.%26rft_id%253Dinfo%253Adoi%252F10.1073%252Fpnas.1919012117%26rft_id%253Dinfo%253Apmid%252F32457147%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [26]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoicG5hcyI7czo1OiJyZXNpZCI7czoxMjoiMTE3LzIzLzEyNTkyIjtzOjQ6ImF0b20iO3M6MjE6Ii9zY2kvMzcyLzY1MzcvMzAuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9 [27]: #xref-ref-5-1 "View reference 5 in text" [28]: #xref-ref-6-1 "View reference 6 in text" [29]: {openurl}?query=rft.jtitle%253DThe%2BInternational%2Bjournal%2Bof%2Bneuroscience%26rft.stitle%253DInt%2BJ%2BNeurosci%26rft.aulast%253DSandyk%26rft.auinit1%253DR.%26rft.volume%253D51%26rft.issue%253D1-2%26rft.spage%253D99%26rft.epage%253D103%26rft.atitle%253DThe%2Bsignificance%2Bof%2Beye%2Bblink%2Brate%2Bin%2Bparkinsonism%253A%2Ba%2Bhypothesis.%26rft_id%253Dinfo%253Adoi%252F10.3109%252F00207459009000515%26rft_id%253Dinfo%253Apmid%252F2265915%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [30]: /lookup/external-ref?access_num=10.3109/00207459009000515&link_type=DOI [31]: /lookup/external-ref?access_num=2265915&link_type=MED&atom=%2Fsci%2F372%2F6537%2F30.atom [32]: /lookup/external-ref?access_num=A1990DD55900014&link_type=ISI [33]: #xref-ref-7-1 "View reference 7 in text" [34]: {openurl}?query=rft.jtitle%253DThe%2BJournal%2Bof%2Bnervous%2Band%2Bmental%2Bdisease%26rft.stitle%253DJ%2BNerv%2BMent%2BDis%26rft.aulast%253DKarson%26rft.auinit1%253DC.%2BN.%26rft.volume%253D173%26rft.issue%253D9%26rft.spage%253D566%26rft.epage%253D569%26rft.atitle%253DEye-blink%2Brate%2Bin%2BTourette%2527s%2Bsyndrome.%26rft_id%253Dinfo%253Apmid%252F3860628%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [35]: /lookup/external-ref?access_num=3860628&link_type=MED&atom=%2Fsci%2F372%2F6537%2F30.atom [36]: #xref-ref-8-1 "View reference 8 in text" [37]: {openurl}?query=rft.jtitle%253DNature%26rft.volume%253D559%26rft.spage%253D324%26rft_id%253Dinfo%253Adoi%252F10.1038%252Fd41586-018-05707-8%26rft_id%253Dinfo%253Apmid%252Fhttp%253A%252F%252Fwww.n%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [38]: /lookup/external-ref?access_num=10.1038/d41586-018-05707-8&link_type=DOI [39]: /lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Fsci%2F372%2F6537%2F30.atom [40]: #xref-ref-9-1 "View reference 9 in text" [41]: {openurl}?query=rft.jtitle%253DScience%26rft.stitle%253DScience%26rft.aulast%253DObermeyer%26rft.auinit1%253DZ.%26rft.volume%253D366%26rft.issue%253D6464%26rft.spage%253D447%26rft.epage%253D453%26rft.atitle%253DDissecting%2Bracial%2Bbias%2Bin%2Ban%2Balgorithm%2Bused%2Bto%2Bmanage%2Bthe%2Bhealth%2Bof%2Bpopulations%26rft_id%253Dinfo%253Adoi%252F10.1126%252Fscience.aax2342%26rft_id%253Dinfo%253Apmid%252F31649194%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [42]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6Mzoic2NpIjtzOjU6InJlc2lkIjtzOjEyOiIzNjYvNjQ2NC80NDciO3M6NDoiYXRvbSI7czoyMToiL3NjaS8zNzIvNjUzNy8zMC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30= [43]: #xref-ref-10-1 "View reference 10 in text" [44]: #xref-ref-11-1 "View reference 11 in text" [45]: {openurl}?query=rft.jtitle%253DScience%26rft.stitle%253DScience%26rft.aulast%253DWingfield%26rft.auinit1%253DA.%2BH.%26rft.volume%253D369%26rft.issue%253D6502%26rft.spage%253D351%26rft.epage%253D351%26rft.atitle%253DSystemic%2Bracism%2Bpersists%2Bin%2Bthe%2Bsciences%26rft_id%253Dinfo%253Adoi%252F10.1126%252Fscience.abd8825%26rft_id%253Dinfo%253Apmid%252F32703851%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [46]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6Mzoic2NpIjtzOjU6InJlc2lkIjtzOjEyOiIzNjkvNjUwMi8zNTEiO3M6NDoiYXRvbSI7czoyMToiL3NjaS8zNzIvNjUzNy8zMC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30= [47]: #xref-ref-12-1 "View reference 12 in text" [48]: #xref-ref-13-1 "View reference 13 in text" [49]: #xref-ref-14-1 "View reference 14 in text" [50]: {openurl}?query=rft.jtitle%253DNPG%2BDigi.%2BMed.%26rft.volume%253D3%26rft.spage%253D81%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [51]: #xref-ref-15-1 "View reference 15 in text"
领域气候变化 ; 资源环境
URL查看原文
引用统计
文献类型期刊论文
条目标识符http://119.78.100.173/C666/handle/2XK7JSWQ/321124
专题气候变化
资源环境科学
推荐引用方式
GB/T 7714
Achuta Kadambi. Achieving fairness in medical devices[J]. Science,2021.
APA Achuta Kadambi.(2021).Achieving fairness in medical devices.Science.
MLA Achuta Kadambi."Achieving fairness in medical devices".Science (2021).
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Achuta Kadambi]的文章
百度学术
百度学术中相似的文章
[Achuta Kadambi]的文章
必应学术
必应学术中相似的文章
[Achuta Kadambi]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。