{"id":591571,"date":"2019-06-03T00:00:00","date_gmt":"2019-06-03T07:00:00","guid":{"rendered":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/?post_type=msr-research-item&#038;p=591571"},"modified":"2019-06-07T08:13:50","modified_gmt":"2019-06-07T15:13:50","slug":"dissecting-racial-bias-in-an-algorithm-that-guides-health-decisions-for-millions","status":"publish","type":"msr-video","link":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/video\/dissecting-racial-bias-in-an-algorithm-that-guides-health-decisions-for-millions\/","title":{"rendered":"Dissecting Racial Bias in an Algorithm that Guides Health Decisions for Millions"},"content":{"rendered":"<p>For millions of patients across the US, hospitals use commercial risk scores to target those needing extra help with complex health needs. We examine a widely used commercial algorithm for racial bias. Thanks to a unique dataset, we also study the algorithm\u2019s construction, gaining a rare window into the mechanisms of bias. We find significant racial bias: at the same risk score, blacks are considerably sicker than whites. Removing bias would double the number of high-risk blacks auto-identified for extra help, from 17.7% to 46.5%. We isolate the problem to the algorithm\u2019s objective function: it predicts costs, and since blacks incur lower costs than whites conditional on health, accurate cost predictions produce racially biased health predictions. We find suggestive evidence of a \u201cproblem formulation error\u201d: as algorithmic prediction is in a nascent stage, convenient choices of proxy labels to predict (in this case, cost) can inadvertently produce biases at scale.<\/p>\n<p><a href=\"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/wp-content\/uploads\/2019\/06\/Dissecting-Racial-Bias-in-an-Algorithm-that-Guides-Health-Decisions-for-Millions-slides.pdf\">[SLIDES]<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>For millions of patients across the US, hospitals use commercial risk scores to target those needing extra help with complex health needs. We examine a widely used commercial algorithm for racial bias. Thanks to a unique dataset, we also study the algorithm\u2019s construction, gaining a rare window into the mechanisms of bias. We find significant [&hellip;]<\/p>\n","protected":false},"featured_media":591583,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr_hide_image_in_river":0,"footnotes":""},"research-area":[13556,13563],"msr-video-type":[],"msr-locale":[268875],"msr-post-option":[],"msr-session-type":[],"msr-impact-theme":[],"msr-pillar":[],"msr-episode":[],"msr-research-theme":[],"class_list":["post-591571","msr-video","type-msr-video","status-publish","has-post-thumbnail","hentry","msr-research-area-artificial-intelligence","msr-research-area-data-platform-analytics","msr-locale-en_us"],"msr_download_urls":"","msr_external_url":"https:\/\/youtu.be\/y6eo0FZIqjk","msr_secondary_video_url":"","msr_video_file":"","_links":{"self":[{"href":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/591571","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-video"}],"about":[{"href":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-video"}],"version-history":[{"count":2,"href":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/591571\/revisions"}],"predecessor-version":[{"id":591580,"href":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/591571\/revisions\/591580"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/media\/591583"}],"wp:attachment":[{"href":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/media?parent=591571"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=591571"},{"taxonomy":"msr-video-type","embeddable":true,"href":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-video-type?post=591571"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=591571"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=591571"},{"taxonomy":"msr-session-type","embeddable":true,"href":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-session-type?post=591571"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=591571"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=591571"},{"taxonomy":"msr-episode","embeddable":true,"href":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-episode?post=591571"},{"taxonomy":"msr-research-theme","embeddable":true,"href":"https:\/\/new-cm-edgedigital.pages.dev\/en-us\/research\/wp-json\/wp\/v2\/msr-research-theme?post=591571"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}