EQUAL PROTECTION UNDER ALGORITHMS: A NEW STATISTICAL AND LEGAL FRAMEWORK

In this Article, we provide a new statistical and Iegal framework to understand the legality and fairness of predictive algorithms under the Equal Protection Clause. We begin by reviewing the main legal concerns regarding the use of protected characteristics such as race and the correlates of protec...

Full description

Saved in:
Bibliographic Details
Published inMichigan law review Vol. 119; no. 2; pp. 291 - 395
Main Authors Yang, Crystal S., Dobbie, Will
Format Journal Article
LanguageEnglish
Published Ann Arbor Michigan Law Review Association 01.11.2020
Subjects
Online AccessGet full text
ISSN0026-2234
1939-8557
1939-8557
DOI10.36644/mlr.119.2.equal

Cover

Abstract In this Article, we provide a new statistical and Iegal framework to understand the legality and fairness of predictive algorithms under the Equal Protection Clause. We begin by reviewing the main legal concerns regarding the use of protected characteristics such as race and the correlates of protected characteristics such as criminal history. The use of race and nonrace correlates in predictive algorithms generates direct and proxy effects of race, respectively, that can lead to racial disparities that many view as unwarranted and discriminatory. These effects have led to the mainstream legal consensus that the use of race and nonrace correlates in predictive algorithms is both problematic and potentially unconstitutional under the Equal Protection Clause. This mainstream position is also reflected in practice, with all commonly used predictive algorithms excluding race and many excluding nonrace correlates such as employment and education. Next, we challenge the mainstream legal position that the use of a protected characteristic always violates the Equal Protection Clause. We develop a statistical framework that formalizes exactly how the direct and proxy effects of race can lead to algorithmic predictions that disadvantage minorities relative to nonminorities. While an overly formalistic solution requires exclusion of race and all potential nonrace correlates, we show that this type of algorithm is unlikely to work in practice because nearly all algorithmic inputs are correlated with race. We then show that there are two simple statistical solutions that can eliminate the direct and proxy effects of race, and which are implementable even when all inputs are correlated with race. We argue that our proposed algorithms uphold the principles of the equal protection doctrine because they ensure that individuals are not treated differently on the basis of membership in a protected class, in stark contrast to commonly used algorithms that unfairly disadvantage minorities despite the exclusion of race. We conclude by empirically testing our proposed algorithms in the context of the New York City pretrial system. We show that nearly all commonly used algorithms violate certain principles underlying the Equal Protection Clause by including variables that are correlated with race, generating substantial proxy effects that unfairly disadvantage Black individuals relative to white individuals. Both of our proposed algorithms substantially reduce the number of Black defendants detained compared to commonly used algorithms by eliminating these proxy effects. These findings suggest a fundamental rethinking of the equal protection doctrine as it applies to predictive algorithms and the folly of relying on commonly used algorithms.
AbstractList In this Article, we provide a new statistical and legal framework to understand the legality and fairness of predictive algorithms under the Equal Protection Clause. We begin by reviewing the main legal concerns regarding the use of protected characteristics such as race and the correlates of protected characteristics such as criminal history. The use of race and nonrace correlates in predictive algorithms generates direct and proxy effects of race, respectively, that can lead to racial disparities that many view as unwarranted and discriminatory. These effects have led to the mainstream legal consensus that the use of race and nonrace correlates in predictive algorithms is both problematic and potentially unconstitutional under the Equal Protection Clause. This mainstream position is also reflected in practice, with all commonly used predictive algorithms excluding race and many excluding nonrace correlates such as employment and education. Next, we challenge the mainstream legal position that the use of a protected characteristic always violates the Equal Protection Clause. We develop a statistical framework that formalizes exactly how the direct and proxy effects of race can lead to algorithmic predictions that disadvantage minorities relative to nonminorities. While an overly formalistic solution requires exclusion of race and all potential nonrace correlates, we show that this type of algorithm is unlikely to work in practice because nearly all algorithmic inputs are correlated with race. We then show that there are two simple statistical solutions that can eliminate the direct and proxy effects of race, and which are implementable even when all inputs are correlated with race. We argue that our proposed algorithms uphold the principles of the equal protection doctrine because they ensure that individuals are not treated differently on the basis of membership in a protected class, in stark contrast to commonly used algorithms that unfairly disadvantage minorities despite the exclusion of race. We conclude by empirically testing our proposed algorithms in the context of the New York City pretrial system. We show that nearly all commonly used algorithms violate certain principles underlying the Equal Protection Clause by including variables that are correlated with race, generating substantial proxy effects that unfairly disadvantage Black individuals relative to white individuals. Both of our proposed algorithms substantially reduce the number of Black defendants detained compared to commonly used algorithms by eliminating these proxy effects. These findings suggest a fundamental rethinking of the equal protection doctrine as it applies to predictive algorithms and the folly of relying on commonly used algorithms.
In this Article, we provide a new statistical and legal framework to understand the legality and fairness of predictive algorithms under the Equal Protection Clause. We begin by reviewing the main legal concerns regarding the use of protected characteristics such as race and the correlates of protected characteristics such as criminal history. The use of race and nonrace correlates in predictive algorithms generates direct and proxy effects of race, respectively, that can lead to racial disparities that many view as unwarranted and discriminatory. These effects have led to the mainstream legal consensus that the use of race and nonrace correlates in predictive algorithms is both problematic and potentially unconstitutional under the Equal Protection Clause. This mainstream position is also reflected in practice, with all commonly used predictive algorithms excluding race and many excluding nonrace correlates such as employment and education. Next, we challenge the mainstream legal position that the use of a protected characteristic always violates the Equal Protection Clause. We develop a statistical framework that formalizes exactly how the direct and proxy effects of race can lead to algorithmic predictions that disadvantage minorities relative to nonminorities. While an overly formalistic solution requires exclusion of race and all potential nonrace correlates, we show that this type of algorithm is unlikely to work in practice because nearly all algorithmic inputs are correlated with race. We then show that there are two simple statistical solutions that can eliminate the direct and proxy effects of race, and which are implementable even when all inputs are correlated with race. We argue that our proposed algorithms uphold the principles of the equal protection doctrine because they ensure that individuals are not treated differently on the basis of membership in a protected class, in stark contrast to commonly used algorithms that unfairly disadvantage minorities despite the exclusion of race. We conclude by empirically testing our proposed algorithms in the context of the New York City pretrial system. We show that nearly all commonly used algorithms violate certain principles underlying the Equal Protection Clause by including variables that are correlated with race, generating substantial proxy effects that unfairly disadvantage Black individuals relative to white individuals. Both of our proposed algorithms substantially reduce the number of Black defendants detained compared to commonly used algorithms by eliminating these proxy effects. These findings suggest a fundamental rethinking of the equal protection doctrine as it applies to predictive algorithms and the folly of relying on commonly used algorithms.
In this Article, we provide a new statistical and Iegal framework to understand the legality and fairness of predictive algorithms under the Equal Protection Clause. We begin by reviewing the main legal concerns regarding the use of protected characteristics such as race and the correlates of protected characteristics such as criminal history. The use of race and nonrace correlates in predictive algorithms generates direct and proxy effects of race, respectively, that can lead to racial disparities that many view as unwarranted and discriminatory. These effects have led to the mainstream legal consensus that the use of race and nonrace correlates in predictive algorithms is both problematic and potentially unconstitutional under the Equal Protection Clause. This mainstream position is also reflected in practice, with all commonly used predictive algorithms excluding race and many excluding nonrace correlates such as employment and education. Next, we challenge the mainstream legal position that the use of a protected characteristic always violates the Equal Protection Clause. We develop a statistical framework that formalizes exactly how the direct and proxy effects of race can lead to algorithmic predictions that disadvantage minorities relative to nonminorities. While an overly formalistic solution requires exclusion of race and all potential nonrace correlates, we show that this type of algorithm is unlikely to work in practice because nearly all algorithmic inputs are correlated with race. We then show that there are two simple statistical solutions that can eliminate the direct and proxy effects of race, and which are implementable even when all inputs are correlated with race. We argue that our proposed algorithms uphold the principles of the equal protection doctrine because they ensure that individuals are not treated differently on the basis of membership in a protected class, in stark contrast to commonly used algorithms that unfairly disadvantage minorities despite the exclusion of race. We conclude by empirically testing our proposed algorithms in the context of the New York City pretrial system. We show that nearly all commonly used algorithms violate certain principles underlying the Equal Protection Clause by including variables that are correlated with race, generating substantial proxy effects that unfairly disadvantage Black individuals relative to white individuals. Both of our proposed algorithms substantially reduce the number of Black defendants detained compared to commonly used algorithms by eliminating these proxy effects. These findings suggest a fundamental rethinking of the equal protection doctrine as it applies to predictive algorithms and the folly of relying on commonly used algorithms.
In this article we provide a new statistical and legal framework to understand the legality and fairness of predictive algorithms under the Equal Protection Clause. We begin by reviewing the main legal concerns regarding the use of protected characteristics such as race and the correlates of protected characteristics such as criminal history. The use of race and non-race correlates in predictive algorithms generates direct and proxy effects of race, respectively, that can lead to racial disparities that many view as unwarranted and discriminatory. These effects have led to the mainstream legal consensus that the use of race and non-race correlates in predictive algorithms is both problematic and potentially unconstitutional under the Equal Protection Clause. This mainstream position is also reflected in practice, with all commonly used predictive algorithms excluding race and many excluding non-race correlates such as employment and education. Next, we challenge the mainstream legal position that the use of a protected characteristic always violates the Equal Protection Clause. We develop a statistical framework that formalizes exactly how the direct and proxy effects of race can lead to algorithmic predictions that disadvantage minorities relative to non-minorities. While an overly formalistic solution requires exclusion of race and all potential non-race correlates, we show that this type of algorithm is unlikely to work in practice because nearly all algorithmic inputs are correlated with race. We then show that there are two simple statistical solutions that can eliminate the direct and proxy effects of race, and which are implementable even when all inputs are correlated with race. We argue that our proposed algorithms uphold the principles of the equal protection doctrine because they ensure that individuals are not treated differently on the basis of membership in a protected class, in stark contrast to commonly used algorithms that unfairly disadvantage minorities despite the exclusion of race. We conclude by empirically testing our proposed algorithms in the context of the New York City pre-trial system. We show that nearly all commonly used algorithms violate certain principles underlying the Equal Protection Clause by including variables that are correlated with race, generating substantial proxy effects that unfairly disadvantage Black individuals relative to white individuals. Both of our proposed algorithms substantially reduce the number of Black defendants detained compared to commonly used algorithms by eliminating these proxy effects. These findings suggest a fundamental rethinking of the equal protection doctrine as it applies to predictive algorithms and the folly of relying on commonly used algorithms.
Audience Professional
Academic
Author Yang, Crystal S.
Dobbie, Will
Author_xml – sequence: 1
  givenname: Crystal S.
  surname: Yang
  fullname: Yang, Crystal S.
– sequence: 2
  givenname: Will
  surname: Dobbie
  fullname: Dobbie, Will
BookMark eNqNkc2P0zAQxS20SHQX7hwjIY4J_oqTcIu62bai20Caao-WnTjFVZp07VRo_3u8CSBAFUI-2Bq938yb52tw1fWdAuAtggFhjNIPx9YECCUBDtTjWbQvwAwlJPHjMIyuwAxCzHyMCX0Frq09QAhRSNAMLLMvu3TtfS7yMpuXq3zj7Ta3WeGl60VerMrl_fajl3qb7MHblmm52paruZOnm1tvnS3c665I77OHvPj0GrxsRGvVmx_3DdjdZeV86a_zxTPjVyFlg6_CiGAZCyipkgkTVCooUUNiqOpYNrHEkYwpiWqWNKzBNYKRqEXUhFRiGTUVuQFo6nvuTuLpm2hbfjL6KMwTR5CPUXAXBXdRcMzHKBzzbmJOpn88KzvwQ382nbPJMWURI4S6mb9Ue9EqrrumH4yojtpWPGWUhIxiDJ3Kv6Daq04Z0bovabQr_6EPLujdqdVRVxcBOAGV6a01qvmf_dhfSKUHMei-c7N0-y9wOYHmqAcu9tqeBm6VMNXX0epY7s2e170euxDEfsowxBAhHEIKEX5O7_3U6mCH3vzuGRMYcRqS2Dlg5DsZC86g
CitedBy_id crossref_primary_10_1257_pol_20220620
crossref_primary_10_1111_phis_12208
crossref_primary_10_1257_aer_20201653
crossref_primary_10_2139_ssrn_3935520
crossref_primary_10_1002_pam_22527
crossref_primary_10_18601_01234366_48_06
crossref_primary_10_1038_s43588_023_00485_4
crossref_primary_10_1080_14719037_2022_2160488
crossref_primary_10_2139_ssrn_4602450
crossref_primary_10_2139_ssrn_4825988
crossref_primary_10_2139_ssrn_3956251
crossref_primary_10_1007_s00146_022_01577_x
crossref_primary_10_1016_j_chb_2023_108006
crossref_primary_10_1145_3686901
ContentType Journal Article
Copyright COPYRIGHT 2020 Michigan Law Review Association
Copyright Michigan Law Review Association Nov 2020
Copyright_xml – notice: COPYRIGHT 2020 Michigan Law Review Association
– notice: Copyright Michigan Law Review Association Nov 2020
DBID AAYXX
CITATION
ILT
3V.
4U-
7WY
7WZ
7X7
7XB
87Z
88E
8AO
8FI
8FJ
8FK
8FL
8G5
ABUWG
AFKRA
AZQEC
BENPR
BEZIV
CCPQU
DWQXO
FRNLG
FYUFA
F~G
GHDGH
GNUQQ
GUQSH
K60
K6~
K9.
L.-
L.0
M0C
M0S
M1P
M2O
MBDVC
PHGZM
PHGZT
PJZUB
PKEHL
PPXIY
PQBIZ
PQBZA
PQEST
PQQKQ
PQUKI
Q9U
S0X
ADTOC
UNPAY
DOI 10.36644/mlr.119.2.equal
DatabaseName CrossRef
Gale OneFile: LegalTrac
ProQuest Central (Corporate)
University Readers
ABI/INFORM Collection
ABI/INFORM Global (PDF only)
Health & Medical Collection
ProQuest Central (purchase pre-March 2016)
ABI/INFORM Collection
Medical Database (Alumni Edition)
ProQuest Pharma Collection
ProQuest Hospital Collection
Hospital Premium Collection (Alumni Edition)
ProQuest Central (Alumni) (purchase pre-March 2016)
ABI/INFORM Collection (Alumni Edition)
Research Library
ProQuest Central (Alumni)
ProQuest Central
ProQuest Central Essentials
ProQuest Central
Business Premium Collection
ProQuest One Community College
ProQuest Central Korea
Business Premium Collection (Alumni)
Health Research Premium Collection
ABI/INFORM Global (Corporate)
Health Research Premium Collection (Alumni)
ProQuest Central Student
Research Library Prep
ProQuest Business Collection (Alumni Edition)
ProQuest Business Collection
ProQuest Health & Medical Complete (Alumni)
ABI/INFORM Professional Advanced
ABI/INFORM Professional Standard
ABI/INFORM Global
Health & Medical Collection (Alumni Edition)
Medical Database
Research Library
Research Library (Corporate)
ProQuest Central Premium
ProQuest One Academic
ProQuest Health & Medical Research Collection
ProQuest One Academic Middle East (New)
ProQuest One Health & Nursing
ProQuest One Business (OCUL)
ProQuest One Business (Alumni)
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central Basic
SIRS Editorial
Unpaywall for CDI: Periodical Content
Unpaywall
DatabaseTitle CrossRef
ABI/INFORM Global (Corporate)
ProQuest Business Collection (Alumni Edition)
ProQuest One Business
University Readers
Research Library Prep
ProQuest Central Student
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
SIRS Editorial
ProQuest Health & Medical Complete (Alumni)
ProQuest Central (Alumni Edition)
ProQuest One Community College
ProQuest One Health & Nursing
Research Library (Alumni Edition)
ProQuest Pharma Collection
ABI/INFORM Complete
ProQuest Central
ABI/INFORM Professional Advanced
ProQuest Health & Medical Research Collection
Health Research Premium Collection
Health and Medicine Complete (Alumni Edition)
ABI/INFORM Professional Standard
ProQuest Central Korea
Health & Medical Research Collection
ProQuest Research Library
ProQuest Central (New)
ProQuest Medical Library (Alumni)
ABI/INFORM Complete (Alumni Edition)
Business Premium Collection
ABI/INFORM Global
ABI/INFORM Global (Alumni Edition)
ProQuest Central Basic
ProQuest One Academic Eastern Edition
ProQuest Hospital Collection
Health Research Premium Collection (Alumni)
ProQuest Business Collection
ProQuest Hospital Collection (Alumni)
ProQuest Health & Medical Complete
ProQuest Medical Library
ProQuest One Academic UKI Edition
ProQuest One Business (Alumni)
ProQuest One Academic
ProQuest One Academic (New)
ProQuest Central (Alumni)
Business Premium Collection (Alumni)
DatabaseTitleList ABI/INFORM Global (Corporate)
CrossRef


Database_xml – sequence: 1
  dbid: UNPAY
  name: Unpaywall
  url: https://proxy.k.utb.cz/login?url=https://unpaywall.org/
  sourceTypes: Open Access Repository
– sequence: 2
  dbid: BENPR
  name: ProQuest Central
  url: http://www.proquest.com/pqcentral?accountid=15518
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Law
EISSN 1939-8557
EndPage 395
ExternalDocumentID 10.36644/mlr.119.2.equal
A643564220
10_36644_mlr_119_2_equal
10.3316/agispt.20201125040124
45386446
Genre Articles
GeographicLocations UNITED STATES
GeographicLocations_xml – name: UNITED STATES
GroupedDBID ---
-ET
.4L
.CB
0ZK
123
1XV
2-G
29M
2QL
5.J
6DY
7WY
7X7
88E
8AO
8FI
8FJ
8FL
8G5
8OO
8R4
8R5
8VB
96U
AACLI
AAFWJ
ABACO
ABBHK
ABDBF
ABFRF
ABLWH
ABUWG
ABVAB
ABXSQ
ACBMB
ACHQT
ACIHN
ACMJI
ACUHS
ADBBV
ADCHZ
ADEPB
ADEYR
ADMHG
ADNFJ
ADUOI
AEAQA
AEFWE
AEGZQ
AEMOZ
AEUPB
AFACB
AFAZI
AFKRA
AFXCU
AGCSZ
AGQRV
AHEHV
AHQJS
AKNUK
AKVCP
AL2
ALIPV
ALMA_UNASSIGNED_HOLDINGS
AY0
AZQEC
B0M
BENPR
BEZIV
BHRNT
BKOMP
BPHCQ
BVXVI
CCPQU
CS3
DO4
DU5
DWQXO
EAP
EAS
EBC
EBD
EBE
EBO
EBR
EBS
EBU
ECR
EHL
EIS
EJD
EKAWT
EMH
EMK
EPL
ESX
F5P
F8P
FAS
FIL
FJW
FM.
FRNLG
FRS
FYUFA
GCQ
GENNL
GNUQQ
GROUPED_ABI_INFORM_RESEARCH
GUQSH
HCSNT
HISYW
HLR
HMCUK
HOCAJ
IAO
IBB
ICJ
IEA
ILT
INH
INR
IOF
IPB
IPSME
ITC
JAAYA
JAV
JBMMH
JBZCM
JENOY
JHFFW
JKQEH
JLEZI
JLXEF
JPL
JST
K1G
K60
K6~
KGA
LBL
LMKDQ
LO7
LU7
LXB
LXHRH
LXL
LXN
LXO
LXY
M0C
M1P
M2O
NXXTH
OK1
P2P
PHGZM
PHGZT
PQBIZ
PQBZA
PQQKQ
PROAC
PSQYO
PV9
Q.-
Q2X
QF4
QN5
QN7
QWB
RHO
RWL
RXW
RZL
S0X
SA0
TAA
TAC
TAE
TAF
TAI
TH9
TQQ
TQW
TR2
TWJ
UFL
UKHRP
UNMZH
UXK
UXR
VGZHO
VKN
W2G
WE1
WH7
X6Y
XFL
XPM
ZL0
ZRF
ZRR
~8M
~X8
~ZZ
PJZUB
PPXIY
PUEGO
XRM
AAAZS
AAYXX
ABAWQ
ACHJO
ADULT
AFQQW
CITATION
GOZPB
GRPMH
HGD
HVGLF
KQ8
M86
MVM
TAG
TAH
WEY
YQR
ZY4
3V.
4U-
7XB
8FK
K9.
L.-
L.0
MBDVC
PKEHL
PQEST
PQUKI
Q9U
ADTOC
UNPAY
ID FETCH-LOGICAL-c546t-e5732b8a0b4eb96a4be0b1f380ed8bf8b27b8437d69f6f2d107ada7f54b2b7fc3
IEDL.DBID BENPR
ISSN 0026-2234
1939-8557
IngestDate Wed Oct 01 16:54:58 EDT 2025
Fri Oct 03 11:11:15 EDT 2025
Mon Oct 20 22:13:03 EDT 2025
Thu Jun 12 23:36:31 EDT 2025
Mon Oct 20 16:25:50 EDT 2025
Wed Oct 01 04:41:44 EDT 2025
Thu Apr 24 22:53:45 EDT 2025
Wed Sep 24 03:20:43 EDT 2025
Thu Jun 19 21:31:36 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 2
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c546t-e5732b8a0b4eb96a4be0b1f380ed8bf8b27b8437d69f6f2d107ada7f54b2b7fc3
Notes MICHIGAN LAW REVIEW, Vol. 119, No. 2, Nov 2020, 291-395
Informit, Melbourne (Vic)
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
OpenAccessLink https://proxy.k.utb.cz/login?url=https://repository.law.umich.edu/cgi/viewcontent.cgi?article=6911&context=mlr
PQID 2467633484
PQPubID 36597
PageCount 105
ParticipantIDs rmit_agispt_search_informit_org_doi_10_3316_agispt_20201125040124
unpaywall_primary_10_36644_mlr_119_2_equal
crossref_primary_10_36644_mlr_119_2_equal
gale_infotracgeneralonefile_A643564220
gale_infotracmisc_A643564220
proquest_journals_2467633484
gale_infotracacademiconefile_A643564220
crossref_citationtrail_10_36644_mlr_119_2_equal
jstor_primary_10_2307_45386446
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2020-11-01
PublicationDateYYYYMMDD 2020-11-01
PublicationDate_xml – month: 11
  year: 2020
  text: 2020-11-01
  day: 01
PublicationDecade 2020
PublicationPlace Ann Arbor
PublicationPlace_xml – name: Ann Arbor
PublicationTitle Michigan law review
PublicationYear 2020
Publisher Michigan Law Review Association
Publisher_xml – name: Michigan Law Review Association
SSID ssj0001531
Score 2.434607
Snippet In this Article, we provide a new statistical and Iegal framework to understand the legality and fairness of predictive algorithms under the Equal Protection...
In this article we provide a new statistical and legal framework to understand the legality and fairness of predictive algorithms under the Equal Protection...
In this Article, we provide a new statistical and legal framework to understand the legality and fairness of predictive algorithms under the Equal Protection...
SourceID unpaywall
proquest
gale
crossref
rmit
jstor
SourceType Open Access Repository
Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 291
SubjectTerms Algorithms
Consensus (Social sciences)
Constitutional law
Credit scoring
Criminal procedure
Criminology
Demographics
Employment
Equal protection
Influence
Laws, regulations and rules
Methods
Minorities
Minority & ethnic groups
Proxies
Race
Race discrimination
Racial differences
Remedies
Risk assessment
Social conditions
Socioeconomic factors
SummonAdditionalLinks – databaseName: Unpaywall
  dbid: UNPAY
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3Nb9MwFLe27gAc-EYUxpQDGgIpbebYToK0Q4RWVWibdqDSOFm245SKLK3aVAP-et6LnWlBCITENX52bD_7fdjv_UzIa1skZaw1hkpZFTLK4lClXITalHFUAMsjiwf6Z-diOmMfL_nlDjnrcmHwpHyzwPvlUaUQBMKfuKGvOMbDcozgBnHceop-co8F7NjDtuBbc3xVrXfJnuDwmwHZm51f5J9dmIcIQRX6W-YsTDlP3LVlLMAmGEM1EB_ZiI4spjP21JQX1i5gsWeKtln398idbb1S369VVd1STpMHpO6G5WJSvo62jR6ZH78gPv63cT8k970ZG-SO5BHZsfVjsnuqrp-Q6QmOLLhwABDA9qB9WynIq_lyvWi-XG3eB3kAwjVAQ7fFiQZyVRfBqYUZCCZduNhTMpucfPowDf17DaHhTDSh5UlMdaoizazOhGLaRvqojNPIFqkuU00TnbI4KURWipIW4HmqQiUlZ5rqpDTxMzKol7V9TgIWmZQm4GsZMJmEpmnJkqIwXHNtGRhtQzLu2CONBzPHNzUqCU5Ny1AJ8wHuTSapbBk6JG9vaqwckMcfaN8gxyXucWjVKJ-qAH1DtCyZgxnHwXGj0ZAc9ijnDiv8d4T7PULYxKZXfNAurttdwxB9yUAhQQ8F1O9WnfRSZiMpaDmBqdRsSFAuNFLNF5tVI52UlA57Fz4v13MJKrIdcHwkOjIaofmHIHZgpkAb726W8V-n6MW_EL8kd_FXLoVznwya9da-Aluu0Qd-h_4ELxxNHQ
  priority: 102
  providerName: Unpaywall
Title EQUAL PROTECTION UNDER ALGORITHMS: A NEW STATISTICAL AND LEGAL FRAMEWORK
URI https://www.jstor.org/stable/45386446
http://search.informit.org/doi/10.3316/agispt.20201125040124
https://www.proquest.com/docview/2467633484
https://repository.law.umich.edu/cgi/viewcontent.cgi?article=6911&context=mlr
UnpaywallVersion publishedVersion
Volume 119
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVEBS
  databaseName: EBSCOhost Academic Search Ultimate
  customDbUrl: https://search.ebscohost.com/login.aspx?authtype=ip,shib&custid=s3936755&profile=ehost&defaultdb=asn
  eissn: 1939-8557
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0001531
  issn: 1939-8557
  databaseCode: ABDBF
  dateStart: 19960601
  isFulltext: true
  titleUrlDefault: https://search.ebscohost.com/direct.asp?db=asn
  providerName: EBSCOhost
– providerCode: PRVPQU
  databaseName: Health & Medical Collection
  customDbUrl:
  eissn: 1939-8557
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0001531
  issn: 1939-8557
  databaseCode: 7X7
  dateStart: 19970601
  isFulltext: true
  titleUrlDefault: https://search.proquest.com/healthcomplete
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: ProQuest Central
  customDbUrl: http://www.proquest.com/pqcentral?accountid=15518
  eissn: 1939-8557
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0001531
  issn: 1939-8557
  databaseCode: BENPR
  dateStart: 19970601
  isFulltext: true
  titleUrlDefault: https://www.proquest.com/central
  providerName: ProQuest
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwhV3db9MwED-t7QPwgPgUhVHlAYFAypo5jpMiIZSydB20adWmWnmy7MSpkEpbukwT_z3nfJRFQuMpUXKxnLPv7nfJfQC8UYmb2lLqUCklTEqobQrPYaaMU9tKcMktpT_oj0M2XNCvS2d5BGGVC6PDKiudmCvqZBvrb-RdghLNdNoo_bz7ZequUfrvatVCQ5StFZJPeYmxBrSIrozVhFY_CKezg25G-S566BFmomGkxY9LmyEq6P5c71GB9E7IidIJjTVDVarrImSxBkbzvPsHcO96sxO_b8R6fcs8DR7BwxJXGn6xER7Dkdo8gcZI3DyFYYCwdWRMZ5MoyENGjDyzwPBH55PZRTQczz8avhEGl8Y88qOLeaRLJBh-eGaMgnM8G8z8cXA5mX17BotBEH0ZmmUDBTN2KMtM5bg2kZ6wJFWyxwSVypKnqe1ZKvFk6kniSo_absJ6KUtJgq6gSISbOlQS6aax_Ryam-1GvQCDWrFHXHR-YsQwTBIvpW6SxI50pKKIotrQrbjF47K6uG5yseboZeT85chf9Dd6nPCcv214f3hiV1TWuIP2nV4AroUOR41FmTuAc9Plq7iPuMpBT4pYbXhbo1wVxbv_RXhcI0Spimu3O_la356ajpnnFC0EzpDh89Um4KXYX_G_m7QNWlAzLlY_rnYZL9QWL4rh4uXtfsXRZuUvbJ-yioxYGo_pqnKIG3CMD4dd9V8Wvbx7Pq_gvh68yKI8hma2v1avEU5lsgMNd-l2oOX3z_qDTikxeFyEU__7HwKHGp4
linkProvider ProQuest
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1bb9MwFLZ2eRg8IK6iMIYfuAikrJnjOCnShAJLl9K0ndpM25uxY6dCKm1pO1X7c_w2jnMpi4TG096i5MRxjn3O-U5yLgi90crLHClNqJQWFiXUsYTvMkummWMrWHJbmw_6vT6Lzum3S_dyC_2ucmFMWGWlE3NFrWap-UbeJCDRzKSN0s_zX5bpGmX-rlYtNETZWkEd5yXGysSOrr5egwu3PO6cwHq_JaQdJl8jq-wyYKUuZStLu55DpC9sSbVsMUGltuVR5vi2Vr7MfEk86VPHU6yVsYwo8JeEEl7mUkmkl6UOjLuNdqlDW-D87X4J-2fDjS0AfVL07CPMAkNMix-lDgMU0vw5WYDCah2SQ20SKGuGsTQPRYhkDfzmef730d7VdC6u12IyuWEO2w_RgxLH4qDYeI_Qlp4-RtuxWD9BUQgwOcZnw0ES5iEqOM9kwEF8Ohh2kqg3-oQD3A8v8CgJks4oMWzCQf8Ex-EpHLWHQS-8GAy7T9H5nbDyGdqZzqb6OcLUTn3igbOVAmZikvgZ9ZRKXelKTQG1NVCz4hZPy2rmpqnGhINXk_OXA3_Bv2lxwnP-NtCHzR3zopLHLbTvzQJwI-QwairKXAWYmymXxQPAcS54bsRuoHc1ynFRLPxfhPs1QpDitHb5IF_rm1MzMfqcgkWCGTK4v9oEvFQzS_5XKBrIKIYVF-Mfy_mKF_ufF8V34fRsMeZgI_MXdo5YRUZsg_9MFTvAKTDGx82u-i-LXtw-n9doL0p6MY87_e5LdM88qMjg3Ec7q8WVfgVQbiUPSnnB6Ptdi-gfUslVOw
linkToUnpaywall http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3Nb9MwFLe27gAc-EYUxpQDGgIpbebYToK0Q4RWVWibdqDSOFm245SKLK3aVAP-et6LnWlBCITENX52bD_7fdjv_UzIa1skZaw1hkpZFTLK4lClXITalHFUAMsjiwf6Z-diOmMfL_nlDjnrcmHwpHyzwPvlUaUQBMKfuKGvOMbDcozgBnHceop-co8F7NjDtuBbc3xVrXfJnuDwmwHZm51f5J9dmIcIQRX6W-YsTDlP3LVlLMAmGEM1EB_ZiI4spjP21JQX1i5gsWeKtln398idbb1S369VVd1STpMHpO6G5WJSvo62jR6ZH78gPv63cT8k970ZG-SO5BHZsfVjsnuqrp-Q6QmOLLhwABDA9qB9WynIq_lyvWi-XG3eB3kAwjVAQ7fFiQZyVRfBqYUZCCZduNhTMpucfPowDf17DaHhTDSh5UlMdaoizazOhGLaRvqojNPIFqkuU00TnbI4KURWipIW4HmqQiUlZ5rqpDTxMzKol7V9TgIWmZQm4GsZMJmEpmnJkqIwXHNtGRhtQzLu2CONBzPHNzUqCU5Ny1AJ8wHuTSapbBk6JG9vaqwckMcfaN8gxyXucWjVKJ-qAH1DtCyZgxnHwXGj0ZAc9ijnDiv8d4T7PULYxKZXfNAurttdwxB9yUAhQQ8F1O9WnfRSZiMpaDmBqdRsSFAuNFLNF5tVI52UlA57Fz4v13MJKrIdcHwkOjIaofmHIHZgpkAb726W8V-n6MW_EL8kd_FXLoVznwya9da-Aluu0Qd-h_4ELxxNHQ
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=EQUAL+PROTECTION+UNDER+ALGORITHMS%3A+A+NEW+STATISTICAL+AND+LEGAL+FRAMEWORK&rft.jtitle=Michigan+law+review&rft.au=Yang%2C+Crystal+S&rft.au=Dobbie%2C+Will&rft.date=2020-11-01&rft.pub=Michigan+Law+Review+Association&rft.issn=0026-2234&rft.volume=119&rft.issue=2&rft.spage=291&rft_id=info:doi/10.36644%2Fmlr.119.2.equal&rft.externalDBID=ILT&rft.externalDocID=A643564220
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0026-2234&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0026-2234&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0026-2234&client=summon