Entities
View all entitiesIncident Stats
CSETv0 Taxonomy Classifications
Taxonomy DetailsProblem Nature
Specification
Physical System
Software only
Level of Autonomy
Medium
Nature of End User
Amateur
Public Sector Deployment
No
Data Inputs
Standardized university admission form, Previous admission and regection decisions
CSETv1 Taxonomy Classifications
Taxonomy DetailsIncident Number
43
Notes (special interest intangible harm)
The Commission for Racial Equality found St. George's Hospital Medical School guilty of discrimination against women and members of ethnic minorities.
Special Interest Intangible Harm
yes
Date of Incident Year
1979
CSETv1_Annotator-1 Taxonomy Classifications
Taxonomy DetailsIncident Number
43
Notes (special interest intangible harm)
The Commission for Racial Equality found St. George's Hospital Medical School guilty of discrimination against women and members of ethnic minorities.
Special Interest Intangible Harm
yes
Date of Incident Year
1979
CSETv1_Annotator-2 Taxonomy Classifications
Taxonomy DetailsIncident Number
43
Special Interest Intangible Harm
yes
Date of Incident Year
1979
Estimated Date
No
Multiple AI Interaction
no
Embedded
no
Incident Reports
Reports Timeline
- View the original report at its source
- View the report at the Internet Archive
A Blot on the Profession
Discrimination in medicine against women and members of ethnic minorities has long been suspected, but it has now been proved. St George's Hospital Medical School has been found guilty by the Commission for Racial E…
- View the original report at its source
- View the report at the Internet Archive
As AI spreads, this will become an increasingly important and controversial issue:
For one British university, what began as a time-saving exercise ended in disgrace when a computer model set up to streamline its admissions process exposed …
- View the original report at its source
- View the report at the Internet Archive
Professor Margaret Boden, an AI and cognitive science researcher, took the time to speak to me in 2010 about computers, AI, morality and the future. One of the stories she told me comes back to me every now and then, most recently by Micros…
- View the original report at its source
- View the report at the Internet Archive
Companies and governments need to pay attention to the unconscious and institutional biases that seep into their algorithms, argues cybersecurity expert Megan Garcia. Distorted data can skew results in web searches, home loan decisions, or …
Variants
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
AI Beauty Judge Did Not Like Dark Skin
Sexist and Racist Google Adsense Advertisements
Similar Incidents
Did our AI mess up? Flag the unrelated incidents