

בחסות
This story was originally published on HackerNoon at: https://hackernoon.com/why-is-it-so-hard-to-learn-basic-facts-about-government-algorithms.
It took six years, from the algorithm’s deployment in 2017 until “Inside the Suspicion Machine” published, for the public to get a full picture of how it worked
Check more stories related to society at: https://hackernoon.com/c/society. You can also check exclusive content about #society, #algorithms, #fraud-detection-algorithm, #journalism, #usa-government, #fast-enterprises, #midas, #the-markup, and more.
This story was written by: @TheMarkup. Learn more about this writer by checking @TheMarkup's about page, and for more stories, please visit hackernoon.com.
The investigation showed that the algorithm, built for Rotterdam by the consultancy Accenture, discriminated on the basis of ethnicity and gender. And most impressively, it demonstrated in exacting detail how and why the algorithm behaved the way it did. (Congrats to the Lighthouse/Wired team, including Dhruv Mehrotra, who readers may recall helped us investigate crime prediction algorithms in 2021.) Cities around the world and quite a few U.S. states are using similar algorithms built by private companies to flag citizens for benefits fraud. Not for lack of trying, we know very little about how they work.
106 פרקים
This story was originally published on HackerNoon at: https://hackernoon.com/why-is-it-so-hard-to-learn-basic-facts-about-government-algorithms.
It took six years, from the algorithm’s deployment in 2017 until “Inside the Suspicion Machine” published, for the public to get a full picture of how it worked
Check more stories related to society at: https://hackernoon.com/c/society. You can also check exclusive content about #society, #algorithms, #fraud-detection-algorithm, #journalism, #usa-government, #fast-enterprises, #midas, #the-markup, and more.
This story was written by: @TheMarkup. Learn more about this writer by checking @TheMarkup's about page, and for more stories, please visit hackernoon.com.
The investigation showed that the algorithm, built for Rotterdam by the consultancy Accenture, discriminated on the basis of ethnicity and gender. And most impressively, it demonstrated in exacting detail how and why the algorithm behaved the way it did. (Congrats to the Lighthouse/Wired team, including Dhruv Mehrotra, who readers may recall helped us investigate crime prediction algorithms in 2021.) Cities around the world and quite a few U.S. states are using similar algorithms built by private companies to flag citizens for benefits fraud. Not for lack of trying, we know very little about how they work.
106 פרקים
Player FM סורק את האינטרנט עבור פודקאסטים באיכות גבוהה בשבילכם כדי שתהנו מהם כרגע. זה יישום הפודקאסט הטוב ביותר והוא עובד על אנדרואיד, iPhone ואינטרנט. הירשמו לסנכרון מנויים במכשירים שונים.