#Weapons_of_Math_Destruction - Cathy_O'Neil - #Review

There is no mathematics, though, rather it is all Statistics and the apprehension associated with Big Data when it comes to application either through AI or ML. Most of the time we may agree with almost all of the author’s views but there is no algorithm anywhere as the title suggests, and instead O'Neil takes us on a tour of a collection of business and public policy malpractices, stating that the solution is to "encode values into our algorithms.". This might really be useful for some. With a few real stuff inside I think it would be an excellent book for those with skills and interest in social justice to take to an interview with Big Data firms.

    Still, I am inclined to think this book is best targeted to thoughtful high schoolers and college-aged students who are thinking about planning their careers, who have a penchant for mathematical and computer modeling. You may be able to conclude the effect of:

 # Targeted advertising, especially the way it allows predatory advertisers

# Predictive policing, with equality before the law replaced by an algorithm that sends different degrees of law enforcement into different communities.

# Automated algorithms sorting and rejecting job applications, with indirect consequences of discrimination against classes of people.

# Poorly thought-out algorithms for evaluating teachers, sometimes driving excellent teachers from their jobs.

# Algorithms that score credit, determine access to mortgages and to insurance, often with the effect of making sure that those deemed losers stay that way.

 Some of these are broken down chapter wise:

        After teaching math at Barnard College for several years, O’Neil left Academia for a new “laboratory”—the global economy. As a “quant” (quantitative analyst) for D.E. Shaw, a major hedge fund, she was amazed by how the operations she and her team performed each day translated into “trillions of dollars sloshing” between accounts. But in the fall of 2008, everything changed and the financial crisis brought the economy to a halt.

     The Big Data economy emerged as mathematicians began calculating human potential (as students, workers, lovers, criminals), but its math-powered algorithms were encoded with their creators' human prejudices and biases. Despite deepening the global wealth divide, Big Data seemed unimpeachable. Author O'Neil calls these harmful models "Weapons of Math Destruction." While baseball statisticians also use complex, game-defining models, those models are transparent, offering everyone access to the statistics that rule the sport.

    Human beings carry models in their heads all day—as an example, O’Neil uses the “informal model” of how she decides what to cook for her large family each night. She has data (each person’s likes and dislikes), and she has new information concerning that data all the time: fluctuating grocery prices, changing tastes, and anomalies like special meals for special occasions.

     Prosecutors in Harris County are three times more likely to seek the death penalty for Black people. And sentences imposed on Black men (who comprise only 13% of the U.S. population yet make up 40% of the U.S.’s prison population) are 20% longer than those for white people.  The three elements of a WMD, according to O’Neil, are opacity, scale, and damage.

     To illustrate the scale of a Weapon of Math Destruction (WMD), O'Neil asks readers to imagine the "caveman diet" becoming a mandatory national standard for all 330 million Americans, creating a distorted economic climate. This large-scale distortion is what happened to higher education when the U.S. News & World Report, in 1983, began ranking approximately 1,800 colleges based initially on opinion surveys. The rankings, which attempted to statistically measure "educational excellence," created a "rat race" in U.S. academia.

     Big tech platforms like Google and Facebook allow for-profit universities to segment and target vulnerable populations with ads based on A/B testing, similar to massive credit card mailings. These ads often use fake job postings or misleading promises (like routes to food stamps/Medical aid) to gather user information and send it to recruiters. Separately, after police force cuts in 2013, Pennsylvania's police chief invested in PredPol, a Big Data crime prediction software. This system used historical data to forecast crime hotspots, and within a year, burglaries in vulnerable areas were down by over 20 percent.

     While human programs like stop and frisk (a program in which NYPD offers were given the go-ahead to stop, search, and frisk anyone who seemed suspicious anywhere at any time) have created friction and danger in vulnerable communities, mathematical models now dominate law enforcement. The author suggests justice system data scientists must understand the realities inside prisons (like solitary confinement, rape, and malnutrition) and instead study how improvements—such as better food, sunlight, and educational programs—could impact recidivism rates. Similarly, Human Resources departments rely on automated processes to screen resumes, forcing jobseekers to adapt their writing to the algorithms' prioritized buzzwords. The overall issue is that WMDs (Weapons of Math Destruction) often serve unfair objectives instead of helping people.

     American workers have coined a new idea: “clopening” -closing and opening, and refers to when an employee works late closing one night and comes in early, sometimes just a few hours later, to open up shop the next morning. U.S. government data shows that over two-thirds of food service workers and over half of retail workers find out about scheduling changes with less than a week’s notice.

     Modern-day scheduling technology is rooted in the discipline of applied mathematics called “operations research” (OR). Mathematicians used OR to help farmers plan crop plantings. During World War II, OR was used to help the U.S. and British militaries optimize their resources. After the war, OR was used in manufacturing and supply chain logistics, and now it underpins huge companies like Amazon, FedEx, and UPS. But these models exploit workers, bending their lives to unfair schedules. Optimization programs are everywhere now, and they’ve contributed to the creation of what O’Neil calls a “captive workforce.”

    Local bankers, who knew their neighbors and their backgrounds, previously controlled lending with a human, intelligent judgment. Now, companies like Neustar use metrics like location and internet history to assign "e-scores," creating destructive and biased feedback loops that move society away from fairness. Instead of slowing down to allow for greater human oversight, the tech world is doubling down on predictive models. For example, Facebook has patented a credit rating system based on social networks that can unfairly privilege a person (e.g., a white, connected college graduate with no credit) while penalizing a hardworking person (e.g., a Black or Latino housecleaner) whose social network may include unemployed or incarcerated friends.

     In 1896, a German statistician named Frederick Hoffman, who worked for the Prudential Life Insurance Company created a WMD. He published a 330-page report claiming that the lives of Black Americans were so precarious that “the entire race was uninsurable.” It was he who first correlated smoking with cancer! However, like many other WMDs, Hoffman’s analysis was statistically flawed, racist, and unfortunately widespread.

     In 2015, Swift Transportation, the largest U.S. trucking company, installed dual-facing cameras in long-haul trucks to reduce accidents and insurance costs. This also allowed them to gather data to optimize profits and compare drivers. Now, insurance companies offer regular drivers discounts if they share data via a small, in-car telemetric unit. Similarly, companies like Michelin and CVS have implemented wellness programs that charge employees extra for failing to meet goals (like glucose or cholesterol targets) or for refusing to report their health data. Finally, the author notes that posting a petition for tougher WMD regulations on Facebook means the site's algorithm controls who sees it, based on its existing data about one's friends.

     During the 2010 and 2012 U.S. elections, Facebook created experiments to hone a tool called the “voter megaphone” that would allow people to spread the word about voting. Facebook was encouraging over 61 million American users to get out and vote by leveraging peer pressure against them. Because the profits of companies like Facebook, Google, Apple, Microsoft, Amazon, and Verizon are heavily regulated by government policies, these companies often spend a lot of money lobbying and donating to the political system. Now, they can influence Americans’ political behavior and, as a result, the shape of American government.

     Rayid Ghani, one of the campaign’s data scientists, when Obama was presidential candidate had previously worked on projects for a consulting that analyzed grocery stores’ consumer data. This information was used to create customized shopping plans for many kinds of shoppers: coupon-clippers, brand loyalists, foodies, and so on. Now, Ghani was trying to see if similar calculations would work on swing voters. The shoppers who switched brands to save a few cents often behaved like swing voters.

     The main difference between the WMDs of the present and the prejudiced human error of the past is simple: humans can evolve, learn, and adapt. But automated systems are stuck in time—engineers have to change them as society progresses. So essentially, “Big Data processes codify the past” rather than inventing the future. Only humans have the “moral imagination” needed to create a better world. 


 

1 comment:

  1. Great Article, an eye opener, things that common public should be aware of. Thank you for sharing.

    ReplyDelete

#Weapons_of_Math_Destruction - Cathy_O'Neil - #Review

There is no mathematics, though, rather it is all Statistics and the apprehension associated with Big Data when it comes to application eith...