Jump to content

The Protestant Community

Welcome to Christforums the Protestant Community. You'll need to register in order to post your comments on your favorite topics and subjects. You'll also enjoy sharing media across multiple platforms. We hope you enjoy your fellowship here! God bless, Christforums' Staff
Register now

Christforums

Christforums is a Protestant Christian forum, open to Bible- believing Christians such as Presbyterians, Lutherans, Reformed, Baptists, Church of Christ members, Pentecostals, Anglicans. Methodists, Charismatics, or any other conservative, Nicene- derived Christian Church. We do not solicit cultists of any kind, including Mormons, Jehovah's Witnesses, Eastern Lightning, Falun Gong, Unification Church, Aum Shinrikyo, Christian Scientists or any other non- Nicene, non- Biblical heresy. God bless, Christforums' Staff
Register now
Sign in to follow this  
William

When Algorithms Discriminate

Recommended Posts

Claire Cain Miller

 

The online world is shaped by forces beyond our control, determining the stories we read on Facebook, the people we meet on OkCupid and the search results we see on Google. Big data is used to make decisions about health care, employment, housing, education and policing.

 

But can computer programs be discriminatory?

 

There is a widespread belief that software and algorithms that rely on data are objective. But software is not free of human influence. Algorithms are written and maintained by people, and machine learning algorithms adjust what they do based on people’s behavior. As a result, say researchers in computer science, ethics and law, algorithms can reinforce human prejudices.

 

Google’s online advertising system, for instance, showed an ad for high-income jobs to men much more often than it showed the ad to women, a new study by Carnegie Mellon University researchers found.

 

Research from Harvard University found that ads for arrest records were significantly more likely to show up on searches for distinctively black names or a historically black fraternity. The Federal Trade Commission said advertisers are able to target people who live in low-income neighborhoods with high-interest loans.

 

Research from the University of Washington found that a Google Images search for “C.E.O.” produced 11 percent women, even though 27 percent of United States chief executives are women. (On a recent search, the first picture of a woman to appear, on the second page, was the C.E.O. Barbie doll.) Image search results determined 7 percent of viewers’ subsequent opinions about how many men or women worked in a field, it found.

 

“The amoral status of an algorithm does not negate its effects on society,” wrote the authors of the Google advertising study, Amit Datta and Anupam Datta of Carnegie Mellon and Michael Carl Tschantz of the International Computer Science Institute.

 

Algorithms, which are a series of instructions written by programmers, are often described as a black box; it is hard to know why websites produce certain results. Often, algorithms and online results simply reflect people’s attitudes and behavior. Machine learning algorithms learn and evolve based on what people do online. The autocomplete feature on Google and Bing is an example. A recent Google search for “Are transgender,” for instance, suggested, “Are transgenders going to hell.”

 

“Even if they are not designed with the intent of discriminating against those groups, if they reproduce social preferences even in a completely rational way, they also reproduce those forms of discrimination,” said David Oppenheimer, who teaches discrimination law at the University of California, Berkeley.

 

But there are laws that prohibit discrimination against certain groups, despite any biases people might have. Take the example of Google ads for high-paying jobs showing up for men and not women. Targeting ads is legal. Discriminating on the basis of gender is not.

 

The Carnegie Mellon researchers who did that study built a tool to simulate Google users that started with no search history and then visited employment websites. Later, on a third-party news site, Google showed an ad for a career coaching service advertising “$200k+” executive positions 1,852 times to men and 318 times to women.

 

The reason for the difference is unclear. It could have been that the advertiser requested that the ads be targeted toward men, or that the algorithm determined that men were more likely to click on the ads.

 

Google declined to say how the ad showed up, but said in a statement, “Advertisers can choose to target the audience they want to reach, and we have policies that guide the type of interest-based ads that are allowed.”

 

Anupam Datta, one of the researchers, said, “Given the big gender pay gap we’ve had between males and females, this type of targeting helps to perpetuate it.”

 

It would be impossible for humans to oversee every decision an algorithm makes. But companies can regularly run simulations to test the results of their algorithms. Mr. Datta suggested that algorithms “be designed from scratch to be aware of values and not discriminate.”

 

“The question of determining which kinds of biases we don’t want to tolerate is a policy one,” said Deirdre Mulligan, who studies these issues at the University of California, Berkeley School of Information. “It requires a lot of care and thinking about the ways we compose these technical systems.”

 

Silicon Valley, however, is known for pushing out new products without necessarily considering the societal or ethical implications. “There’s a huge rush to innovate,” Ms. Mulligan said, “a desire to release early and often — and then do cleanup.”

Share this post


Link to post
Share on other sites

Algorithms don't discriminate. But, we live in a society that very studiously ignores the differences in people that lead to different outcomes. Instead, we accuse innocent people of discrimination. These algorithms don't know your race, sex, or perversions. They only know the websites you go to, and they serve the ads accordingly, without discrimination.

 

Our fascist overlords are demanding, aside from stomping on our personal freedoms, is that these Algorithms do discriminate.

 

Google’s online advertising system, for instance, showed an ad for high-income jobs to men much more often than it showed the ad to women, a new study by Carnegie Mellon University researchers found.

 

And, yet, the advertising system doesn't know if any user is a man or woman. It just knows if you're spending your time on Facebook or the Wall Street Journal

 

Share this post


Link to post
Share on other sites

Since people write the algorithms and people discriminate one ought not be surprised when the algorithms that they write do what their authors think ought to be done; they discriminate.

Share this post


Link to post
Share on other sites
Algorithms don't discriminate. But, we live in a society that very studiously ignores the differences in people that lead to different outcomes. Instead, we accuse innocent people of discrimination. These algorithms don't know your race, sex, or perversions. They only know the websites you go to, and they serve the ads accordingly, without discrimination.

 

Our fascist overlords are demanding, aside from stomping on our personal freedoms, is that these Algorithms do discriminate.

 

And, yet, the advertising system doesn't know if any user is a man or woman. It just knows if you're spending your time on Facebook or the Wall Street Journal

 

Kind of funny, as to your point of accusing people of discrimination when they act like the autocomplete feature of a search engine, people parroting a preprogramed response without thought, but only for the sake of argument. :D

 

You'd be surprised by the resulting demographics of a good analytic program. Some websites use them, not only do they show age, gender, but I assume tracking your clicks will indeed show your interests. Matter of fact, Facebook has recently announced it will be tracking how long a person is engaged in a thread, how much of a movie they watched, etc. Privacy is always a concern. From a personal standpoint, or bias, this kind of information is valuable. As a webmaster I like to know who is viewing, from where they came, the terms or phrases typed into the search engine, etc. These things help in determining the best direction to focus in on. But to note, no such "indepth" tracking is currently implemented on this site, besides Google analytics which is made available to me.

 

See: Demographics and Interests

 

God bless,

William

Share this post


Link to post
Share on other sites
Since people write the algorithms and people discriminate one ought not be surprised when the algorithms that they write do what their authors think ought to be done; they discriminate.

 

Maybe you should demand that Google open up their code so that you can prove your prejudice is correct about the programmers by identifying the sexist, racist, and your great concern about homophobic lines of code. The only bigotry that is pervasive at Google is Liberal bigotry.

 

;if x=male goto 'display high paying job'.

 

 

  • Like 1

Share this post


Link to post
Share on other sites

 

Maybe you should demand that Google open up their code so that you can prove your prejudice is correct about the programmers by identifying the sexist, racist, and your great concern about homophobic lines of code. The only bigotry that is pervasive at Google is Liberal bigotry.

 

;if x=male goto 'display high paying job'.

 

 

One thing is certain, your post is full of all sorts of prejudice; had it been a program in Google's search engine Google would be shockingly bad. One can only hope that none of their programmers are writing stuff like your post.

Share this post


Link to post
Share on other sites

 

One thing is certain, your post is full of all sorts of prejudice; had it been a program in Google's search engine Google would be shockingly bad. One can only hope that none of their programmers are writing stuff like your post.

 

How do you suppose one website ranks better than another in the Google Search Engine?

 

God bless,

William

 

 

Share this post


Link to post
Share on other sites

 

One thing is certain, your post is full of all sorts of prejudice; had it been a program in Google's search engine Google would be shockingly bad. One can only hope that none of their programmers are writing stuff like your post.

 

You appear schizophrenic. Back in post 3 you asserted that Google's software discriminates because it was written by the KKK (or something to that effect). Now, you're saying if Google's software discriminated, it would be shockingly bad and that you hope their programmers aren't writing discriminating code.

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Recently Browsing   0 members

    No registered users viewing this page.

×