[Home]  [Headlines]  [Latest Articles]  [Latest Comments]  [Post]  [Mail]  [Sign-in]  [Setup]  [Help]  [Register] 

PV Police Chief's Gun Missing (Realizes he left it in the Library Bathroom, 4 days later)

General says he'd deny 'illegal' order for nuke strike

Happy Thanksgiving 2017!

Vegas Massacre Cover-Up: PR Firm Hired by Mandalay Bay Exposed Pushing Disinformation on Conspiracy Theories

20,000 DWI Cases May Be Thrown Out After Cop Arrested for Tampering With Breathalyzers

Country music legend Mel Tillis dead at 85

Helter skelter: The history of Charles Manson and rock 'n' roll

Grant Cardone was at Mandalay Hotel and he doesn't believe the official story

How a government-sanctioned scam in Newburgh Heights has taken thousands of dollars from drivers

The Cops Were Chasing a Shoplifter. They Ended Up Destroying an Innocent Man's Home.

Amtrak train smashes car in New Hampshire after GPS directed driver onto tracks

Cooking Hamburgers With Thermite

Making Charcoal the Easy Way

Homemade Gunpowder, For Science!

100% TWO ACTUAL SHOOTERS ON VIDEO IN VEGAS - According to Joe

Study reveals how a very low calorie diet can reverse type 2 diabetes

Trump’s overseas trips reap goodwill and trade agreements

Harvard University hosts anal sex workshop (Anal University)

AC/DC co-founder Malcolm Young dies aged 64

Apple's first vice president of diversity stepping down after six months in the job, report says

NY Times Reporter Calls For Censorship of Creepy Videos Of Joe Biden Inappropriately Touching Kids

Al Franken accuser says disgusting USO tour grope photo was an 'in your face' parting shot after nearly two weeks of being 'belittled and humiliated' by future senator

Alabama G.O.P. Says It Stands Behind Roy Moore

Florida Man Being Chased By Police Asks 911 To Call Donald Trump

Ex-Cops Who Fought Against Legal Weed Just Opened a Marijuana Business

Police mistook hibiscus plants for marijuana, arrested Buffalo Township couple, suit claims

Cops Accidentally Film Themselves Planting Cocaine in a Man’s Wallet

FDA, DEA Launch Massive Assault On Kratom – Drug War for Miracle Plant Ramping Up

‘Come Meet a Black Person’ Event in Atlanta

After Admitting to Secretly Experimenting on Troops, Army Refusing to Provide Them Medical Care

A Sheriff Just Threatened Charges Over ‘F*ck Trump’ Sticker — and the Internet Is Destroying Him

Cops Accuse Innocent Man of Breaking Into His Own Truck—So They Shot Him, Twice

"I've Been Banned From Facebook For Sharing An Article About False Flags"

Health Effects: Alcohol vs. Marijuana

Toledo Cops Raid Wrong House, Kill Dogs, Thousands in Damage

Clearwater Beach man facing eviction over emotional support squirrel

U.S. Chamber:"Dreamers" Make America, Americans Have No Role

FBI Seeks Senate Documents on Abortionists' Sale of Fetal Tissue, Body Parts

oman accuses Al Franken of kissing, groping her without consent

What is the worst restaurant experience you ever had?

Judge Moore Sends Defiant Open Letters to Hannity After Fox News Host Gives Him Ultimatum

Trump's cheeseburger draws a crowd at Tokyo restaurant

The universal Rebel and the psy-op to neuter him

Cops Beat Innocent Homeless Man to a Pulp Because He Was Sleeping

Congresswoman: Taxpayers Have Paid $15 MILLION to Silence Sexual Abuse Victims of Politicians

2017 a Record Year for Opium Poppy Cultivation in Afghanistan (The fruits of 16 years of war)

New Wisconsin Gun Law Lets Kids Of Any Age Hunt If An Adult "Mentor" Is Present

The Rot Inside the Republican Party

US hires company with KGB link to guard Moscow embassy

People for sale (Libyan slave markets)


Status: Not Logged In; Sign In

Science-Technology
See other Science-Technology Articles

Title: AI gaydar can accurately determine sexuality from a photo
Source: Beta News
URL Source: https://betanews.com/2017/09/08/ai-gaydar-sexuality-prediction/
Published: Sep 8, 2017
Author: Mark Wycislik-Wilson
Post Date: 2017-09-08 11:12:03 by Deckard
Keywords: None
Views: 164
Comments: 3

Facial detection technology is usually used to identify individuals for the purposes of crime prevention, or as a biometric security method. But a paper published by Stanford University -- entitled simply "Deep neural networks are more accurate than humans at detecting sexual orientation from facial images" -- shows that it could also be used to determine people's sexuality.

Using AI and deep neural networks, algorithms have been shown to have a far better gaydar than people. Working with a sample of more than 35,000 photographs, the system was able to correctly determine whether individuals were gay or straight with staggering accuracy -- 81 percent of men and 74 percent of women. While on one hand the results are impressive, there are also ethical concerns.

The ability of the system to correctly identify gay and straight men and women is significantly better than the normal human prediction rate. The human gaydar was shown to yield a success rate of around 61 percent for men and 54 percent for women. The research by Michal Kosinski and Yilun Wang shows that AI is able to pick up on subtle differences in facial structure between gay and straight people.

The report, published in Journal of Personality and Social Psychology and also made publicly available on Open Science Framework, used publicly available images from a dating website -- the first of the ethical issues some may spot with the research.

Using a piece of software called VGG-Face, the photographs were scanned and assigned a number and an algorithm used to predict sexuality (which was then compared to the sexual orientation declared on the dating website). The Economist explains:

When shown one photo each of a gay and straight man, both chosen at random, the model distinguished between them correctly 81 percent of the time. When shown five photos of each man, it attributed sexuality correctly 91 percent of the time. The model performed worse with women, telling gay and straight apart with 71 percent accuracy after looking at one photo, and 83% accuracy after five. In both cases the level of performance far outstrips human ability to make this distinction. Using the same images, people could tell gay from straight 61 percent of the time for men, and 54 percent of the time for women. This aligns with research which suggests humans can determine sexuality from faces at only just better than chance.

This sounds like an impressive rate of accuracy, but the system is flawed. The high success rate is only achieved when comparing images of two men when one of them is known to be gay. In "real world" tests where the ratio of gay to straight people is much lower, the accuracy dropped dramatically. The system was, however, able to select with 90 percent accuracy the 10 people it was most confident were gay.

While the research showed that it is possible to use characteristics such as nose shape, forehead size and jaw length (exhibiting, in the paper's wording "gender atypical" features) to determine sexuality with some accuracy, the authors issue a warning:

Importantly, we would like to warn our readers against misinterpreting or overinterpreting this study's findings. First, the fact that the faces of gay men and lesbians are, on average gender atypical, does not imply that all gay men are more feminine than all heterosexual men, or that there are no gay men with extremely masculine facial features (and vice versa in the case of lesbians). The differences in femininity observed in this study were subtle, spread across many facial features, and apparent only when examining averaged images of many faces. Second, our results in no way indicate that sexual orientation can be determined from faces by humans

Then there is the concern that such technology could be used nefariously. This is something the authors are aware of, and touch on in the paper:

Some people may wonder if such findings should be made public lest they inspire the very application that we are warning against. We share this concern. However, as the governments and companies seem to be already deploying face-based classifiers aimed at detecting intimate traits (Chin & Lin, 2017; Lubin, 2016), there is an urgent need for making policymakers, the general public, and gay communities aware of the risks that they might be facing already.

Delaying or abandoning the publication of these findings could deprive individuals of the chance to take preventive measures and policymakers the ability to introduce legislation to protect people. Moreover, this work does not offer any advantage to those who may be developing or deploying classification algorithms, apart from emphasizing the ethical implications of their work.

We used widely available off-the-shelf tools, publicly available data, and methods well known to computer vision practitioners. We did not create a privacy-invading tool, but rather showed that basic and widely used methods pose serious privacy threats. We hope that our findings will inform the public and policymakers, and inspire them to design technologies and write policies that reduce the risks faced by homosexual communities across the world.

The authors also point out:

The results reported in this paper were shared, in advance, with several leading international LGBTQ organizations.

You can read the full study over on the Open Science Framework website.

Post Comment   Private Reply   Ignore Thread  


TopPage UpFull ThreadPage DownBottom/Latest

#1. To: Deckard (#0)

The best thing about the AI computer system, is it is never incorrect. If it says you are gay, you are gay. If it says your bank account reads -0-, you will bounce checks too.

Exercising rights is only radical to two people, Tyrants and Slaves. Which are YOU? Our ignorance has driven us into slavery and we do not recognize it.

jeremiad  posted on  2017-09-08   13:01:43 ET  Reply   Trace   Private Reply  


#2. To: Deckard (#0)

Let's see if it works compared to common sense: feed it Lindsay Graham and Reince Priebus.

Hank Rearden  posted on  2017-09-08   15:40:49 ET  Reply   Trace   Private Reply  


#3. To: Deckard (#0)

Facial detection technology is usually used to identify individuals for the purposes of crime prevention, or as a biometric security method. But a paper published by Stanford University -- entitled simply "Deep neural networks are more accurate than humans at detecting sexual orientation from facial images" -- shows that it could also be used to determine people's sexuality.

Don't blame me,folks,I'm just the reporter.

In the entire history of the world,the only nations that had to build walls to keep their own citizens from leaving were those with leftist governments.

sneakypete  posted on  2017-09-09   10:34:30 ET  (2 images) Reply   Trace   Private Reply  


TopPage UpFull ThreadPage DownBottom/Latest

[Home]  [Headlines]  [Latest Articles]  [Latest Comments]  [Post]  [Mail]  [Sign-in]  [Setup]  [Help]  [Register] 

Please report web page problems, questions and comments to webmaster@libertysflame.com