25 Comments

There is a clear and simple distinction between the use cases you are worried about — Clearview-style "who is this person?" — and the use cases you want to preserve for the general public — "a face recognition feature that only works within a single user’s photo library." It's the difference between identifying someone you don't know from a photo and identifying a photo of someone you already know.

This distinction leads to a technical basis for legislation. Facial recognition is safer when *the user brings the dataset* of faces to match against but very dangerous when *the app brings the dataset*. So a face-recognition law could prohibit recognition against an app-supplied database except by law enforcement with a warrant, but allow apps to perform recognition against a user-supplied dataset (perhaps limited to some reasonable size of number of images or number of distinct people the app can identify). This would allow companies to create sophisticated models that are good at face recognition in general, but which do not have specific identified faces embedded in them. Instead, the models could be fine-tuned on an individual user's photos. Apple, for example, does face recognition on-device. (https://machinelearning.apple.com/research/recognizing-people-photos)

Expand full comment
author

Makes sense to me!

Expand full comment

"Bring your own dataset" doesn't address the stalking and spousal abuse scenarios, which are arguably the ones we should be most worried about. Stalkers and abusers presumably have ready access to tagged images of family members and/or celebrities they wish to target.

Expand full comment
author

There are two possible concerns with stalkers where "bring your own data" helps. One is a scenario where the stalker doesn't know the target's name (say they met the target in a bar and she refused to give her name or number). In this scenario, the stalker has the one snapshot he took of her in the bar, but he isn't able to match the faceprint to anyone else's photos, and hence it doesn't help him identify her.

The other scenario is one where a woman leaves an abusive husband or boyfriend and tries to cut off all contact—perhaps including moving, changing jobs, etc. Such a woman would probably try to keep her own online presence to a minimum, but her face could be captured in the background of photos taken by others, which could give the abusive partner clues about what neighborhood she lives in, what stores she frequents, etc.

Expand full comment

Pardon me if I'm missing something, but in those scenarios what are you using the app for?

Like, if I supply a photo of my ex-wife to a factial recognition app, this will now enable me to... recognize my ex-wife? I don't need a facial recognition app to recognize celebrities, that's the whole point of celebrities!

Expand full comment

Same reason, presumably, that law enforcement wants to recognize known criminals: to search social media for clues about where they are and who they're hanging out with.

Expand full comment

Right, but if you already know the person, you can just search for them on social media. You don't need a face recognition tool as an intermediate step.

Expand full comment

What if they aren't on social media [any longer]? what if they are trying to hide from you?

Expand full comment

Just on your early point re captchas, the writing has been on the wall for some time (also existing algorithms can already do quite a good job on captchas, not to mention services that just farm out the task to human workers).

Google has released v3 of their widely used ReCaptcha service and notably, it does not actually use captchas. It takes the algorithms they’ve built to determine whether someone is likely to be a bot from their browser session context (or something like that) which previously were used to decide whether a captcha was necessary or not, and makes that risk score alone into the entire product. It won’t serve you a captcha if you exceed the score threshold, it just won’t let you in at all.

Expand full comment
author

Very good point thank you!

Expand full comment

You write...

"Lately I’ve been frustrated with the vagueness of the “AI safety” debate. A lot of people want stricter oversight of AI. But we’re far from a consensus about the nature of the threat—to say nothing of what to do about it."

All the talk about "stricter oversight" is mostly delusional. US and EU law has no jurisdiction over 90% of humanity. As your story makes clear, those with few scruples are going to take the AI ball and run with it. Some of the world's largest nations are ruled by psychopaths with very few if any scruples.

Talk of AI governance is mostly a PR stunt designed to present developers as being responsible, and politicians as being in control, neither of which are really very true.

Expand full comment
Sep 28, 2023·edited Sep 28, 2023

A Face Search Engine Anyone Can Use Is Alarmingly Accurate

PimEyes is a paid service that finds photos of a person from across the internet, including some the person may not want exposed. “We’re just a tool provider,” its owner said.

May 26, 2022

https://www.nytimes.com/2022/05/26/technology/pimeyes-facial-recognition-search.html

.

Top PimEyes Free Alternatives – 12 Secure Image Search Tools

https://rigorousthemes.com/blog/best-pimeyes-alternatives/

.

"You can run but you can't hide" in today's world. Nothing is truly private.

Expand full comment

That NYT article you linked is written by the same person that inspired (?) this substance posting "I’ve been thinking about facial recognition a lot recently because I just listened to Your Face Belongs to Us, an excellent new book from New York Times reporter Kashmir Hill."

Pimeyes shows the facial recognition cat is out of the bag. It's not going back in.

Expand full comment
Sep 27, 2023·edited Sep 27, 2023

How do I say this. Until bias goes way of which they probably never will, facial recognition software is too dangerous to life and liberty. I am not just speaking on the law enforcement aspect but giving the general public to ID someone can lead to nefarious encounters. I am usually for technology advancement but dare I say all facial recognition software should be scrapped and outlawed. It is already abused to the point that law enforcement can pick you up on street just solely based on facial recognition and the various system are extremely flawed, especially for people whom look like me.

Don't trust it one bit, which is weird coming from me.

Expand full comment

https://apnews.com/article/facial-recognition-banned-new-york-schools-ddd35e004254d316beabf70453b1a6a2

This is getting a bit out of hand straight bonkers. This isn't safety.

Expand full comment
Sep 27, 2023·edited Sep 27, 2023

"For example, think of a young woman who meets a stranger at a bar ... in the future, the man might be able to pull out his phone, snap a photo, and upload it to a facial recognition app. That might enable him to show up uninvited at her home or workplace the next day."

While I think this is a very legitimate concern, I think there is a flip side to it. Imagine the scenario where a guy shows up at a young woman's work place repeatedly and stares at her, makes her feel uncomfortable, but never reveals who he is. It seems like members of the general public, like her, would appreciate being able to identify people who might pose a threat to them. You could even rig up a system that automatically flags people that have certain criminal histories if you encounter them on the street, making it easier for potential victims to escape or prepare to defend themselves

Expand full comment
author

I think in that scenario it would probably make sense for the young woman to involve the police with or without facial recognition. So I think we'd want a legal framework where the woman could take a photo of the man, bring that to the police station, and ask the police to warn him to leave her alone. Not only does the woman not need access to the facial recognition tool in this case, the police might not even need to tell her the man's name. Instead the police could contact the man directly and ask him to stay away.

Expand full comment

Oh yeah.. another bit of paper - similar to other interventions delivered on paper that dont work either. Any woman caught up in the DFV knows that! Including the families of those women and children now deceased thru DFV!

Expand full comment
author

I'm not saying the system is perfect by any means. I just don't understand how the facial recognition is helpful without involving the authorities. What is a woman going to do with information about a stalker's identity besides report the person to the police?

Expand full comment

Sorry to bust the bubble you live in, but the police and civil justice hardly do anything any longer. Here in CA, they don't enforce traffic laws and rarely arrest people for stealing. Criminals get citations and sent on their way.

Our local paper regularly lists police blotter reports where people with multiple offenses and even warrants just get a citation and sent on their way. On NextDoor, people relate lack of police enforcement and concern regularly.

A month or so ago, San Mateo, CA police stopped a car and found 200lbs of meth. The car was stolen and was picked up on license plate readers. That's 200 pounds of METH! The two people got a citation and were let go. It wasn't clear frorm the story if they also let them keep the stolen car.

Look at the thievery in SF or druggies shooting up in plain view on the sidewalk while the police walk on by. No, the police are not going to help, outside of something truly serious, like a murder.

Expand full comment
author

Not everyone lives in California you know.

Expand full comment

But 40 million people do. And there are a lot of other states in a similar pickle, where calling the police for anything is like yelling into a black hole.

Expand full comment

We need facial recognition. Just look at how much safer China is.

Expand full comment

Yeah right!!!! Good point!!!

Expand full comment

There is no privacy any longer. Get over it. It is too late to cram Pandora back into her box.

Expand full comment