Reading view

There are new articles available, click to refresh the page.

Police Want to Treat Your Data Privacy Like Garbage. The Courts Shouldn't Let Them.

pImagine this: You lost your phone, or had it stolen. Would you be comfortable with a police officer who picked it up rummaging through the phone’s contents without any authorization or oversight, thinking you had abandoned it? We’ll hazard a guess: hell no, and for good reason. /p pOur cell phones and similar digital devices open a window into our entire lives, from messages we send in confidence to friends and family, to intimate photographs, to financial records, to comprehensive information about our movements, habits, and beliefs. Some of this information is intensely private in its own right; in combination, it can disclose virtually everything about a modern cell phone user. /p pIf it seems like common sense that law enforcement shouldn’t have unfettered access to this information whenever it finds a phone left unattended, you’ll be troubled by an argument that government lawyers are advancing in a pending case before the Ninth Circuit Court of Appeals, iUnited States v. Hunt/i. In iHunt/i, the government claims it does not need a warrant to search a phone that it deems to have been abandoned by its owner because, in ditching the phone, the owner loses any reasonable expectation of privacy in all its contents. As a basis for this claim, the government cites an exception to the Fourth Amendment’s warrant requirement that applies to searches of abandoned property. But that rule was developed years ago in the context of property that is categorially different, and much less revealing, than the reams of diverse and highly sensitive information that law enforcement can access by searching our digital devices. /p pThe Supreme Court a href=https://www.supremecourt.gov/opinions/17pdf/16-402_h315.pdf#page=24has cautioned against/a uncritically extending pre-digital doctrines to modern technologies, like cell phones, that gather in one place so many of the privacies of life. In a a href=https://www.aclu.org/documents/ninth-circuit-cell-phone-abandonment-amicus-huntfriend-of-the-court brief/a in iHunt/i, the ACLU and our coalition partners urge the Ninth Circuit to heed this call, and hold that even if the physical device may properly be considered abandoned, the myriad records that reside on a cell phone remain subject to full constitutional protection. Police should have to get a warrant before searching the data on a phone they find separated from its owner./p div class=wp-heading mb-8 hr class=mark / h2 id= class=wp-heading-h2 with-markCases about abandoned property are a poor fit for digital-age privacy/h2 /div pAs the Supreme Court a href=https://supreme.justia.com/cases/federal/us/573/13-132/case.pdfrecognized/a more than 10 years ago, when the storage capacity of the median cell phone was a great deal less than it is today, advances in digital technology threaten to erode our privacy against government intrusion if courts apply to the troves of information on a cell phone the same rule they would use to analyze a search of a cigarette pack. In a case called iRiley v. California/i, the Supreme Court held that even though police may warrantlessly search items in a suspect’s pockets during arrest to avoid the destruction of evidence or identify danger to the arresting officers, a warrantless inspection of the information on an arrestee’s phone went too far. Why? Because phones, “[w]ith all they may contain and all they may reveal,” are different. /p pHere too, the information on a cell phone is qualitatively and quantitatively unlike the items that underpin precedents permitting warrantless searches of abandoned property. The most recent of those precedents was decided in 1988, long before cell phones became a “a href=https://supreme.justia.com/cases/federal/us/573/13-132/case.pdfpervasive and insistent part of daily life/a.” In case you’re keeping score, 1988 was the year Motorola debuted its first “bag phone,” a href=https://www.thehenryford.org/collections-and-research/digital-collections/artifact/162235#slide=gs-212075an early transportable telephone the size of a briefcase/a that needed to be lugged around with a separate battery and transceiver. In that case, the Supreme Court held that people lose their legal privacy in items, like curbside trash, that they knowingly and voluntarily leave out for any member of the public to see. But when you fail to reclaim a lost or abandoned phone, do you knowingly and voluntarily renounce all of your data, too? Our brief argues that the Ninth Circuit should not use the same reasoning that has historically applied to a href=https://tile.loc.gov/storage-services/service/ll/usrep/usrep486/usrep486035/usrep486035.pdfgarbage left out for collection/a and a href=https://tile.loc.gov/storage-services/service/ll/usrep/usrep362/usrep362217/usrep362217.pdf#page=24items discarded in a hotel wastepaper basket/a after check-out to impute to a cell phone’s owner an intent to give up all the revealing information on their device, just because it was left behind./p div class=wp-heading mb-8 hr class=mark / h2 id= class=wp-heading-h2 with-markCell phones contain vast amounts of diverse and revealing information, unlike other categories of objects/h2 /div pThe immense storage capacity of modern cell phones allows people to carry in their palm a volume and variety of private information that is genuinely unprecedented in cases concerning searches of abandoned property. Our cell phones provide access to information comparable in quantity and breadth to what police might glean from a thorough search of a house. Unlike a house, though, a cell phone is relatively easy to lose. You carry it with you almost all the time. It can fall between seat cushions or slip out of a loose pocket. You might leave it at the check-out desk after making a purchase or forget it on the bus as you hasten to make your stop. Even if you eventually give up looking for the device, thereby “abandoning” it, this doesn#8217;t evince any subjective intent to relinquish to whoever might pick it up all the information the phone can store or access through the internet./p div class=wp-heading mb-8 hr class=mark / h2 id= class=wp-heading-h2 with-markCloud backups mean that the data on a phone often isn’t lost even when the device goes missing/h2 /div pAn additional reason that the privacy of the information on a cell phone shouldn’t hinge on a person’s ongoing possession of their device is that you can still access and control much of the data on your phone independently of the device itself. While modern cell phones store extraordinary and growing amounts of data locally, a lot of this information resides also on remote servers — think of the untold messages, contacts, notes, and images you may have backed up on iCloud or its equivalents. If you have access to a computer or tablet, all this information remains yours to view, edit, and delete whether or not your phone is handy. Trade in your cell phone, and you can seamlessly download this information onto a new device, reviewing voicemail messages and carrying on existing conversations in text without interruption. In this sense, a cell phone is more properly analogized to a house key than a house, something we use to access vast amounts of information that’s largely stored elsewhere. It would be absurd to suggest that a person intends to open up their house for unrestrained searches by police whenever they drop their house key. Yet this is essentially the position the government in the iHunt /icase argued, successfully, in the trial court: Because the defendant discarded his phone, any piece of information stored on that phone was fair game, regardless of whether it was backed up. /p pThe Ninth Circuit has an opportunity in iHunt/i to correct the trial court’s error and clarify that the rule governing police searches of the information on a lost or abandoned cell phone does not defy common-sense intuitions about what information we mean to give up when we lose track of our devices. The information on your cell phone is highly private and revealing. If the police want authority to review it, the Constitution requires of them something simple — get a warrant./p

States Dust Off Obscure Anti-Mask Laws to Target Pro-Palestine Protesters

pArcane laws banning people from wearing masks in public are now being used to target people who wear face coverings while peacefully protesting Israel’s war in Gaza. That’s a big problem./p pIn the 1940s and 50s, many U.S. states passed anti-mask laws as a response to the Ku Klux Klan, whose members often hid their identities as they terrorized their victims. These laws were not enacted to protect those victims, but because political leaders wanted to defend segregation as part of a “modern South” and felt that the Klan’s violent racism was making them look bad./p pNow these laws are being used across the country to try and clamp down on disfavored groups and movements, raising questions about selective prosecution. Just this month, Ohio Attorney General Dave Yost a href=https://www.latimes.com/world-nation/story/2024-05-08/masked-student-protesters-could-face-felony-charges-under-anti-kkk-law-ohio-attorney-general-warnssent a letter/a to the state’s 14 public universities alerting them that protesters could be charged with a felony under the state’s little-used anti-mask law, which carries penalties of between six to 18 months in prison. An Ohio legal expert, Rob Barnhart, observed that he’d a href=https://www.wosu.org/politics-government/2024-05-07/protesters-could-face-felony-charge-if-arrested-while-wearing-a-mask-under-obscure-ohio-lawnever heard/a of the state’s law being applied previously, even to bank robbers wearing masks. While Yost framed his letter as “proactive guidance,” Barnhart countered that “I find it really hard to believe that this is some public service announcement to students to be aware of a 70-year-old law that nobody uses.”/p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/news/free-speech/americas-mask-bans-in-the-age-of-face-recognition-surveillance target=_blank tabindex=-1 img width=1200 height=628 src=https://assets.aclu.org/live/uploads/2024/05/5b813d014d9877b39c43e882a1782bed.jpg class=attachment-4x3_full size-4x3_full alt= decoding=async loading=lazy srcset=https://assets.aclu.org/live/uploads/2024/05/5b813d014d9877b39c43e882a1782bed.jpg 1200w, https://assets.aclu.org/live/uploads/2024/05/5b813d014d9877b39c43e882a1782bed-768x402.jpg 768w, https://assets.aclu.org/live/uploads/2024/05/5b813d014d9877b39c43e882a1782bed-400x209.jpg 400w, https://assets.aclu.org/live/uploads/2024/05/5b813d014d9877b39c43e882a1782bed-600x314.jpg 600w, https://assets.aclu.org/live/uploads/2024/05/5b813d014d9877b39c43e882a1782bed-800x419.jpg 800w, https://assets.aclu.org/live/uploads/2024/05/5b813d014d9877b39c43e882a1782bed-1000x523.jpg 1000w sizes=(max-width: 1200px) 100vw, 1200px / /a /div div class=wp-link__title a href=https://www.aclu.org/news/free-speech/americas-mask-bans-in-the-age-of-face-recognition-surveillance target=_blank America's Mask Bans in the Age of Face Recognition Surveillance /a /div div class=wp-link__description a href=https://www.aclu.org/news/free-speech/americas-mask-bans-in-the-age-of-face-recognition-surveillance target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tabletAmerican laws should allow people the freedom to cover up their faces in protests or anywhere else./p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/news/free-speech/americas-mask-bans-in-the-age-of-face-recognition-surveillance target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pOhio officials aren’t the only ones who seem to be selectively enforcing anti-mask laws against student protestors. Administrators at the University of North Carolina a href=https://chapelboro.com/news/unc/unc-asks-pro-palestine-protesters-to-stop-wearing-masks-citing-1953-anti-kkk-lawhave warned/a protesters that wearing masks violates the state’s anti-mask law and “runs counter to our campus norms and is a violation of UNC policy.” Students arrested during a protest at the University of Florida were a href=https://www.sun-sentinel.com/2024/04/29/police-make-first-arrests-in-florida-of-pro-palestinian-protesters-at-two-university-campuses/charged with/a, among other things, wearing masks in public. At the University of Texas at Austin, Gov. Greg Abbott and university officials called in state troopers to a href=https://www.texastribune.org/2024/04/29/university-texas-pro-palestinian-protest-arrest/violently/a break up pro-Palestinian protests after the school a href=https://www.houstonchronicle.com/politics/texas/article/ut-austin-police-protest-arrests-19422645.phprescinded permission/a for a rally on the grounds that protesters had a “declared intent to violate our policies and rules.” One of the rules the administrators cited was a university ban on wearing face masks “to obstruct law enforcement.”/p pAt a time when both public and private actors are increasingly turning to invasive surveillance technologies to identify protesters, mask-wearing is an important way for us to safeguard our right to speak out on issues of public concern. While the ACLU has raised concerns about how anti-mask laws have been wielded for decades, we are especially worried about the risk they pose to our constitutional freedoms in the digital age./p pIn particular, the emergence of face recognition technology has changed what it means to appear in public. Increasingly omnipresent cameras and corrosive technology products such as a href=https://www.nytimes.com/interactive/2021/03/18/magazine/facial-recognition-clearview-ai.htmlClearview AI/a allow police to easily identify people. So, too, can private parties. The push to normalize face recognition by security agencies threatens to turn our faces into the functional equivalent of license plates. Anti-mask laws are in effect a requirement to display those “plates” anytime one is in public. Humans are not cars./p pOf course, mask-wearing is not just about privacy — it can also be an expressive act, a religious practice, a political statement, or a public-health measure. The ACLU has chronicled the a href=https://www.aclu.org/news/free-speech/americas-mask-bans-in-the-age-of-face-recognition-surveillancemask-wearing debate/a for years. As recently as 2019, anti-mask laws were used against a href=https://www.theatlantic.com/national/archive/2011/09/nypd-arresting-wall-street-protesters-wearing-masks/337706/Occupy Wall Street/a protesters,a href=https://www.ajc.com/news/state--regional/white-nationalist-richard-spencer-riles-auburn-campus-three-arrested/5HeaD0TCfvfNI7DuXDUciJ/ anti-racism/aa href=https://wtvr.com/2017/09/19/mask-in-public-court-hearing/ protesters/a, anda href=https://wbhm.org/feature/2019/experts-alabamas-mask-law-is-outdated/ police violence/a protesters. The coronavirus temporarily scrambled the mask-wearing debate and made a mask both a protective and a a href=https://apnews.com/article/virus-outbreak-donald-trump-ap-top-news-politics-health-7dce310db6e85b31d735e81d0af6769cpolitical/a act./p pToday, one question that remains is whether and how the authorities distinguish between those who are wearing a mask to protect their identities and those who are wearing one to protect themselves against disease. That ambiguity opens up even more space for discretionary and selective enforcement. In North Carolina, the state Senate is currently considering an anti-protest bill that would remove the exception for wearing a mask for health purposes altogether, and would add a sentencing enhancement for committing a crime while wearing a mask./p pFor those speaking out in support of the Palestinian people, being recognized in a crowd can have extreme consequences for their personal and professional security. During the Gaza protests, pro-Israel activists and organizations have posted the faces and personal information of pro-Palestine activists to intimidate them, get them fired, or otherwise shame them for their views. These doxing attempts have intensified, with viral videos showing counterprotesters demanding that pro-Palestinian protesters remove their masks at rallies. Professionally, employers have a href=https://www.thecut.com/2023/10/israel-hamas-war-job-loss-social-media.htmlterminated workers/a for their comments about Israel and Palestine, and CEOs have a href=https://finance.yahoo.com/news/bill-ackman-wants-harvard-name-104621975.htmldemanded/a universities give them the names of protesters in order to blacklist them from jobs./p pWhile wearing a mask can make it harder to identify a person, it#8217;s important for protesters to know that it’s not always effective. Masks haven’t stopped the a href=https://www.nytimes.com/2022/12/02/business/china-protests-surveillance.htmlChinese government/a or a href=https://www.cbsnews.com/sanfrancisco/news/google-workers-fired-after-protesting-israeli-contract-file-complaint-labor-regulators/Google/a, for example, from identifying protesters and taking action against them. Technologies that can be used to identify masked protesters range froma href=https://www.notus.org/technology/war-zone-surveillance-border-us Bluetooth and WiFi signals/a, to historical cell phone location data, to constitutionally dubious devices calleda href=https://www.aclu.org/news/privacy-technology/police-citing-terrorism-buy-stingrays-used-only IMSI Catchers/a, which pretend to be a cell tower and ping nearby phones, prompting phones to reply with an identifying ping of their own. We may also see the development of a href=https://www.aclu.org/publications/dawn-robot-surveillancevideo analytics/a technologies that use gait recognition or body-proportion measurements. During Covid, face recognition also got a href=https://www.bbc.com/news/technology-56517033much/aa href=https://www.zdnet.com/article/facial-recognition-now-algorithms-can-see-through-face-masks/ better/a at identifying people wearing partial face masks./p pProtecting people’s freedom to wear masks can have consequences. It can make it harder to identify people who commit crimes, whether they are bank robbers, muggers, or the members of the “a href=https://www.latimes.com/california/story/2024-05-07/a-ucla-timeline-from-peaceful-encampment-to-violent-attacks-aftermathviolent mob/a” that attacked a peaceful protest encampment at UCLA. Like all freedoms, the freedom to wear a mask can be abused. But that does not justify taking that freedom away from those protesting peacefully, especially in today’s surveillance environment./p pAnti-mask laws, undoubtedly, have a significant chilling effect on some protesters#8217; willingness to show up for causes they believe in. The bravery of those who do show up to support a highly-controversial cause in the current surveillance landscape is admirable, but Americans shouldn’t have to be brave to exercise their right to protest. Until privacy protections catch up with technology, officials and policymakers should do all they can to make it possible for less-brave people to show up and protest. That includes refusing to use anti-mask laws to target peaceful protestors./p

Police Say a Simple Warning Will Prevent Face Recognition Wrongful Arrests. That's Just Not True.

pFace recognition technology in the hands of police is dangerous. Police departments across the country frequently use the technology to try to identify images of unknown suspects by comparing them to large photo databases, but it often fails to generate a correct match. And numerous a href=https://www.washingtonpost.com/technology/2019/12/19/federal-study-confirms-racial-bias-many-facial-recognition-systems-casts-doubt-their-expanding-use/studies/a have shown that face recognition technology misidentifies Black people and other people of color at higher rates than white people. To date, there have been at least seven wrongful arrests we know of in the United States due to police reliance on incorrect face recognition results — and those are just the known cases. In nearly every one of those instances, a href=https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.htmlthe/a a href=https://www.nytimes.com/2023/03/31/technology/facial-recognition-false-arrests.htmlperson/a a href=https://www.newyorker.com/magazine/2023/11/20/does-a-i-lead-police-to-ignore-contradictory-evidence/wrongfully/a a href=https://www.nytimes.com/2023/08/06/business/facial-recognition-false-arrest.htmlarrested/a a href=https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentify-jail.htmlwas/a a href=https://www.freep.com/story/news/local/michigan/detroit/2020/07/10/facial-recognition-detroit-michael-oliver-robert-williams/5392166002/Black/a./p pSupporters of police using face recognition technology often portray these failures as unfortunate mistakes that are unlikely to recur. Yet, they keep coming. Last year, six Detroit police officers showed up at the doorstep of an a href=https://www.nytimes.com/2023/08/06/business/facial-recognition-false-arrest.htmleight-months pregnant woman/a and wrongfully arrested her in front of her children for a carjacking that she could not plausibly have committed. A month later, the prosecutor dismissed the case against her./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway target=_blank tabindex=-1 img width=2800 height=1400 src=https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed.jpg class=attachment-4x3_full size-4x3_full alt=Robert Williams and his daughter, Rosie Williams decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed.jpg 2800w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-768x384.jpg 768w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1536x768.jpg 1536w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-2048x1024.jpg 2048w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-400x200.jpg 400w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-600x300.jpg 600w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-800x400.jpg 800w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1000x500.jpg 1000w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1200x600.jpg 1200w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1400x700.jpg 1400w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1600x800.jpg 1600w sizes=(max-width: 2800px) 100vw, 2800px / /a /div div class=wp-link__title a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway target=_blank I Did Nothing Wrong. I Was Arrested Anyway. /a /div div class=wp-link__description a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tabletOver a year after a police face recognition tool matched me to a crime I did not commit, my family still feels the impact. We must stop this.../p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pPolice departments should be doing everything in their power to avoid wrongful arrests, which can turn people’s lives upside down and result in loss of work, inability to care for children, and other harmful consequences. So, what’s behind these repeated failures? As the ACLU explained in a a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13erecent submission/a to the federal government, there are multiple ways in which police use of face recognition technology goes wrong. Perhaps most glaring is that the most widely adopted police policy designed to avoid false arrests in this context emsimply does not work/em. Records from the wrongful arrest cases demonstrate why./p pIt has become standard practice among police departments and companies making this technology to warn officers that a result from a face recognition search does not constitute a positive identification of a suspect, and that additional investigation is necessary to develop the probable cause needed to obtain an arrest warrant. For example, the International Association of Chiefs of Police a href=https://www.theiacp.org/sites/default/files/2019-10/IJIS_IACP%20WP_LEITTF_Facial%20Recognition%20UseCasesRpt_20190322.pdfcautions/a that a face recognition search result is “a strong clue, and nothing more, which must then be corroborated against other facts and investigative findings before a person can be determined to be the subject whose identity is being sought.” The Detroit Police Department’s face recognition technology a href=https://detroitmi.gov/sites/detroitmi.localhost/files/2020-10/307.5%20Facial%20Recognition.pdfpolicy/a adopted in September 2019 similarly states that a face recognition search result is only an “an investigative lead and IS NOT TO BE CONSIDERED A POSITIVE IDENTIFICATION OF ANY SUBJECT. Any possible connection or involvement of any subject to the investigation must be determined through further investigation and investigative resources.”/p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 /a /div div class=wp-link__title a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank ACLU Comment re: Request for Comment on Law Enforcement Agencies' Use of Facial Recognition Technology, Other Technologies Using Biometric Information, and Predictive Algorithms (Exec. Order 14074, Section 13(e)) /a /div div class=wp-link__description a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tablet/p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pPolice departments across the country, from a href=https://lacris.org/LACRIS Facial Recognition Policy v_2019.pdfLos Angeles County/a to the a href=https://www.in.gov/iifc/files/Indiana_Intelligence_Fusion_Center_Face_Recognition_Policy.pdfIndiana State Police/a, to the U.S. a href=https://www.dhs.gov/sites/default/files/2023-09/23_0913_mgmt_026-11-use-face-recognition-face-capture-technologies.pdfDepartment of Homeland Security/a, provide similar warnings. However ubiquitous, these warnings have failed to prevent harm./p pWe’ve seen police treat the face recognition result as a positive identification, ignoring or not understanding the warnings that face recognition technology is simply not reliable enough to provide a positive identification./p pIn Louisiana, for example, police relied solely on an incorrect face recognition search result from Clearview AI as purported probable cause for an arrest warrant. The officers did this even though the law enforcement agency signed a contract with the face recognition company acknowledging officers “must conduct further research in order to verify identities or other data generated by the [Clearview] system.” That overreliance led to a href=https://www.nytimes.com/2023/03/31/technology/facial-recognition-false-arrests.htmlRandal Quran Reid/a, a Georgia resident who had never even been to Louisiana, being wrongfully arrested for a crime he couldn’t have committed and held for nearly a week in jail./p pIn an a href=https://www.courierpress.com/story/news/local/2023/10/19/evansville-police-using-clearview-ai-facial-recognition-to-make-arrests/70963350007/Indiana investigation/a, police similarly obtained an arrest warrant based only upon an assertion that the detective “viewed the footage and utilized the Clearview AI software to positively identify the female suspect.” No additional confirmatory investigation was conducted./p pBut even when police do conduct additional investigative steps, those steps often emexacerbate and compound/em the unreliability of face recognition searches. This is a particular problem when police move directly from a facial recognition result to a witness identification procedure, such as a photographic lineup./p pFace recognition technology is designed to generate a list of faces that are emsimilar/em to the suspect’s image, but often will not actually be a match. When police think they have a match, they frequently ask a witness who saw the suspect to view a photo lineup consisting of the image derived from the face recognition search, plus five “filler” photos of other people. Photo lineups have long been known to carry a high risk of misidentification. The addition of face recognition-generated images only makes it worse. Because the face recognition-generated image is likely to appear more similar to the suspect than the filler photos, there is a a href=https://www.newyorker.com/magazine/2023/11/20/does-a-i-lead-police-to-ignore-contradictory-evidence/heightened chance/a that a witness will a href=https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4101826mistakenly choose/a that image out of the lineup, even though it is not a true match./p pThis problem has contributed to known cases of wrongful arrests, including the arrests of a href=https://www.nytimes.com/2023/08/06/business/facial-recognition-false-arrest.htmlPorcha Woodruff/a, a href=https://www.freep.com/story/news/local/michigan/detroit/2020/07/10/facial-recognition-detroit-michael-oliver-robert-williams/5392166002/Michael Oliver/a, and a href=https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.htmlRobert Williams/a by Detroit police (the ACLU represents Mr. Williams in a a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anywaywrongful arrest lawsuit/a). In these cases, police obtained an arrest warrant based solely on the combination of a false match from face recognition technology; and a false identification from a witness viewing a photo lineup that was constructed around the face recognition lead and five filler photos. Each of the witnesses chose the face recognition-derived false match, instead of deciding that the suspect did not, in fact, appear in the lineup./p pA lawsuit filed earlier this year in Texas alleges that a similar series of failures led to the wrongful arrest of a href=https://www.theguardian.com/technology/2024/jan/22/sunglass-hut-facial-recognition-wrongful-arrest-lawsuit?ref=upstract.comHarvey Eugene Murphy Jr./a by Houston police. And in New Jersey, police wrongfully arrested a href=https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentify-jail.htmlNijeer Parks/a in 2019 after face recognition technology incorrectly flagged him as a likely match to a shoplifting suspect. An officer who had seen the suspect (before he fled) viewed the face recognition result, and said he thought it matched his memory of the suspect’s face./p pAfter the Detroit Police Department’s third wrongful arrest from face recognition technology became public last year, Detroit’s chief of police a href=https://www.facebook.com/CityofDetroit/videos/287218473992047acknowledged/a the problem of erroneous face recognition results tainting subsequent witness identifications. He explained that by moving straight from face recognition result to lineup, “it is possible to taint the photo lineup by presenting a person who looks most like the suspect” but is not in fact the suspect. The Department’s policy, merely telling police that they should conduct “further investigation,” had not stopped police from engaging in this bad practice./p pBecause police have repeatedly proved unable or unwilling to follow face recognition searches with adequate independent investigation, police access to the technology must be strictly curtailed — and the best way to do this is through strong a href=https://www.aclu.org/sites/default/files/field_document/02.16.2021_coalition_letter_requesting_federal_moratorium_on_facial_recognition.pdfbans/a. More than 20 jurisdictions across the country, from Boston, to Pittsburgh, to San Francisco, have done just that, barring police from using this dangerous technology./p pBoilerplate warnings have proven ineffective. Whether these warnings fail because of human a href=https://www.nytimes.com/2020/06/09/technology/facial-recognition-software.htmlcognitive bias/a toward trusting computer outputs, poor police training, incentives to quickly close cases, implicit racism, lack of consequences, the fallibility of witness identifications, or other factors, we don’t know. But if the experience of known wrongful arrests teaches us anything, it is that such warnings are woefully inadequate to protect against abuse./p

How to Protect Consumer Privacy and Free Speech

pTechnology is a necessity of modern life. People of all ages rely on it for everything from accessing information and connecting with others, to paying for goods, using transportation, getting work done, and speaking out about issues of the day. Without adequate privacy protections, technology can be co-opted to surveil us online and intrude on our private lives–not only by the government, but also by businesses–with grave consequences for our rights./p pThere is sometimes a misconception that shielding our personal information from this kind of misuse will violate the First Amendment rights of corporations who stand to profit from collecting, analyzing, and sharing that information. But we don’t have to sacrifice robust privacy protection to uphold anyone’s right to free speech. In fact, when done right, strong privacy protections reinforce speech rights. They create spaces where people have the confidence to exercise their First Amendment rights to candidly communicate with friends, seek out advice and community, indulge curiosity, and anonymously speak or access information./p pAt the same time, simply calling something a “privacy law” doesn’t make it so. Take the California Age Appropriate Design Code Act (CAADCA), a law currently under review by the Ninth Circuit in iNetChoice v. Bonta/i. As the ACLU and the ACLU of Northern California argued in a a href=https://www.aclu.org/cases/netchoice-llc-v-bonta?document=Amici-Curiae-Brief-of-the-ACLU-%26-ACLU-of-Northern-Californiafriend-of-the-court brief/a, this law improperly included content restrictions on online speech and is unconstitutional. Laws can and should be crafted to protect both privacy and free speech rights. It is critical that legislatures and courts get the balance right when it comes to a law that implicates our ability to control our personal information and to speak and access content online./p pConsumer privacy matters. With disturbing frequency, businesses use technology to siphon hordes of personal information from us – learning things about our health, our family situation, our financial status, our location, our age, and even our beliefs. Not only can they paint intimate portraits of our lives but, armed with this information, they can raise or lower prices depending on our demographics, make discriminatory predictions about a href=https://www.wired.com/story/argentina-algorithms-pregnancy-prediction/health outcomes/a, improperly deny a href=https://www.hud.gov/sites/dfiles/Main/documents/HUD_v_Facebook.pdfhousing/a or a href=https://www.cnn.com/2023/06/12/tech/facebook-job-ads-gender-discrimination-asequals-intl-cmd/index.htmljobs/a, a href=https://www.propublica.org/article/health-insurers-are-vacuuming-up-details-about-you-and-it-could-raise-your-rateshike insurance rates/a, and flood people of color and low-income people with a href=https://www.nytimes.com/2011/09/08/opinion/fair-lending-and-accountability.htmlads for predatory loans/a./p pAll this nefarious behavior holds serious consequences for our financial stability, our health, our quality of life, and our civil rights, including our First Amendment rights. Better consumer privacy gives advocates, activists, whistleblowers, dissidents, authors, artists, and others the confidence to speak out. Only when people are free from the fear that what they’re doing online is being monitored and shared can they feel free to enjoy the full extent of their rights to read, investigate, discuss, and be inspired by whatever they want./p pYet in recent years, tech companies have argued that consumer privacy protections limit their i /iFirst Amendment rights to collect, use, and share people’s personal information. These arguments are often faulty. Just because someone buys a product or signs up for a service, that doesn’t give the company providing that good or service the First Amendment right to share or use the personal information they collect from that person however they want./p pTo the contrary, laws that require data minimization and high privacy settings by default are good policy and can easily pass First Amendment muster. Arguments to the contrary not only misunderstand the First Amendment; they’d actually weaken its protections./p pLaws that suppress protected speech in order to stop children from accessing certain types of content generally often hurt speech and privacy rights for all. That’s why First Amendment challenges to laws a href=https://www.aclu.org/news/free-speech/arkansas-wants-to-unconstitutionally-card-people-before-they-use-social-mediathat limit what we can see online/a typically succeed. The Supreme Court has made it clear time and again that the government cannot regulate speech solely to stop children from seeing ideas or images that a legislative body believes to be unsuitable. Nor can it limit adults’ access to speech in the name of shielding children from certain content./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/news/free-speech/arkansas-wants-to-unconstitutionally-card-people-before-they-use-social-media target=_blank tabindex=-1 img width=1200 height=628 src=https://www.aclu.org/wp-content/uploads/2024/03/493ead8cd079d73577ec75d5436e8b10.jpg class=attachment-original size-original alt= decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2024/03/493ead8cd079d73577ec75d5436e8b10.jpg 1200w, https://www.aclu.org/wp-content/uploads/2024/03/493ead8cd079d73577ec75d5436e8b10-768x402.jpg 768w, https://www.aclu.org/wp-content/uploads/2024/03/493ead8cd079d73577ec75d5436e8b10-400x209.jpg 400w, https://www.aclu.org/wp-content/uploads/2024/03/493ead8cd079d73577ec75d5436e8b10-600x314.jpg 600w, https://www.aclu.org/wp-content/uploads/2024/03/493ead8cd079d73577ec75d5436e8b10-800x419.jpg 800w, https://www.aclu.org/wp-content/uploads/2024/03/493ead8cd079d73577ec75d5436e8b10-1000x523.jpg 1000w sizes=(max-width: 1200px) 100vw, 1200px / /a /div div class=wp-link__title a href=https://www.aclu.org/news/free-speech/arkansas-wants-to-unconstitutionally-card-people-before-they-use-social-media target=_blank Arkansas Wants to Unconstitutionally “Card” People Before They Use Social Media /a /div div class=wp-link__description a href=https://www.aclu.org/news/free-speech/arkansas-wants-to-unconstitutionally-card-people-before-they-use-social-media target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tabletThe state’s Social Media Safety Act stifles freedom of expression online and violates the First Amendment./p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/news/free-speech/arkansas-wants-to-unconstitutionally-card-people-before-they-use-social-media target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pThe CAADCA is unconstitutional for these reasons, despite the legislature’s understandable concerns about the privacy, wellbeing, and safety of children. The law was drafted so broadly that it actually would have hurt children. It could have prevented young people and adults from accessing things like online mental health resources; support communities related to school shootings and suicide prevention; and reporting about war, the climate crisis, and gun violence. It also could interfere with students#8217; attempts to express political or religious speech, or provide and receive personal messages about deaths in the family, rejection from a college, or a breakup. Paradoxically, the law exposes everyone’s information to greater privacy concerns by encouraging companies to gather and analyze user data for age estimation purposes./p pWhile we believe that the CAADCA burdens free speech and should be struck down, it is important that the court not issue a ruling that forecloses a path that other privacy laws could take to protect privacy without violating the First Amendment. We need privacy and free speech, too, especially in the digital age./p

Communities Should Reject Surveillance Products Whose Makers Won't Allow Them to be Independently Evaluated

pAmerican communities are being confronted by a lot of new police technology these days, a lot of which involves surveillance or otherwise raises the question: “Are we as a community comfortable with our police deploying this new technology?” A critical question when addressing such concerns is: “Does it even work, and if so, how well?” It’s hard for communities, their political leaders, and their police departments to know what to buy if they don’t know what works and to what degree./p pOne thing I’ve learned from following new law enforcement technology for over 20 years is that there is an awful lot of snake oil out there. When a new capability arrives on the scene—whether it’s a href=https://www.aclu.org/wp-content/uploads/publications/drawing_blank.pdfface recognition/a, a href=https://www.aclu.org/blog/privacy-technology/surveillance-technologies/experts-say-emotion-recognition-lacks-scientific/emotion recognition/a, a href=https://www.aclu.org/wp-content/uploads/publications/061819-robot_surveillance.pdfvideo analytics/a, or “a href=https://www.aclu.org/news/privacy-technology/chicago-police-heat-list-renews-old-fears-aboutbig data/a” pattern analysis—some companies will rush to promote the technology long before it is good enough for deployment, which sometimes a href=https://www.aclu.org/blog/privacy-technology/surveillance-technologies/experts-say-emotion-recognition-lacks-scientific/never happens/a. That may be even more true today in the age of artificial intelligence. “AI” is a term that often amounts to no more than trendy marketing jargon./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technology target=_blank tabindex=-1 img width=1200 height=628 src=https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7.jpg class=attachment-original size-original alt= decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7.jpg 1200w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-768x402.jpg 768w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-400x209.jpg 400w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-600x314.jpg 600w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-800x419.jpg 800w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-1000x523.jpg 1000w sizes=(max-width: 1200px) 100vw, 1200px / /a /div div class=wp-link__title a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technology target=_blank Six Questions to Ask Before Accepting a Surveillance Technology /a /div div class=wp-link__description a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technology target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tabletCommunity members, policymakers, and political leaders can make better decisions about new technology by asking these questions./p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technology target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pGiven all this, communities and city councils should not adopt new technology that has not been subject to testing and evaluation by an independent, disinterested party. That’s true for all types of technology, but doubly so for technologies that have the potential to change the balance of power between the government and the governed, like surveillance equipment. After all, there’s no reason to get a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technologywrapped up in big debates/a about privacy, security, and government power if the tech doesn’t even work./p pOne example of a company refusing to allow independent review of its product is the license plate recognition company Flock, which is pushing those surveillance devices into many American communities and tying them into a centralized national network. (We wrote more about this company in a 2022 a href=https://www.aclu.org/publications/fast-growing-company-flock-building-new-ai-driven-mass-surveillance-systemwhite paper/a.) Flock has steadfastly refused to allow the a href=https://www.aclu.org/news/privacy-technology/are-gun-detectors-the-answer-to-mass-shootingsindependent/a security technology reporting and testing outlet a href=https://ipvm.com/IPVM/a to obtain one of its license plate readers for testing, though IPVM has tested all of Flock’s major competitors. That doesn’t stop Flock from a href=https://ipvm.com/reports/flock-lpr-city-sued?code=lfgsdfasd543453boasting/a that “Flock Safety technology is best-in-class, consistently performing above other vendors.” Claims like these are puzzling and laughable when the company doesn’t appear to have enough confidence in its product to let IPVM test it./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/news/privacy-technology/experts-say-emotion-recognition-lacks-scientific target=_blank tabindex=-1 img width=1160 height=768 src=https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e.jpg class=attachment-original size-original alt= decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e.jpg 1160w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-768x508.jpg 768w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-400x265.jpg 400w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-600x397.jpg 600w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-800x530.jpg 800w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-1000x662.jpg 1000w sizes=(max-width: 1160px) 100vw, 1160px / /a /div div class=wp-link__title a href=https://www.aclu.org/news/privacy-technology/experts-say-emotion-recognition-lacks-scientific target=_blank Experts Say 'Emotion Recognition' Lacks Scientific Foundation /a /div div class=wp-link__description a href=https://www.aclu.org/news/privacy-technology/experts-say-emotion-recognition-lacks-scientific target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tablet/p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/news/privacy-technology/experts-say-emotion-recognition-lacks-scientific target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pCommunities considering installing Flock cameras should take note. That is especially the case when errors by Flock and other companies’ license plate readers can lead to innocent drivers finding themselves with their a href=https://ipvm.com/reports/flock-lpr-city-sued?code=lfgsdfasd543453hands behind their heads/a, facing jittery police pointing guns at them. Such errors can also expose police departments and cities to lawsuits./p pEven worse is when a company pretends that its product has been subject to independent review when it hasn’t. The metal detector company Evolv, which sells — wait for it — emAI/em metal detectors, submitted its technology to testing by a supposedly independent lab operated by the University of Southern Mississippi, and publicly touted the results of the tests. But a href=https://ipvm.com/reports/bbc-evolvIPVM/a and the a href=https://www.bbc.com/news/technology-63476769BBC/a reported that the lab, the National Center for Spectator Sports Safety and Security (a href=https://ncs4.usm.edu/NCS4/a), had colluded with Evolv to manipulate the report and hide negative findings about the effectiveness of the company’s product. Like Flock, Evolv refuses to allow IPVM to obtain one of its units for testing. (We wrote about Evolv and its product a href=https://www.aclu.org/news/privacy-technology/are-gun-detectors-the-answer-to-mass-shootingshere/a.)/p pOne of the reasons these companies can prevent a tough, independent reviewer such as IPVM from obtaining their equipment is their subscription and/or cloud-based architecture. “Most companies in the industry still operate on the more traditional model of having open systems,” IPVM Government Research Director Conor Healy told me. “But there’s a rise in demand for cloud-based surveillance, where people can store things in cloud, access them on their phone, see the cameras. Cloud-based surveillance by definition involves central control by the company that’s providing the cloud services.” Cloud-based architectures can a href=https://www.aclu.org/news/civil-liberties/major-hack-of-camera-company-offers-four-key-lessons-on-surveillanceworsen the privacy risks/a created by a surveillance system. Another consequence of their centralized control is increasing the ability of a company to control who can carry out an independent review./p pWe’re living in an era where a lot of new technology is emerging, with many companies trying to be the first to put them on the market. As Healy told me, “We see a lot of claims of AI, all the time. At this point, almost every product I see out there that gets launched has some component of AI.” But like other technologies before them, these products often come in highly immature, premature, inaccurate, or outright deceptive forms, relying little more than on the use of “AI” as a buzzword./p pIt’s vital for independent reviewers to contribute to our ongoing local and national conversations about new surveillance and other police technologies. It’s unclear why a company that has faith in its product would attempt to block independent review, which is all the more reason why buyers should know this about those companies./p
❌