April 17, 2026 A Bilingual Newspaper

New York,US
23C
pten
Study Highlights Risks of Facial Recognition Technologies – The Brasilians

Smile! Your face is being filmed, classified, compared, and identified, mainly by public security agencies, most of the time without your knowledge. A study by the Federal Public Defender’s Office (DPU), in partnership with the Center for Studies on Security and Citizenship (CESeC), an academic institution linked to Candido Mendes University in Rio de Janeiro, shows this.

The released report, Mapping of Biometric Surveillance, points out that Brazil became a vast field of digital surveillance after hosting the 2014 World Cup, where so-called Facial Recognition Technologies (TRFs) found fertile ground to spread. Thanks, in part, to the promise of facilitating the identification of criminals and the location of missing persons.

“Facial recognition has been widely incorporated by public agencies in Brazil, in a process that began with the hosting of mega-events in the country – especially the 2014 FIFA World Cup and the 2016 Olympic Games,” argue the federal public defenders from the DPU and CESeC members, referring to the sophisticated and expensive facial recognition cameras that are increasingly present in the urban landscape.

According to the researchers, there are at least 376 active facial recognition projects in Brazil as of April this year. Together, these projects can potentially monitor nearly 83 million people, equivalent to 40% of the Brazilian population. They have already mobilized at least R$ 160 million in public investments, a value calculated based on information provided by 23 of the 27 federative units to the study coordinators. The four states that did not respond to the survey are Amazonas, Maranhão, Paraíba, and Sergipe.

“Despite this alarming scenario, regulatory solutions are lagging behind,” argue the DPU and CESeC researchers, emphasizing the pressing need for laws to regulate the use of digital surveillance systems, particularly facial recognition cameras, in Brazil.

In addition, according to the experts, the absence of external control mechanisms, uniform technical-operational standards, and transparency in the implementation of the systems is a cause for concern. This lack of oversight increases the chances of serious errors, privacy violations, discrimination, and misuse of public resources, highlighting the need for stricter controls.

Errors

In another study, CESeC mapped 24 cases between 2019 and April 2025 claiming to have identified failures in facial recognition systems. As in the case of 23-year-old personal trainer João Antônio Trindade Bastos, these failures can lead to mistaken identifications.

In April 2024, military police removed Bastos from the stands of Estádio Lourival Batista in Aracaju (SE) during the Sergipano Championship final. They took the young man to a room, where they frisked him roughly. Only after checking Bastos’s documents, who had to answer several questions to prove he was who he said he was, did the police reveal that the facial recognition system implemented at the stadium had confused him with a fugitive.

Outraged, Bastos used social media to vent his anger over the injustice he suffered. The repercussion of the case led the Sergipe government to suspend the technology’s use by the police, which, according to reports from the time, had already used the system to detain more than ten people.

Bastos is Black. Like most people identified by surveillance and facial recognition systems in Brazil and other countries, according to the DPU and CESeC report, there are indicators that 70% of police forces worldwide have access to some TRF and that 60% of countries have facial recognition in airports. In Brazil, “more than half of police approaches using facial recognition resulted in mistaken identifications, highlighting the risk of unjust arrests.”

“Concerns about the use of these technologies are not unfounded,” the experts warn, citing international studies that, in some cases, error rates of the systems are “disproportionately high for certain population groups, being ten to 100 times higher for Black, Indigenous, and Asian people compared to white individuals.” This finding led the European Parliament to warn, in 2021, that “[t]echnical inaccuracies of Artificial Intelligence [AI] systems designed for remote biometric identification of individuals can lead to biased results and have discriminatory effects.”

Legislation

When addressing “institutional and regulatory challenges,” the researchers point out that in December 2024, the Senate approved Bill No. 2338/2023, which seeks to regulate the use of artificial intelligence, including biometric systems in public security. To become law, the Chamber of Deputies needs to approve the proposal, which created a special commission last month to debate the issue.

In addition, for the DPU and CESeC researchers, although the bill proposes to prohibit the use of remote and real-time biometric identification systems in public spaces, the text approved by the Senate provides so many exceptions that, in practice, it functions “as a broad authorization for the implementation” of these systems.

“The categories of permissions [in the approved text] include criminal investigations, flagrante crimes, searches for missing persons, and recapture of fugitives, situations that cover a considerable spectrum of public security activities. Considering the history of abuses and the lack of effective control mechanisms, this openness to use maintains the possibility of a surveillance state and rights violations.”

Recommendations

The researchers emphasize the urgency of a “qualified public debate,” with active participation from civil society, academia members, public control agency representatives, and international organizations. They highlight that this debate is crucial to shape the future of facial recognition technology in Brazil and urge the public to participate actively.

They also recommend what they classify as “urgent measures,” such as the approval of a specific national law to regulate the technology’s use; standardization of protocols that respect due process of law; and the conduct of independent and regular audits.

The experts also point to the need for public agencies to provide greater transparency on contracts and databases used, ensuring public access to clear information about facial recognition systems and training for public agents dealing with the issue. And they suggest requiring prior judicial authorization for the use of information obtained through TRFs in investigations, as well as deadlines for biometric data storage and strengthening control over private companies operating these systems.

“We hope these findings can not only guide and support the processing of Bill 2338 in the Chamber of Deputies, but also serve as an alert for regulatory and control bodies to pay attention to what is happening in Brazil. The report highlights racial bias in the technology’s use, problems of public resource diversion, and lack of transparency in its implementation,” says CESeC general coordinator Pablo Nunes.

Source: Agência Brasil


  • Actor Juca de Oliveira Dies at 91

    Brazil lost one of the most prominent names in national performing arts in the early hours of this Saturday (21). Actor, author, and director Juca de Oliveira passed away at 91 years old in São Paulo, victim of pneumonia associated with a cardiac condition. The information was confirmed by the family’s press office to TV…