April 17, 2026 A Bilingual Newspaper

New York,US
25C
pten
Risks of Facial Recognition Technologies – The Brasilians

Smile! Your face is not only being filmed, but also classified, compared, and identified, mainly by public security agencies. Most of the time without your knowledge. That’s what a study by the Federal Public Defender’s Office (DPU) in partnership with the Center for Studies on Security and Citizenship (CESeC), an academic institution linked to Candido Mendes University in Rio de Janeiro, shows.

The report Mapeando a Vigilância Biométrica reveals that, after hosting the World Cup in 2014, Brazil became a vast field of digital surveillance where so-called Facial Recognition Technologies (TRFs) found fertile ground to spread. Thanks, in part, to the promise of facilitating the identification of criminals and the location of missing persons.

“Facial recognition has been widely incorporated by public agencies in Brazil, in a process that began with the mega-events held in the country – especially the 2014 FIFA World Cup and the 2016 Olympic Games,” state the federal public defenders from the DPU and CESeC members, referring to the sophisticated and expensive facial recognition cameras, increasingly present in the urban landscape.

According to the researchers, in April this year there were at least 376 active facial recognition projects in Brazil. Together, these initiatives have the potential to monitor nearly 83 million people, equivalent to about 40% of the Brazilian population. And they have already mobilized at least R$ 160 million in public investments – a value calculated from the information provided by 23 of the 27 federative units to the study coordinators – Amazonas, Maranhão, Paraíba, and Sergipe did not respond to the survey, conducted between July and December 2024.

“Despite this entire scenario, regulatory solutions are lagging behind,” state the DPU and CESeC researchers, assuring that Brazil still has no laws to regulate the use of digital surveillance systems, particularly facial recognition cameras.

In addition, for the experts, there is a lack of external control mechanisms, uniform technical-operational standards, and transparency in the implementation of the systems. This increases the chances of serious errors, privacy violations, discrimination, and misuse of public resources.

Errors

In another survey, CESeC mapped 24 cases that occurred between 2019 and April 2025, in which it claims to have identified failures in facial recognition systems. The most well-known is that of personal trainer João Antônio Trindade Bastos, 23 years old.

In April 2024, military police officers removed Bastos from the stands of Estádio Lourival Batista in Aracaju (SE) during the final match of the Campeonato Sergipano. They took the young man to a room, where they frisked him roughly. Only after checking all of Bastos’s documentation, who had to answer several questions to prove he was who he said he was, did the PMs reveal that the facial recognition system installed in the stadium had mistaken him for a fugitive.

Indignant, Bastos used social media to vent about the injustice he suffered. The repercussion of the case led the Sergipe government to suspend the technology’s use by the PM – which, according to news from the time, had already used it to detain more than ten people.

Bastos is Black. Like most people identified by surveillance and facial recognition systems in Brazil and other countries – according to the DPU and CESeC report, there are indicators that 70% of police forces worldwide have access to some type of TRF and that 60% of countries have facial recognition in airports. In Brazil, “more than half of police approaches motivated by facial recognition resulted in mistaken identifications, evidencing the risk of undue arrests”.

“Concerns about the use of these technologies are not unfounded,” the experts warn, citing international studies according to which, in some cases, error rates of the systems are “disproportionately high for certain population groups, being ten to 100 times higher for indigenous and Asian people compared to white individuals.” This finding led the European Parliament, in 2021, to warn that “[the] technical inaccuracies of Artificial Intelligence [AI] systems, designed for remote biometric identification of individual persons, can lead to biased results and have discriminatory effects.”

Legislation

When addressing “institutional and normative challenges,” the researchers recall that in December 2024, the Senate approved Bill No. 2338/2023, which seeks to regulate the use of artificial intelligence, including biometric systems in public security. To become law, the proposal must be approved by the Chamber of Deputies, which last month created a special commission to debate the issue.

In addition, for the DPU and CESeC researchers, although the bill proposes the prohibition of remote and real-time biometric identification systems in public spaces, the text approved by the Senate provides so many exceptions that, in practice, it functions “as a broad authorization for the implementation” of these systems.

“The categories of permissions [in the approved text] include criminal investigations, flagrante delicto, search for missing persons, and recapture of fugitives, situations that cover a considerable spectrum of public security activities. Considering the history of abuses and the lack of effective control mechanisms, this openness for use ends up maintaining the possibility of a surveillance state and rights violations.”

Recommendations

The researchers conclude by advocating the urgency of a “qualified public debate,” with active participation from civil society, academia members, representatives of public control agencies, and international organizations.

They also recommend what they classify as “urgent measures,” such as the approval of a specific national law to regulate the use of the technology; the standardization of protocols that respect due process of law; and the conduct of independent and regular audits.

The experts also point to the need for public agencies to provide more transparency on contracts and databases used, ensuring public access to clear information about facial recognition systems and training public agents who deal with the issue. And they suggest the requirement of prior judicial authorization for the use of information obtained through TRFs in investigations, as well as time limits for biometric data storage and strengthening control over private companies that operate these systems.

“We hope these findings can not only guide the progress of Bill 2338 in the Chamber of Deputies, but also serve as an alert for regulatory and control agencies to pay attention to what is happening in Brazil. The report highlights both racial biases in the use of the technology and issues of misuse of public resources and lack of transparency in its implementation,” states CESeC’s general coordinator, Pablo Nunes, in a note.

Source: Agência Brasil


  • Actor Juca de Oliveira Dies at 91

    Brazil lost one of the most prominent names in national performing arts in the early hours of this Saturday (21). Actor, author, and director Juca de Oliveira passed away at 91 years old in São Paulo, victim of pneumonia associated with a cardiac condition. The information was confirmed by the family’s press office to TV…