BACC TRAVEL

Smile! Your face is being filmed, classified, compared, and identified, mainly by public security agencies, most of the time without your knowledge. Research by the Public Defender’s Office of the Union (DPU) in partnership with the Center for Security and Citizenship Studies (CESeC), an academic institution linked to Candido Mendes University in Rio de Janeiro, shows this.

Released the report Mapping Biometric Surveillance points out that Brazil became a vast field of digital surveillance after hosting the World Cup in 2014, where the so-called Facial Recognition Technologies (TRFs) found fertile ground to spread. Thanks, in part, to the promise of facilitating the identification of criminals and the location of missing persons.

“Facial recognition has been widely incorporated by public agencies in Brazil, in a process that began with the holding of mega-events in the country – especially the Football World Cup in 2014 and the Olympic Games in 2016”, argue the federal public defenders of the DPU and members of CESeC, referring to the sophisticated and expensive facial recognition cameras, increasingly present in the urban landscape.

According to the researchers, at least 376 active facial recognition projects in Brazil as of April this year. Together, these projects can potentially monitor almost 83 million people, equivalent to 40% of the Brazilian population. They have already moved at least R$160 million in public investments, a value calculated based on the information that 23 of the 27 federative units provided to those responsible for the study. The four states that did not respond to the survey are Amazonas, Maranhão, Paraiba, and Sergipe.

“Despite this alarming scenario, regulatory solutions are lagging,” argue researchers from the DPU and CESeC, emphasizing the pressing need for laws to regulate the use of digital surveillance systems, particularly facial recognition cameras, in Brazil.

Furthermore, according to the experts, the absence of external control mechanisms, uniform technical-operational standards, and transparency in implementing the systems is a cause for concern. This lack of oversight increases the chances of serious errors, privacy violations, discrimination, and misuse of public resources, underscoring the need for more stringent control.

Errors

In another survey, CESeC mapped 24 cases between 2019 and April 2025, claiming to have identified flaws in facial recognition systems. As with 23-year-old personal trainer João Antônio Trindade Bastos, these flaws can lead to mistaken identifications.

In April 2024, military police removed Bastos from the stands of the Lourival Batista Stadium, in Aracaju (SE), during the final match of the Sergipano Championship. They took the young man to a room, where they searched him roughly. Only after checking Bastos’s documents, who had to answer several questions to prove he was who he said he was, did the police officers reveal that the facial recognition system implemented in the stadium had mistaken him for a fugitive.

Outraged, Bastos took to social media to vent his outrage at the injustice he had suffered. The repercussions of the case led the government of Sergipe to suspend the use of the technology by the police, which, according to news reports at the time, had already used it to detain more than ten people.

Bastos is black. Like most people identified by surveillance and facial recognition systems in Brazil and other countries, according to the report by the DPU and CESeC, there are indicators that 70% of the world’s police forces have access to some TRF and that 60% of countries have facial recognition in airports. In Brazil, “more than half of police stops using facial recognition resulted in mistaken identifications, highlighting the risk of wrongful arrests.”

“Concerns about the use of these technologies are not unfounded,” warn the experts, citing international research that, in some cases, the error rates of the systems are “disproportionately high for certain population groups, being ten to 100 times higher for black, indigenous and Asian people compared to white individuals.” This finding prompted the European Parliament to warn in 2021 that “[t]he technical inaccuracies of Artificial Intelligence [AI] systems designed for remote biometric identification of individuals may lead to biased results and have discriminatory effects.”

Legislation

When addressing “institutional and regulatory challenges,” the researchers point out that, in December 2024, the Senate approved Bill No. 2338/2023, which seeks to regulate the use of artificial intelligence, including biometric systems in public safety. To become law, the Chamber of Deputies must approve the proposal, which created a special committee last month to debate the issue.

In addition, for the researchers at DPU and CESeC, although the bill proposes to prohibit the use of remote and real-time biometric identification systems in public spaces, the text approved by the Senate provides for so many exceptions that, in practice, it functions “as a broad authorization for the implementation” of these systems.

“The categories of permissions [in the approved text] include criminal investigations, flagrant crimes, searches for missing persons, and recapture of fugitives, situations that cover a considerable spectrum of public security activities. Considering the history of abuses and the lack of effective control mechanisms, this openness to use maintains the possibility of a surveillance state and violation of rights.”

Recommendations

The researchers stress the urgency of a “qualified public debate”, with the active participation of civil society, members of academia, and representatives of public over-sight bodies and international organizations. They emphasize that this debate is crucial for shaping the future of facial recognition technology in Brazil and urge the audience to participate actively.

They also recommend what they classify as “urgent measures”, such as the approval of a specific national law to regulate the use of the technology; the standardization of protocols that respect due legal process, and the performance of independent and regular audits.

The experts also point out the need for public agencies to provide greater transparency to the contracts and databases used, ensuring the population’s access to clear information about facial recognition systems and training public agents who deal with the issue. And they suggest that prior judicial authorization be required for the use of information obtained through the TRFs in investigations, as well as a time limit for storing biometric data and strengthening control over private companies that operate these systems.

“We hope that these findings can not only guide and support the processing of Bill 2338 in the Chamber of Deputies but also serve as a warning for regulatory and control bodies to be aware of what is happening in Brazil. The report highlights racial bias in the use of technology, problems of misuse of public resources, and lack of transparency in its implementation,” says CESeC general coordinator Pablo Nunes.

Source: Agência Brasil

Leave a Reply

The Brasilians