(The Real Agenda) Public spaces are under constant surveillance.
Hypervigilance is not so evident when we walk through the streets, but it exists: in addition to cameras, even the wastebaskets and telephone booths extract data to try to identify people.
Despite not having a clear objective in most cases, the faces, habits and actions of citizens are unified in databases and we have no control over the information that the system collects about us.
For example, police activity in social demonstrations is not organized during the act itself, but a posteriori, that is, when analyzing the data collected in the protests: license plates, unique phone identifiers or social network analysis.
With these tools it is possible to determine who is arrested and/or not, causing a change in the form of participation and conditioning the involvement of citizens in certain acts.
The terrorist threat seems to be a valid justification for governments to invest in technologies that allow citizens to feel more secure, but it’s not.
In countries such as China, the police use facial recognition glasses to identify suspects or verify identities. In Brazil, there are operation centers that collect information that goes from transportation to sanitation, health and emergency services.
This hypervigilance is not confined to public spaces.
Every time we buy an intelligent device we are exposed to being recorded, information about our tastes or preferences is collected and profiles are defined about ourselves.
The instruction manuals have long warned us to be cautious with the conversations in front of these devices.
Is all this surveillance helping to prevent crimes?
Algorithms cannot predict what will happen, but they can be used to work on the basis of the data received and on the mechanisms for evaluating them.
Since “algorithms and technology cannot predict what will happen”, it is only possible to “make an estimate for the future”.
A future that is balanced when it comes to being secured and collecting data is a future in which technology is at the service of real social problems of humanity and in which we do not develop technologies in an uncritical way without knowing why we develop them, nor what is done with our data.
A Spy in the room
When we get home, we turn on the television and zap to a series or content of interest; Dinner time comes and we turn off the television without being really aware of what is happening in the background.
The television has followed all the steps, has registered which channels have been selected and for how long (including streaming platforms), spreading all this information with personal sociological data.
Where is all this record going? Possibly to the hands of companies that market this information, which is worth gold, to advertisers.
One of these companies is Samba TV and it monitors more than 30 million televisions spread across countries around the globe, accumulating data from users in bulk.
This company, in spite of not being known by the public, obtains particular data of the habits of the users of televisions.
It is not so much about the legality or not of collecting personal data – in short, the user must accept the conditions – but of the clarity or not of what is being accepted.
These associations mark a confusing process of acceptance that appears on televisions on their first start: the user must accept a text that seems to offer all advantages.
By accepting you can “interact with favorite programs and connect the devices to receive special offers. But there is also small print.
What kind of information is collected and what happens to it?
The consumer associations complain that in order to know these details it is necessary to go to the Internet (from the television, direct access to the text is not provided) and read texts that range between 4,000 and 6,500 words.
A legal brick that, except for those very skilled in the field or obsessed with their privacy, few will read.
These services apply a system called Automatic Content Recognition (ACR) that tracks what appears on the screen at all times and how the user interacts with that content.
The end seems legitimate: knowing the habits of each user can suggest a content according to their tastes in an extensive and growing grid of programs.
But the problem does not only lie in the accumulation of user habit data, but also in a function included in the platform that links this information with the user’s mobile devices, and in this way the information that is collected goes beyond their television sets.
Can you give up this service and continue enjoying the television without major incidents? Consumer Reports published a series of recommendations at the beginning of the year to deactivate such services.
In theory, it is enough not to accept the privacy conditions in the start-up. However, it is possible that in updates to the platform the user is obliged to go through this process and, in certain television sets, some reported problems in their equipment when refusing to accept the privacy conditions.
Why do manufacturers include such invasive privacy services on their TVs?
Beyond the obvious added value of content recommendations, services such as Samba TV pay brands to have their programs pre-installed on computers, and with a waning margin in the sale of equipment, it seems too tempting to refuse.