Recommendations to Protect Your Voice from Cloning

The SSPC issues advice to avoid being a victim of fraud through voice cloning by artificial intelligence. Stay calm and always verify the sources.


Recommendations to Protect Your Voice from Cloning

The use of Artificial Intelligence (AI) applications and the exchange of voice messages has become increasingly common today. However, this practice carries security risks, as criminals can resort to voice cloning to commit fraud. This is why the Secretariat of Citizen Security and Protection (SSPC) has issued recommendations through the General Directorate of Service Management, Cybersecurity, and Technological Development to prevent individuals' voices from being cloned for criminal purposes.

One of the concerns is that current technology allows the voice of someone to be cloned with great precision, not only imitating tones and intonations but also replicating the pronunciation of words. According to a survey conducted by a technology security company, approximately 1 in 10 people has received fraudulent voice messages generated by AI, and 77% of them have fallen victim to fraud in this manner.

This practice is attractive to cybercriminals, who can obtain contact information of a person to then simulate emergencies and request money from their acquaintances. It is important to be alert to the behavioral patterns of these criminals, such as the use of alarming phrases and urgent tones that seek to generate distress in the victim.

In light of this situation, recommendations have been provided to reduce the risk of becoming a victim of these types of crimes, such as staying calm in the face of emergency requests, verifying information through other means, sharing information about these types of scams with family members, and activating additional security measures such as two-step verification.

In conclusion, it is essential to be informed about the risks associated with using voice in digital environments and to take preventive measures to avoid becoming victims of possible fraud or identity theft through voice cloning with AI.