Between Technological Progress and Cyber Threats

This article looks at the phenomenon of dipfakes - deepfaking technologies that use artificial intelligence to create realistic fake images, audio and video.

Jun 16, 2024 - 12:58
 0  7690
Between Technological Progress and Cyber Threats
photo orda.kz

Deepfakes, or deepfakes, are a serious problem in today's world due to the rapid development of artificial intelligence technologies. These technologies make it possible to create impressively realistic fake photos, audio and videos that can be used for a variety of fraudulent purposes.

One example is the use of the faces and voices of well-known personalities, such as the President of Kazakhstan, Kasym-Jomart Tokayeva, to create fake videos for the purpose of deception. For example, in such videos, Kasym-Jomart Tokayev might purport to support financial projects or promise compensation to victims of fraudulent schemes. These videos can be used for phishing attacks, misleading users and swindling them out of money through fake investment platforms or websites.

Apart from political figures, dipfakes also involve personalities from the world of business, finance and other fields. Fake pornographic images and videos created using the faces of famous women are also a serious threat, as they can lead to virtual sex exploitation and privacy violations.

To combat dipfakes, it is important to develop technologies to detect and recognise counterfeit material and to raise public awareness of the possible risks. In addition, it is important to develop legislative measures to prevent and penalise the use of dipfakes for unlawful purposes.

Thus, while dipfakes may have positive applications in digital art and commerce, their potential negative effects on personal safety and trust must be considered and addressed proactively. 

The editorial board is not responsible for the content and accuracy of material taken, sent or obtained from other sources. The publication of such materials is for informational purposes only and does not imply automatic endorsement or approval of their content.

Now you can receive our news directly in WhatsApp! Subscribe to our channel in this popular messenger and stay up to date with all events!