COMPARATIVE ANALYSIS OF FOREIGN PRACTICES IN THE LEGAL REGULATION OF DEEPFAKES: BALANCING INNOVATION, HUMAN RIGHTS PROTECTION, AND NATIONAL SECURITY
DOI:
https://doi.org/10.52026/2788-5291_2026_81_1_263Keywords:
deepfakes, legal regulation, artificial intelligence, synthetic content labeling, national securityAbstract
This article provides a comparative analysis of international models of deepfakes regulation in the United States, the European Union, China, and the Republic of Korea, with the aim of identifying optimal approaches for developing effective legal regulation policies in Kazakhstan. Based on regulatory documents, doctrinal sources, and statistical data, the article examines the main threats associated with the dissemination of synthetic media content: interference in democratic processes, reputational risks, financial fraud, and violation of privacy and personal data. Particular attention is paid to the analysis of fragmented state-level solutions in the United States, the European model of preventive transparency (GDPR, AI Act, Digital Services Act), as well as Asian technological norms, particularly pronounced in China, and criminal law protections in the Republic of Korea. Based on a comparative analysis, key elements of effective regulation are identified, including mandatory labeling of synthetic content, technical and organizational requirements for AI developers, transparent moderation mechanisms, expanded rights for data subjects, and differentiated responsibilities for providers. Proposals are formulated for the development of Kazakhstani legislation, taking into account the adoption of the Law of the Republic of Kazakhstan "On Artificial Intelligence" (2025). Clarifying moderation mechanisms, enshrining special rights for data subjects, and specifying platform responsibilities will create conditions for aligning Kazakhstani regulation of synthetic media content with best international practices, thereby harmoniously combining innovation, human rights protection, and national security guarantees.