Deep synthesis definition
Service provider obligations
Deep synthesis technology has recently become a regulatory target in China. On 28 January 2022, the Cyberspace Administration of China released the draft provisions on the Administration of Deep Synthesis of Internet Information Services (the draft), which remained open for public comment until 28 February 2022. The explanatory notes accompanying the daft emphasised the urgent need to have "bottom lines" and "red lines", which would allow all stakeholders to understand the legal boundaries.
Deep synthesis technology can generate images, texts, videos and audio content by using artificial intelligence. This technology has most famously been used to generate videos of former US President Barack Obama giving fictional speeches he never gave in real life. The Chinese government has taken the view that:
deep synthesis . . . is also used by some criminals to produce, copy, publish, and disseminate illegal information, slander and degrade others' reputation, honour, impersonate others' identities to commit fraud and other illegal acts, [which can] not only damage the vital interests of the people, but even endanger national security and social stability.
The new draft might impact the following entities, among others, if it comes into effect:
- companies that provide photo manipulation apps;
- livestreaming and video conferencing tools that support visual enhancements;
- virtual reality massively multiplayer online role-playing games; and
- websites employing chatbots.
The draft targets online deep synthesis services, technical support for deep synthesis services, service users and service providers within China. The draft defines deep synthesis as "technology that uses a generation synthesis algorithm represented by deep learning and virtual reality to produce text, image, audio, video, virtual scene and other information". The definition is open and followed by a list of illustrative examples. Examples worthy of note include:
Face generation, face replacement, character attribute editing, face manipulation, gesture manipulation, and other technologies for generating or editing biometric features such as faces in images and video content . . . . Technologies for generating or editing virtual scenes such as 3D reconstruction.
These particular examples seem to cover important components of virtual reality, or the so-called "metaverse".
Under the draft, deep synthesis service providers and service users are obliged to:
abide by laws and regulations, respect social morality and ethics, adhere to the correct political direction, public opinion orientation, and value orientation, and promote deep synthesis services to be upward and good.
Deep synthesis service providers must:
- comply with information security obligations;
- establish management systems for processes such as:
- algorithm reviews;
- user registration and identity verification;
- automated and manual content reviews (input and output);
- data security;
- personal information protection;
- protection of minors;
- reporting of illegal activity;
- complaints handling; and
- education and training of practitioners; and
- put in place technical and organisational safeguard measures compatible with technological development and new applications to deal with illegal and bad information, and bad actors, among other things.
The management of algorithms has recently been a focal point for regulators in China. The draft carries on this trend by stating that deep synthesis service providers must regularly review, evaluate and verify their algorithms. At the same time, those who provide tools – such as models and templates that involve biometric information or may involve national security or social and public interests – must conduct security self-assessments to prevent information security risks.
Personal information protection has gained prominence in China in recent years following several scandals that led to increasing public awareness of how personal information can be used (and abused) by some businesses. The draft proposes to build on the legal framework for existing personal information protection by pre-emptively targeting some of the applications of deep synthesis technologies. The draft requires that:
Where a deep synthesis service provider provides significant editing functions for biometric information such as face and human voice, it shall prompt the deep synthesis service user to notify and obtain the individual consent of the subject of the personal information being edited, unless otherwise provided by laws and administrative regulations.
Obtaining consent might be problematic because:
- deep synthesis technology allows for realistic human likenesses to be artificially generated, in which case service users might be asked to obtain the consent of a fictional person; and
- unless some form of verification is conducted, there is no way for a deep synthesis service to be sure that they have the consent of a personal information subject.
Deep synthesis services are required to add logos to the content they generate and keep generation logs to allow for content identification and source tracking.
Deep synthesis technologies may not be used to engage in various types of illegal activity, such as endangering national security, undermining social stability, disrupting social order, creating pornography, and infringing the rights and interests of others.
Other prohibitions relate to the Turing test. Deep synthesis services must prominently mark content as synthetic. Unmarked content must be withdrawn immediately. Such marking would prevent any deep synthesis service from passing the Turing test. This prohibition is hugely significant for effectively protecting personal autonomy, combatting crime and maintaining the perception of reality. It would be surprising if this prohibition were not included in the promulgated version of the draft and other jurisdictions did not follow suit.
Violations of the draft attract penalties ranging from 10,000 yuan to 100,000 yuan and business suspension. Other punishments may also be issued where other laws, administrations and departmental rules are violated.
For further information on this topic please contact Samuel Yang or Chris Fung at AnJie Law Firm by telephone (+86 10 8567 5988) or email ([email protected] or [email protected]). The AnJie Law Firm website can be accessed at www.anjielaw.com.