The Ministry of Digital Development, Communications and Mass Media of the Russian Federation proposes to toughen the requirements for the circulation of data about citizens
Oleg Ivanov, Deputy Minister of Digital Development, stated he had signed the document and was going to submit it to the government. He made this announcement at the meeting of the working group on the improvement of legislation in the field of personal data and the accessibility of digital technology.
The draft regulates the policy of depersonalized data processing. For example, the operator will no longer be able to use any additional information that helps determine whether the personal data belongs to a particular subject. Besides depersonalized data, it will also be prohibited to transfer to third parties information that would allow identifying a particular person. Dedepersonalization of data will be prohibited, except in cases when it is necessary to protect a person’s life or health.
Ivanov said the new requirements for handling depersonalized data are caused by the fact that there is already technology for depersonalizing data and identifying a specific person with its help: “Depersonalization and anonymization are different concepts. De-anonymization is reversible.”
The use of depersonalized personal data without consent will only be allowed for research and statistical purposes, clarified the Ministry of Digital Development, Communications, and Mass Media.
Members of the working group believe that the draft, in its current wording, does not solve the problem, leading instead to a tightening of the legal regime. They claim it does not create additional opportunities for the processing of depersonalized data, which is widely used by businesses, particularly to target advertising.
“When an array of impersonal data is transferred to a third party, processing this data for purposes other than those for which it was collected is impossible, because it requires new consent from the citizen. Yet it is impossible to apply for it because the data is impersonal,” explains Ekaterina Batmanova of the press service of ANO Digital Economy. She believes the bill to contain excessive requirements for the identification of the subject of personal data: while its previous version provided an opportunity to consent to the processing of personal data without specifying a name (using e-mail or phone number), the current version suggests that the user must specify the name to receive certain services.
“If these requirements are adopted, citizens will not use, say, Russian dating services, since they will not want to enter their full name there. As a result, the amendments to the law will not strengthen protection, but on the contrary, will lead to an overflow of data concerning our citizens abroad,” Batmanova said.
The world practice knows a number of depersonalization methods and their combinations which allow for secure data protection, says Anna Serebryanikova, President of the Big Data Association: “The level of depersonalization is estimated by a special model based on the risk of repeated identification. If the data can be used to identify its owner, such depersonalization methods have a low coefficient: from 0 to 0.4. If, however, it is impossible to identify the data owner without significant effort after depersonalization, the coefficient is higher: from 0.5 to 0.8.” If you can establish the identity of the subject of personal data only through extremely time-consuming and expensive procedures, then the data are considered depersonalized, and the methods by which they were “cleared” of personal elements receive a coefficient between 0.8 and 1 — the maximum rating for the degree of depersonalization, adds Serebryanikova.
The bill proposes changes to federal law that would largely eliminate the difference in legal relations arising from the collection, processing, and use of personal and depersonalized data, says Yury Fedyukin, Managing Partner of the law firm Enterprise Legal Solutions. “The equation of depersonalized data to personal means that the operator will be required to take all measures to protect the former, just like with regular personal data,” agrees Anastasia Fedorova, leading expert of the Information Security department at the IT company CROC.
Fedyukin warns that equating depersonalized data to personal data means new grounds for bringing businesses to liability for violating laws governing legal relations concerning data collection, storage, processing, and distribution: “Formally, this means that any attempt to use depersonalized data, i.e. an array containing information about many individuals, will be greatly penalized if at least one of them has not authorized its use for any particular purpose or has withdrawn his consent.” This will complicate by an order of magnitude the performance of most companies whose work is in one way or another related to big data, as well as government agencies, concludes Fedyukin.
The concept of obtaining consent to depersonalization is inefficient and will lead to the suspension of many projects related to the processing of big data. After all, the purpose of creating an additional artificial barrier is unclear if the consent to the processing of personal data has already been obtained, argues Anna Aibasheva of VimpelCom: “The proposed regulation will significantly limit the development of digital economy in Russia as well as the introduction of new services and technologies, and significantly slow down the transition to a new technological paradigm.”