Automated disclosure is a species of automated decision-making governed by data privacy laws. Article 22 of the General Data Protection Regulation (GDPR) covers this, while a similar provision is found in Section 16 of the Data Privacy Act (DPA).
Profiling is one type of automated processing of personal data, this is when such data is used to evaluate or analyze a person.
Automated disclosure may be operationalized under the DPA through data sharing done through an automated decision made by a personal information processor. A recent issue in relation to the automated disclosure of personal information came up in relation to a press release issued by the National Privacy Commission (NPC). The NPC reports that several online lending apps include a “shaming provision” whereby in case of default of the borrower, the app will automatically send out his outstanding balance to the persons in his contact list. This resulted in several complaints being lodged with the NPC.
Under the GDPR
Article 22 of the GDPR provides that “[t]he data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.” Mike Hintze from the University of Washington School of Law argues that the provisions of the GDPR on automated decision making has a narrow scope based on fairness. Article 22 does not create restrictions on all automated decision-making, it is targeted to only those decisions that in some significant way affects a data subject’s legal rights or other critical aspects of their lives. The author argues that an automatic disclosure of personal data would not be covered under Article 22 due to several layers of other prior or subsequent decisions before a significant and legal effect would come to the data subject.
Following the above discussion, Article 22 would only apply if the automated processing was the direct cause of the significant and legal effects upon the data subject. Applying this to the automated disclosure made by lending apps, there would be no harm committed against the data subjects as any harm that may come about them would be due to the acts of those who received such data, rather than from the automated disclosure itself.
Under the DPA
As stated earlier, Section 16 of the Data Privacy Act provides for the rights of data subjects, among them is the right to information on automated processes where the data will or likely to be made as the sole basis for any decision significantly affecting or will affect the data subject. While the Implementing Rules and Regulations of the DPA expound upon this right by requiring that the data subject be informed if his or her data shall be used in automated decision-making. Related to this right is the right to object to any automated processing of personal information and to have the right to be notified in case the data processor adopts an automated process on a later date.
Based on the press release from the NPC, the lending apps in question requested certain information found within the smartphone of the borrower; this included his contacts. In this case, there was no notice given by the app that the purpose for such access was to later on send out messages to those contacts of the fact of non-payment by the borrower. Due to this, under the DPA, the lenders may be found guilty of a violation since specific rights of the borrowers in relation to automated decision-making were violated. Furthermore, the DPA only requires that there be a significant effect on the data subject, as compared to what the GDPR requires in order for Article 22 to apply. In this situation the complainants before the NPC allege that there is a significant effect on them: the emotional stress and the embarrassment caused by the disclosure made to their contacts.