HFN Technology & Regulation Client Update April 2017 Dear Clients and Friends, We are pleased to introduce you to our April edition of the Technology & Regulation Client Update, which includes a variety of industry and regulatory developments in the fields of technology compliance, digital advertising, privacy protection and content. In this update you can read about: The Massachusetts Attorney General's settlement with an advertising company prohibiting geofencing around healthcare facilities; The new consumer choice tools for setting preferences about digital advertising data collection and use, presented by the NAI and the DAA; FTC's enforcement action concerning inadequate disclosures by influencers and marketers; OCR's HIPPA enforcement action regarding impermissible disclosure of unsecured electronic protected health information; A new US Court ruling which determined that avatar calls are prerecorded messages under the Telemarketing Sales Rule; The UK ICO's new guidance on profiling and automated decision-making issues, as well as big data, artificial intelligence, machine learning and data protection under the impending EU GDPR regime; and The European Commission's proposal for a pilot project on the blockchain in order to improve its regulation. Kind regards, Ariel Yosefi, Partner Co-Head - Technology & Regulation Department Herzog Fox & Neeman If you have an important regulatory or industry compliance update you would like to share with the industry, please let us know. Digital Advertising Company Prohibited from Geofencing near Healthcare Facilities TOPICS: Digital Advertising Compliance, Geofencing, Massachusetts Attorney General, United States Earlier this month, the Massachusetts Attorney General’s office announced that it reached a settlement with a digital advertising company, Copley Advertising, LLC, following allegations that it was using mobile geofencing technology to target ads to women entering reproductive health facilities. The settlement prohibits the company from using this technology at or near Massachusetts healthcare facilities to infer the health status, medical condition, or medical treatment of any individual. In its advertising campaign, the company set mobile geofences at or near reproductive health centers and methadone clinics in various cities. When a consumer visited the geofenced area near these locations, the company tagged the consumer’s device ID and served advertisements to the consumer’s device for up to 30 days. The advertisements included text such as "Pregnancy Help", "You Have Choices", and "You’re Not Alone" that, if clicked, took the consumer to a webpage with information regarding abortion alternatives as well as access to a live web chat with a "pregnancy support specialist". The advertising company has denied that it has engaged in geofencing campaigns around reproductive health clinics in Massachusetts, even though it has the ability to do so. The settlement, resolved through an Assurance of Discontinuance filed at the beginning of the month in the Suffolk Superior Court, resolves allegations that the company's practices would violate consumer protection laws in Massachusetts by tracking a consumer’s physical location near or within medical facilities, disclosing that location to third-party advertisers, and targeting the consumer with potentially unwanted advertising based on inferences about his or her private, sensitive, and intimate medical or physical condition, all without obtaining the consumer’s consent. This settlement is an important reminder for digital advertising companies to consider the privacy implications of targeted advertising, whether in geofencing or other digital marketing strategies, and how privacy and wider consumer protection laws might apply. It is expected that other federal and state regulators in the US, including other Attorneys General, might start to intervene in this area(see our previous Client Update which reported on the US Federal Trade Commission's ("FTC") serious enforcement action against mobile ad network ,InMobi, that involved a $950,000 fine and a 20 year compliance monitoring plan). The NAI and the DAA Unveiled New Versions of Consumer Choice Tools TOPICS: Adtech Industry Compliance, Control and Choice, Digital Advertising, Interest-Based Advertising, The Network Advertising Initiative, The Digital Advertising Alliance The Network Advertising Initiative ("NAI") and the Digital Advertising Alliance ("DAA") have recently jointly launched new versions (in beta) of their consumer choice tools for setting preferences about digital advertising data collection and use. Improvements to the NAI tool and DAA tool include, inter alia, an enhanced user experience, the ability for companies to easily disclose to consumers their use of both cookie-based and non-cookie technologies for digital interest-based advertising (IBA), and controls for users to opt-out of such use. In particular, the tools offer a significantly improved consumer experience, including a simplified, mobile-responsive interface; a reduced need to modify browser settings for successful opt-outs; and a real-time status check which reports the use of both cookie-based and non-cookie technologies. Moreover, the improvements in these tools provide increased transparency into emerging data practices, regardless of technology, and they offer additional options for companies to comply with existing DAA standards concerning the use of non-cookie technologies. The new versions of the tools are the result of an ongoing collaboration between the two industryleading self-regulatory organizations for digital advertising, and an important step forward in providing consumer notice and choice. The FTC Warns Influencers and Brands to Clearly Reveal their Relationship TOPICS: Influencer Marketing, Online Advertising and Marketing, Instagram, Federal Trade Commission, United States After reviewing a great number of Instagram posts by celebrities, athletes, and other influencers, the FTC has recently announced it has sent out more than 90 letters reminding influencers and marketers that influencers should clearly and conspicuously disclose their relationships to brands when promoting or endorsing products through social media (in this regard, you can also see our special Client Update titled "Influencer Marketing: Rules of Engagement"). These letters mark the first time that FTC staff has reached out directly in order for them to educate social media influencers. In addition to providing background information on how and when marketers and influencers should reveal a "material connection" in an advertisement (in this regard, see the FTC’s Endorsement Guides which apply to both marketers and endorsers), the letters each addressed one point that is specific to Instagram posts – which is that users viewing those posts on mobile devices usually only see the first three lines of a longer post unless they click "more", which many might not do. The staff’s letters clarified to the recipients that while making endorsements on Instagram, they should reveal any material connection above the "more" button. Additionally, the letters noted that when multiple tags, hashtags, or links are used, users might just skip over them, particularly when they appear at the end of a long post. Accordingly, a disclosure placed in such a string is not likely to be conspicuous. Some of the letters addressed certain disclosures which are not sufficiently clear, indicating that many users will not comprehend a disclosure like "#sp", "Thanks [Brand]", or "#partner" in an Instagram post to mean that the post is sponsored. In addition to publishing its endorsement guide titled "The FTC’s Endorsement Guides: What People Are Asking" in May 2015 (see our related Client Update), the FTC has previously addressed the necessity for endorsers to adequately reveal connections to brands through law enforcement actions and the staff’s business education efforts (as we reported in our March 2016 Client Update regarding the Lord & Taylor and Machinima cases, the July 2016 Client Update concerning Warner Bros. Home Entertainment case, as well as August 2016 Client Update about FTC's regulatory scrutiny on this issue). Finally, in its blog post, the FTC recommended three main steps that influencers and marketers should take in order to ensure the effectiveness of disclosures on Instagram: keep your disclosures unambiguous; make your disclosures hard to miss; and avoid #HardtoRead #BuriedDisclosures #inStringofHashtags #SkippedByReaders. We would be happy to provide further advice and recommendations concerning the required steps, to ensure compliance with the applicable obligations and their scope. The OCR Settled the HIPPA Enforcement Action for Impermissible Disclosure of Unsecured ePHI TOPICS: Electronic Protected Health Information, Privacy, Health Insurance Portability and Accountability Act, Department of Health and Human Services, Office for Civil Rights, United States The US Department of Health and Human Services ("HHS"), Office for Civil Rights ("OCR"), has recently announced a Health Insurance Portability and Accountability Act ("HIPAA") settlement based on the impermissible disclosure of unsecured electronic protected health information ("ePHI") (see also our previous Client Update which reported on OCR's first HIPAA enforcement action for untimely reporting of a breach of unsecured PHI). In January 2012, CardioNet reported to the OCR that a workforce member’s laptop, which contained the ePHI of 1,391 individuals, was stolen from a parked vehicle outside of the employee’s home. OCR’s investigation into the impermissible disclosure concluded that the company had an insufficient risk analysis as well as risk management processes in place at the time of the theft. In addition, the company's policies and procedures implementing the standards of the HIPAA Security Rule were in draft form and had not been implemented. Moreover, the company was unable to produce any final policies or procedures concerning the implementation of safeguards for ePHI, including those for mobile devices. CardioNet has agreed to settle potential non-compliance with the HIPAA Privacy and Security Rules by paying $2.5 million and implementing a corrective action plan. This settlement is the first involving a wireless health services provider, since CardioNet provides remote mobile monitoring of and rapid response to patients at risk for cardiac arrhythmias. Finally, the HHS has accumulated information as well as some tips intended to provide assistance in protection and security of health information while using mobile devices. Recommended steps from the HHS in this regard include the following: Use a password or other user authentication, install and enable encryption, install and activate remote wiping or remote disabling, disable and do not install or use file sharing applications, install and enable a firewall, install and enable security software, keep your security software up to date, research mobile applications (apps) before downloading, maintain physical control, use adequate security to send or receive health information over public Wi-Fi networks, delete all stored health information before discarding or reusing the mobile device. US District Court Upheld the FTC Staff Opinion that Avatar Calls are subject to Prerecorded Messages regulations TOPICS: Robocall Telemarketing, Soundboard Technology, Telemarketing Sales Rule, Federal Trade Commission, Columbia District Court, United States The US District Court for the District of Columbia has recently upheld an opinion letter released by the FTC staff in November 2016 which expanded robocalling limitations to telemarketing calls that use so-called soundboard technology or "avatars" (see also our previous Client Update which reported on FTC's enforcement action against two massive illegal robocall telemarketing operations). This technology typically allows a live agent to communicate with a call recipient by playing recorded audio snippets rather than using their own live voice. In September 2009, the FTC staff had asserted in another opinion letter that avatar calls were not considered prerecorded messages under the Telemarketing Sales Rule ("TSR"). Nevertheless, as mentioned above, in 2016, the FTC reversed its position regarding this issue and decided to revoke its former opinion letter. Accordingly, the FTC's current position is that outbound telemarketing calls which utilize avatars are subject to the TSR’s prerecorded call provisions. Consequently, companies that make use of soundboard technology will need to obtain prior written consent and to be compliant, inter alia, with the prerecorded message requirements under the TSR which will take effect as from 12 May 2017. We would be happy to provide further advice and recommendations concerning the required steps, to ensure compliance with the applicable obligations and their scope. The UK ICO Published Updated Paper on Big Data, Artificial Intelligence, Machine Learning and Data Protection TOPICS: Big Data, Artificial Intelligence, Machine Learning, General Data Protection Regulation, Information Commissioner’s Office, United Kingdom, European Union The UK Information Commissioner’s Office ("ICO") has recently released updated guidance on big data, artificial intelligence, machine learning and data protection. This guidance provides helpful emphasis on accountability, transparency and how to evidence compliance with the General Data Protection Regulation ("GDPR"), which is due to come into force as from 25 May 2018. For further details and recommendations published by us on the GDPR, see our related special Client Update titled "How to prepare to the new EU General Data Protection Regulation", as well as our recent GDPR Compliance Playbook – a practical guide which highlights the most important actions that organizations should take in preparing to comply with the new regime. In its discussion paper, the ICO provides, among other things, six key recommendations which it believes will assist organizations to achieve compliance in the big data area: Anonymization: organizations should carefully consider whether the big data analytics to be undertaken actually requires the processing of personal data. Often, this will not be the case; in such circumstances organizations should use appropriate techniques to anonymize the personal data in their dataset(s) before analysis; Privacy Notices: organizations must be transparent about their processing of personal data by using a combination of innovative approaches in order to provide meaningful privacy notices at appropriate stages throughout a big data project. This may include the use of icons, just-in-time notifications and layered privacy notices; Privacy Impact Assessment: organizations should embed a privacy impact assessment framework into their big data processing activities to help identify privacy risks and assess the necessity and proportionality of a given project. The privacy impact assessment should involve input from all relevant parties including data analysts, compliance officers, board members and the public; Privacy by Design: organizations should adopt privacy by design approach in the development and application of their big data analytics. This should include implementing technical and organizational measures to address matters including data security, data minimization and data segregation; Ethical Approaches: it is recommended for organizations to develop ethical principles to help reinforce key data protection principles. Employees in smaller organizations should use these principles as a reference point when working on big data projects. Larger organizations should create ethics boards to help scrutinize projects and assess complex issues arising from big data analytics; and Algorithmic Transparency: the paper recommends organizations to implement innovative techniques to develop auditable machine learning algorithms. Internal and external audits should be undertaken with a view to explaining the rationale behind algorithmic decisions and checking for bias, discrimination and errors. According to the ICO, the paper does not mark the end of its work on big data, and it intends to continue its work through several current and planned activities in the field. Moreover, the ICO stresses that in the area of big data, artificial intelligence and machine learning it will continue to exert its powers with the aim of responding to data protection legislation breaches, including, inter alia, enforcement notices as well as monetary penalty notices. Accordingly, companies should ensure that they consider the recommendations which the ICO mentions in its paper. Finally, this discussion paper preempts both the ICO’s own GDPR guidance on profiling as well as the EU Article 29 Working Party's ("WP29") guidance on consent and profiling (which will supplement its previous guidance on Data Portability, Data Protection Officers and the One Stop Shop), both of which are expected to be released in the first half of 2017 (see our previous related Client Update). The UK ICO Requested Feedback on GDPR Profiling and Automated Decision-Making Issues TOPICS: Profiling, Automated Decision-Making, General Data Protection Regulation, Information Commissioner’s Office, United Kingdom, European Union The ICO has recently issued a discussion paper, which requests feedback by 28 April 2017, on the GDPR rules on profiling and automated decision-making. According to the ICO, the paper does not set out guidance but rather constitutes “initial thoughts” on this topic under the GDPR, which introduces stricter provisions to defend individuals from this type of data processing. The ICO’s paper highlights the key areas of profiling which it considers need further consideration, and poses ten questions, including the following: How will you ensure that the profiling you carry out is fair, not discriminatory, and does not have an unjustified impact on individuals’ rights? How will you ensure that the information you use for profiling is relevant, accurate and kept for no longer than necessary? What controls and safeguards do you consider you will need to introduce, internally and externally, to satisfy these particular requirements? Have you considered what your legal basis would be for carrying out profiling on personal data? How would you demonstrate, for example, that profiling is necessary to achieve a particular business objective? How do you mitigate the risk of identifying special category personal data from your profiling activities? How will you ensure that any ‘new’ special category data is processed lawfully in line with the GDPR requirements? How do you propose handling the requirement to provide relevant and timely fair processing information, including "meaningful" information on the logic involved in profiling and automated decision-making? What, if any, challenges do you foresee? If someone objects to profiling, what factors do you consider would constitute "compelling legitimate grounds" for the profiling to override the "interests rights and freedoms" of the individual? and Will your organization be affected by the GDPR provisions on profiling involving children’s personal data? If so, how? The ICO's paper emphasizes the fact that the GDPR will require organizations to reevaluate their approach to obtaining consent and using consent as a legal basis for data processing. We would be happy to provide further advice and recommendations concerning the required steps in order to ensure compliance with the applicable obligations and their scope. On 25 May 2017, we will be hosting at HFN a special workshop – just one year before the GDPR enters into force – in which we will discuss the practical aspects of GDPR compliance, including with respect to the new consent requirements. The European Commission's Proposed Pilot to Improve Blockchain Regulation TOPICS: Blockchain, FinTech, Regulation, European Commission, European Union The European Commission has recently proposed a pilot project on blockchain with the intention of improving its regulation (see also our previous Client Update which reported on the US Financial Industry Regulatory Authority's new report on the use and potential implications of blockchain in the securities industry). In a draft communication addressed to the European Parliament, the Council, the European Central Bank, the European Economic and Social Committee and the Committee of the Regions, the European Commission stated that the pilot project would especially focus on improving the knowledge and creating awareness of the technology among the Europeans' regulators. According to the draft communication, the European Commission also launched an internal FinTech Task Force which involves all relevant services working on financial regulation, technology, data and competition to ensure that its assessment reflects the multi-disciplinary approach required by FinTech developments r. Additionally, the European Commission is launching a public consultation in order to receive inputs from stakeholders to further develop its policy approach towards technological innovation in financial services.