Eagle Alpha Legal Wrap - June 2022

Eagle Alpha rounds up some of the most relevant legal and compliance articles surrounding the alternative data space over the past month.   

3 years ago   •   4 min read

By Mikheil Shengelia

Eagle Alpha rounds up some of the most relevant legal and compliance articles surrounding the alternative data space over the past month.   

US

The SEC proposed rules that would make investment advisers and investment companies provide standardized ESG disclosures to their investors and the Commission itself. For example, registered funds will have to explain the role of ESG factors in a prospectus of their investment strategy. ESG-focused funds will have to provide additional details such as the incorporation of ESG factors in investment decisions and proxy voting or company engagement. These proposed disclosures are similar to the EU regulatory reforms that came into effect last year. You can access the article here.

Connecticut became the fifth US state with a comprehensive consumer data privacy law as Governor Ned Lamont signed Senate Bill 6 into law. An Act Concerning Personal Data Privacy and Online Monitoring is modeled after the Colorado Privacy Act and will also go into effect on July 1st, 2023. One notable difference is great children’s data privacy rights. Also, the scope of Connecticut’s law is narrower than that of Colorado’s which covers any revenue realized from data sales. It is also important to note that personal data processed solely for payment transactions are excluded. You can access the article here.

The SEC filed a complaint in what can be considered the largest fraud since Madoff as Germany’s Allianz SE was accused of manipulating risk reports related to the firm’s Structured Alpha group of funds. Allianz’s US asset management unit also pleaded guilty, and the group will pay over 6 billion USD in fines and restitution. In one example detailed by the SEC, senior executives of Allianz are shown to alter numbers by directly changing digits and reducing the losses of an IDS stress test output from –42.15% to –4.15%. You can access the article here.

Twitter agreed to settle with the Department of Justice and the Federal Trade Commission over data misuse. The company will pay 150 million USD to resolve allegations that data collected for account security purposes was actually used to send targeted ads to Twitter users. The government alleged that this practice was going on from 2013 to 2019 with Twitter collecting and misusing data of over 140 million people. Twitter is accused of not only violating the FTC Order and Section 5(a) of the FTC Act but also misrepresenting compliance with the EU-US and Swiss-US Privacy Shield Frameworks. You can access the article here.

Peter Green of Schulte Roth & Zabel's perspective on the Twitter settlement:

This fact pattern is critical to the analysis of alternative data provenance. Permission from the owner/creator of the data (the user of the app) for the app developer to use the data for other purposes, for example, to sell to hedge funds for financial analysis, is necessary for hedge funds to be comfortable that their use of the data does not violate US securities laws. If app developers are collecting data from users and then selling it to hedge funds without permission from app users, securities laws could be implicated.

UK

The Information Commissioner's Office fined Clearview AI Inc by over 7.5 million GBP for collecting and using images of people in the UK. The facial recognition company amassed over 20 billion images of people’s faces and data from social media websites and publicly available information in its goal to create a global database. However, data from UK residents was collected without their consent with John Edwards, UK Information Commissioner, explaining: “The company not only enables identification of those people but effectively monitors their behavior and offers it as a commercial service. That is unacceptable. That is why we have acted to protect people in the UK by both fining the company and issuing an enforcement notice”. You can access the article here.

Europe

The EU Digital Services Act is yet to receive the final vote but it promises greater transparency when dealing with recommender algorithms. The idea of “algorithmic accountability” will make companies open up and let regulators look under the hood of how user activity and personal information are used to furnish recommendations. Furthermore, targeted ads are going to be totally banned from using a certain type of sensitive information such as sexual orientation, religion, and ethnicity. You can access the article here.

Europe’s new law will result in secretive apps like TikTok opening up with the European Commission’s goal being the reduction of online harms such as harassment but also potential influence on elections and other societal aspects. Margarethe Vestager, Executive Vice President of the European Commission, explained: “We ensure that platforms are held accountable for the risks their services can pose to society and citizens”. You can access the article here.

The European Commission published its proposal on the European Health Data Space (EHDS) the goal of which is to promote the safe exchange of patients’ data and also citizens’ control over their health data. It is interesting to note that EHDS will encourage the access to and use of data for research, policy-making, and regulation. You can access the article here.

The European Data Protection Board (EDPB) announced the adoption of new guidelines on how to calculate administrative fines under the GDPR. Furthermore, EDPB also introduced guidelines on the use of facial recognition technology in law enforcement. Both these guidelines are subject to public consultation until June 27th, 2022. You can access the article here.

China

Hong Kong’s Office of the Privacy Commissioner for Personal Data (PCPD) issued recommendations for cross-border transfers of personal data. These contractual clauses are for situations when personal data is transferred outside of Hong Kong or when the data transfer is between entities outside Hong Kong but controlled by a Hong Kong data user. Data users are recommended to adopt good data ethics by being transparent about data processing activities. For example, data subjects are to be notified of proposed cross-border transfers. You can access the article here.

Industrial Commentary

Michalsons prepared a checklist for using alternative data lawfully: “1) Classify the alternative data into different categories of data, like personal data, special personal data, consumer credit data, health data or account numbers; 2) Determine what laws apply to each category of data, like data protection law or credit law; 3) Determine what the regulatory requirements are of that law and comply with them; 4) If there is no law that regulates how you can process a category of data, go for it”.You can access the full commentary here.

Derek E. Brink, Vice President & Research Fellow at Aberdeen Strategy & Research, on Twitter’s open-source algorithm: “The idea that algorithms should be open and transparent has been considered best practice for nearly 140 years. It’s called Kerchoff’s Principle, which holds that trying to keep the algorithms secret — which many refer to as “security by obscurity” — is the wrong approach to maintaining security. Instead, the algorithms themselves should be public knowledge”. You can access the full commentary here.

Spread the word

Keep reading