Baidu Is Using Its Own Data to Measure China's Economy
Baidu Inc. started using its own data in order to gauge China's economy. The goal is to construct better indicators compared to the ones using the official government figures.
Baidu Inc. started using its own data in order to gauge China's economy. The goal is to construct better indicators compared to the ones using the official government figures.
Gene Ekster, CFA is Eagle Alpha’s advisory board member who works with asset management firms and data providers in a consulting capacity to help integrate alternative data into the investment process.
Eagle Alpha has spent over 2 years perfecting proprietary indices using a Principal Component Analysis (PCA) approach.
Tonia Ouellette Klausner, a partner with Wilson Sonsini Goodrich & Rosati, and a member of their Internet Law and Strategy Group, explained how buyside firms are using web crawling software and how legal risks could be mitigated.
Eagle Alpha gathers alternative data sources to analyse the Auto Industry in different ways, some details are described in our case study.
Daniel Goldberg is Eagle Alpha’s advisory board member who ran TangentDS – a company which turned email receipts into actionable insights. Prior to this, Mr. Goldberg was an analyst at Hunter Global and Bear Stearns.
There is a wide variety of data related to the Auto Industry and not all of it is available via conventional ways.
Eagle Alpha published a white paper detailing case law regarding web crawling and emerging best practices and organized web crawling events in NYC (16 March 2016) and London (7 April 2016).
Eagle Alpha has released a new white paper created by our advisory board member Gene Ekster, CFA on the topic of web crawling and the associated compliance risks. The paper is split into 5 sections:
This article presents Foursquare’s new tool called Place Insights. It uses all of the company’s foot traffic data which can be a powerful analytics tool as we mentioned in our recent article on “How Personal Data is Being Collected.”
Automated crawling of websites (also often referred to as “scraping” or “web-harvesting”) is ubiquitous. Internet search engines use “web crawler” programs (sometimes called “bots” or “spiders”) to automatically copy third-party websites in order to help people easily find sites of interest.