How Macro Funds and CTAs Approach Alternative Data

Background

Macro funds and Commodity Trading Advisors (CTAs) have evolved rapidly, adapting to changes in financial markets and leveraging new sources of information to gain a competitive edge. In 2022, macro funds have enjoyed some of the best performance metrics as the HFRI 500 Macro Index gained 14.2% for the year. The Citadel Wellington fund is among the top performers along with Brevan Howard, Rokos Capital Management, Haidar Jupiter, and Contrarian Macro Fund.

The use of alternative data by these funds has also grown throughout the years as they feature in our latest buyside surveys. COVID-19, government lockdowns, and supply chain disruption also created a significant challenge to company and economic forecasting models which led to higher demand for inflation, pricing, and supply chain insights. While considered similar by way of alternative data use cases, macro funds usually trade in multiple asset classes, including equities, fixed income, currencies, and commodities. CTAs, on the other hand, are focused on trading in the futures markets, including commodities, currencies, and interest rates.

Alternative data sources, like satellite imagery, credit card transaction data, and weather data, in particular, can provide insights into consumer behavior, economic activity, and other factors impacting global asset prices. For example, satellite imagery can provide insights into crop yields, energy consumption, and retail foot traffic, which can be used to make informed trading decisions in commodity markets. Similarly, social media sentiment analysis can provide real-time insights into public sentiment and consumer behavior.

Macro funds and CTAs can use these new sources to augment their market analysis, modeling, predictive analytics, and trade execution. Winton, the UK-based CTA, is considered a pioneer in implementing a statistical and mathematical approach to research and trading, with senior management referring to the firm as a “scientific research organization which happens to have a highly profitable computer attached.” Winton is not the only fund relying on alternative data sources for a holistic approach to macro investing, See Section 4 for a breakdown of funds using alternative data for macro investing.

Structure of the Paper

This paper provides a breakdown of key alternative developments important for investors and pulls together publicly available information through desk research, Eagle Alpha’s proprietary articles and webinars, and insights from interviews with industry experts. We have grouped our insights into four key sections:

  • The evolution of macro and CTA funds
  • Team and workflow
  • Challenges and solutions
  • Alternative data case studies

Section 1: The Evolution of Macro and CTA Funds

The origins of macro and CTA funds can be traced back to the 1960s and 1970s when pioneers like George Soros and Paul Tudor Jones began using macroeconomic analysis and trend-following strategies to generate returns. These early funds focused on trading in currencies, commodities, and interest-rate futures. In the 1980s and 1990s, macro and CTA funds expanded their strategies and asset classes and began trading in equities, fixed income, and options markets, broadening their scope beyond the traditional futures markets. This diversification allowed them to capture opportunities across various asset classes and mitigate risks through portfolio diversification.

The 2000s witnessed a rapid increase in the use of technology and quantitative models in macro and CTA funds. Advances in computing power, data availability, and algorithmic trading techniques enabled funds to process vast amounts of data, implement complex trading strategies, and execute trades more efficiently. This led to the rise of systematic and algorithmic trading approaches for macro funds and CTAs.

Macro funds can therefore be classified in a variety of ways, including by style (discretionary or systematic), type of analytical input (fundamental or technical), and investment time horizon (short-term or long-term). Some managers apply a bottom-up micro-analysis in specific markets or regions. Figure 1 below shows an example of return decomposition mentioned in Morgan Stanley’s overview of global macro strategies. Fund managers are highlighted to exploit various inefficiencies across different markets:

  • Investor behavioral biases
  • Reduced consensus opinion on market directionality
  • Monetary policy errors causing disequilibria in the financial markets
  • Non-economic investment activities (central bank currency interventions, sovereign debt rating downgrades, political events)
  • Divergent global business cycles
Figure 1: Return Decomposition (Source: Morgan Stanley)

In 2017, it was reported that Tudor Investment Corp. invested in CargoMetrics Technologies, a data analytics company using shipping and satellite data, and Numerai, an AI-run crowdsourced hedge fund. Figure 2 below shows the results of Eagle Alpha’s vendor survey indicating that macro and multi-strategy firms are consistently trialing and purchasing data from alternative data providers.

Figure 2: Macro Funds Increasingly Trial Alternative Datasets (Source: Eagle Alpha’s Survey)

Section 2: Team and Workflow

Identifying, hiring, and retaining the right talent is essential for macro and CTA funds. According to Kenneth Tropin, Founder and Chairman of Graham Capital Management, it is important to consider the following factors: 1) PMs track record based on consistent and replicable strategy; 2) a well-defined trading methodology; 3) a detailed and well-constructed view of market dynamics; 4) low correlation of returns to various market indices or instruments.

In early 2015, Bridgewater created a team of programmers that specialized in analytics and artificial intelligence – called the Systematized Intelligence Lab. This lab built a tool to incorporate peer-to-peer ratings into an organized ranking system that shows employees’ strengths and weaknesses. Another app, called “The Contract”, asks staff to set goals they want to achieve and then tracks how well they follow through. Bridgewater is known to be very secretive about what they are working on internally, but we understand that a lot of testing around big data and alternative data happens in their Innovation Lab.

According to the Wall Street Journal, in 2016 Bridgewater was developing a management engine, called PriOS, that aims to control three-quarters of all management decisions within 5 years (also referred to as “The Book of the Future”). The kinds of decisions PriOS could make include, finding the right staff for particular job openings and ranking opposing perspectives from multiple team members when there’s a disagreement. Figure 3 below shows how numerous data points are collected internally and how they are fed into a machine of sorts to derive logical decisions.

Figure 3: Ray Dalio’s PriOS Management System (Source: Wall Street Journal)

Acadian’s Multi-Asset Absolute Return Strategy (MAARS) is an example of Acadian’s investment philosophy. It is a systematic global macro strategy with the aim to generate absolute returns in any market condition. MAARS employs a return forecasting model that includes more than 300 signals that fall into two categories: macro/cross-asset themes and asset-specific themes.

These signals are constructed from alternative datasets and are market-specific. When looking at commodities, for example, the team employs datasets that are useful in predicting supply/demand balances and trade flows, including crack spreads and inventory information as well as sea surface temperatures, frost risk, and pipeline flows. The figure below shows different contributions of the signal themes to performance across various asset classes.

Figure 4: MAARS Return Contributions – By Factor Theme and Asset Class (Source: Acadian Asset Management)

Winton, the UK-based CTA, attributes its success to a substantial research and investment management group and implements many strategies across thousands of securities. In 2016, David Harding stated that there were over 100 people on their data team. In early 2016, Winton opened up a San Francisco location as a data science center, the area chosen since it was so close to the Big Tech hub of Silicon Valley. Starting out with 6 scientists in 2016 working on proprietary data sets and exploring new applications for its expertise in pattern recognition and statistical inference, Harding planned to grow this to around 40 scientists as the project progressed.

Quoniam, a Frankfurt-based asset manager, decided to migrate all its research to the cloud in 2021 and over 850 SAS scripts were translated to Python. Andre Fröhlich, Head of Research Technology at Quoniam, commented: “At some point, a mountain of scripts and small programs builds up and you have to ask yourself the question of whether to keep doing what you’re doing or tackle important technology shifts. We’ve chosen the latter.” Markus Ebner, Head of Multi-Asset Strategies, noted that that Quoniam benefits from data vendors doing the preliminary work. He highlighted two data vendors, RavenPack and GDELT, that help the firm monitor news data: “The data is usually much more up-to-date than conventional research data, so we can capture market sentiment almost immediately.”

Section 3: Challenges and Solutions

Alternative data typically comes in various formats, structures, and timeframes, making it challenging to integrate with existing data systems. It may require significant effort to transform and normalize the data for compatibility with internal databases and analytical tools. For example, Acadian’s Director of Stock Selection Research Wesley Chan reported in 2017 how challenging it was to navigate the space as the company hired about two dozen researchers tasked with developing new signals from alternative datasets: “You’re going to have to investigate 90 different things to get 10 that are good. A lot of people who aren’t used to those odds will walk away in disappointment, thinking the whole thing is a failure—it’s going to be a lot of waste”.

In our report, the first steps in alternative data for the buyside, firms' most common mistakes when implementing a new alternative data strategy are attempting to do it all in-house and over-hiring. Outsourcing is a powerful way to scale up the data initiative. Much of the required technology and infrastructure can be outsourced, as can many of the steps in dataset evaluation.

The process may involve developing data pipelines, data cleaning techniques, and data normalization procedures. Dataset prioritization is another key part of the puzzle at Acadian as reported by VP of Research Katherine Glass-Hardenbergh. The breadth of coverage that alternative data vendors offer is particularly essential as 20-30 names won’t be enough for the company’s systematic models.

It is also important to have a process for managing any outsourced functions. More sophisticated users increasingly subscribe to raw datasets, taking the methodology for aggregating a dataset and building a signal in-house. In contrast, they might have previously relied on the data vendor for aggregations and signals. The preference for raw datasets was also highlighted as Acadian wants to form unique research ideas. Vendors that provide dashboards and distilled signals are seen as less valuable as alpha might get eroded quicker. However, Acadian prefers to leave data collection, curation, and mapping to data vendors instead of doing those functions in-house.

Quoniam’s alternative data journey started back in 2012 and wasn’t successful according to Markus Ebner, Head of Multi-Asset Strategies at the firm. In a June 2022 podcast, Markus discussed how the research team at Quoniam first started working with Twitter data by creating curated lists of people working in the finance industry. The firm tried aggregating sentiment daily to forecast country markets in the wake of the European debt crisis.

The sentiment analysis of Twitter data didn’t result in creating a successful forecasting model, but Quoniam gained lots of experience in natural language processing and directed their efforts towards analyzing unstructured news articles. Quoniam now works with several data vendors providing different sets of news data to run a strategy purely based on unstructured data.

Section 4: Alternative Data Case Studies

Acadian

Signal diversity is essential for Acadian’s MAARS strategy. The company highlights that many signals used in this strategy are derived from alternative datasets. For example, return forecasts for commodities “employ an array of data sources used to capture predictive information related to supply/ demand balances and trade flows, ranging from crack spreads and sector-specific inventory details to more esoteric data, such as sea surface temperatures, frost risk, pipeline flows, and ethanol-to-gasoline ratios”.

As sustainable investing and ESG became a focus for Acadian with the appointment of Andy Moniz, the company started using alternative datasets to get additional insights. Greenwashing is one of the areas that Acadian concentrates on by using natural language processing and machine learning. Moniz explained: “I became frustrated with what the ESG data providers were doing. A lot of the time they’re just relying on yes or no answers to questions like ‘Does this company have a human rights policy?’ That data is extremely stale and backward-looking”.

Therefore, Moniz and his team decided to collect and analyze data on their own instead of relying on ESG scores and ratings. Acadian’s algorithm collects and analyzes transcripts from shareholder meetings, conferences, and analyst calls to identify signs of vagueness and evasiveness. One example highlighted by Moniz was Albert Chao, CEO of Westlake Chemical Corp, who couldn’t properly explain his firm’s decarbonization strategy. Other examples include the Russian miner Evraz and the Polish coal-fired electricity company Ze Pak.

Figure 5: Acadian’s AI Tool Designed to Identify Greenwashers (Source: Acadian Asset Management)

More information on Acadian can be found on our fund deep dive here.

AQR

AQR, or Applied Quantitative Research, is a global asset manager based in Greenwich, Connecticut. The firm employs just under 1,000 employees across 9 locations – Boston, Greenwich, Chicago, Los Angeles, London, Hong Kong, and Australia, with its most recent offices opening in 2019 in Munich, and an engineering center in Bangalore, India in 2018.

AQR relies on three core principles for operating - fundamental investing, systematically applied and thoughtfully designed. AQR frequently publishes reports detailing the data sets that are behind many of their proprietary research papers. Although the returns are from hypothetical portfolios from academic articles, it offers a hugely insightful resource for the quantitative investing community. You can access their data library here. AQR also provide resources on the use of Machine Learning for asset management and how it improves quantitative investing.

The firm's financial data repository system, which stores, manages, and analyzes the data, became too costly to run on-premise. As of October 2021, AQR has moved a third of its production workloads to the cloud, along with a number of tools. Moving data to the cloud has saved the firm 20% in costs related to data storage, according to AQR’s co-CTO, Steve Mock. “Total cost savings will come closer to 30% when the move is complete”. AWS is used for infrastructure work, while Azure is used in more specific workloads internally. AQR expects 60% of its compute volume will be on the public cloud by halfway through 2022 and reducing on-premise tech stack workloads to around 10%-15%.

To do this efficiently, AQR created sets of rules to govern how the engineering organization would move off physical data centers and onto the cloud, with Steve Mock stating "The first couple of years were us learning and discovering, and really making sure that we were creating a safe and secure infrastructure to move our firm to.”

When the firm first began working on public-cloud projects, it built out a specific cloud-engineering team. Some 5% of the total engineering organization are cloud engineers, a number that is expected to eventually double in size as AQR's cloud footprint grows.18

AQR’s cloud processes consist of sourcing and cross-referencing data from multiple vendors, and maintaining daily security history going back decades as accuracy is paramount for the back-tests/models to be effective.19 AQR have invested in building a financial instrument data repository on AWS, with Chris Mock saying, “there are a few vendor products out there in this space but none of them meet the needs of a firm like AQR."

More information on AQR can be found on our fund deep dive here.

Brevan Howard

Brevan Howard Asset Management is a global alternative investment management firm specializing in macro strategies. The firm was founded in 2002 by Alan Howard, a prominent hedge fund manager, and works primarily with institutional investors, including pension funds, sovereign wealth funds, and endowments. Brevan Howard currently employs over 1,000 team members, including more than 150 portfolio managers. The firm was also one of the top performers last year as its biggest funds gained up to 28% in 2022.

The firm's macro strategies aim to profit from changes in interest rates, exchange rates, and the overall economic environment. By taking positions in a wide range of financial instruments, including currencies, interest rate derivatives, and government bonds, Brevan Howard seeks to generate returns that are not directly tied to individual companies or industries.

The firm employs a top-down approach, which means it focuses on identifying major global economic themes and then constructs investment portfolios to capitalize on those themes. The investment decisions are based on a combination of quantitative models, fundamental analysis, and expert judgment. The team believes in the value of contrarian thought with intellectual humility and flexibility considered essential to long-term success.

It is also interesting to note that SIGTech spun out of Brevan Howard in 2019 to create a quant platform and a “data refinery” for asset managers. SIGTech’s CEO Bin Ren worked as CIO of the Systematic Investment Group at Brevan Howard and developed the platform as the firm’s in-house system over the period of eight years.

According to Bin Ren, investment managers are being overwhelmed by the exponential growth in data availability: “Raw data itself is a bit like crude oil. Without being processed in a refinery, it’s practically useless. Investment managers now are under tremendous pressure from things like fee compression; they need to justify their fees by showing they are doing a good job and generating alpha and beating the market consistently. So, there is a strong demand for a “data refinery” among investment managers. That’s the role SIGTech is playing.”

Bridgewater

According to Business Insider, Bridgewater starts with government data but has found alternative data to be very useful. In 2020, Matt Karasz, a macro research lead at the firm, stated that alternative data is being used more and more as it helps them “understand macro dynamics much more granularly and quickly than traditional government data sources would." Particularly over the Covid-19 crisis, the company has relied on alternative data to monitor the large shifts in global trends. This includes consumers changing from shopping in brick-and-mortar stores to moving their purchasing online. He stated that alternative data "has been particularly useful as we navigated the uncertainty over the last couple of months."

In March 2021, Greg Jensen, Co-CIO, said that the firm was aiming to win big from the macro turnaround, using alternative data to map the rollout of the fiscal stimulus, which Jensen says is by nature harder to track than monetary tools. In 2020, Bridgewater’s Pure Alpha fund recorded its greatest loss, chalked up to being caught off-guard by the speed of global lockdowns and not being able to adapt its investing engine fast enough.

More information on Bridgewater can be found on our fund deep dive here.

King Street

King Street's investment approach primarily focuses on fundamental analysis and value investing in distressed and undervalued debt and equity securities. The firm seeks to generate returns by identifying opportunities in companies facing financial distress, restructurings, or other special situations. King Street employs a combination of intensive research, financial analysis, and active involvement in the restructuring process to capitalize on these opportunities.

King Street manages two global evergreen funds focused on long/short multi-strategy credit and event-driven investments. The Funds aims to produce long-term risk-adjusted returns throughout diverse market environments. The managers also seek capital preservation and minimization of volatility through these funds. The hedge fund also manages a European Fund that “generally invests alongside the Flagship Funds in European situations.”

In our webinar with King Street’s General Counsel Jessica Singer, she discussed the company’s separate dedicated data science team that is a subset of the investment team. It is charged to source data providers and generate ideas on how to use alternative data sources. One team lead on the data science team is charged with funnelling data vendors to the compliance team for review.

There is no initial review, the team generally just goes right to the formal diligence. If the data science team sends something early on before they figure out that they really want it, the compliance team would look for big red flags. For example, how established the vendor is, whether they provide blockbuster information which is really exclusive and expensive, whether it could lead to headlines or regulatory scrutiny.

King Street’s policy entails that the data science team isn’t allowed to receive sample data until the compliance team clears it. They do have introductory calls and review vendor pitches to figure out whether they want to move forward and that is when Jessica and her team get involved.

Quoniam

Quoniam uses data from the Global Database of Events, Language, and Tone (GDELT). It is a research collaboration that parses through global news articles and identifies themes, emotions, locations, and other factors. The firm’s methodology involves filtering the GDELT data, applying a bidirectional LSTM neural network to train on 1,000 articles manually tagged according to industrial production or CPI relevance, and aggregating the results on a monthly basis. Figure 6 below details four models constructed using the GDELT data and benchmarks.

Figure 6: Overview of the Components of the Models and Benchmarks (Source: Quoniam Asset Management)

The results in Figure 7 below show unstructured data improves macroeconomic forecasts as mean square errors for the model are mostly improved compared to the BM1, BM2, and BM3. Therefore, GDELT sentiment factors outperform the benchmarks for both CPI and industrial production models. And at least one sentiment factor was found to be statistically significant in nineteen out of twenty models.

Figure 7: Consumer Price Index (left) and Industrial Production (right) Forecast Results (Source: Quoniam Asset Management)

More information on Quoniam can be found on our fund deep dive here.

Winton

Winton, the UK-based commodities trading adviser (CTA), was launched in 1997 by David Harding with $1.6 million, following his departure from future AHL (now Man AHL), which he co-founded. Both firms began futures strategies using data analysis and modeling to follow trends in global futures markets such as commodities and bonds. Winton is considered a pioneer in implementing a statistical and mathematical approach to research and trading, with senior management referring to the firm as a “scientific research organization which happens to have a highly profitable computer attached.”

In 2016, David Harding stated that he was interested in creating a database that looks at career histories. Here Winton could extrapolate a handful of research questions to assess companies, like whether gender diversity among senior management has an impact, how compensation relates to performance, and if companies benefit from splitting the CEO and chair roles. Winton has also used temperature data from the UK’s Hadley Centre, and the annual average of the concentration of atmospheric carbon dioxide as measured by the National Oceanic and Atmospheric Administration monitoring station on Mauna Loa, Hawaii.

In a 2017 presentation by Carlyon, a research and consulting firm for financial services, the firm pointed out that Winton’s researchers trawl “through credit card and social network data to forecast sales, analyzing rail car networks to measure the pulse of the US economy or…weather data to forecast crop yields.” Harding has also stated, “Data interests me because a lot of it is incorrect, and once we are able to put it together, we are able to look at a range of different research premises that can impact how we allocate capital.”

Winton’s data-focused arm, Hivemind, which spun out in 2018 managed around three-quarters of Winton’s research projects. In 2018, Hivemind joined an Eagle Alpha presentation discussing the sharing of ground truth data. Hivemind explained how tasks are collected from the user either from their Hivemind Studio or user system and delivered through API. The platform uses a combination of automation and human input as shown in figure 8. The human input can be internal, external, or outsourced through platforms such as Amazon’s Mechanical Turk. The end goal of Hivemind’s framework is to gather a range of samples for every data task and to arrive at a consensus for the project.

Figure 8: An Example Flow in Hivemind (Source: Hivemind)

In 2022, it was announced that Winton was expanding its crypto operations and hired a quant researcher. The job posting indicated that they were seeking someone with general quant experience in systematic trading as well as Python coding experience. The entry of hedge funds like Winton into the crypto space signals the growing involvement of institutional investors.

More information on Winton can be found on our fund deep dive here.

Conclusion

Macro and CTA funds are increasingly adopting alternative datasets and experimenting with various strategies. We highlight case studies and associated challenges overcoming of which requires a combination of technical expertise, robust data governance frameworks, and a thoughtful approach to privacy and ethical considerations. By addressing these challenges effectively, firms can harness the power of alternative data to gain valuable insights and make more informed decisions.