Generative AI Through a Fundamental-Research Lens

Generative AI Through a Fundamental-Research Lens

The following is an excerpt from Investment Manager Harding Loevner’s second-quarter report for their Global Equity strategy. Click here to read the full report.

Anyone who has interacted with popular AI models—asked them about the mysteries of life and the cosmos or created convincing Van Gogh replicas using AI-enabled image generators—can sense that we may be in the midst of a technological revolution. That prospect has consumed equity markets lately, with seven US tech-related stocks responsible for most of the market appreciation in the second quarter.

As an investor in high-quality, growing businesses, we have always tried to position this portfolio to benefit from secular trends, the kind that transcend economic cycles and are driven by fundamental changes in key areas such as tech. Still, it is incredibly difficult for anyone to predict how such trends will unfold; the vicissitudes of cryptocurrency are a sobering reminder of this. Furthermore, as seen with the rise of the internet and, later, mobile connectivity, technology is merely a platform; it’s the applications of the technology that eventually determine many of the winners and losers. In the case of generative AI, some of the future applications may not yet be conceivable, although many companies, even outside the tech field, are now pondering the possibilities.

ChatGPT, the chatbot that helped spark the market’s AI enthusiasm, is an important innovation because it can digest large amounts of text (hence the term large language model), communicate in natural (human) language, and generate sophisticated responses. Based on the Transformer architecture, a technique first introduced by Google in 2017, ChatGPT demonstrates the advances that have been made in AI that open the door to a wider set of business uses. But while natural language models recently produced epiphanies among lay chief executives and investors regarding AI, some tech companies were already investing in such capabilities and are being rewarded for that foresight. NVIDIA has been the biggest beneficiary this year in terms of its stock run and projected revenue gains; however, our other holdings, such as Adobe, Microsoft, Salesforce, ServiceNow, Synopsys, and TSMC, also appear among the possible beneficiaries. More companies—including, perhaps, some not yet in existence—will certainly join the ranks over time.

While it is still early, it’s evident that these companies see generative AI as transformative to their businesses and something upon which they can build new revenue models. Additionally, they are turning to AI to boost internal productivity, enhance existing customer offerings, and improve the quality and efficiency of customer interactions.

Most notably, Microsoft was able to gain an immediate leadership position in generative AI by making a US$10 billion investment in OpenAI, the company behind ChatGPT, earlier this year. Microsoft’s Bing search engine has since introduced ChatGPT into its web index data—a collection so large that it is rivaled by the dataset of only one other business in the world, Alphabet’s Google. Data are the feedstock of AI models, and an AI-enhanced search engine trained on so much data may attract more users to Bing, allowing Microsoft to sell more ads on the service. Microsoft is also adding generative AI to other products, including the Azure cloud service, enabling business customers who use Azure to easily integrate OpenAI models to glean more insights from their data and automate functions such as certain IT tasks. These added capabilities should motivate more businesses to migrate their data to the cloud and make Azure more competitive with Amazon.com’s AWS and Google Cloud.

The beneficiaries of demand for generative AI aren’t limited to traditional IT-sector companies. Data centers used to train AI models require up to ten times more power than typical data centers, thus requiring more-powerful equipment and backup power. Given the amount of heat they generate, new liquid-cooling solutions will be needed as well. This creates an opportunity for Schneider Electric, which has been developing innovative data-center equipment solutions for many years.

In the meantime, NVIDIA has emerged as the unrivaled global leader in providing the technologies at the center of the AI arms race. Due to an explosion of demand related to generative AI and LLMs from across its customer base, NVIDIA projects that data-center revenue for its fiscal second quarter ending in July will surge to US$11 billion. Not only is that more than double last quarter’s total, but the forecast also shattered the average analyst estimate that called for about US$7 billion.

Our investments in NVIDIA and Schneider are reflective of how we are thinking through the many unknowns and approaching portfolio structure in this environment. Through our fundamental framework, we can appreciate the broad excitement for AI, but we also remain conscious of valuations and thoughtful about diversification, recognizing that it’s unlikely anyone can predict today the biggest long-term winners.

By Jingyi Li, Co-Lead Portfolio Manager of the Pengana Harding Loevner International Fund and Pengana International Equities Limited (ASX:PIA)

Get monthly insights and market commentary direct to your inbox

Pengana Capital Group