Moby's AI Outlook for 2024
More than a month ago

With our top pick for 2024 out of the way, we also want to give you a sector-wide strategy for 2024. Like last year, we're hunting for long-term trends that are going to define 2024 regardless of market conditions.

This time last year, we talked about automation and reshoring themes as the economy recovered from COVID lockdowns and global trade began the long process of diversifying away from complete reliance on China. Those picks have largely panned out and will continue to do so in 2024. 

But we want to make sure we're also bringing you an understanding of new sectors we're watching as our economy evolves. 

And we're beginning to see the market start a subtle shift toward rewarding AI deployment over AI development. 

The AI boom was largely kicked off by ChatGPT and then, in the last year, has all been about developing large language models that the market can provide real utility in the real world. Investors have largely been rewarded with the results so far -- especially for stocks that peripherally enable AI development like our strategies in DataDog and MongoDB

But a lot of those gains have more come from the promise of future returns than anything else. And that's why 2024 is going to be the year where big AI bets like Microsoft's massive investment in OpenAI and Alphabet's LLM  development need to start generating real returns in the real world. 

If you read our recent reporting on Dell, you've already gotten a taste of this shift as major companies move to develop an architecture that will allow LLMs and other AI models to run in the real world. We're going to see PCs, laptops, and smartphones all with some level of AI infrastructure built into their physical chipsets. 

This is going to produce an entirely new slate of winners and losers within the tech space as some AI implementation works and others fall flat. We've already seen big bets like Humane's ridiculous AI Pin try to take a crack at this market. While consumer-facing products like that need a lot more development before they get wider adoption, there are ways AI will generate real revenue in the real world of 2024 -- but most of those are going to be more behind-the-scenes. 

So, let's take the lid off of the AI race and try to understand the architecture that's going to drive the next wave of AI development and investment, and along the way, we'll show you our top 5 favorite stocks that stand to benefit the most from this coming wave of bull sentiment.πŸ‘‡

Why Deploy AI at All? 

One key area of groundwork we need to lay here is the actual cost-benefit argument for deploying AI like this.

When we talk about deploying AI, we specifically mean purpose-built AI models running directly either in company data centers at the enterprise level or providing services for consumers at the device level. Think AI optimizing battery usage or a virtual assistant that is run directly from your laptop. 

But really, what is the point of building that kind of infrastructure? Every other service we have exists just fine on rented cloud servers. Why would GenAI even need investment to have these models deployed in the real world? 

For us, a key here is 'under the hood' applications for AI models. Some of the biggest wins for AI as a concept came this year from behind-the-scenes applications like Meta using AI and machine learning to boost their recommendation engine and advertising algorithm. That boosted revenue, margins, and engagement on Meta's Instagram Reels product to an incredible degree.

Uber is another great example where they used internal AI and ML resources to finally optimize their rideshare algorithm enough to make their product profitable and Uber's primary revenue driver. 

It makes sense that Uber and Meta would keep their AI deployed onsite for proprietary reasons -- but why everyone else?

Here are a few of the top reasons: 

  • Latency is actually a huge issue. For a lot of the immediate use cases, AI models are being deployed to simply optimize an expensive process in the real world, and therefore the model needs to be running at the device level to actually make things more efficient. For instance, this is why Nvidia's partnership with various car manufacturers is only going to broaden as more and more advanced Driver Assistance systems will require more and more processing power to run. 

  • Pairing specialized models with specialized chips is so much cheaper. General AI is tackling so many parameters at once to handle general queries that it burns through an astounding amount of energy to handle each query. Based on data from Q2, a single ChatGPT query is about 8x as expensive to run as a Google search. Specializing at the device level can cut those costs drastically, as designers can fine-tune components to the needs of a model. 

  • Personalization and security are key to actual AI services. For enterprises, running AI models on proprietary data can't happen in the cloud. At the same time, if any major manufacturer is going to release a true AI assistant, for the sake of individual privacy, that model would have to run at the device level. Think of Apple revolutionizing the next wave of iPhones with a LLM-powered upgrade to Siri. Apple can't make a service like that work with their privacy guarantees if requests are getting processed at the cloud level. 

There are a lot more super technical pressures pushing companies to deploy AI outside of the cloud, but those three help paint roughly 90% of the picture.

A few rate cuts in 2024 won't even be close to enough to bring back the cheap money days we had pre-pandemic. The year of efficiency is going to continue into 2024, and firms of all sizes are salivating at the opportunity to increase operational efficiency the way that Meta and Uber already have. So, we're going to see companies generate a lot of real revenue in the next 12 months on the promise of deploying these AI systems at the device level. 

The next major question is, what narratives within this theme stand to benefit the most? 

Let's list a few of the most important sectors we're watching as 2024 kicks off:

The Main AI Deployment Narratives: 

Competition is already fierce in the AI chipset space as tech providers try to grab a slice of the massive demand Nvidia has managed to generate. We can tackle AI deployment in a lot of different ways, but for now, AI infrastructure is going to depend on physical chips, power management, and new memory components.

Let's break down how these will play out and who can win. 

  • AI-enabled chips are going to be a key differentiator at minimum.  We're already seeing some forms of AI work in the background to optimize smartphone battery life and produce better smartphone photography, but this architecture is going to proliferate a lot in 2024. Some names like Qualcomm are already a little overbought as they're seen as a potential universal producer of AI chips for smartphones. We need to look a little broader for 2024, as we really don't know if we're going to see enough demand return to the smartphone sector. That's why we see AMD (more on the below) as a key beneficiary here. 

    • We're going to see a majority of the development (and therefore a majority of the revenue) on the enterprise side of this market -- and AMD's products are better suited for running AI models specifically. It takes a lot more power and infrastructure to train large language models, but once companies finish their development, running AI models depends on efficiency. AMD's products come in at a price point that makes them the more broadly attractive option for less souped-up data centers that are focused on running AI services. AMD's AI chips can also grab outsized adoption for manufacturers that are looking to add Dell-style NPUs to their PCs and laptops as well. 

  • AI Models will need better power optimization infrastructure. It's no secret that running models like ChatGPT takes an extraordinary amount of energy. While specialization does lead to cost savings, folks building these devices will need to shave off as much power consumption as possible at the chip level. Lots of component manufacturers are purpose-building components that are optimized for machine learning and AI work at the device level. One really key manufacturer here is STMicroelectronics, a Swiss-based firm that is a little ahead of the game here after they went whole-hog on developing extremely efficient and low-power microcontrollers. The original angle here centered on needing a super low-powered component for Internet-of-things (IoT) devices supercharged with Machine Learning (think an internet-connected remote thermostat that can adjust its temperature on the fly based on a whole host of inputs). This perfectly positions STM as an AI deployment leader, as their components are perfect for smartphones, AI-enabled IoT devices, and anything else that needs to run an AI model without burning through limited power resources.  

  • High-powered memory is critical for the latency needs of AI models. For model deployment in the wild, power isn't as important as latency. These systems need to run as fast as possible to be effective and actually useful to consumers and businesses. You might remember from our Arista Networks strategy how we discussed Nvidia buying an entire competitor to ethernet called Infiniband just so they could own the low-latency market for AI data centers. At the device level, RAM takes on a lot of the burden of speeding up AI operations. So, a new standard of souped-up memory chips is going to become increasingly necessary. But making a single pick here is actually really hard, as there are a lot of solid memory players out there. We're filtering out any major manufacturer based in China or Taiwan just to reduce any potential regional risk in 2024 as there are still doubts about China's recovery and the Chinese government's ability to tank any stock price at any time with a single regulatory filing. So, for us, the single strongest play in the memory business would be Samsung Electronics as they make moves toward developing AI-capable smartphones and memory chips. However, for most of our members, straight-up investing in stocks listed on South Korean exchanges is too complicated to make buying a single stock worth it. Therefore, the most even-handed play here is a position in the iShares MSCI South Korean ETF. Just shy of a third of this ETF is just two really strong memory players, Samsung and SK Hynix. The ETF also includes some of our favorite niche plays in the South Korean economy like the communications company Naver and social media powerhouse Kakao. 

  • What models are actually going to be powering these services?  With the physical infrastructure here basically established, let's turn to the actual models powering these services. Who wins here? Frankly, we think this answer is as obvious as it is boring: this is Alphabet's market to lose. The launch of the Gemini model and Alphabet's modular approach to developing different models for different services set them up as a strong B2B player in this space that will make them developer-friendly and agile enough to adapt their services throughout 2024. 2023 was Microsoft's year to own the AI development conversation, while 2024 is Alphabet's year to own the AI deployment narrative. 

  • Is there a dark horse that can win this entire conversation?  Technically, yes. There's one company that owns every aspect of this market and could start winning the whole game in 2024. And that's Apple. Honestly, Apple really needs to offer something from the AI deployment conversation if they're going to continue being the world's most valuable company this year. After a few years of barely iterative iPhone models, their release cycle in 2024 needs to be massive in order to reignite consumer demand. The wild thing here is that Apple has multiple angles they can take to seize control of the AI conversation. One obvious angle is Apple realizing an LLM-powered upgrade to Siri that makes it a device-level AI assistant. Having Siri+ (or whatever you want to call it) rely on device architecture and not the operating system would instantly make next year's iPhone model a must-have. Apple has enough devices in the wild generating enough data that Apple is perfectly capable of developing something truly revolutionary in the AI assistant space. But, AI can also help make the launch of Vision Pro more distinctive. One critical area where AI is already providing a lot of utility is video games and computer graphics. If there are AI capabilities built into Vision Pro that can be improved after this first launch in Q1, Apple can easily start making big promises about AI's capabilities for supercharging visual computing for the next stages of Vision Pro. However, with no announcements from Apple about their AI ambitions, it's hard to make a clear pick here. 

Wrapping this up:

We're going to be heavily focused on this sector throughout 2024, as this is a space that can turn on a dime.

A lot of the gains we'll see in 2024 will be based more on the 2025-2026 outlook.

We don't anticipate seeing a major AI app hitting the market in 2024, but we're not counting out early 2025. 2024 is going to be about understanding what works and what doesn't in the development space.

Any company touching this space is going to have an outsized lift in the first phases here, but by the end of 2024, we should start seeing the winners and losers emerge.

Our picks in the above narratives reflect the companies we believe will become dominant in the space. But we'll be updating you frequently as the year moves on. 

Granular Picks for the AI Deployment Narrative:

1) AMD will emerge as a strong winner in chips for running inference models.  AMD ($AMD) has gotten some great lift ever since Microsoft and Meta signed up to use their chips for training models.

However, as the industry coheres, AMD is well-positioned at a price point to provide chipsets for enterprises who want to run AI models in-house on data centers and some niche services where AI models are deployed at the PC level. 

​Price Target: $165 (18% upside)

Current Price: $140

Target Date: Q4 2024

Stock: Advanced Micro Devices, Inc. ($AMD)

Rating: Overweight

Risk/Reward: Medium-High/ Medium-high

2) STMicroelectronics is a critical provider of power-saving componentsSTMicroelectronics ($STM ) is a massive leader in the microcontroller business that has a lot of different routes to generating outsized growth in 2024 as more businesses experiment with deploying AI as efficiently as possible. 

​Price Target: $55 (20% upside)

Current Price: $50

Target Date: Q4 2024

Stock: STMicroelectronics NV ($STM)

Rating: Overweight

Risk/Reward: Medium-High/ High

3) Betting on the entire South Korean tech sector is the best way to win the memory market.  Samsung and a few other companies will have a strong opportunity to be the memory providers for low-latency memory as more and more smartphone manufacturers add AI at the device level. Samsung also has a solid opportunity to differentiate itself even more by beating Apple to AI-enabled smartphones that truly iterate on a stale smartphone market. However, most folks should simply opt for the iShares MSCI South Korean ETF ($EWY) for efficient access to Samsung gains as well as SK Hynix. 

​Price Target: $71 (12% upside)

Current Price: $64

Target Date: Q4 2024

Stock: iShares MSCI South Korean ETF ($EWY)

Rating: Overweight

Risk/Reward: Medium/ Medium

4) Alphabet is the true operating system for an AI-enabled world.  This really is Alphabet's ($GOOG) game to lose at this point  Alphabet is the most underbought Magnificient 7 stock and their Gemini model is going to need a little time in H1 to demonstrate the diverse reach of its value prop. However, with Microsoft needing massive earnings to justify their current price and no real announcement from Apple, Alphabet can really change the conversation this year after badly fumbling the start of the race.

Alphabet also has the ability to surprise the market with an AI-enabled Pixel phone or better AI architecture for Android devices across the board. Android isn't unseating the iPhone anytime soon, but could this be the year when Android devices really make a case for the platform. Please note that our Alphabet price target is unchanged from our most recent update to our strategy from August.

​Price Target: $169 (14% upside)

Current Price: $141

Target Date: Q3 2024

Stock: Alphabet ($GOOG)

Rating: Overweight

Risk/Reward: Medium-High/ Medium-high

5) Apple can win the whole game with a few product launches. Basically, there isn't a company on earth with the potential Apple has to completely own the AI deployment future, we just need to see their actual roadmap after the Vision Pro gets released. Once again, we are not updating our price target from our strategy published in July. 

​Price Target: $226 (16% upside)

Current Price: $194

Target Date: Q4 2024

Stock: Apple Inc ($AAPL)

Rating: Overweight

Risk/Reward: Medium-High/ Medium-high​