skip to content
  1. Home
  2. >
  3. Free Report AI Stocks

Free Report AI Stocks

Revolutionizing AI - The Powerhouse GPUs Fueling the AI Breakthrough


The Importance of AI and How it Will Change Our Lives

Artificial intelligence (AI) is a concept that has been around for nearly a century, but its actual use and adoption have not transpired until just recently. Many individuals understand the broad idea of AI and some of its implications, but the speed at which this technology will scale will come as a shock to most. In a broad sense, AI is software that enables machines, computers, and more to perform tasks autonomously by reading and analyzing vast amounts of data simultaneously, recognizing patterns in that data, and executing tasks. The ability of AI to intake, process, and act on data at speeds dramatically quicker than humans is why this technology is so revolutionary and transformational.

What is most unique about AI versus other machines and software that we have today, is its ability to continuously learn in an unsupervised manner, teaching itself how to make adjustments based on past experiences or by learning new patterns from data. A recent example of how the autonomous features of AI software will scale exponentially is from an AI program developed by NVIDIA Research called Eureka that has taught robotic hands to perform complex tasks using a reward reinforcement system. This program taught robots by autonomously writing code via ChatGPT-4 to instruct robots how to open drawers and cabinets, toss and catch balls, use scissors, and more. The program then gathers the data following the completion of these tasks and instructs ChatGPT-4 to improve the code for greater efficiency. This is just one of many examples of how revolutionary this technology will be.

To get an idea of the addressable size of the AI market and how much it can contribute to economic growth, in Figure 1 we have outlined the estimated AI market size and growth until 2030. These growth rates anticipate a CAGR of 36.6% by 2025 and a total boost of 14% to the world’s GDP by 2030.

Figure 1

Source: toolsforhumans.ai


LLMs, Chatbots, Generative AI

LLMs, chatbots, and generative AI are perhaps buzzwords that some investors have heard of before, and here we want to dig into what these mean and their importance. Starting with Large Language Models (LLMs), these are models based on deep learning architecture that leverage the vast amount of data from the internet. These models intake large quantities of data from the internet, generate content that is readable and understandable by humans, via writing, drawing, or coding. LLMs use unsupervised learning, which means that they self-teach themselves to understand grammar, languages, knowledge, etc. ChatGPT is an example of an LLM, since it was trained and self-taught itself using existing data from the internet, and provides content that is understandable to humans (text, code, drawings).

ChatGPT is an example of a chatbot that is also an LLM, but not all LLMs are chatbots. Chatbots typically use LLMs to enhance their language comprehension capabilities. Some examples of LLMs that are not chatbots include: code generation models, language translation models, medical text analysis, legal document analysis, and more. Generative AI is a term that has been growing increasingly popular, and in simple terms it refers to any type of AI that generates original content. These models are mostly built on LLMs, and can accept audio, imagery, text, as inputs, and can produce code for 3D modeling, video output, act as voice assistants, and many other applications.


How AI Has Come to the Forefront through a Technological Breakthrough

Many individuals have been hearing of AI applications for years, but most have not been able to actually use an AI application, until recently with ChatGPT. It seems as though that overnight this technology was mass adopted and readily available for users, and this can be traced back to a technological breakthrough in 2017, called ‘transformers’.

In 2017, researchers at Google released a paper called “Attention is All You Need”, which introduced the transformer architecture. Before this paper was released, AI programs used other models which had resource limitations, computational constraints, and other setbacks. The transformer model was revolutionary in that instead of an AI program having to read and comprehend a sentence by each word in sequential order, the new model allows AI programs to read all words in a sentence at the same time, find relationships and patterns between various words, and produce the necessary output. This is where the ’attention’ aspect comes into play, AI programs now use transformer architecture to determine which words in a sentence to pay attention to. This breakthrough rapidly advanced the development of AI programs as they are more scalable and can be trained on larger datasets, and is a main reason why we have ChatGPT today. Now, the main benefit and also slight drawback to the transformer architecture is that heavy computation is required, as the AI programs need to analyze all words in a prompt at the same time. This is where the importance of GPUs come into play. GPUs support the analysis of multiple pieces of data at the same time (in parallel), making them the ideal hardware for AI.

In Figure 2 we outline how AI models built on the transformer architecture can scale much better than prior models, and how this has advanced the size and effectiveness of AI.

Figure 2

Source: Nvidia.com


The Importance of GPUs

General purpose chips, such as Central Processing Units (CPUs), cannot perform the same parallelized analysis that Graphics Processing Units (GPUs) can, and this function is vital to today’s AI software. As we mentioned in an earlier section on the revolutionary transformer model that requires simultaneous analysis of data, GPUs have a high ability to perform multiple tasks simultaneously. This is where NVIDIA has benefited significantly. NVIDIA has been making GPUs for the video game industry for decades, however, its A100 and H100 flagship AI GPUs have been massively adopted for training and using AI software recently. Several years ago, the company that runs ChatGPT, OpenAI, connected with Microsoft to build AI infrastructure using its Azure cloud system on thousands of NVIDIAs GPUs to train its AI model. For reference, each of these AI-specific GPUs from NVIDIA currently costs around $10,000. There are two main reasons that an AI program needs GPUs, the first is for training, which is the initial set up of the software, and the second is inferencing, which is the ongoing use of the software. The inferencing function (ongoing use of AI software) is no less compute intensive than training the software (initial setup), and thus, an investor can quickly see how ongoing demand for GPUs will increase as these platforms need to scale up.

With this backdrop of high demand for GPUs in mind, it helps to explain NVIDIA’s staggering quarterly results. In Figure 3 we demonstrate the substantial rise and expectations in sales and profits for the company, almost entirely due to increased demand for its high-powered AI GPUs.

Figure 3

Source: Quartr


The AI Tech Stack

While the GPUs manufactured and sold by NVIDIA and other companies such as AMD, are vital to training and the ongoing use of generative AI, data centres are needed to provide high performing, secure, and stable environments for these GPUs, the cloud infrastructure they run on, and the software applications. NVIDIA nearly has a monopoly with most cloud companies that offer NVIDIA GPUs as their cloud GPUs for AI tasks, and it is estimated that NVIDIA has more than 90% of the market share of GPU data centres. It is estimated that demand for data centre GPUs is growing at an annual rate of 23.5%, a rate that is expected to be maintained until the end of the decade. The tech stack required for the training and ongoing use of AI is a data centre environment with high bandwidth connectivity, high-powered GPUs, cloud infrastructure, and the AI software.

The Future of AI Resources, Size

Throughout this report we have demonstrated the capabilities of AI systems, a few main types of AI models (LLMs, generative AI), the reason for recent technological advances, and why GPUs are critical to the training and use of AI. It is without a doubt that this technology will be not only revolutionary but will scale exponentially. This exponential growth will benefit many companies, not only NVIDIA, however, we believe NVIDIA is leading the charge in this new paradigm. We believe that the mega-cap tech stocks will be some of the main beneficiaries and leaders in the software and hardware for AI, as well as smaller cloud companies, data centres, and GPU manufacturers. In Figure 4, we demonstrate how AI systems are growing exponentially larger over time, and we do not expect this growth to slowdown anytime soon. This chart is on a log scale, which means that as the AI systems move up the chart, they use exponentially more parameters.

Figure 4


It is still early in the theme of AI, and investors that feel they may have ‘missed out’ on the AI theme need to consider the recency of breakthroughs in this technology, the duration of the AI revolution, and the exponential requirement to operate and scale these systems. We believe there are lots of avenues for growth in the coming years.

Canadian AI-Related Stocks

Celestica (CLS)

Celestica (CLS) is a Canadian electronics manufacturing services (EMS) company that designs, builds, and services hardware. It has two main operating segments: Connectivity and Cloud Solutions (CCS) and Advanced Technology Solutions (ATS). Its CCS segment provides networking switches, server related hardware, and cloud-communications infrastructure. Whereas its ATS segment provides services for the industrial, aerospace, and health-tech industries. CLS was mostly stagnant following the dot-com bust, and sales growth was mostly muted, but amid the recent AI revolution, CLS has been benefiting from hyperscaler and AI infrastructure demand.

CLS is a $44.6 billion company, which has been exhibiting 20% sales growth in the past couple of years, alongside a rapidly expanding profit margin. Its free cash flows have grown alongside its margin expansion, and just prior to it benefiting from the AI data center demand, it was trading at a forward earnings multiple below 10X, and it now trades at 44X. Analyst estimates are calling for continued margin expansion in the coming years, and analyst estimate trends are moving higher. We continue to like CLS as one of the primary ways to gain exposure to the AI theme through a Canadian stock. CLS has transformed from a period of stagnation to a leader in the EMS industry, capitalizing on the growing demand for AI and data center infrastructure. 

Electrovaya (ELVA)

ELVA started out as a manufacturer of lithium-ion batteries targeting the electric vehicle (EV) industry. ELVA’s batteries are recognized for their safety, longevity, and low cost of ownership. The company is expected to continue enjoying significant growth in demand for battery solutions over the long term, supported by multiple tailwinds such as hyperscale data centers, robotics, and potential in “physical AI,” as well as continued expansion into other industries like e-commerce and aircraft. All of these require tremendous energy usage and energy storage capacity. For instance, the company estimates that demand from hyperscale data centers is expected to grow at around 28% CAGR over the next few years.

In addition, the company is currently at an inflection point in terms of financial results, with two consecutive quarters of positive EPS, as trailing twelve-month revenue has surpassed the breakeven point of $50 million for the first time since going public. The company also reported strong momentum in orders from OEM partners and announced plans for expansion to address growing demand. Though ELVA is not a pure AI play, the company’s exposure to the robotics industry ties into the broader adoption and development of AI. We think ELVA’s growth story has become more interesting in recent years due to significant demand growth from adjacent industries, along with improving financial results.

Shopify (SHOP)

SHOP is widely recognized as one of the largest and best operators in the e-commerce industry. The company started out as an online platform that helps merchants establish an online presence. Over the years, it has gradually evolved into a full e-commerce ecosystem by adding features such as payments, financing, and logistics. In recent years, SHOP has been working to integrate AI into most aspects of its business. For example, the company recently announced a partnership with OpenAI—the parent company of ChatGPT—through which SHOP’s merchants will be able to convert sales directly and seamlessly through conversations with ChatGPT, without any links or redirects.

Going forward, the company plans to continue enhancing product discovery with real-time features like pricing, inventory, and images, making millions of products instantly discoverable in a format that AI can understand. The CEO’s vision is to help SHOP’s merchants sell wherever AI conversations happen, so that when someone asks ChatGPT for recommendations, they can access inventory from SHOP’s merchants for immediate purchase. SHOP has been one of the best-performing stocks in Canada since going public, and we believe the company still has a long runway to grow and compound, given the almost “unbounded” addressable market of global e-commerce.


*Authors, directors, partners and/or officers of 5i Research hold a financial or other interest in NVDA, MSFT at the time of publishing. The i2i Fund does not hold a financial or other interest in the above companies at the time of publishing.