With just $250K invested over 7 years, this couple built...
Read MoreWhy AI Chips Could Be the End of the Chip Industry

Read all AiF opinions and articles
The recent boom in the artificial intelligence industry has attracted widespread attention to the chip industry. Today, we will delve into this eye-catching topic and discuss the demand from the AI chip industry from artificial intelligence, the limitations of Moore’s Law, and the possibility that Ai chip demand may become the terminator of the chip industry.
Computing power reaches bottleneck

First, let’s review the development of artificial intelligence. 2022 was declared the “first year of AI”, marking the official arrival of the era of artificial intelligence. The realization of artificial intelligence depends on the improvement of computing power, from the initial CPU to the later GPU, TPU and NPU, which continuously drives the development of the chip industry. Due to the rapid development of artificial intelligence, the stock prices of companies such as Nvidia have skyrocketed, becoming the second largest company in terms of market value.
- CPU stands for Central Processing Unit
- GPU stands for Graphics Processing Unit
- TPU is Google’s Tensor Processing Unit
- NPU stands for Neural Network Processing Unit

Limitations
However, as technology continues to develop, the limitations of Moore’s Law have gradually been exposed. Moore’s Law has been a driving force for technological innovation and social change since the late 20th century (FAGA (Four well-known companies in the US technology industry: Facebook, Amazon, Google and Apple) built their business models based on Moore’s insights.) , but as the number of transistors increases, the increase in power consumption leads to a decrease in efficiency. CEO Huang Renxun and others have long pointed out that Moore’s Law is outdated, and Moore himself acknowledged the physical limits it faces in an interview in 2005.

So, can Moore’s Law really continue to upgrade forever?
Process technology drives the realization of Moore’s Law. In the initial process of upgrading from 3um to 0.13um, Dennard Scaling was effective. However, after 0.13um, Dennard Scaling gradually became ineffective. The main reason is that when the chip size remains unchanged, the transistors become smaller and the number increases, the current leakage in the channel area will cause the transistor to heat up, thereby increasing the power consumption of the chip, breaking the simple linear recursive relationship between the number of transistors, computing speed and energy consumption.
Manufacturers
This reflects the current situation, where CPU manufacturers can no longer simply rely on increasing clock frequency and core count to improve computing power, because a significant increase in overall power consumption will only put some cores in an ideal working state.
The industry has conducted a lot of research and attempts to extend the life of Moore’s Law. However, these efforts cannot fundamentally change the problem facing Moore’s Law: heat generation leads to reduced efficiency.

Ai Power Demand is Infinite
In 2023, Moore, the inventor of Moore’s Law, passed away, which is also considered the end of an era in the chip industry. With the development of technology and Ai chip demand, we may need to give up our dependence on chips and seek new technological paths.
The insurmountable physical limits have led to a bottleneck in the technological progress of the entire human race. To get out of this bottleneck, giving up the path of chips is the inevitable result.
So, the Ai chip industry is already a sunset industry.
In addition to changes in technology, cooperation and innovation within and outside the industry are also worth paying attention to. For example, OpenAI CEO Sam Altman listed a nuclear energy company to help achieve the development of artificial intelligence. This cross-border cooperation and innovation may also be one of the new directions for future technological development.
Data Centers
When we consider the impact of artificial intelligence on the chip industry, we also need to consider the needs of data centers. The training of artificial intelligence requires a lot of data processing and storage, which has led to an increasing demand for high-performance chips and high-speed networks. Cloud computing giants such as Google, Amazon, and Microsoft have invested in new data center architectures to meet the needs of artificial intelligence. Google’s TPU chip has become part of the cloud computing market, and its performance on machine learning tasks exceeds that of traditional CPUs and GPUs.
Edge Computing
In addition, AI has also driven the development of edge computing. Edge computing moves data processing and analysis closer to the data source to reduce latency and improve efficiency. This has led to an increase in demand for low-power, high-performance chips to meet the requirements of running complex AI algorithms on edge devices. As a result, chip manufacturers and technology companies are working hard to develop new chip architectures.
Ai Hardware
In the past, GPUs were widely used to train deep learning models, but as AI continues to develop, customized chips for specific tasks have begun to appear. For example, Google’s TPU is specifically designed to accelerate neural network inference, while Intel’s Nervana chip focuses on deep learning training tasks. This customized hardware design has significantly improved the performance of AI applications and promoted innovation in the chip industry.
In general, the rapid development of artificial intelligence has brought a huge impact on the chip industry. As technology continues to advance, we can foresee that the requirements of artificial intelligence for chip design, manufacturing and application will continue to increase. Chip manufacturers and technology companies will need to continue to innovate to meet the needs of the artificial intelligence era, which will also bring new opportunities and challenges to the chip industry.
Ai chip demand is causing it to become the terminator of chips. At the heart of this trend is AI’s huge demand for electricity, making it a power-consuming monster. For example, training an AI model like ChatGPT consumes 1,300 megawatt-hours of electricity per day. This number may be meaningless to most people, but in fact, with this much electricity, you can play online videos for 185 years, which highlights the huge energy requirements of AI.

Consumption
With the rapid development of AI, the electricity consumption of artificial intelligence data centers in the United States has accounted for 2.5% of the total electricity consumption in the United States. This number is shocking because it is enough to light up the entire New York City. From 2020 to 2022, the global data center electricity consumption increased from 200-250 TWh to 460 TWh, accounting for 2% of global electricity consumption. What is more worrying is that this growth rate is rising rapidly at a rate of 25% to 33% every year, and is expected to reach 1,000 TWh by 2026, equivalent to the electricity consumption of the entire Japan.
Cooling System
In addition to the power consumption of the IT equipment itself, the cooling system used to cool these devices also requires a lot of electricity. If the power consumption of the IT equipment is recorded as 1, then the power consumption of the cooling system will reach 0.4, and the sum of the two is 1.4. This means that the power consumption of the entire data center is not only the power consumption of the IT equipment, but also includes the huge energy used to keep the equipment at normal operating temperature.

Therefore, all over the world, major companies are engaged in a fierce battle for electricity. This is one of the reasons why OpenAI will help the nuclear power company OKLO go public. Traditional electricity supply can no longer meet the huge demand of AI, and nuclear power is expected to become a viable solution. However, the construction of nuclear power takes time, and one of the biggest limiting factors for the development of AI is the lack of electricity supply. Therefore, by helping nuclear power companies go public, OpenAI hopes to promote the expansion of power infrastructure to meet the growing energy needs of AI.
What are the features of a chip terminator?

Bottleneck
As Ai chip demand grows, or more precisely, AI’s demand for computing power grows, the existing chips can no longer meet the development needs of AI, so chips have actually become an obstacle to the development of AI. In the past, the continued promotion of Moore’s Law has led to continuous improvement in chip performance, but as time goes by, this law has become increasingly difficult to maintain, and nanotechnology has reached its limit. Therefore, the bottleneck faced by modern chips is that they can no longer continue to develop according to the expectations of Moore’s Law, which has also led to the gradual elimination of chips.
Development
In order to solve this problem, we need new technologies to replace existing chips. In this process, we need to consider the role of the theoretical layer, the technical layer and the application layer. Moore’s Law belongs to the theoretical layer, which provides a theoretical basis for the development of the chip industry; while nanotechnology is the representative of the technical layer, which realizes the continuation of Moore’s Law by continuously improving the manufacturing process.
However, with the inability to break through Moore’s Law and the limit of nanotechnology, the application of chips has also come to an end. Therefore, at this stage, the chip industry has begun to reach its end. Any industry will go through the stages of theoretical layer, technical layer and application layer in the process of development. This staged development is inevitable.


Because of this, we cannot just focus on the surface. Today, the hot chip manufacturers in the market, such as Nvidia, may just be taking advantage of the existing technology to make the most profit before it is completely replaced. Take Intel as an example. As a leader in the semiconductor industry, it continues to transfer and sell nanotechnology and patents. This is not because Intel is stupid, but because it realizes that the development of semiconductor technology has reached a bottleneck and must turn to new technology fields.
Transformation
In this transformation process, solving power and heat dissipation problems becomes particularly important. Existing chips generate a lot of heat during high-performance computing, so a lot of electricity is needed to cool them, which not only wastes resources but also limits the further improvement of chip performance. Therefore, future chip terminators must be able to solve these problems and reduce power consumption and improve heat dissipation efficiency through new materials, structures or technical means.

In general, although the terminator of chips has not yet appeared, we have seen the inevitable trend of technological development. Similar to the development of computers in the past, AI will also undergo a transformation from being bulky and heavy to being small and efficient, and this process may be faster than the development of computers. Therefore, we need to pay close attention to the development of science and technology and prepare for future technological innovations.
How should you invest?
First of all, I want to tell you that with the advent of the fourth industrial revolution, the entire revolution led by artificial intelligence, big data, the Internet of Things, cloud computing, etc. will bring about a big explosion of technological progress, which will definitely bring about a big explosion of wealth. Everything is accelerating, so we must choose the right direction when investing. If you invest in the wrong direction, you will not make money, and may even suffer a major loss.

So how should we invest? In Ai Financial’s investment strategy, there has always been a list of “don’ts”, and the fifth one is “don’t invest in sunset industries”. So in our lecture so far, I would like to ask everyone, is the chip industry a sunrise industry or a sunset industry?
Through today’s lecture, I believe everyone knows that artificial intelligence AI is a sunrise industry, while chips are sunset industries. So when investing, we cannot invest in sunset industries, and the chip industry belongs to this category. Since the chip industry cannot be invested, Nvidia stock is not suitable for purchase.
Buying stocks is actually buying probabilities, and the final result of speculation is definitely not making money, or even losing money. So the right way is to buy public principal-guaranteed funds. Because public principal-guaranteed funds are actually buying a basket of stocks, so even if one of the stocks does not rise, but other stocks rise, we will not lose money.
Moreover, in the past two years, the rise of NVDR (Nvidia) has been perfectly grasped by our fund, which is the advantage of public funds. Therefore, the best way to invest is to use investment loans to buy public principal-guaranteed funds, so that you will not miss the rise of stocks and have a balanced asset allocation.
You may also interested in
$300K in 7 Years: A Story of Investment Loan Strategy and Discipline
No inheritance. No big salary. Just a smart investment loan...
Read MoreShe Achieved Steady Growth with Segregated Funds and Secured a $200,000 Investment Loan!
By investing in segregated funds, Ms. E achieved steady asset...
Read MoreThe Dow Dropped 5,000 Points—He Still Made $36K: How Mr. F Used an Investment Loan to Stay Ahead
While most investors panicked and lost big during the recent...
Read More$17,913 Profit in a Down Market—Here’s How
Profit During a Crash Isn’t Luck—It’s Strategy: Leo Earned $17,913...
Read More