Home > Industry News > Detail

Memory chip giants flock to new track

Date:2022-11-10 10:57:50    Views:485

When it comes to "storage" and "AI", many people will say that storage is important to AI because the development of AI is supported by massive amounts of data, which makes people put forward extremely high requirements for data processing, requiring more memory to store more data. It has to be admitted that high-performance storage allows AI technology to be used to its maximum power. But in fact, AI is also very important to storage, AI is always pushing the development of storage, the reason for which can not get around the in-store computing (PIM: Processing in-memory).


PIM is a new computing architecture that breaks the traditional von Neumann architecture, by organically combining storage and computing, directly using the storage unit for computing, greatly eliminating the overhead of data migration, solving the traditional chip in running artificial intelligence algorithms on the "storage wall" and "power consumption Wall" problem, can tens of times or even a hundred times to improve the efficiency of artificial intelligence computing, reduce costs.


Although the basic concept of in-store computing has been proposed as early as the 1970s, it is only in recent years that it has gradually become the focus of attention, because the proliferation of arithmetic power and computing data has led to an increasingly prominent storage wall problem, and the only way to further improve arithmetic power is to solve the storage wall problem, and in-store computing is one of the most direct solutions to achieve high energy efficiency, low power consumption, and low cost. low cost.


Back in 2019, Micron CEO Sanjay Mehrotra pointed out that the computing architecture that was once in place was not suitable for future trends, and that in the long run it was considered best for computing to be done in memory. Another Micron technology executive at the time was also convinced that the convergence of computing and memory was necessary to improve performance efficiency and reduce latency, and was willing to put in the effort to do so. Many industry insiders believe that in the future memory may not only be a storage device, but also a gas pedal, or will have other functions such as better ECC, etc.


So, in the face of a future of proliferating data volumes, how to meet the memory bandwidth challenge and achieve better quality PIM to alleviate the workloads faced by various AI drivers such as HPC, training and reasoning?AI is a good choice, and as the technology evolves, more and more storage vendors are starting to join the AI war...


Snap to AI, invest first


Investment/acquisition is an extremely common way to get the fastest access to a new technology. Storage vendors started the AI investment boom around 2018, which was also a wild year for AI, with Google Duplex automatically answering phone calls instead of humans, the EU's officially released draft AI ethics guidelines, OpenAI's 5v5 DOTA AI The "Open AI Five" beat humans again, and the world's first "AI synthetic anchor" was officially inaugurated... All these seem to make people realize that the era of AI, which only exists in fiction, is really coming.


In June 2018, storage leader Samsung announced a new fund focused on AI technology and startups, "Samsung NEXT Q Fund", which will provide seed and A-round financing support for startups that solve AI problems and use AI to solve computer science problems, with the announcement specifically mentioning that The areas included relate to simulation learning, scenario understanding, problem learning solutions, and human-computer interaction.


By August 2018, Samsung Group announced that it would invest more than $22 billion in AI, auto parts and other fields over the next three years, with most of the investment undertaken by Samsung Electronics; in August 2021, Samsung again announced that it would invest 240 trillion won (about $205.5 billion) in biopharmaceuticals, artificial intelligence, semiconductors, robotics and other fields over the next three years. From $22 billion in 2018 to $205.5 billion in 2021, a tenfold increase, although AI is not the only area in which Samsung Electronics is increasing its investment, it is clearly a new growth area that has been targeted.


Shortly after Samsung announced the establishment of a new fund in 2018, Micron also announced that it was investing $100 million in artificial intelligence and machine learning start-ups through Micron Ventures, with news at the time indicating that investing in start-ups would not only help accelerate the development of artificial intelligence, but also indirectly drive demand for next-generation memory such as DRAM and NAND memory and 3D Xpoint.


Perhaps the investment was no longer enough to meet the demand, or perhaps Micron further recognized the importance of in-store computing, and by 2019, Micron Technology directly acquired Fwdnxt, an AI hardware and software startup, which made a big splash in the industry at the time. Micron believes that Fwdnxt's technology, when used in conjunction with Micron's memory chips, can equip Micron to explore the deep learning AI solutions needed for data analytics, particularly IoT and edge computing.


For its part, Micron also said that the acquisition of Fwdnxt will not only not have more competition with Intel, Nvidia and others, but on the contrary will have more cooperation. In Micron's view, no one can compete with them in the data center space, and storage vendors want to get a piece of the pie and provide more help to Intel and others, so research in edge computing will be where Micron gets the most efficiency and economies of scale.


The data center is a common support technology for the Internet, cloud computing and artificial intelligence, etc. China ICT's Data Center White Paper 2022 report shows that the global data center market size exceeded $67.9 billion in 2021, and market revenue is expected to reach $74.6 billion in 2022. And both storage and AI are indispensable and important parts of data centers. On the one hand, data in any data center will eventually be placed on storage devices. On the other hand, AI can help data centers improve energy efficiency and thus save costs, and it can also help optimize operations and maintenance, using predictive analytics to help data centers distribute workloads. So, when the "right and left arms" of the data center come together, it is bound to work wonders, and this may be part of Micron's consideration in acquiring Fwdnxt.


In 2019, SK Hynix also joined the AI investment war, even with a fierce offensive. 2019 began with SK Hynix's investment in Horizon; in September 2020, SK Hynix announced its investment in Gauss Labs, a company that aims to lead semiconductor manufacturing innovation through industrial artificial intelligence (AI) solutions; and in January 2022, SK Hynix, together with SK Telecom and SK Square, announced the formation of a new company called SK Telecom. In January 2022, SK Hynix, together with SK Telecom and SK Square, announced the establishment of a joint development association to establish an AI semiconductor company, SAPEON, in the U.S., with SK Hynix holding 25% of the shares, which will further expand SK Hynix's business in NAND flash memory and AI.


"Technology Achievements under "Money Capability


From the investment side, storage majors on AI can be described as a full "blood money", under the heavy money, manufacturers are also very good results, one after another to show the new technology.


Samsung Electronics was the first to develop HBM-PIM (also known as Aquabolt-XL), a high-bandwidth memory that combines memory chips and AI processors, in February 2021 to enhance large-scale processing in data centers, high-performance computing (HPC) systems and AI-enabled mobile applications. HBM-PIM is understood to bring processing power directly to data storage locations by placing DRAM-optimized AI engines within each storage subunit, enabling parallel processing and minimizing data movement. Data shows that when applied to Samsung's existing HBM2 Aquabolt solution, the new architecture is able to deliver more than twice the system performance and significantly lower energy consumption.


Nam Sung Kim, senior vice president of DRAM products and technology at Samsung Electronics, said that as the technology standardization evolves, the applications will expand further to include HBM3 for next-generation supercomputers and AI applications, and even mobile memory for on-device AI, as well as memory modules for data centers. The latest news shows that Samsung has completed the software standardization required to run its latest memory solution, High Bandwidth Memory Processing in Memory (HBM-PIM), and plans to launch the solution this month.


In February, SK Hynix announced that it would launch a new product combining GDDR6-AiM and AI chips in partnership with artificial intelligence chip company SAPEON, the aforementioned AI chip company founded by SK Hynix, SK Telecom and SK Square in the U.S. GDDR6-AiM is a new product that SK Hynix announced in February this year that will use the next-generation memory chip PIM. SK Hynix claims that the combination of GDDR6-AiM with CPU and GPU, which is not a typical DRAM chip, can increase the computational speed by 16 times and is suitable for machine learning, high performance computing, and big data computing and storage.


1.png

GDDR6-AiM chip with SK Hynix PIM technology

Source: kedglobal


The acquisition of FWDNX follows Micron's introduction of a powerful new set of high-performance hardware and software tools for deep-learning applications, a comprehensive AI development platform that integrates compute, memory, tools and software to provide an important building block for exploring innovative memory optimized for AI workloads. solutions, through a very easy-to-use software architecture paired with extensive model support and utilization flexibility. Micron's Deep Learning Accelerator (DLA) technology is said to support a wide range of machine learning frameworks and neural networks that can quickly process massive amounts of data through easy-to-use interfaces, supported by the FWDNXT AI inference engine that enables memory and compute to be more tightly integrated, resulting in higher performance and lower power consumption.


In addition, unlike the three storage majors mentioned above, Armor Man is focusing on the combination of SSD and AI technology, focusing on the development of memory-centric AI technology. Recently, Armor Man developed a Memory-Centric AI-based image classification system. Memory-Centric AI is an artificial intelligence technology that utilizes large-capacity memory, which uses neural networks to classify images, and in the future, Armor Man will expand Memory-Centric AI from image classification to other fields and promote the adoption of Research and development of artificial intelligence technologies using mass storage.


2.png

Image classification using high-capacity storage

Source: businesswire


For its part, Armored Man points out that while traditional AI technology uses large amounts of data to train models, Memory-Centric AI performs tasks based on its accumulated knowledge by searching and referring to that knowledge, so the more new knowledge and memories it accumulates, the more mature the AI becomes, and more importantly, this technology allows AI to grow indefinitely while also reducing the large amount of computation required and reducing power consumption.


The key aspect of this technology is where the accumulated data is stored, and that's where flash memory comes in, able to store information with little power consumption, but it also poses a challenge to the speed of flash memory, which can be too slow to affect the AI's ability to make decisions. As access speeds increase and hardware and AI technologies evolve, Armored Man believes today is the right time to propose memory-centric AI.


Write at the end


AI has been a "jack of all trades" since the concept was first introduced, with 5G requiring AI, the metaverse requiring AI, autonomous driving requiring AI, and even EDA requiring AI. Now, AI has "infiltrated" the field of memory chips, the major storage manufacturers to show "18 skills", but the future who can become the first to win, we will see.


  • STEP 1

    Enter Electronic Component part number below.

  • STEP 2

    Click the button below.It's that easy.

  • Contact name/company*
  • Email address*
  • Telephone number*
  • Part number and quantity and target price