Samsung Receives AMD’s Certification For MI300 A.I. Chips – Report

This is not investment advice. The author has no position in any of the stocks mentioned. Wccftech.com has a disclosure and ethics policy.

After a report claimed yesterday that Korean semiconductor manufacturer Samsung was looking to invest in new equipment and technologies for its memory semiconductors, research firm TrendForce claims that the Korean company will catch up with rivals SK Hynix and Micron when it comes to next generation memory products from NVIDIA Corporation.

According to TrendForce’s report, Samsung has secured its entry into NVIDIA’s supply chain for the H200 AI GPU, along with having received certification from AMD for its HBM3 products. NVIDIA’s H200 will rely on Samsung’s HBM3e memory, while AMD’s MI300 artificial intelligence accelerator will increase the Korean firm’s share of the memory food pie of A.I. chips.

Samsung Set To Compete With Sk hynix & Micron With Next Gen A.I. Memory Chips Says New Report

According to TrendForce, Samsung, SK Hynix and Micron have all sent HBM3e samples to NVIDIA ahead of the H200’s launch later this year. NVIDIA’s current estimates of H200 availability place the launch in the next quarter, and SK Hynix and Micron sent the first samples to NVIDIA in the third quarter of 2023. This was followed by Samsung a quarter later, and mass production of all these products will start this year.

Following sample submission, chip designers like NVIDIA validate the products to certify them for inclusion into a GPU. Today’s report shares that SK Hynix and Micron have received validation from NVIDIA, which allows them to proceed smoothly toward mass production of the HBM3e memory chips ahead of NVIDIA’s second quarter release.

Samsung is yet to complete its HBM3e validation, and TrendForce believes that the firm will finish this step by the end of the current quarter. Its mass production timelines are later than those for Micron and SK hynix, with Micron shown to enter mass production the earliest while SK hynix follows soon.

TrendForce’s estimate of the current state of the HBM3e memory A.I. semiconductor market places Samsung at an advantage for products with large capacity. Image: TrendForce

However, a key advantage to Samsung, which is missing from the text of TrendForce’s report, can be inferred from the accompanying graphics. The research firm’s HBM3e roadmap for NVIDIA’s A.I. chips shows that while Samsung will be the last to enter mass production for 24GB HBM3e memory modules for NVIDIA’s H200 GPUs, it might be the first to enter mass production for the beefier 36GB products. For A.I. chips, more memory means faster performance as the primary processor is able to access more data per unit of time.

Additionally, while it might have missed out on the first wave of A.I. memory chip demand in the form of HBM3, TrendForce believes that Samsung has deepened its partnership with AMD. As opposed to NVIDIA, which is limited to designing and selling GPUs and Arm based chips, AMD enjoys the advantage of providing a complete A.I. product stack.

The firm’s star A.I. product is its MI300 accelerator, part of CEO Lisa Su’s multi billion dollar total addressable market (TAM) estimates for A.I. products. As per TrendForce, another key win for Samsung is its certification by AMD. The research firm believes that Samsung’s HBM3 memory has been certified by AMD for use in the MI300 accelerators, a small win for Samsung, whose rival SK Hynix has dominated this particular market until now.

Share this story

Facebook

Twitter