South Korea’s SK hynix Inc., the world’s top supplier of high-bandwidth memory chips powering Nvidia’s AI accelerators, unveiled plans this week to launch a U.S.-based AI solutions firm with a $10 billion commitment. Tentatively named AI Company or AI Co., the entity will emerge from restructuring its California subsidiary Solidigm, an enterprise solid-state drive maker born from a $9 billion acquisition of Intel’s NAND business in 2021. This move positions SK hynix to centralize SK Group’s AI strategies amid surging demand for memory in data centers.
The announcement, detailed in a PR Newswire release, comes as SK hynix reported record 2025 results: annual sales of 97.1 trillion won ($70.4 billion) and operating profit of 47.2 trillion won, surpassing Samsung Electronics for the first time among Korean listed firms. Fourth-quarter operating profit hit 19.1 trillion won on 32.8 trillion won in revenue, fueled by AI memory shortages, as noted by The Korea Herald.
“The planned establishment of AI Co. is aimed at securing opportunities in the emerging AI era,” SK hynix stated, pledging to “proactively seize opportunities in the upcoming AI era and deliver exceptional value to its partners in AI.” AI Co. will invest in U.S. innovators, forging synergies with SK affiliates like SK Telecom and SK Square, while leveraging HBM leadership—chips essential for overcoming AI data bottlenecks.
Solidigm’s Pivot to AI Frontier
Solidigm will retain its name as AI Co., transferring SSD operations to a new Solidigm Inc. to preserve brand continuity, per the CNBC report on the announcement. This restructuring transforms a storage-focused unit into an AI hub, managing roughly 10 trillion won ($6.92 billion) in overseas AI assets, including stakes in Bill Gates-backed TerraPower, a small modular reactor firm vital for AI power needs, according to BusinessKorea and Reuters.
Preceding media speculation in Maeil Business prompted a regulatory filing where SK hynix confirmed reviewing AI investment options. The firm aims to become a “key partner in the AI data center ecosystem,” accelerating global AI via U.S.-Korea ties. No firm timeline was set, but the official name will follow later in 2026.
SK hynix’s HBM dominance—over 50% market share through 2026, per Goldman Sachs—underpins this expansion. The company mass-produces HBM3E and HBM4, showcased at CES 2026 with 16-layer HBM4 at 48GB capacity, as Justin Kim, President of AI Infra, emphasized customer collaborations for ecosystem value.
Record Profits Fuel Aggressive Bets
AI-driven gains doubled operating profit, with HBM revenue more than doubling yearly. SK hynix outpaced expectations, achieving a 58% Q4 margin rivaling TSMC. This windfall funds not just AI Co., but parallel investments: a $3.87 billion advanced packaging fab in Indiana for HBM production starting 2028, and a 19 trillion won ($13 billion) P&T7 plant in Cheongju, Korea, operational by late 2027.
The Indiana site, in West Lafayette near Purdue University, targets next-gen HBM for AI GPUs like ChatGPT trainers. Cheongju’s M15X fab accelerates to 1c DRAM for HBM4 next month, addressing “tremendous” AI demand, per CEO Sungsoo Ryu in Reuters comments. These facilities form a triad with Icheon, bolstering supply resilience.
X posts echoed the buzz, with users noting U.S. investments amid Trump tariff pressures, linking to Reuters on South Korea’s concessions. Finaxus highlighted SK hynix’s ($SKM) semiconductor push via Yahoo Finance.
Geopolitical Chess in AI Supply Chains
U.S. investments align with Trump administration priorities, following threats of tariffs unless foreign chipmakers build domestically. President Trump signaled flexibility with South Korea on Tuesday, post tariff talks. AI Co. sidesteps domestic capital rules by focusing on foreign assets like TerraPower, revalued amid AI data center power surges.
Competitors scramble: Samsung expands HBM capacity 50% in 2026; Micron eyes New York megafab. SK hynix’s strategy integrates memory with ecosystem plays, from Nvidia partnerships—including an SK Group “AI factory” using CUDA-X for HBM development—to server modules like SOCAMM2.
Morgan Stanley raised 2026 earnings forecasts 56%, citing tight HBM pricing into 2026 from China demand for Nvidia H200s. Bank of America dubs it a “supercycle,” naming SK hynix top pick with DRAM revenue up 51%.
Broader Ecosystem Synergies Emerge
AI Co. will deploy the $10 billion via capital calls, scouting U.S. firms for partnerships enhancing SK’s portfolio. This includes power infrastructure via TerraPower ($250 million SK stake since 2022) and telecom via SKT. Global big tech’s AI race demands high-end memory, positioning SK hynix centrally.
At CES 2026, SK hynix demoed AI system zones visualizing custom HBM, alongside LPDDR6 and CuD prototypes. The company eyes server DDR5 and eSSD growth, with NAND sales rebounding on AI storage.
X chatter from @DJone01 and @tenet_research tied the unit to Nvidia supply chains, underscoring $6.9 billion asset management for long-term AI boom.
Charting AI Memory Dominance
Analysts project HBM3E dominating 2026 shipments at two-thirds, HBM4 ramping via M15X. UBS notes SK hynix as first HBM3E for Google’s TPUs. BofA forecasts memory ASP rises of 33% DRAM, 26% NAND.
SK hynix’s U.S. foothold via AI Co. and Indiana fab challenges TSMC’s packaging lead, offering full-stack HBM solutions. This $10 billion bet, atop domestic mega-investments, cements its pivot from memory vendor to AI enabler, as global demand outstrips supply through 2028.


WebProNews is an iEntry Publication