OpenAI, Samsung and SK to start Korea data centre build
- OpenAI, Samsung and SK begin Korea AI data centre construction in March.
- The build highlights rising demand for local AI infrastructure.
Work to construct new data centres in South Korea by OpenAI, Samsung Electronics and SK Hynix is set to begin in March, government officials said this week, marking a major step in the expansion of advanced computing infrastructure on the Korean Peninsula.
The agreement, confirmed by Science Minister Bae Kyung-hoon at a parliamentary hearing in Seoul, comes as the three partners move ahead on a project that was first revealed last year. Under plans announced in October, the group intends to build two data centres with an initial combined capacity of about 20 megawatts, aimed at supporting artificial intelligence and related computing needs in the region.
This development shows how infrastructure for AI and cloud workloads is being constructed beyond traditional cloud hubs like the United States and Europe. South Korea has been positioning itself as a hub for advanced computing and semiconductor activity, and this venture signals how global AI players and local technology firms are collaborating to build the physical systems that power next-generation services.
OpenAI partnership drives Korea data centre build
Details shared this week at the parliamentary hearing confirmed earlier reports about joint ventures involving OpenAI, Samsung and SK Hynix, the country’s major memory-chip maker. The partnerships envision large-scale facilities that can meet the demands of AI models and data-intensive workloads — operations that require sustained computing power and reliable power and cooling infrastructure.
While many big cloud players focus on markets with established enterprise demand, this project ties local industry leaders with an AI developer that has rapidly grown its infrastructure footprint worldwide. For Samsung and SK Hynix, whose memory chips are used in many of the systems that support AI compute, the build could reinforce their role in the global supply chain.
In South Korea, data centre construction and operations have long been linked with the country’s strength in semiconductors and electronics manufacturing. Now, as demand for AI models and cloud workloads grows, local firms are also looking to fill gaps in regional capacity. That means not only building physical space for servers but ensuring there is enough power capacity, connectivity and cooling to support 24/7 operations.
What this means for Korea’s AI infrastructure
The planned centres are not simply storage facilities. They are meant to provide the backbone for computing that supports AI services, from training large models to serving real-time applications. Models that power tools like advanced language systems or analytics engines rely on specialised chips and networking that work best when supported by local infrastructure with low latency and high reliability.
An initial capacity of 20 megawatts is modest compared with the largest hyperscale facilities, which can exceed 100 megawattsbut it is still significant — especially for an AI-focused deployment where power is a major cost driver. It reflects South Korea’s measured, phased approach to building computing capacity while aligning it with industrial partners and long-term plans.
An official familiar with government policy told reporters that the move aims to support a broader tech ecosystem in Asia, where proximity to users and data sources can improve performance and compliance for enterprises and public-sector users. While the details about how the centres will be used have not been fully disclosedlocal media reported that they will serve a mix of research, enterprise and cloud computing workloads.
Strategic impact of OpenAI’s Korea data centre build
For OpenAI, expanding physical infrastructure into South Korea is part of its global strategy to ensure computing capacity is available close to critical markets. The company’s cloud footprint has grown rapidly in recent years, with data centre plans in multiple regions meant to reduce latency and serve regional demand without depending entirely on third-party cloud providers. This approach resembles how major cloud firms build their networks of facilities, but tailored for AI applications where compute demands often outstrip traditional cloud use cases.
Local partnerships with Samsung and SK Hynix also reflect another trend: tie-ups between AI developers and hardware suppliers. Samsung is one of the world’s largest producers of memory and storage components used in AI infrastructurewhile SK Hynix produces a range of DRAM and high-bandwidth memory (HBM) components that help support large AI workloads. By working together on physical facilities, these companies can align supply chains, construction plans and technical requirements more closely than if they were operating independently.
It also gives South Korea a shot at being a regional hub for compute infrastructure, as other economies seek to reduce reliance on distant cloud regions. Enterprises, especially those in regulated sectors such as finance, healthcare and manufacturing, often prefer locating workloads closer to home for compliance reasons or because they want more predictable performance. Projects like this one help create that option.
Challenges and outlook
Building data centres capable of supporting AI means confronting some familiar challenges: securing reliable power, managing heat dissipation, and balancing the cost of construction with the revenue potential of the services hosted there. While joint ventures can help distribute risk and investment burden, the partners must still coordinate on technical standards, operational models and long-term demand forecasting.
Another factor is workforce. Operating advanced data centres requires skilled technicians, engineers and network specialists. South Korea’s strong electronics and semiconductor base gives it an advantage in cultivating such talent, but competition for skilled workers remains tight globally.
Still, beginning construction as early as March signals a sense of urgency. Whether these facilities will be fully operational by late 2026 or into 2027 will depend on how quickly the partners can move from ground-breaking to deployment — a process that often takes many months in complex sites. But the timing suggests that South Korea and its partners see value in moving ahead without delay.
As AI adoption grows worldwide, having local infrastructure can be a differentiator for markets that want more control over where data and computing happen. For South Korea, hosting facilities tied to both local industry and global AI demand could prove to be a strong foundation for future services and enterprise cloud use cases.
Want to experience the full spectrum of enterprise technology innovation? Join TechEx in Amsterdam, California, and London. Covering AI, Big Data, Cyber Security, IoT, Digital Transformation, Intelligent Automation, Edge Computing, and Data Centres, TechEx brings together global leaders to share real-world use cases and in-depth insights. Click here for more information.
TechHQ is powered by TechForge Media. Explore other upcoming enterprise technology events and webinars here.
TNG – Latest News & Reviews

