Header Ads

The Cloud Beyond The Clouds: Startups Firm Up Orbital AI Data Centre Plans

The Cloud Beyond The Clouds: Startups Firm Up Orbital AI Data Centre Plans

As AI adoption accelerates, the bottleneck is no longer model innovation alone — it is infrastructure. Building terrestrial data centres takes up to 24 months, requiring land acquisition, power substations, cooling systems, fibre connectivity and regulatory approvals. 

This is the context in which cloud services firm NeevCloud and spacetech startup Agnikul are proposing an orbital AI data centre architecture, aka data centres in space. It’s an idea that was also floated by Elon Musk’s SpaceX, and it also drew plenty of criticism for being unfeasible and was even ridiculed by many.  

But, according to both NeevCloud and Agnikul, the idea is not as far-fetched. The two companies signed a memorandum of understanding (MoU) in February to deploy inference infrastructure in low-earth orbit (LEO), roughly 300 kilometres above the planet. 

Even in the larger scheme of things in spacetech, this is one of those moonshots that could actually have global ramifications. So, what’s the big deal and hype about? 

Why Orbital Data Centres?

The global data centre market is entering a phase of accelerated expansion. It is projected to grow at a 14% CAGR through 2030, reaching nearly 200 GW of installed capacity, driven largely by AI and cloud demand, according to a report by JLL

The US leads with close to 40% of global capacity. India, meanwhile, is emerging as one of the fastest-growing markets. As per KPMG’s December 2025 findings, India’s data centre capacity is expected to cross 2 GW by 2026, up from just over 1 GW currently.

Yet, as data centres become foundational to the modern digital economy, their resource intensity is drawing scrutiny. Hyperscale facilities can span 200 to more than 1,000 acres and consume upwards of 100 MW of power. AI-focused campuses could demand as much as 1 GW. 

Today, data centres account for roughly 1.5%–3% of global electricity use, a share set to rise sharply as AI demand scales. Even mid-sized facilities often require 10-50 acres, while large campuses can span 1,000 acres.

Faced with mounting land and energy pressures, a handful of companies are exploring this unconventional alternative — orbital data centres. Designed to operate in LEO, these space-based facilities aim to process satellite data and run AI workloads using solar power and the natural vacuum of space for thermal management, potentially reducing terrestrial land and grid dependence.

Globally, companies such as SpaceX, Axiom Space, and Starcloud have outlined early-stage plans to experiment with orbital computing platforms by the 2030s. 

Closer home, the recently announced partnership aims at combining NeevCloud’s deep tech expertise and sovereign AI architecture with Agnikul’s orbital launch capabilities. The initiative will place high-performance AI inference nodes directly into LEO, potentially enabling secure, low-latency intelligence for critical industries worldwide.

“Inference is 10X the demand once training is done,” says NeevCloud’s CEO Narendra Sen. He makes a clear distinction between the purpose of the planned orbital data centre. He added that the infrastructure will be used for inference purposes alone and not AI training. To be sure, AI inference is the operational phase where a trained model applies learned patterns to new, unseen data to make decisions in real time.

The argument is that inference workloads, especially for defence systems, unmanned vehicles, border surveillance and remote healthcare, require ultra-low latency and high resilience. Orbital nodes, the company claims, could deliver sub-10 millisecond response times by reducing the physical distance between compute and end users. This is in contrast with the general round-trip latency of 100-200 milliseconds for terrestrial data centres. 

Turning Rockets Into Infrastructure

The technological backbone of this proposal lies with Agnikul. Traditionally, a rocket’s upper stage carries a satellite into orbit, releases it, and is then left to decay and burn up in the atmosphere. Agnikul has developed a way to extend the life of this upper stage, converting it into a functional space platform.

“After ejecting the satellite, that system is usually debris,” explains Agnikul’s CEO Srinath Ravichandran. “We figured we could extend its life and host payloads like a data centre module.”

Instead of launching a fully independent satellite, NeevCloud’s compute modules would ride on this extended upper-stage platform. The system would generate solar power in orbit and integrate AI chips directly within the satellite architecture. 

“You cannot deploy traditional vertical server racks in space. Instead, the chips are integrated directly into the satellite architecture. Initial deployments are expected to consume around 10-15 kilowatts of power, with the potential to scale to 50 kilowatts, 200 kilowatts, or higher in future iterations,” Sen says.

Thermal management and radiation shielding are central engineering challenges. While space offers naturally low ambient temperatures, heat dissipation in a vacuum requires careful design. Radiation-hardened chips, likely custom ASIC-based inference processors rather than general-purpose GPUs, would be used to ensure durability over a projected four- to five-year operational lifespan, adds NeevCloud’s Sen. 

The broader implication is strategic. For rocket companies like Agnikul, orbital data centres represent a new customer class beyond Earth observation and communications satellites. 

“We are addressing a new group of customers who want to put infrastructure in space,” the spacetech startup’s CEO notes. He added that the Chennai-headquartered spacetech startup has already received preliminary queries from other players. 

Notably, Bengaluru-based spacetech company GalaxEye is likely to launch its AI-powered ‘Mission Drishti’ satellite by March-end this year, with the company exploring the feasibility of setting up an orbital data centre using NVIDIA Jetson Orin GPU. 

Cofounder and CEO Suyash Singh tells us that Mission Drishti has been in development for nearly five years. It is an Earth observation satellite scheduled for launch within the next two to three months. At its core, the mission focuses on imaging capabilities — integrating cameras, radar, avionics and onboard control systems.

“In parallel, a computational unit has been integrated directly onto the satellite alongside its core electronics. This onboard system serves as a test bed to assess the feasibility of deploying compute infrastructure in orbit. The team plans to run select AI workloads in space to evaluate energy use and processing constraints. It will also help study overall system performance in an orbital environment. These insights will guide the development of a future constellation designed as a full-scale orbital data centre,” he added. Singh, however, did not share a fixed timeline for a focused orbital data centre mission.

Latency, Cost And Constellation Economics

NeevCloud’s proposal envisions a constellation of up to 600 satellites over three years. Redundancy would be built into the network: if one node fails, workloads shift to another. Maintenance, however, will not involve repairs in orbit. Instead, failed modules would be de-orbited and replaced, a model closer to hardware refresh cycles than field servicing. 

“On-orbit maintenance is neither practical nor cost-effective. Repairing a satellite in space would likely cost more than launching a replacement. Instead, the plan is to de-orbit any failed or outdated module in LEO and replace it with a new one. The satellite will be intentionally withdrawn and safely de-orbited at the end of its lifecycle,” said Sen.

Cost remains an open question. Initial deployments may be 10–20% more expensive than terrestrial data centres. However, NeevCloud’s argue that for critical workloads — defence or confidential inference environments — customers prioritise latency, security and autonomy over marginal cost differences. 

He added that the pilot mission is expected to cost up to ₹10 Cr, which will be financed by NeevCloud’s parent company — Rackbank Datacenters, an Indore-headquartered data centre company, also cofounded by Sen, in 2013.

The company raised $16.5 Mn in a seed funding round in March 2025. NeevCloud, on the other hand, was founded in 2024 and has a revenue of less than $1 Mn. Sen said that the company will be looking to raise funds this year to take its R&D ahead. 

Utopia Or Future Reality?

Not everyone is optimistic about putting a data centre infrastructure in space. OpenAI’s cofounder and CEO Sam Altman, during his recent visit to India, called the concept of space data centres ‘ridiculous’.

“It will make sense some day. The very rough math of launch costs relative to the cost of power we can do on Earth, to say nothing of how you’re going to fix a broken GPU in space, and they do break a lot still unfortunately…we are not there yet,” he was quoted as saying by The Information. 

It may be noted that Altman had considered buying SpaceX competitor Stoke Space, tinkering with the idea of space data centres, according to a December 2025 Wall Street Journal article.

Launch economics remain the biggest hurdle for orbital data centres, Lt. Gen. AK Bhatt (Retd.), director general, Indian Space Association (ISpA) opined.

“A true data-centre-scale facility would require substantial mass, making it commercially unviable at current launch costs. Only if fully reusable rockets — as pursued by SpaceX — deliver dramatic cost reductions, potentially on the order of 100x, would large-scale orbital computing become economically realistic rather than niche,” he added. 

He added that while space offers a naturally cold environment, its vacuum means heat cannot dissipate passively. Active thermal systems are still essential, making the idea that cooling is easy in orbit only a half-truth.

The post The Cloud Beyond The Clouds: Startups Firm Up Orbital AI Data Centre Plans appeared first on Inc42 Media.


No comments

Powered by Blogger.