Software
Indian AI compute startup Turiyam AI has entered into a strategic partnership with NTT Global Data Centers to deploy and scale its AI inference servers within NTT’s data center facilities in India.
The collaboration is aimed at supporting enterprises that are moving from training artificial intelligence models to deploying them in production environments, where real-time inference performance, energy efficiency and security become critical.
Under the agreement, Turiyam AI will host its specialized inference hardware in NTT’s data centers, enabling customers to access low-latency AI compute infrastructure within a secure and scalable environment. The company said its server architecture is optimized specifically for inference workloads — the stage at which AI models process live data and generate outputs — rather than for large-scale model training.
As India’s AI ecosystem expands, demand for localized and sovereign compute infrastructure has increased, particularly among enterprises seeking to deploy AI in regulated or latency-sensitive use cases. Hosting inference infrastructure within domestic data centers is seen as a way to address data residency requirements and performance constraints.
NTT Global Data Centers, part of NTT Group, operates a global portfolio of data center facilities and has been expanding its footprint in India to support hyperscale and enterprise customers.
Turiyam AI said the deployment will focus on delivering high throughput with lower power consumption compared with general-purpose compute systems. The partnership also aligns with NTT’s renewable energy and sustainability initiatives, as enterprises increasingly weigh the environmental impact of AI operations.
Alok Bajpai, Managing Director for India at NTT Global Data Centers, said the collaboration would support real-time AI deployment needs across industries. Turiyam AI founder and CEO Sanchayan Sinha said hosting its infrastructure in enterprise-grade facilities would ensure uptime, performance and reliability for mission-critical workloads.
The partnership reflects a broader shift in the AI infrastructure market, where companies are building dedicated inference capacity as organizations scale AI applications beyond experimentation into full-scale deployment.
The collaboration is aimed at supporting enterprises that are moving from training artificial intelligence models to deploying them in production environments, where real-time inference performance, energy efficiency and security become critical.
Under the agreement, Turiyam AI will host its specialized inference hardware in NTT’s data centers, enabling customers to access low-latency AI compute infrastructure within a secure and scalable environment. The company said its server architecture is optimized specifically for inference workloads — the stage at which AI models process live data and generate outputs — rather than for large-scale model training.
As India’s AI ecosystem expands, demand for localized and sovereign compute infrastructure has increased, particularly among enterprises seeking to deploy AI in regulated or latency-sensitive use cases. Hosting inference infrastructure within domestic data centers is seen as a way to address data residency requirements and performance constraints.
NTT Global Data Centers, part of NTT Group, operates a global portfolio of data center facilities and has been expanding its footprint in India to support hyperscale and enterprise customers.
Turiyam AI said the deployment will focus on delivering high throughput with lower power consumption compared with general-purpose compute systems. The partnership also aligns with NTT’s renewable energy and sustainability initiatives, as enterprises increasingly weigh the environmental impact of AI operations.
Alok Bajpai, Managing Director for India at NTT Global Data Centers, said the collaboration would support real-time AI deployment needs across industries. Turiyam AI founder and CEO Sanchayan Sinha said hosting its infrastructure in enterprise-grade facilities would ensure uptime, performance and reliability for mission-critical workloads.
The partnership reflects a broader shift in the AI infrastructure market, where companies are building dedicated inference capacity as organizations scale AI applications beyond experimentation into full-scale deployment.
See What’s Next in Tech With the Fast Forward Newsletter
SOFTWARE
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.



