Why the AI Autocrats Must Be Challenged to Do Better

If we’ve learned anything from the Age of AI, it’s that the industry is grappling with significant power challenges. These challenges are both literal—as in finding ways to meet the voracious energy demands that AI data centers require—and figurative—as in the concentration of AI wealth in a few hands based on narrow commercial interests rather than broader societal benefits.

The AI Power Paradox: High Costs, Concentrated Control

For AI to be successful and benefit humanity, it must become ubiquitous. To become ubiquitous, it must be both economically and environmentally sustainable. That’s not the path we’re headed down now. The obsessive battle for bigger and faster AI is driven more by short-term performance gains and market dominance than by what’s best for sustainable and affordable AI.

The race to build ever-more-powerful AI systems is accelerating, but it comes at a steep environmental cost. Cutting-edge AI chips, like Nvidia’s H100 (up to 700 watts), already consume significant amounts of energy. This trend is expected to continue, with industry insiders predicting that Nvidia’s next-generation Blackwell architecture could push power consumption per chip well into the kilowatt range, potentially exceeding 1,200 watts. With industry leaders anticipating millions of these chips being deployed in data centers worldwide, the energy demands of AI are poised to skyrocket.

The Environmental Cost of the AI Arms Race

Let’s put that in an everyday context. The electricity powering your entire house could run all your appliances at full blast simultaneously – not that anyone would do that. Now imagine just one 120kw Nvidia rack demanding that same amount of power – especially when there might be hundreds or thousands in large data centers! Now,1,200 watts equal 1.2 kw. So really, we’re talking about a medium-sized neighborhood. A single 120kW Nvidia rack – essentially 100 of those power-hungry chips – needs enough electricity to power roughly 100 homes.

This trajectory is concerning, given the energy constraints many communities face. Data center experts predict that the United States will need 18 to 30 gigawatts of new capacity over the next five to seven years, which has companies scrambling to find ways to handle that surge. Meanwhile, my industry just keeps creating more power-hungry generative AI applications that consume energy far beyond what’s theoretically necessary for the application or what’s feasible for most businesses, let alone desirable for the planet.

Balancing Security and Accessibility: Hybrid Data Center Solutions

This AI autocracy and “arms race,” obsessed with raw speed and power, ignores the practical needs of real-world data centers – namely, the kind of affordable solutions that decrease market barriers to the 75 percent of U.S. organizations that have not adopted AI. And let’s face it, as more AI regulation rolls out around privacy, security and environmental protection, more organizations will demand a hybrid data center approach, safeguarding their most precious, private and sensitive data safe in highly protected on-site areas away from the AI and cyberattacks of late. Whether it’s healthcare records, financial data, national defense secrets, or election integrity, the future of enterprise AI demands a balance between on-site security and cloud agility.

This is a significant systemic challenge and one that requires hyper-collaboration over hyper-competition. With an overwhelming focus on GPUs and other AI accelerator chips with raw capability, speed and performance metrics, we are missing sufficient consideration for the affordable and sustainable infrastructure required for governments and businesses to adopt AI capabilities. It’s like building a spaceship with nowhere to launch or putting a Lamborghini on a country road.

Democratizing AI: Industry Collaboration

While it’s heartening that governments are starting to consider regulation – ensuring that AI benefits everyone, not just the elite – our industry needs more than government rules.

For example, the UK is leveraging AI to enhance law enforcement capabilities by enhancing data sharing between law enforcement agencies to improve AI-driven crime prediction and prevention. They focus on transparency, accountability, and fairness in using AI for policing, ensuring public trust and adherence to human rights – with tools like facial recognition and predictive policing to aid in crime detection and management.

In highly regulated industries like biotech and healthcare, notable collaborations include Johnson & Johnson MedTech and Nvidia working together to enhance AI for surgical procedures. Their collaboration aims to develop real-time, AI-driven analysis and decision-making capabilities in the operating room. This partnership leverages NVIDIA’s AI platforms to enable scalable, secure, and efficient deployment of AI applications in healthcare settings​.

Meanwhile, in Germany, Merck has formed strategic alliances with Exscientia and BenevolentAI to advance AI-driven drug discovery. They are harnessing AI to accelerate the development of new drug candidates, particularly in oncology, neurology, and immunology. The goal is to improve the success rate and speed of drug development through AI’s powerful design and discovery capabilities​.

The first step is to reduce the costs of deploying AI for businesses beyond BigPharma and Big Tech, particularly in the AI inference phase—when businesses install and run a trained AI model like Chat GPT, Llama 3 or Claude in a real data center every day. Recent estimates suggest that the cost to develop the largest of these next-generation systems could be around $1 billion, with inference costs potentially 8-10 times higher.

The soaring cost of implementing AI in daily production keeps many companies from fully adopting AI—the “have-nots.” A recent survey found that only one in four companies have successfully launched AI initiatives in the past 12 months and that 42% of companies have yet to see a significant benefit from generative AI initiatives.

To truly democratize AI and make it ubiquitous — meaning, widespread business adoption — our AI industry must shift focus. Instead of a race for the biggest and fastest models and AI chips, we need more collaborative efforts to improve affordability, reduce power consumption, and open the AI market to share its full and positive potential more broadly. A systemic change would raise all boats by making AI more profitable for all with tremendous consumer benefit.

There are promising signs that slashing the costs of AI is feasible – lowering the financial barrier to bolster large-scale national and global AI initiatives. My company, NeuReality, is collaborating with Qualcomm to achieve up to 90% cost reduction and 15 times better energy efficiency for various AI applications across text, language, sound and images – the basic building blocks of AI.  You know those AI models under industry buzzwords like computer vision, conversational AI, speech recognition, natural language processing, generative AI and large language models. By collaborating with more software and service providers, we can keep customizing AI in practice to bring performance up and costs down.

In fact, we’ve managed to decrease the cost and power per AI query compared to traditional CPU-centric infrastructure upon which all AI accelerator chips, including Nvidia GPUs, rely today. Our NR1-S AI Inference Appliance began shipping over the summer with Qualcomm Cloud AI 100 Ultra accelerators paired with NR1 NAPUs. The result is an alternative NeuReality architecture that replaces the traditional CPU in AI data centers – the biggest bottleneck in AI data processing today. That evolutionary change is profound and highly necessary.

Beyond Hype: Building an Economically and Sustainable AI Future

Let’s move beyond the AI hype and get serious about addressing our systemic challenges. The hard work lies ahead at the system level, requiring our entire AI industry to work with—not against—each other. By focusing on affordability, sustainability and accessibility, we can create an AI industry and broader customer base that benefits society in bigger ways. That means offering sustainable infrastructure choices without AI wealth concentrated in the hands of a few, known as the Big 7.

The future of AI depends on our collective efforts today. By prioritizing energy efficiency and accessibility, we can avert a future dominated by power-hungry AI infrastructure and an AI oligarchy focused on raw performance at the expense of widespread benefit. Simultaneously, we must address the unsustainable energy consumption that hinders AI’s potential to revolutionize public safety, healthcare, and customer service.

In doing so, we create a powerful AI investment and profitability cycle fueled by widespread innovation.

Who’s with us?