- AiNexaVerse News
- Posts
- Nvidia CEO Backs Trump’s Plan to Ease AI Chip Export Restrictions
Nvidia CEO Backs Trump’s Plan to Ease AI Chip Export Restrictions
Hello AI Lovers!
Today’s Topics Are:
- Nvidia CEO Backs Trump’s Plan to Ease AI Chip Export Restrictions
- AI’s Hidden Power Strain: The Energy Crisis Behind the Chatbots
Nvidia CEO Backs Trump’s Plan to Ease AI Chip Export Restrictions

Quick Summary:
Nvidia CEO Jensen Huang has praised former President Donald Trump’s decision to scrap Biden-era AI chip export controls, calling them a failure that cost U.S. companies billions and boosted China’s tech self-sufficiency.
Key Points:
Jensen Huang criticizes the Biden administration’s AI export curbs as “fundamentally flawed.”
U.S. restrictions reduced Nvidia’s China market share from 95% to 50%.
Trump proposes a new global licensing regime to replace tiered export bans.
China’s AI market is still valued at $50 billion, with intense local competition.
Nvidia is adapting its chip designs to comply with evolving regulations.
The Story:
Speaking at Computex in Taipei, Nvidia CEO Jensen Huang openly criticized U.S. AI chip export restrictions to China, calling them a strategic misstep. The Biden administration's three-tiered system blocked advanced chip sales to China entirely, aiming to slow its AI progress. However, Huang argued these rules failed, prompting Chinese firms like Huawei to develop local alternatives and pushing China to build an independent semiconductor supply chain.
Huang welcomed Trump’s plan to replace the system with a government-to-government global licensing framework, suggesting it better reflects global AI realities. Trump’s approach, Huang said, recognizes that U.S. companies are no longer the sole players in advanced computing.
Due to U.S. policies, Nvidia’s once-dominant China market share dropped dramatically. Huang estimated China’s AI market could be worth $50 billion in 2026, but acknowledged fierce competition from local players who are eager to keep foreign firms out.
Despite the export challenges, Nvidia is adapting. It’s redesigning its Blackwell chip to meet compliance standards by adjusting its memory speed and capacity. The company recently reported a $5.5 billion hit from restrictions on its H20 chip and expects around $15 billion in lost revenue overall.
Conclusion:
Nvidia’s leadership is urging a reset in U.S. AI trade policy, highlighting that blanket bans are driving innovation away from American firms. As AI becomes a global race, the call is growing for smarter, more cooperative regulation.
Get the tools, gain a teammate
Impress clients with online proposals, contracts, and payments.
Simplify your workload, workflow, and workweek with AI.
Get the behind-the-scenes business partner you deserve.

Quick Summary:
AI might seem lightweight when you send a text prompt or generate an image, but its energy cost is mounting rapidly. A new MIT Technology Review investigation reveals how underestimated and poorly tracked the AI industry’s energy footprint really is—and what that means for our climate and infrastructure.
Key Points:
AI queries seem minor but add up across billions of uses.
AI-driven data centers are rapidly increasing energy demand and emissions.
Inference (using AI) now consumes 80–90% of AI’s computing power.
Tech giants are investing billions in energy-hungry infrastructure.
Governments and utilities lack transparent data for energy planning.
The Story:
AI has quickly become integrated into our daily lives—from chatting with bots to generating media—but powering these systems requires massive energy resources. A new report by MIT Technology Review shows that the industry’s energy usage is much larger than it appears. Each AI query may use a small amount of energy, but with billions of interactions each day and AI being built into nearly every online service, the total impact is staggering.
Training models like OpenAI’s GPT-4 can use over 50 gigawatt-hours of electricity—enough to power San Francisco for days. Yet training is only the start. The real energy sink is inference: running the models for user queries. Inference now accounts for up to 90% of AI’s computing load. Data centers supporting these operations already consume 4.4% of all U.S. electricity, and projections show that by 2028, AI could consume as much power as 22% of U.S. households.
Conclusion:
AI’s energy toll is accelerating with little oversight or transparency. As Big Tech races to expand AI infrastructure, utility companies and governments are being left in the dark. Without better planning, we risk an energy future shaped by AI—one that’s costly, carbon-heavy, and dangerously unaccountable.
That was it for this Weeks News, We Hope this was informative and insightful as always!
We Will Start Something Special Within a Few Months.
We Will Tell you more soon!
But for now, Please refer us to other people that would like our content!
This will help us out Big Time!
Did You Like The News? |