Tag: AI Chips

  • Engineers Develop a thin Film that Makes AI Chips Faster and more Efficient

    Engineers Develop a thin Film that Makes AI Chips Faster and more Efficient

    University of Houston engineers developed a breakthrough thin-film material that boosts AI speed while cutting energy use.
    Image Credits:This is the two-dimensional thin film electric insulator designed in the University of Houston la

    University of Houston engineers developed a breakthrough thin-film material that boosts AI speed while cutting energy use.

    Published in ACS Nano, the study introduces a 2D dielectric thin film that replaces heat-generating chip components, reducing energy use and AI-related heat.

    AI has driven our energy needs sky-high,” said Alamgir Karim, Dow Chair and Welch Foundation Professor. Brookshire Department of Chemical and Biomolecular Engineering at UH.

    Massive Cooling Systems Drive Up Power Use in AI Data Centers

    He noted that AI data centers rely on power-hungry cooling systems to keep thousands of servers fast and extend chip lifespans.

    To reduce power consumption while boosting performance, Karim and his former Ph.D. student, Maninderjeet Singh, used Nobel Prize–winning organic framework materials to create these dielectric films.

    These next-generation materials could greatly boost AI and electronic device performance,” said Singh, who developed them at UH with Professor Shaffer and Ph.D. student Schroeder.

    Image Credits:Alamgir Karim, University of Houston Dow Chair and Welch Foundation Professor at the William 

    Dielectric materials differ in how much electrical energy they store. High-permittivity (high-k) dielectrics hold more charge and therefore generate more heat, while low-k materials store less and stay cooler. Karim focused on lightweight, carbon-based low-k dielectrics that speed signal transmission and reduce delays.

    Low-k materials insulate circuits, allowing faster, cooler, and more efficient chip performance,” Karim explained.

    New Carbon-Based Thin Films Engineered and Analyzed for Next-Generation Low-k Electronics

    The researchers made porous, sheet-like carbon films and, with student Saurabh Tiwary, analyzed their electronic properties for low-k devices.

    Karim and Singh found that integrating low-k 2D films into chips could greatly reduce AI data centers’ power use, thanks to their ultralow dielectric constant, high breakdown strength, and strong thermal stability.

    Shaffer and Schroeder used synthetic interfacial polymerization to form strong, layered crystalline films from molecules in two immiscible liquids.. This approach was pioneered by 2025 Chemistry Nobel laureate Omar M. Yaghi of UC Berkeley and his collaborators.


    Read the original article on: Techxplore

    Read more: Samsung Introduces its First Special Edition Triple-Fold Smartphone

  • Qualcomm Set to Challenge Nvidia with its own Line of AI Chips

    Qualcomm Set to Challenge Nvidia with its own Line of AI Chips

    On October 27, Qualcomm unveiled a new line of artificial intelligence chips aimed at challenging market leader Nvidia, as competition intensifies to profit from the rapid expansion of AI data centers.
    Image Credits: Pixabay

    On October 27, Qualcomm unveiled a new line of artificial intelligence chips aimed at challenging market leader Nvidia, as competition intensifies to profit from the rapid expansion of AI data centers.

    If successful, Qualcomm — the San Diego-based technology giant — could secure a foothold in the AI data center market as customers seek alternatives to Nvidia, which currently dominates nearly 90% of the sector.

    Qualcomm Debuts AI200 and AI250 Chips for Data Centers

    Qualcomm plans to commercially release the first chip in the lineup, the AI200, in 2026, and the AI250 will follow in 2027. Following the announcement of its entry into the data center market, Qualcomm’s stock jumped 20%.

    The company intends to offer specialized AI server racks equipped with dozens of its chips for data center installations, while also selling standalone AI chips that enterprises can purchase and integrate into their existing servers.

    Demand for AI inference chips has surged alongside broader adoption and emerging applications, prompting major players like Amazon, Google, and Microsoft to develop their own in-house AI processors.

    McKinsey Sees $7 Trillion Data Center Boom as Qualcomm Diversifies

    McKinsey estimates that data centers will see nearly $7 trillion in capital expenditures by 2030.

    “Qualcomm’s push to expand beyond smartphones and enter this market is a logical move,” said Austin Lyons, analyst and founder of the semiconductor publication Chipstrat. “It’s a smart shift in direction—targeting data centers rather than just consumer products.”

    In September, OpenAI struck a $10 billion agreement with Broadcom to co-design custom AI chips and also invested in AMD, committing to purchase its MI450 AI processors.

    Qualcomm has also secured Humain, a Saudi Arabian AI firm backed by the country’s sovereign wealth fund, as the first customer for its new chip lineup. These chips are set to be deployed in Humain’s data centers in 2026.

    Humain Chooses Groq; G42 Builds UAE–US AI Campus

    Humain, which plans to launch a $10 billion venture fund, selected another California-based chipmaker, Groq, in May to provide inference chips for its data centers.

    Meanwhile, G42, an Abu Dhabi–backed AI holding company with a stake in U.S. chipmaker Cerebras Systems, will develop the 5-gigawatt UAE–US AI campus announced during President Trump’s May visit.

    The Gulf nations have become increasingly influential in the AI landscape, following Trump’s White House rollback of Biden-era chip export controls and the negotiation of multi-billion-dollar U.S. chip supply deals to support their growing AI ambitions.


    Read the original article on: Tech Xplore

    Read more: Swiss Smart Socks Help Diabetics Regain Sensation in Their Feet

  • Malaysia Will Require Official Approval To Trade AI Chips Made In The United States

    Malaysia Will Require Official Approval To Trade AI Chips Made In The United States

    Malaysia is stepping up its efforts to support the U.S. in restricting the flow of advanced AI chips to China.

On Monday, the Malaysian Ministry of Investment, Trade and Industry introduced new rules limiting the export of U.S.-origin AI chips. Under the new policy, individuals and companies must provide Malaysian authorities with at least 30 days' notice before exporting or transshipping these chips. The measures take effect immediately.
    Image Credits:BlackJack3D / Getty Images

    Malaysia is stepping up its efforts to support the U.S. in restricting the flow of advanced AI chips to China.

    On Monday, the Malaysian Ministry of Investment, Trade and Industry introduced new rules limiting the export of U.S.-origin AI chips. Under the new policy, individuals and companies must provide Malaysian authorities with at least 30 days’ notice before exporting or transshipping these chips. The measures take effect immediately.

    “Malaysia remains committed to preventing any efforts to bypass export controls or participate in illegal trade,” the Ministry stated in a press release, adding that individuals or companies found violating the Strategic Trade Act 2010 or related regulations will face severe legal consequences.

    Reports of alleged smuggling of U.S. AI chips into China have surfaced repeatedly in recent months.

    Allegations of Extreme Smuggling Tactics to Bypass Chip Export Controls

    In an April blog post, Anthropic alleged that China had already established advanced smuggling operations. The post described extreme methods used to transport the chips, including concealing them inside prosthetic baby bumps and hiding GPUs in shipments of live lobsters.

    Anthropic’s April blog post supported the implementation of stricter U.S. export controls on AI chips to curb smuggling activities. The government is expected to introduce these additional restrictions soon.

    According to a Bloomberg report last week, the Trump administration is considering tightening export rules on AI chips particularly from companies like Nvidia to countries such as Malaysia and Thailand, aiming to block China from obtaining the chips through alternative routes. However, the government has not officially announced anything yet.

    Meanwhile, the U.S. Department of Commerce is developing its own set of broader AI chip export regulations, following the formal withdrawal of the Biden administration’s AI Diffusion rules in May.


    Read the original article on: TechCrunch

    Read more: New Magnetic Phenomenon May Pave the Way for Ultrafast Memory Chips

  • NVIDIA and AMD Are Expected To Roll Out New AI Chips in China That Align With U.S. Export Limits

    NVIDIA and AMD Are Expected To Roll Out New AI Chips in China That Align With U.S. Export Limits

    To adhere to U.S. export restrictions on advanced semiconductor technology to China, NVIDIA and AMD are preparing to launch new GPUs tailored for AI applications in the Chinese market, according to Taiwanese tech outlet Digitimes, which cited supply chain insiders.
    Image Credits: Pixabay

    To adhere to U.S. export restrictions on advanced semiconductor technology to China, NVIDIA and AMD are preparing to launch new GPUs tailored for AI applications in the Chinese market, according to Taiwanese tech outlet Digitimes, which cited supply chain insiders.

    NVIDIA and AMD Prepare AI GPUs Tailored for China’s Market

    NVIDIA is reportedly preparing to launch a scaled-down version of its AI GPU, known by the code name “B20,” specifically designed to meet the export restrictions and cater to the Chinese market’s demand for AI processing power. This streamlined GPU delivers solid AI performance while complying with U.S. regulations, enabling NVIDIA to maintain its presence in China’s growing AI sector despite export limitations.

    Meanwhile, AMD targets the same market by introducing its new Radeon AI PRO R9700 workstation GPU, designed to efficiently handle AI workloads in professional environments. This GPU aims to provide robust performance for AI applications, particularly for Chinese enterprises and developers who require high-powered yet accessible AI computing solutions.

    NVIDIA and AMD Navigate Regulations to Tap Growing Market

    Both companies plan to start selling these specialized AI chips in China as early as July, signaling a strategic move to adapt their product lines to regulatory constraints while continuing to capitalize on the expanding AI technology market in China. This approach not only ensures compliance but also enables NVIDIA and AMD to support the AI ecosystem within China by offering products designed specifically for its unique regulatory and commercial environment.

    Additionally, Reuters recently reported that NVIDIA is developing a lower-cost AI chip based on its Blackwell architecture for the Chinese market, with an anticipated price range of $6,500 to $8,000—significantly cheaper than its H20 GPUs, which currently sell for $10,000 to $12,000.

    On Wednesday, NVIDIA reported a $4.5 billion charge in the first quarter due to licensing restrictions that hindered its ability to sell the H20 AI chip to Chinese firms. Additionally, the company was unable to ship another $2.5 billion worth of H20 chips during the quarter because of these limitations. Looking ahead, NVIDIA projected that these licensing requirements could reduce its second-quarter revenue by $8 billion.


    Read the original article on: TechCrunch

    Read more: AI Chip Designed for Decentralized Operation Without Relying on the Cloud