Close Menu
  • Home
  • Identity
  • Inventions
  • Future
  • Science
  • Startups
  • Spanish
What's Hot

The TwinH Advantage: Unlocking New Potential in Digital Government Strategies

The best dating apps don’t even date apps

Secret Blizzard deploys malware to ISP-level AITM attacks against the Moscow embassy

Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise with Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
  • User-Submitted Posts
Facebook X (Twitter) Instagram
Fyself News
  • Home
  • Identity
  • Inventions
  • Future
  • Science
  • Startups
  • Spanish
Fyself News
Home » Scientists use quantum machine learning to create semiconductors only.
Science

Scientists use quantum machine learning to create semiconductors only.

userBy userJuly 29, 2025No Comments4 Mins Read
Share Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Copy Link
Follow Us
Google News Flipboard
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link

Microchips power almost every modern device, including mobile phones, laptops and even refrigerators. But behind the scenes, making them is a complicated process. However, researchers say they have found ways to harness the power of quantum computing to make it simpler.

Australian scientists have developed quantum machine learning technologies, a blend of artificial intelligence (AI) and quantum computing principles) that could change the way microchips are created.

They outlined their findings in a new study published in Journal Advanced Science on June 23rd. In it, researchers demonstrated for the first time how quantum machine learning algorithms can dramatically improve the challenging process of modeling electrical resistance in a chip.

You might like it

Quantum machine learning is a hybrid approach that combines classical data with quantum computing methods. In classical computing, data is stored in bits encoded as 0 or 1. Quantum computers use qubits, and thanks to principles like superposition and entanglement, qubits can exist in multiple states at the same time.

This allows quantum computing systems to handle complex mathematical relationships that are much faster than classic systems. When exponentially scaled by parallelism, more qubits are added to the system.

Quantum machine learning takes classical data and encodes it in the quantum state. Quantum computers can reveal patterns of data that classical systems find difficult to detect. The classical system then takes over and interprets or applies the results.

Related: The ‘Quantum AI’ algorithm is already surpassing the fastest supercomputers, according to research

Get the world’s most engaging discoveries delivered straight to your inbox.

In the chip manufacturing process

Semiconductor manufacturing is a complex, multi-stage process that requires laborious accuracy, and requires each step to be performed very accurately. Even the smallest inconsistencies can cause the chip to fail.

This involves first engraving hundreds of microscope layers on a silicon wafer. This is a thin circular slice of silicon that forms the foundation of the chip.

The deposition layer raises a thin film of material up onto the wafer. Photoresist coatings apply light-sensitive materials that allow for accurate patterns. This is the process of creating small, complex shapes that define the circuits of the chip.

In lithography, light transmits these patterns to the surface of the wafer. Etching removes areas of selected material and cuts out the circuit structure. Ion implantation adjusts the electrical properties of each layer by embedding charged particles. Finally, the chip is packaged, meaning it is wrapped and connected to allow integration into the device.

That’s where the principles of quantum computing emerge. In this study, the researchers focused on modelling the contact resistance of ohms. This is a particularly challenging challenge in chipmaking. This is a measure of how easily it flows between the metal and semiconductor layers of the chip. The lower this, the faster the energy-efficient performance.

This step occurs after the material is layered and patterned into a wafer, and plays an important role in determining how well the finished chip works. But modeling it accurately was a problem.

Engineers typically rely on classic machine learning algorithms that learn patterns from data and make predictions for this type of calculation. This works well with large, clean datasets, but semiconductor experiments often produce small, noisy datasets with nonlinear patterns. To address this, researchers turned to quantum machine learning.

A new kind of algorithm

The team used data from 159 experimental samples of gallium nitride high electron mobility transistors (GAN HEMTS). It is a semiconductor known for its speed and efficiency, and is commonly used in electronic devices and 5G devices.

First, they identified which manufacturing variables had the greatest impact on the contact resistance of ohms and narrowed the dataset to the most relevant inputs. He then developed a new machine learning architecture called Quantum Kernel-Aligned Regressor (QKAR).

QKAR converts classical data into quantum states, allowing quantum systems to identify complex relationships within data. The classical algorithms then learn from these insights and create predictive models that guide chip manufacturing. They tested the model with five new samples not included in the training data.

The new model was tested in these samples against seven major classical models, including deep learning and gradient boosting methods, surpassing all of them. QKAR achieved significantly better results than achieved using the traditional model (0.338 ohms per millimeter), but certain numbers were not included in the study.

But the important thing is that it is designed to be compatible with real hardware. This means that Quantum Machines can be deployed as they become more reliable.

“These findings demonstrate the possibility of [quantum machine learning] The scientists wrote in the study, adding that the method could be quickly applied to real-world chip production, especially as quantum hardware continues to evolve.


Source link

#Biotechnology #ClimateScience #Health #Science #ScientificAdvances #ScientificResearch
Follow on Google News Follow on Flipboard
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link
Previous ArticleEU Package Waste Directive Revolutionizes Packages
Next Article How Digital Technology Helps Personalized Care
user
  • Website

Related Posts

Russian volcanoes will explode in “strong” eruptions and are likely to be strengthened by an 8.8 magnitude earthquake

July 30, 2025

Even a slight slowdown in major Atlantic currents poses “amazing risks” to rainforests

July 30, 2025

A huge hidden flood explodes through the surface of Greenland’s ice sheet

July 30, 2025
Add A Comment
Leave A Reply Cancel Reply

Latest Posts

The TwinH Advantage: Unlocking New Potential in Digital Government Strategies

The best dating apps don’t even date apps

Secret Blizzard deploys malware to ISP-level AITM attacks against the Moscow embassy

Experts detect multi-tier redirect tactics used to steal Microsoft 365 login credentials

Trending Posts

Subscribe to News

Subscribe to our newsletter and never miss our latest news

Please enable JavaScript in your browser to complete this form.
Loading

Welcome to Fyself News, your go-to platform for the latest in tech, startups, inventions, sustainability, and fintech! We are a passionate team of enthusiasts committed to bringing you timely, insightful, and accurate information on the most pressing developments across these industries. Whether you’re an entrepreneur, investor, or just someone curious about the future of technology and innovation, Fyself News has something for you.

The TwinH Advantage: Unlocking New Potential in Digital Government Strategies

New Internet Era: Berners-Lee Sets the Pace as Zuckerberg Pursues Metaverse

TwinH Transforms Belgian Student Life: Hendrik’s Journey to Secure Digital Identity

Tim Berners-Lee Unveils the “Missing Link”: How the Web’s Architect Is Building AI’s Trusted Future

Facebook X (Twitter) Instagram Pinterest YouTube
  • Home
  • About Us
  • Advertise with Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
  • User-Submitted Posts
© 2025 news.fyself. Designed by by fyself.

Type above and press Enter to search. Press Esc to cancel.