Microchips power almost every modern device, including mobile phones, laptops and even refrigerators. But behind the scenes, making them is a complicated process. However, researchers say they have found ways to harness the power of quantum computing to make it simpler.
Australian scientists have developed quantum machine learning technologies, a blend of artificial intelligence (AI) and quantum computing principles) that could change the way microchips are created.
They outlined their findings in a new study published in Journal Advanced Science on June 23rd. In it, researchers demonstrated for the first time how quantum machine learning algorithms can dramatically improve the challenging process of modeling electrical resistance in a chip.
You might like it
Quantum machine learning is a hybrid approach that combines classical data with quantum computing methods. In classical computing, data is stored in bits encoded as 0 or 1. Quantum computers use qubits, and thanks to principles like superposition and entanglement, qubits can exist in multiple states at the same time.
This allows quantum computing systems to handle complex mathematical relationships that are much faster than classic systems. When exponentially scaled by parallelism, more qubits are added to the system.
Quantum machine learning takes classical data and encodes it in the quantum state. Quantum computers can reveal patterns of data that classical systems find difficult to detect. The classical system then takes over and interprets or applies the results.
Related: The ‘Quantum AI’ algorithm is already surpassing the fastest supercomputers, according to research
In the chip manufacturing process
Semiconductor manufacturing is a complex, multi-stage process that requires laborious accuracy, and requires each step to be performed very accurately. Even the smallest inconsistencies can cause the chip to fail.
This involves first engraving hundreds of microscope layers on a silicon wafer. This is a thin circular slice of silicon that forms the foundation of the chip.
The deposition layer raises a thin film of material up onto the wafer. Photoresist coatings apply light-sensitive materials that allow for accurate patterns. This is the process of creating small, complex shapes that define the circuits of the chip.
In lithography, light transmits these patterns to the surface of the wafer. Etching removes areas of selected material and cuts out the circuit structure. Ion implantation adjusts the electrical properties of each layer by embedding charged particles. Finally, the chip is packaged, meaning it is wrapped and connected to allow integration into the device.
That’s where the principles of quantum computing emerge. In this study, the researchers focused on modelling the contact resistance of ohms. This is a particularly challenging challenge in chipmaking. This is a measure of how easily it flows between the metal and semiconductor layers of the chip. The lower this, the faster the energy-efficient performance.
This step occurs after the material is layered and patterned into a wafer, and plays an important role in determining how well the finished chip works. But modeling it accurately was a problem.
Engineers typically rely on classic machine learning algorithms that learn patterns from data and make predictions for this type of calculation. This works well with large, clean datasets, but semiconductor experiments often produce small, noisy datasets with nonlinear patterns. To address this, researchers turned to quantum machine learning.
A new kind of algorithm
The team used data from 159 experimental samples of gallium nitride high electron mobility transistors (GAN HEMTS). It is a semiconductor known for its speed and efficiency, and is commonly used in electronic devices and 5G devices.
First, they identified which manufacturing variables had the greatest impact on the contact resistance of ohms and narrowed the dataset to the most relevant inputs. He then developed a new machine learning architecture called Quantum Kernel-Aligned Regressor (QKAR).
QKAR converts classical data into quantum states, allowing quantum systems to identify complex relationships within data. The classical algorithms then learn from these insights and create predictive models that guide chip manufacturing. They tested the model with five new samples not included in the training data.
The new model was tested in these samples against seven major classical models, including deep learning and gradient boosting methods, surpassing all of them. QKAR achieved significantly better results than achieved using the traditional model (0.338 ohms per millimeter), but certain numbers were not included in the study.
But the important thing is that it is designed to be compatible with real hardware. This means that Quantum Machines can be deployed as they become more reliable.
“These findings demonstrate the possibility of [quantum machine learning] The scientists wrote in the study, adding that the method could be quickly applied to real-world chip production, especially as quantum hardware continues to evolve.
Source link