Machine learning tagged posts

AI reveals unsuspected Math underlying Search for Exoplanets

chart explaining gravitational microlensing
This infographic explains the light curve astronomers detect when viewing a microlensing event, and the signature of an exoplanet: an additional uptick in brightness when the exoplanet lenses the background star. (Image Credit: NASA / ESA / K. Sahu / STScI)

Artificial intelligence (AI) algorithms trained on real astronomical observations now outperform astronomers in sifting through massive amounts of data to find new exploding stars, identify new types of galaxies and detect the mergers of massive stars, accelerating the rate of new discovery in the world’s oldest science.

But AI, also called machine learning, can reveal something deeper, University of California, Berkeley, astronomers found: Unsuspected connections hidden in the complex mathematics arising from general relativity—...

Read More

Harnessing Noise in Optical Computing for AI

An illustration of the UW ECE-led research team’s integrated optical computing chip and “handwritten” numbers it generated. The chip contains an artificial neural network that can learn how to write like a human in its own, distinct style. This optical computing system uses “noise” (stray photons from lasers and thermal background radiation) to augment its creative capabilities. The system is also approximately 10 times faster than comparable conventional digital computers and more energy efficient, helping to put AI and machine learning on a path toward environmental sustainability. Illustration by Changming Wu

A research team has developed an optical computing system for AI and machine learning that not only mitigates the noise inherent to optical computing but actually use...

Read More

Machine Learning for Morphable Materials

New platform can program the transformation of 2D stretchable surfaces into specific 3D shapes. Flat materials that can morph into three-dimensional shapes have potential applications in architecture, medicine, robotics, space travel, and much more. But programming these shape changes requires complex and time-consuming computations.

Now, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed a platform that uses machine learning to program the transformation of 2D stretchable surfaces into specific 3D shapes.

“While machine learning methods have been classically employed for image recognition and language processing, they have also recently emerged as powerful tools to solve mechanics problems,” said Katia Bertoldi, the Wil...

Read More

Machine Learning Models Quantum Devices

Quantum reservoir computing. B and F represent the input and output states, respectively, of a quantum system. E is an auxiliary system necessary to pass the sequence of input states B to the quantum reservoir S. S can then be read to emulate F without disrupting the system. ©2021 Tran et al.

A novel algorithm allows for efficient and accurate verification of quantum devices. Technologies that take advantage of novel quantum mechanical behaviors are likely to become commonplace in the near future. These may include devices that use quantum information as input and output data, which require careful verification due to inherent uncertainties. The verification is more challenging if the device is time dependent when the output depends on past inputs...

Read More