artificial intelligence (AI) tagged posts

Graphene Key for Novel Hardware Security

A team of Penn State researchers has developed a new hardware security device that takes advantage of microstructure variations to generate secure keys.
 IMAGE: JENNIFER MCCANN/PENN STATE

As more private data is stored and shared digitally, researchers are exploring new ways to protect data against attacks from bad actors. Current silicon technology exploits microscopic differences between computing components to create secure keys, but artificial intelligence (AI) techniques can be used to predict these keys and gain access to data. Now, Penn State researchers have designed a way to make the encrypted keys harder to crack.

Led by Saptarshi Das, assistant professor of engineering science and mechanics, the researchers used graphene — a layer of carbon one atom thick — to develop a no...

Read More

World’s Fastest Optical Neuromorphic Processor

Dr Xingyuan (Mike) Xu with the integrated optical microcomb chip, which forms the core part of the optical neuromorphic processor.

An international team of researchers led by Swinburne University of Technology has demonstrated the world’s fastest and most powerful optical neuromorphic processor for artificial intelligence (AI), which operates faster than 10 trillion operations per second (TeraOPs/s) and is capable of processing ultra-large scale data.

Published in the journal Nature, this breakthrough represents an enormous leap forward for neural networks and neuromorphic processing in general.

Artificial neural networks, a key form of AI, can ‘learn’ and perform complex operations with wide applications to computer vision, natural language processing, facial recognition, speech...

Read More

Early Bird uses 10 times Less Energy to Train Deep Neural Networks

Rice University’s Early Bird method for training deep neural networks finds key connectivity patterns early in training, reducing the computations and carbon footprint for the increasingly popular form of artificial intelligence known as deep learning. (Graphic courtesy of Y. Lin/Rice University)

Novel training method could shrink carbon footprint for greener deep learning. Engineers have found a way to train deep neural networks for a fraction of the energy required today. Their Early Bird method finds key network connectivity patterns early in training, reducing the computations and carbon footprint for training deep learning.

Early Bird is an energy-efficient method for training deep neural networks (DNNs), the form of artificial intelligence (AI) behind self-driving cars, inte...

Read More