artificial intelligence (AI) tagged posts

Teaching Photonic Chips to ‘Learn’

CHIP used in research
Silicon Photonic Architecture for Training Deep Neural Networks with Direct Feedback Alignment, OPTICA

A multi-institution research team has developed an optical chip that can train machine learning hardware.

Machine learning applications skyrocketed to $165B annually, according to a recent report from McKinsey. But, before a machine can perform intelligence tasks such as recognizing the details of an image, it must be trained. Training of modern-day artificial intelligence (AI) systems like Tesla’s autopilot costs several million dollars in electric power consumption and requires supercomputer-like infrastructure. This surging AI “appetite” leaves an ever-widening gap between computer hardware and demand for AI...

Read More

Researchers develop new Strategies to Teach Computers to Learn Like Humans do

SUTD researchers develop new strategies to teach computers to learn like humans do
Graphics of the generative-replay setup (top left panel) and scheme for training artificial neural network (ANN) with generative replay (top right panel). The normalized electrical current accuracy for the conventional (bottom left panel) and brain-inspired replay (BIR) models (bottom right panel). Credit: SUTD

As demonstrated by breakthroughs in various fields of artificial intelligence (AI), such as image processing, smart health care, self-driving vehicles and smart cities, this is undoubtedly the golden period of deep learning. In the next decade or so, AI and computing systems will eventually be equipped with the ability to learn and think the way humans do—to process continuous flow of information and interact with the real world.

However, current AI models suffer from a per...

Read More

Significant Energy Savings using Neuromorphic Hardware

One of Intel’s Nahuku boards, each of which contains eight to 32 Intel Loihi neuromorphic chips. © Tim Herman/Intel Corporation

For the first time TU Graz’s Institute of Theoretical Computer Science and Intel Labs demonstrated experimentally that a large neural network can process sequences such as sentences while consuming 4X – 16X less energy while running on neuromorphic hardware than non-neuromorphic hardware. The new research based on Intel Labs’ Loihi neuromorphic research chip that draws on insights from neuroscience to create chips that function similar to those in the biological brain.

The research was funded by The Human Brain Project (HBP), one of the largest research projects in the world with more than 500 scientists and engineers across Europe studying the human bra...

Read More

Observation, Simulation, and AI join forces to reveal a Clear Universe

Artist’s visualization of this research. Using AI driven data analysis to peel back the noise and find the actual shape of the Universe. (Credit: The Institute of Statistical Mathematics)

Japanese astronomers have developed a new artificial intelligence (AI) technique to remove noise in astronomical data due to random variations in galaxy shapes. After extensive training and testing on large mock data created by supercomputer simulations, they then applied this new tool to actual data from Japan’s Subaru Telescope and found that the mass distribution derived from using this method is consistent with the currently accepted models of the Universe. This is a powerful new tool for analyzing big data from current and planned astronomy surveys.

Wide area survey data can be used to study...

Read More