artificial intelligence (AI) tagged posts

Cyber defense innovation could significantly boost 5G network security

Breakthrough development could significantly boost 5G network security
Proposed FedLLMGuard Architecture. Credit: University of Portsmouth

A framework for building tighter security into 5G wireless communications has been created by a Ph.D. student working with the University of Portsmouth’s Artificial Intelligence and Data Center.

With its greater network capacity and ability to rapidly transmit huge amounts of information from one device to another, 5G is a critical component of intelligent systems and services—including those for health care and financial services.

However, the dynamic nature of 5G networks, the high volumes of data shared and the ever changing types of information transmitted means that these networks are extremely vulnerable to cyber threats and increasing risks of attack.

Hadiseh Rezaei, a Ph.D...

Read More

Teaching Photonic Chips to ‘Learn’

CHIP used in research
Silicon Photonic Architecture for Training Deep Neural Networks with Direct Feedback Alignment, OPTICA

A multi-institution research team has developed an optical chip that can train machine learning hardware.

Machine learning applications skyrocketed to $165B annually, according to a recent report from McKinsey. But, before a machine can perform intelligence tasks such as recognizing the details of an image, it must be trained. Training of modern-day artificial intelligence (AI) systems like Tesla’s autopilot costs several million dollars in electric power consumption and requires supercomputer-like infrastructure. This surging AI “appetite” leaves an ever-widening gap between computer hardware and demand for AI...

Read More

Researchers develop new Strategies to Teach Computers to Learn Like Humans do

SUTD researchers develop new strategies to teach computers to learn like humans do
Graphics of the generative-replay setup (top left panel) and scheme for training artificial neural network (ANN) with generative replay (top right panel). The normalized electrical current accuracy for the conventional (bottom left panel) and brain-inspired replay (BIR) models (bottom right panel). Credit: SUTD

As demonstrated by breakthroughs in various fields of artificial intelligence (AI), such as image processing, smart health care, self-driving vehicles and smart cities, this is undoubtedly the golden period of deep learning. In the next decade or so, AI and computing systems will eventually be equipped with the ability to learn and think the way humans do—to process continuous flow of information and interact with the real world.

However, current AI models suffer from a per...

Read More

Significant Energy Savings using Neuromorphic Hardware

One of Intel’s Nahuku boards, each of which contains eight to 32 Intel Loihi neuromorphic chips. © Tim Herman/Intel Corporation

For the first time TU Graz’s Institute of Theoretical Computer Science and Intel Labs demonstrated experimentally that a large neural network can process sequences such as sentences while consuming 4X – 16X less energy while running on neuromorphic hardware than non-neuromorphic hardware. The new research based on Intel Labs’ Loihi neuromorphic research chip that draws on insights from neuroscience to create chips that function similar to those in the biological brain.

The research was funded by The Human Brain Project (HBP), one of the largest research projects in the world with more than 500 scientists and engineers across Europe studying the human bra...

Read More