Joakim Eriksson
Enhetschef
Contact Joakim24 August 2023, 10:51
How can we get AI models to learn locally? This is a research question that the Centre for Applied AI at RISE wants to answer during a multi-year research project. The goal is more efficient, safer and sustainable AI solutions for the smart, smaller devices of the future, found in every home.
The world is getting more and more connected smart devices (IoT devices; Internet of Things), and we have already outnumbered the world's population. There are small devices, such as smart cameras and sensors, as well as larger devices such as laptops, that need to be connected. In many cases, artificial intelligence is involved, and here it varies where these AI models should be located, locally or in the cloud. The Centre for Applied AI at RISE conducts research in this area.
How does the connection work?
There are several types of networks used for communication between connected devices. LoRa WAN and 5G are two examples. LoRa is suitable where you need extra long distances, such as in rural areas or in a city network, but not so high data speeds. 5G is suitable for shorter distances and higher data speeds, such as in the car and at home. So far, a large proportion of IoT devices have used special proprietary protocols, but in the future we need solutions that work for all types of devices. They also want to be able to use the standardised protocols used for communication through the cloud, the Internet protocols, even locally on small devices. This would avoid communication breaches and provide full security all the way to the smallest device from the cloud.
How does AI work on small gadgets?
The microprocessors used in the IoT have been severely resource constrained, but development is progressing rapidly and for the same price you can get tens of times more memory and computing power than ten years ago. Simpler AI accelerators are also starting to be integrated into the chips of IoT devices, making it possible to use more advanced AI functions such as image recognition and sound analysis directly in the device. By having AI capabilities locally, information can be managed without sending it to the cloud. This reduces communication and energy consumption and can also eliminate legal issues that can arise when collected data is transmitted and stored centrally.
- 'We can take cameras with built-in AI models as an example,' says Joakim Eriksson, who is an expert in IoT and research leader in AI platforms at the Centre for Applied AI at RISE. 'If you boost the AI capacity and image processing locally in the camera's microprocessor, you don't need to send as much image material to the cloud. Using AI, you could 'teach' the camera to send only relevant images to the cloud," he explains.
Optimisation benefits efficiency, regulatory requirements and cybersecurity
RISE is therefore researching local adaptations in IoT gadgets and how to train the AI in the cloud more efficiently with less computational capacity. By trimming down the AI models to not need to know everything, the effect is optimised, which requires less resources and computing capacity. To be specific, a coffee maker, for example, does not need to know Swedish kings, the researchers simply want each individual device to become a specialist and only handle relevant data.
There are several ways to shrink AI models. One is to reduce the resolution of the so-called weights (called quantisation). Another is to remove entire layers of connections that are unnecessary for the function. In this way, smart gadgets become smarter in an efficient and resource-saving way. This is desirable in all types of sensors where it is useful to process data locally.
- A sensor that measures vibrations on a bridge collects enormous amounts of data to be sent for analysis. Doing the rough analysis locally and only sending the necessary data would be more resource efficient," Joakim explains.
Another aspect is cyber security. When data is processed locally and not sent to a third part or another country, the risk of security breaches is reduced.
Applied research
As of autumn 2023, the Centre for Applied AI at RISE will have a PhD student in place who will continue working on the research track on how we can make AI models learn locally and with a focus on application areas in industry. It is about the possibilities in lawnmower robots, smart cameras, door sensors, etc.
Do you want to collaborate with RISE and see an application area?
Please contact us!
2024-12-10
2024-11-26
2024-11-04
2024-09-09
2024-09-03
2024-09-02
2024-08-06
2024-04-30
2024-04-29
2024-03-19
2024-02-06
2023-11-15
2023-10-02
2023-09-13
2023-08-24
2023-06-21
2023-06-19
2023-06-02
2023-05-17
2023-05-09
2023-04-27
2023-04-05
2023-04-04
2023-04-04
2023-03-29
2023-03-16
2023-01-31
2023-01-31
2022-12-06
2022-11-15
2022-10-24
2022-10-24
2022-10-20
2022-10-20