Skip to main content
Search
Menu

Regulation of AI – is it stifling innovation?

The new EU AI law imposes new requirements on companies developing AI solutions. The aim is to give EU citizens confidence in the technology. But what are the rules and what does it mean for companies working on new, innovative AI systems? 

These are exciting times. The development of AI has been compared to the leap forward we saw during the Industrial Revolution.  

"It has moved up at least one gear with the realisation that AI can create so much value. Companies are investing huge amounts in developing the underlying technology," says Sverker Janson, senior researcher and head of the Centre for Applied AI at RISE.  

Value-added AI creates willingness to invest 

He sees two main trends in artificial intelligence today. One is where AI creates so much value that companies are willing to invest heavily.

"Good examples of this are the pharmaceutical and telecoms industries. There, the use of AI is seen as absolutely critical to the core business.

The second is where AI tools are cheap and readily available, such as the various language models or other generative AI tools.

"A language model such as ChatGPT can be used interactively as an assistant, or in a more specialised way, such as in different types of customer service. All major companies and organisations are exploring such solutions."

But with all the possibilities, there are also concerns that the technology could be used by forces that want to destabilise. The World Economic Forum's Global Risk Report 2024 lists AI-generated misinformation and disinformation as one of the biggest risks in both the short and long term.

This is one of the reasons why the EU has adopted the AI Act - a regulation that aims to reduce the risks of AI tools by regulating the development of AI systems. The AI Act entered into force in August 2024, and will be phased in over the next few years.

"The AI Act introduces a definition of what constitutes an AI system for legal purposes. It also requires certain types of AI products to be CE marked," says Susanne Stenberg, senior researcher and legal expert at RISE.

Regulation introduced with citizens in mind 

The reason for introducing the regulations is a concern for EU citizens.

"As citizens, we need to be able to trust AI as a technology. You don't know how your toaster works. But you trust the technology and buy it without knowing everything about the manufacturing process behind it. The EU wants to achieve the same thing with the AI Act: namely product safety," says Håkan Burden, senior researcher at RISE, who works with Susanne Stenberg on issues related to the EU's attempts to regulate digitalisation.

"The purpose of the AI law is to protect health, safety and fundamental rights. This means that not all AI tools are covered by the regulation.

Håkan Burden explains:

"If you have a system today that uses AI-based technology to guide people to the right aisle in your store, you are more or less unaffected.

Susanne Stenberg continues:

"But if it's an AI tool that decides in which order to send the police, ambulance or fire brigade and to which location, then it has to be CE marked."  

The AI Act does not replace other regulations, such as the GDPR, data protection rules, intellectual property rights and so on. It complements them and aims to clarify who is responsible if, for example, an AI tool has contributed to discrimination in a recruitment process.  

New regulations will increase the requirements and costs for companies involved in the development of such products. This raises the question of whether regulating AI will slow down innovation.  

"A new company developing a new innovative solution may not be able to spend half a million on organising certification," says Janson. 

We need to require products to meet certain minimum standards, whether it's a saw blade or software

Additional cost – not necessarily a disadvantage 

Additional costs can be a barrier. But both Susanne Stenberg and Håkan Burden do not see this as a problem from a societal perspective.

"We have to demand that products meet a certain minimum standard, whether it's a saw blade or software," says Susanne Stenberg.

"The AI Act may mean that some small or medium-sized companies choose to develop a chatbot for a department store instead of developing an AI system for grading, because the requirements for the latter require you to have a quality management system in your own organisation," says Håkan Burden, and continues:

"On the other hand, if you're not prepared to do internal quality management work, maybe you shouldn't be grading students in a school situation."

The AI Act could make it more difficult for smaller companies to enter certain markets. To counter such a development, Mr Burden cites several examples of how the EU supports innovative SMEs.

"There is a huge innovation programme within the EU where a lot of test and experimentation facilities (TEFs) are being set up, and RISE is active in most of them, so that SMEs can get access to the expertise they need to get CE marking on their products."  

Organisational capacity and skills to avoid bottlenecks 

"It requires maturity and investment from a number of different actors in society, and the AI Act shows that this need comes at a cost," says Susanne Stenberg.

"Companies need to build up organisational capacity, and public authorities need AI expertise. We also need someone who can issue certificates so that we can avoid the bottlenecks that can prevent us from getting the AI tools that industry needs."

"We also need research - and not just into generative AI. There are many different directions and a great need for applied research.

RISE has a role to play in all of this.  

"We now have a three-year window in which small and medium-sized companies can come to us and do research," says Håkan Burden.

Susanne Stenberg continues:

"RISE as a player can help in different ways. It could be a company that wants to develop a system and take it further in a research project, or it could be decision-makers in the public sector who need help in understanding an AI system that they are about to buy."

"It could also be about what kind of AI an organisation needs. Just because there's a hype around a language model at the moment doesn't mean it's the right one for your business or organisation."

Susanne Stenberg

Contact person

Susanne Stenberg

Senior Forskare/Rättslig expert

+46 73 398 73 41

Read more about Susanne

Contact Susanne
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

* Mandatory By submitting the form, RISE will process your personal data.

Sverker Janson

Contact person

Sverker Janson

Enhetschef

+46 70 544 33 54

Read more about Sverker

Contact Sverker
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.

* Mandatory By submitting the form, RISE will process your personal data.