Home » Medtech Expert Says EU’s AI Act Could Face An Uphill Political Battle
Europe European Union General News Global News Politics US

Medtech Expert Says EU’s AI Act Could Face An Uphill Political Battle

Cincinnati, OH – If the EU is unable to pass a hotly anticipated artificial intelligence/machine learning (AI/ML) law by year’s end, it may face an uphill battle due to conflicting political interests, Brandon O’Shea said at the AI Summit held by the AFDO/RAPS Healthcare Products Collaborative.

O’Shea, senior regulatory affairs manager at GE Healthcare, noted that a lot has happened in the past year regarding AI/ML regulation in the EU, US and China. “‘Noisy’ is the way I would describe it,” he said.

The confusion stems from the fact that a lot of different people, with a lot of different views, are working in each of those regulatory regimes to develop AI/ML regulations that don’t just represent the medtech industry but touch on other concerns such as labor, education, gaming, surveillance, and even how such technology will affect job security. This, in turn, he added, is drawing in many more policymaking stakeholders outside the medtech industry.

O’Shea said the most active and lively AI/ML regulatory regime right now is in Europe, where the European Commission (EC), the Council of the European Union, and the European Parliament are in trialogue discussions to create a comprehensive legal framework under the AI Act first proposed in early 2021. The proposed law uses a tiered risk-based approach to regulating AI/ML products across industries.

Despite significant progress, O’Shea noted that legislation stalled on 10 November when EU parliamentarians decided to end negotiations on the AI Act prematurely, leaving the framework’s future in question just as the EU is entering an election year.

“It is unclear at this moment whether the framework we’ve been looking at the past several years will actually come through,” he said.

O’Shea warned that as the EU goes into election season this spring, different political interests may want to insert their own ideas into the proposed legislation – changes that could be the proposal’s undoing. He added that if the legislation is renegotiated, it could come out better or worse, but ultimately, he prefers to take the legislation that he’s already familiar with.

“That is Europe. It is noisy, and it is unknown,” said O’Shea. “They are debating that, and we’ll see how that shakes out by next year.”

Despite the uncertainty about the AI Act, he noted that notified bodies are already basing decisions on the legislation shaping the European medtech sector.

By comparison, O’Shea noted that the US is in a very different political state. He said that AI/ML policy has been tough for the medtech industry in the US, even though there has been some progress due to provisions in the 2022 Food Drug Omnibus Reform Act (FDORA) addressing topics such as predetermined change control plans (PCCP).

“While there are many, many different considerations and voices and topics going through the policy space right now, the [US Food and Drug Administration (FDA)] has a job to do, and they want to show that the right agency to do this job,” O’Shea said.

He also talked about how China’s National Medical Products Administration (NMPA) has been regulating AI/ML medical devices and noted the agency is extremely fast to put out new guidances.

“When they build a guidance, it’s like building a giant skyscraper overnight,” he said. “They build in isolated spaces, and they build it all the way to the top, and it’s fully furnished.”

O’Shea noted that NMPA isn’t slowing down, and in July alone, the agency published four new AI/ML software guidances. Unlike the EU, where policymakers are trying to create a level playing field for everyone in the AI/ML space, he said China has been addressing specific issues related to the technology and published new guidelines very quickly that makes it hard for medtech companies to figure out what they need to do to comply.

“You have to pay very, very close attention to what product you are working with and working with the region to do the best you can to get through this efficient process and convince them that you’re meeting the needs of this framework that they’ve outlined or convince them… that you do not have AI in your product,” said O’Shea.

Pat Baird, regulatory head of global software standards at Philips, told Focus that the FDA is probably ahead of Europe and China in applying AI/ML regulations. He’s concerned about the volatility of where the AI Act is in Europe and thinks the regulatory requirements set by NMPA are likely too high.

“Of course, I’m biased because this is what I’m used to and grew up with, but I do [think FDA is ahead] because PCCP is so innovative in concept,” said Baird. Unlike China’s approach to regulating AI/ML, he noted that the FDA asks stakeholders to think ahead about the implications of AI/ML products and make a long-term plan.

Source: raps

Translate