peshkov - stock.adobe.com

News Stay informed about the latest enterprise technology news and product updates.

Gauging the right amount of government regulation of AI

Government AI regulation is needed, AI experts and policymakers concur. But, there's no clear consensus on how much regulation by the government is necessary.

WASHINGTON, D.C. -- As organizations begin moving AI technologies out of research and testing and into deployment, consumers, technologists, policymakers and businesses likewise have started to understand just how much AI is changing the world and that some kind of government regulation of AI is necessary.

Already, AI is dramatically boosting productivity, helping connect people in new ways, and improving healthcare. When used wrongly or carelessly, AI can also do the opposite; it cuts jobs and has the potential to kill.

A powerful tool

Like any powerful force, AI requires rules and regulations for its development and use to prevent unnecessary harm, according to many in the scientific community. Just how much regulation, especially government regulation of AI, is still open to much debate.

Most AI experts and policymakers agree that at least a simple framework of regulatory policies is needed soon as computing power increases steadily, AI and data science startups pop-up almost daily, and the amount of data organizations collect on people grows exponentially.

"We're dealing with something that has great possibilities, as well as serious [implications]," said Michael Dukakis, former governor of Massachusetts, during a panel discussion at the AI World Government conference here.

Governments leading the way

Many national governments have already put in place guidelines, if sometimes vague, about how data should and shouldn't be used and collected.

We're dealing with something that has great possibilities, as well as serious [implications].
Michael DukakisFormer governor of Massachusetts

Some regulatory rules also govern how AI should be explainable; now, many AI algorithms run in a black box, or their inner-workings are considered proprietary technology and are sealed off from the public. The U.S. recently updated its guidelines on data and AI, and Europe recently marked the one-year anniversary of its GDPR.

Many private organizations have moved to set internal guidelines and regulations for AI, and have made such rules public in the hope that other companies will adopt or adapt them. The sheer number of different guidelines that various private groups have established indicates the wide array of different viewpoints about private and government regulation of AI.

"Government has to be involved," Dukakis said, advocating for government regulation of AI.

"The United States has to play a major, constructive role in bringing the international community together," he said. Countries around the world must come together for meaningful debates and discussions, eventually leading to potential international government regulation of AI, he said.

Or governments should step back

Bob Gourley, CTO and co-founder of consulting firm OODA LLC, agreed that governments should be involved but said their power and scope should be limited.

"Let's move faster with the technology. Let's be ready for job displacement. It's a real concern, but not an instantaneous concern," Gourley said during the panel discussion.

Regulations, he argued, would slow technological growth, although he noted AI should not be deployed without being adequately tested and without adhering to a security framework.

During other panel discussions at the conference several speakers argued that governments should take their lead from the private sector.

Government regulation of AI, AI World Government, Michael Dukakis
AI World Government panel with former Massachusetts governor Michael Dukakis.

Organizations should focus on creating transparent and explainable AI models first before governments concentrate on regulation, said Michael Nelson, a former professor at Georgetown University.

Lack of explainable or transparent AI has long been a problem, with consumers and organizations arguing that AI providers need to do more to make the inner-workings of algorithms easier to see and understand.

Nelson also argued that too much government regulation of AI could quell competition, which, he said, is a core part of innovation.

Lord Tim Clement-Jones, former chairman of the United Kingdom's House of Lords Select Committee on Artificial Intelligence, agreed that regulation should be minimized, but noted that it can be positive.

Governments, he said, should start working now on AI guidelines and regulations.

Guidelines like the GDPR have been effective, he said, and have laid the foundation for more focused government regulation of AI.

Dig Deeper on Artificial intelligence legal issues and compliance

Join the conversation

1 comment

Send me notifications when other members comment.

Please create a username to comment.

Should the government regulate AI? If so, how?
Cancel

-ADS BY GOOGLE

SearchBusinessAnalytics

SearchCIO

SearchDataManagement

SearchERP

Close