A highly-advanced artificial intelligence model developed tens of thousands of lethal compounds that could be used in chemical weapons in just a few hours.
As part of their aim to find cures to the world’s deadliest diseases, a team of scientists at North Carolina’s Collaborations Pharmaceuticals developed an advanced AI algorithm to help them look for compounds that could be made into cures.
The team, however, also decided to see the extent of damage their model could do should it get in the wrong hands. And so, they “flipped a switch” in their algorithm to have the AI search for compounds lethal to humans instead.
After just six hours in the so-called “bad mode,” the model created 40,000 new chemical weapons compounds including new versions of toxic nerve agents that can be lethal in tiny doses.
As the researchers suggested, it was scary to think that the AI was able to create a compound as dangerous as the notorious VX chemical warfare agent in a matter of hours.
Speaking to The Verge, lead author Fabio Urbina revealed that the inspiration for the test came after an invite to the Convergence conference by the Swiss Federal Institute for Nuclear, Biological and Chemical Protection.
“The idea of the conference is to inform the community at large of new developments with tools that may have implications for the Chemical/Biological Weapons Convention,” Urbina said.
“We got this invite to talk about machine learning and how it can be misused in our space. It’s something we never really thought about before.
“But it was just very easy to realize that as we’re building these machine learning models to get better and better at predicting toxicity in order to avoid toxicity, all we have to do is sort of flip the switch around and say, ‘You know, instead of going away from toxicity, what if we do go toward toxicity?’”
As Urbina added, the researchers were surprised to discover compounds “more toxic than VX” – which is one of the deadliest chemicals known to humans – just hours after turning on the ‘bad mode.’
“For me, the concern was just how easy it was to do.point 338 |
A lot of the things we used are out there for free.point 40 | You can go and download a toxicity dataset from anywhere.point 88 | If you have somebody who knows how to code in Python and has some machine learning capabilities, then in probably a good weekend of work, they could build something like this generative model driven by toxic datasets,” Urbina warned.point 285 | 1
“So that was the thing that got us really thinking about putting this paper out there; it was such a low barrier of entry for this type of misuse.”
What are your thoughts on this matter? Let us know in the comments and don’t forget to SHARE this post with your family and friends. For more news and stories, follow us on Facebook!