Cryptocurrency News18 Countries Unite for AI Security Standards

18 Countries Unite for AI Security Standards

Eighteen countries, including the United States, United Kingdom, Australia, Canada, France, Germany, Israel, Italy, Japan, New Zealand, Nigeria, Norway, South Korea, and Singapore, have collectively published a set of guidelines aimed at bolstering the security of AI models. Released on Nov. 26, the 20-page document urges AI companies to prioritize cybersecurity, emphasizing the need for “secure by design” practices. This initiative, spearheaded by the U.S. Secretary of Homeland Security Alejandro Mayorkas, highlights the critical role of cybersecurity in the rapidly evolving field of AI.

The document offers broad recommendations, such as stringent control over AI infrastructure and continuous monitoring for potential tampering. It also stresses the importance of training staff in cybersecurity. However, it doesn’t address some controversial issues in AI, like the regulation of image-generating models, deepfakes, and data usage in training models, which have led to copyright infringement lawsuits against several AI firms.

Mayorkas emphasized the pivotal moment in AI development, recognizing AI as a profoundly impactful technology, where cybersecurity is essential to ensure its safety and reliability.

These guidelines are part of a broader trend of governmental engagement in AI regulation. Earlier in the month, an AI Safety Summit was held in London, focusing on AI development agreements. Concurrently, the European Union is refining its AI Act to regulate the sector, and U.S. President Joe Biden issued an executive order in October establishing AI safety and security standards. Both EU and U.S. initiatives have faced resistance from the AI industry over concerns of stifling innovation.

Major AI firms like OpenAI, Microsoft, Google, Anthropic, and Scale AI have also played a role in shaping these new security guidelines.

source

Join us

13,690FansLike
1,625FollowersFollow
5,652FollowersFollow
2,178FollowersFollow
- Advertisement -