NIST unveils new open source platform for AI safety assessments

NIST unveils new open source platform for AI safety assessments

The freely-downloadable tool, called Dioptra, is designed to help artificial intelligence developers understand some unique data risks with AI models, and to help them "mitigate those risks while supporting innovation," says NIST's director.

Nearly a year since the Biden Administration issued its executive order on Safe, Secure and Trustworthy Development of AI, the National Institute of Standards and Technology has made available a new open source tool to help test the safety and security of AI and machine learning models.

Read on healthcareitnews.com




Next Article

Did you find this useful?

Medigy Innovation Network

Connecting innovation decision makers to authoritative information, institutions, people and insights.

Medigy Logo

The latest News, Insights & Events

Medigy accurately delivers healthcare and technology information, news and insight from around the world.

The best products, services & solutions

Medigy surfaces the world's best crowdsourced health tech offerings with social interactions and peer reviews.


© 2024 Netspective Foundation, Inc. All Rights Reserved.

Built on Dec 20, 2024 at 12:59pm