@ShahidNShah
NIST unveils new open source platform for AI safety assessments
The freely-downloadable tool, called Dioptra, is designed to help artificial intelligence developers understand some unique data risks with AI models, and to help them "mitigate those risks while supporting innovation," says NIST's director.
Nearly a year since the Biden Administration issued its executive order on Safe, Secure and Trustworthy Development of AI, the National Institute of Standards and Technology has made available a new open source tool to help test the safety and security of AI and machine learning models.
Read on healthcareitnews.com
Continue reading at healthcareitnews.com
Make faster decisions with community advice
Next Article
-
5 Innovative Pain Management Techniques for Optimal Body Functioning
Pain, in its many forms, will severely impact anyone’s daily activities. For instance, from limiting mobility to affecting our mental well-being. However, medical advancements seem to have paved …
Posted Aug 1, 2024 Healthcare