For greater than three years, an IEEE Standards Association working group has been refining a draft customary for procuring artificial intelligence and automatic resolution methods, IEEE 3119-2025. It’s supposed to assist procurement groups establish and handle risks in high-risk domains. Such methods are utilized by authorities entities concerned in schooling, well being, employment, and plenty of different public sector areas. Final 12 months the working group partnered with a European Union company to judge the draft customary’s parts and to assemble details about customers’ wants and their views on the usual’s worth.
On the time, the standard included 5processes to assist customers develop their solicitations and to establish, mitigate, and monitor harms generally related to high-risk AI methods.
These processes had been drawback definition, vendor analysis, answer analysis, contract negotiation, and contract monitoring.
The EU company’s suggestions led the working group to rethink the processes and the sequence of a number of actions. The ultimate draft now contains a further course of: solicitation preparation, which comes proper after the issue definition course of. The working group believes the added course of addresses the challenges organizations expertise with making ready AI-specific solicitations, reminiscent of the necessity to add clear and strong knowledge necessities and to include questions relating to the maturity of vendor AI governance.
The EU company additionally emphasised that it’s important to incorporate solicitation preparation, which provides procurement groups extra alternatives to adapt their solicitations with technical necessities and questions relating to accountable AI system selections. Leaving area for changes is very related when acquisitions of AI are occurring inside emerging and changing regulatory environments.
Gisele Waters
IEEE 3119’s place within the requirements ecosystem
Presently there are a number of internationally accepted requirements for AI management, AI ethics, and basic software acquisition. These from the IEEE Standards Association and the International Organization for Standardization goal AI design, use, and life-cycle administration.
Till now, there was no internationally accepted, consensus-based customary that focuses on the procurement of AI instruments and affords operationalsteering for responsibly buying high-risk AI systems that serve the general public curiosity.
The IEEE 3119 standard addresses that hole. Not like the AI administration customary ISO 42001 and different certifications associated to generic AI oversight and threat governance, IEEE’s new customary affords a risk-based, operationalapproach to assist authorities businesses adapt traditional procurement practices.
Governments have an vital function to play within the accountable deployment of AI. Nonetheless, market dynamics and unequal AI experience between business and authorities could be boundaries that discourage success.
One of many customary’s core objectives is to raised inform procurement leaders about what they’re shopping for earlier than they make high-risk AI purchases. IEEE 3119 defines high-risk AI methods as people who make or are a considerable think about making consequential decisions that would have vital impactson folks, teams, or society. The definition is much like the one utilized in Colorado’s 2034 AI Act, the primary U.S. state-level legislation comprehensively addressing high-risk methods.
The usual’s processes, nevertheless, do complement ISO 42001 in some ways. The connection between each is illustrated under.
Worldwide requirements, usually characterised as soft law, are used to form AI improvement and encourage international cooperation relating to its governance.
Laborious legal guidelines for AI, or legally binding guidelines and obligations, are a piece in progress all over the world. Within the United States, a patchwork of state legislation governs completely different features of AI, and the strategy to nationwide AI regulation is fragmented, with completely different federal businesses implementing their very own tips.
Europe has led by passing the European Union’s AI Act, which started governing AI methods based mostly on their threat ranges when it went into impact final 12 months.
However the world lacks regulatory hard laws with a world scope.
The IEEE 3119-2025 customary does align with current laborious legal guidelines. On account of its give attention to procurement, the usual helps the high-risk provisions outlined within the EU AI Act’s Chapter III and Colorado’s AI Act. The usual additionally conforms to the proposed Texas HB 1709 laws, which might mandate reporting on using AI methods by sure enterprise entities and state businesses.
As a result of most AI systems used in the public sector are procured reasonably than constructed in-house, IEEE 3119 applies to business AI services and products that don’t require substantial modifications or customizations.
The usual’s target market
The usual is meant for:
- Mid-level procurement professionals and interdisciplinary staff members with a reasonable degree of AI governance and AI system data.
- Public- and private-sector procurement professionals who function coordinators or patrons, or have equal roles, inside their entities.
- Non-procurement managers and supervisors who’re both chargeable for procurement or oversee workers who present procurement features.
- Professionals who’re employed by governing entities concerned with public schooling, utilities, transportation, and different publicly funded companies that both work on or handle procurement and are concerned about adapting buying processes for AI instruments.
- AI distributors searching for to know new transparency and disclosure necessities for his or her high-risk business merchandise and options.
Coaching program within the works
The IEEE Standards Association has partnered with the AI Procurement Lab to supply the IEEE Responsible AI Procurement Training program. The course covers apply the usual’s core processes and adapt present practices for the procurement of high-risk AI.
The usual contains over 26 instruments and rubrics throughout the six processes, and the coaching program explains use many of those instruments. For instance, the coaching contains directions on conduct a risk-appetite evaluation, apply the seller analysis scoring information to research AI vendor claims, and create an AI procurement “threat register” tied to recognized use-case dangers and their potential mitigations. The coaching session is now accessible for buy.
It’s nonetheless early days for AI integration. Determination makers don’t but have a lot expertise in buying AI for high-risk domains and in mitigating these dangers. The IEEE 3119-2025 customary goals to assist businesses construct and strengthen their AI threat mitigation muscle tissues.
From Your Website Articles
Associated Articles Across the Internet