It’s good that Washington legislators are drafting legal guidelines to handle hurt brought on by synthetic intelligence programs.
However extra are wanted to indicate coverage management befitting a state enjoying a serious function in growing and powering AI and the brand new period of expertise it’s promising.
What’s lacking are proposals to guard information retailers and different creators of digital media from being ripped off by AI firms coaching their programs and producing artificial content material based mostly on the work of others.
Publishers, actors, authors and others have sued to guard their mental property, and at the very least one different state has strengthened legal guidelines to guard its music trade from digital thievery.
All 50 states launched AI laws final yr, together with 38 that handed round 100 AI payments, according to the Nationwide Convention of State Legislatures.
A handful of payments now being debated in Olympia are largely derived from different states’ work.
I recommend Washington additionally create a model of the ELVIS Act that Tennessee handed in 2024 to guard its music trade, and broaden it to additionally defend information organizations.
Washington Sen. Maria Cantwell, in partnership with Tennessee Sen. Marsha Blackburn, is doing that nationally.
Their COPIED Act would defend not simply musicians and songwriters however actors and journalists who’re seeing their work taken and monetized by AI companies.
Cantwell told me final yr that “you really want to rein in these firms who principally are cannibalizing the content material, and now could take the content material in a serious method with AI.”
Washington’s Legislature might add protections for copyrighted materials to one of many AI payments on the desk or via a brand new, stand-alone invoice.
Tennessee’s regulation prevents unauthorized use of individuals’s names and likenesses. Its protections additionally apply to materials owned by artists’ heirs.
Its ELVIS Act prolonged these protections to voices and strengthened copyright safety — it stands for Making certain Likeness, Voice and Picture Safety, with a nod to the late King of Rock ‘n’ Roll.
The federal COPIED Act requires transparency, together with federal requirements “to establish if content material has been generated or manipulated by AI, in addition to the place content material originated,” Cantwell and Blackburn stated of their announcement.
COPIED stands for “Content material Origin Safety and Integrity from Edited and Deepfaked Media.”
Their invoice would prohibit unauthorized use of watermarked content material. It could allow content material house owners, together with musicians and journalists, to guard their work and receives a commission for its use by AI suppliers.
It authorizes enforcement by the Federal Commerce Fee and state attorneys basic, plus lawsuits by artists, information retailers and others whose content material is used with out permission.
Washington doesn’t have Nashville, however it has a strong music trade that wants assist and safety.
Washington additionally has greater than 100 at-risk small companies offering important, native journalism. They desperately want policymakers’ assist to get pretty compensated by tech companies benefiting from their work.
Defend all of them, legislators, with a model of the ELVIS Act that features the broader scope and journalism protections of the Cantwell invoice.
Gov. Bob Ferguson was receptive to this concept once I pitched it throughout a Jan. 9 interview in Olympia.
“I’d be very fascinated by that,” he stated.
“I’ve spoken rather a lot about this at conferences … making an attempt to maximise the advantages of AI and restrict the hurt, whether or not it’s minors and chatbots or artists and information media and their use of it,” he stated, “so (I’m) tremendous that and I’ll be curious if anybody in Olympia this yr is proposing one thing related.”
At Ferguson’s request, the state Home and Senate are contemplating payments regulating AI companion chatbots, which have been linked to teen suicides.
Senate Invoice 5984 and House Bill 2225 would require chatbot builders to “implement and publicly disclose protocols to detect and reply to self-harm or suicidal ideation, together with referrals to disaster sources,” as described by Ferguson’s announcement.
I’m much less passionate about House Bill 2157 to forestall discrimination by AI firms.
HB 2157 sounds good on the floor. It’s supposed to guard folks from discriminatory selections made by AI-powered programs, similar to rejections for loans and insurance coverage.
However HB 2157 doesn’t allow folks to enchantment selections made by AI programs, until they provoke a expensive lawsuit, and it doesn’t have enforcement provisions like related payments elsewhere. I’m wondering if it could truly defend banks and insurance coverage firms; they’re exempted in the event that they comply with an inventory of finest practices and trade requirements.
HB 2157 can be a posh hairball that might mire legislators and derail progress on different work of their brief 2026 session.
I say that based mostly on quite a few objections raised throughout a listening to final Wednesday and what occurred in different states.
Colorado led with an analogous AI discrimination coverage in 2024. Nevertheless it prompted massive debates throughout a particular session final yr and was delayed. Virginia legislators handed an analogous invoice in 2025 solely to see it vetoed.
One other proposal in Olympia, Home Invoice 1170, would require giant AI programs to offer an “AI detection instrument” to customers and add disclosures figuring out AI content material.
On the hearing Wednesday, an inexpensive concern was raised about HB 1170 infringing on the First Modification, by inserting authorities messages into noncommercial speech.
The COPIED Act has the identical transparency objective. However as a substitute of mandating labels, it calls for “tips and requirements for content material provenance data, watermarking and artificial content material detection.”
Cantwell and Blackburn first proposed this in 2024 and it hasn’t superior since they reintroduced it final April. Maybe Washington, the place many essential expertise requirements have been developed, might transfer this ahead within the AI period.
Whereas it’s lagged on the COPIED Act, Congress fought arduous, with robust bipartisanship, to guard states’ proper to control AI firms. It rejected efforts by President Donald Trump and cronies to sideline states and bar them from passing AI legal guidelines.
Tech giants pitch a match about any regulation. They’ll nonetheless be tremendous. However the public gained’t be, if it loses company, its potential to discern actuality and genuine sources of reports and media, all to a flood of opaquely operated supercomputers.
So hold at it, legislators. Put constituents first and strengthen safety for materials created in Washington that’s being scraped and regurgitated to counterpoint trillion-dollar AI firms.
Washington can defend innovation, shoppers and content material creators on the similar time, by drawing on Cantwell’s proposal and Tennessee’s coverage.
However as a substitute of Elvis, how about memorializing Aberdeen native Kurt Cobain with the KURT Act: Kill Unauthorized Replication and Taking.

