Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

California recently signed some legislation into effect. It’s a start. Congress is working on “No Artificial Intelligence Fake Replicas And Unauthorized Duplications Act.” Still in dev in the House, but has bipartisan support.

Call your congressperson, ask them to co-sponsor and/or vote for it.

https://www.cbsnews.com/losangeles/news/california-bills-pro...

https://salazar.house.gov/media/press-releases/salazar-intro...

https://files.constantcontact.com/1849eea4801/695cfd71-1d24-...



No doubt it’s bipartisan!

Politician’s careers live and die in the fickle Court of Public Opinion. They’re probably the most susceptible cohort to AI fakes.

One of the rare times, it seems, that politician’s incentives are aligned with the populous. (Yes, I could have left that last part out.)


I'm sure that impersonation of Taylor Swift really scared the represenatives. A billionaire trying to copy the likeness of another billionaire for votes. I'm sure even some hard staunched conservatives realize how badly that can end.


Seems silly. What if I train my model on somebody who sounds like a somebody?


That it is copied or soundalike does not matter. There are preceding legal cases against using impersonators in commercials that would apply here. It has happened to Tom Waits twice, and both times he sued and won: https://www.youtube.com/watch?v=H6y1kc8Equk

The point is that Geerling is doing Youtube videos in the same field as the products this company is using his voice for. That makes it appear as if Geerling would be endorsing their products. If his voice had been used for, say, a nature documentary then people wouldn't have made the connection that it would have been him.


I imagine it would then depend on the intent and how that voice was presented.

If you got Bob who sounds awfully like Mr Very Famous Guy to record vocal that you then use to train your AI and use that vocal clone to sell your nutritional yeast extract as though it’s Mr Very Famous Guy selling it that would likely be a problem.

If you used the vocal clone to sell it, but said something like “oh hey it’s Bob here lots of people tell me how much I sound like Mr Very Famous Guy but I’m not him” then Mr Famous might have a case for his name being used without permission, but probably not the vocal clone.

But it’s all so new and there’s no precedent.

Given the lawyers are all busy working out whether using copyright protected books and music to train generative AI is legal or not - and have good arguments on both sides - it’s all a bit unclear how stuff like this will work out in the end.


This Ain't Very Famous Guy: An AI Parody


You go to court and see what happens.


Oh so its like patents


But why isn't it like copyright?

Actually that's easy: it wouldn't be profitable, yet.


Then you'd be able to prove that you actually did that, for one.


That's what happened with Scarlett Johanssen's voice in the OpenAI thing, right? Or at least I think it was a claim at the time.


She wrote a strongly worded tweet and people outraged for 0.03 ms and moved on to the next thing.


They moved on because OpenAI immediately removed the voice.


They removed a voice. The one that was removed didn’t sound much like her, but I guess it was the closest.


clearly her, her friends, and a lot of fans disagreed. and OpenAI didn't hedge their bets on "it didn't sound much like her", so I'm inclined to believe it was the closest.


I mean, it’s the same as trademark disputes; legal standards will slowly be cobbled together from statutes, regulations, and random judges setting precedent. “Confusion in the marketplace” seems like a potentially relevant term — accidentally producing a product similar to an existing person’s voice is one thing, but publishing it in a manner and/or context that makes it seem like that person recorded the lines is something else entirely.

Anyway, given how the election is shaking out on Twitter, I have a feeling political usage will spark legislation and precedent far before commercial usage does. But that’s just a plain guess


Or any other copyright, for that matter. What if you copied a CC licensed sound-alike knockoff of a pop song, but the owner of the original song's master thought it sounded more like the original? This is just a new expression of an old problem.


And the flip side. What if somebody who sounds like the person you trained on accuses you of stealing their voice? Assuming malice from similarity is going to sometimes lead to wrong results.


If your thing got data of somebody who sounds like somebody else but users use not "somebody" but "somebody else" to generate derivatives then you know your answer:)


I actually like this: it might be the first case of a "support the underdog" phenomenon. Suddenly some voice actor who sounds a lot like the already rich popular one, can get a bunch of gigs by barely talking a bit into an AI and selling their likeness. (doesn't really extrapolate into the future properly, but still not many examples of this)


That’s what courts are for, and sounds like you’d have a defensible case.


They’re stifling creativity with these anti-AI bills! “No AI unathorized duplications”… these regulations are going to hold this country back while others advance. Mark Andreessen is very much against this government overreach


You joke but couldn't I just pay a Chinese company to train model for me that sounds like a certain person and use it with impunity?

Sounding like someone certainly isn't illegal and a foreign company isn't going to respond to a lawsuit that aims to determine the source of their training data.

You'd just not have to pretend to be that person in a deceptive way (parody is legal).


It's going to be an interesting First Amendment question.


Might as well make photoshopping and manual audio tweaking / impersonation illegal, since its the same ballpark, just less effort.


A bit like breaking a door is "the same ballpark" as unlocking it with a key. Or paying with legit currency instead of counterfeit.

Sometimes all you want is the effect, other times it's important that you're accurately representing effort or accounting for other human considerations.


Might as well make forging signatures and identity thief legal. Who is the government to say which squiggles I may and may not write?

Society is about compromises and balancing different needs against each other. Sometimes we go one way, other times we go the other way, there is no one principle that always solve any situation.


no one demands a moratorium on pens and pencils because someone might write illegal squiggles with them. we simply accept the fact that 0.0001% of squiggles will be illegal, harmful, hateful, unethical, unsafe, etc, so we can use the pen and pencil tech to write the other 99.9999% of squiggles.

the current lobbying and legislating efforts seek to outlaw pens and pencils produced by anyone but a handful of US corporation, who only let you use their pens and pencils if you let them look over your shoulder while you write your squiggles.


yes, and if it became, say, 10-20% squiggles we'd have a different conversation. Those 0.0001% will be fought in court. I think it's obvious which industry I'm referring to.


So currently, if I Photoshop a picture of Scarlett Johansen into in an ad for my hot dog stand, that would be unambiguously no problem? Nobody's talking about making AI illegal, but some people seem to think it's a get-out-of-jail-free card for copyright violations, and it's just not.


But I think if you photoshopped someone into an advertising image in a way that made them appear to endorse your product you would probably very quickly be hit (and lose?) a lawsuit right?

So I’m increasingly of the opinion that it’s not the tool that needs to be regulated, but the use of the output.

Clone voices? Fine. Clone voices for deceptive or commercial purposes without the person’s consent? Not fine.

But then how do you prove it, what is deceptive, what is non-consenting voice cloning, yadda yadda.

I imagine we will shortly see a raft of YouTubers adding “do not clone my voice” notices to their channels like the Facebook “by posting this notice you remove all rights for Facebook to steal the copyright in your photos” spam posts that were doing the rounds at one point.


>you would probably very quickly be hit (and lose?) a lawsuit right?

A lawsuit from a private individual or business entity is very different from the federal, state, or municipal government attempting to silence you, the latter is prohibited by the First Amendment.

I find it appalling that this needs to be spelled out.


I wasn’t replying to the comment about the First Amendment, I was responding to a (what I assume was a somewhat satirical) comment about making photoshop illegal.

Fully aware of the difference between a civil lawsuit vs government lawsuit though so you can pop your italics back in the box and rest easy!


We need better liable and slander enforcement. Treat realistic media as a truth claim that is subject to liable laws.


I do think deepfakes or anything done to intentially mislead/impersonate should be illegal, yes. Intent is 90% of the law.


The copyright hell carries on it looks like.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: