I’d be curious to know you and/or your guest’s views on how to deal with the possibility of wide-scale consumer noncompliance/indifference regarding any restrictions placed on training data and the models created from that, particularly as consumer computer hardware continues to advance and allow more and more complex models to be built by individuals and small groups. Basically, I’m asking “what happens if there’s a use case where rights holders say ‘no’ and the general public says ‘we’re doing it anyway’?” The “sue end users” model worked very poorly for the music industry in the 2000’s, and I think there’s a chance of an overly restrictive copyright regime for AI being rendered similarly unenforceable.
@jennanderson-miller61284 ай бұрын
This is a good question, and one that is actually best suited for law. Since tools are so sophisticated, and we cannot police everything, everywhere, inevitably, people will infringe, as they always have. However, most systems tend to look for the offenders of scale, i.e. commercial users profiting on the infringement. I think the laws are critical and effective when dealing with businesses who have built models using unethical practices. This is also why we need laws to be established so they can catch up to the current practices which are unregulated. I wish there were better answers, but right now, it is all being shaped in real time with progress happening too quickly.
@pokepress4 ай бұрын
@@jennanderson-miller6128 I appreciate your response. I probably should have prefaced that I’m not entirely convinced that the use of copyrighted training data merits compensation, depending somewhat on the purpose of the resulting model. My question stems from the possibility that the general public will also see it that way and thus will gravitate towards whichever service meets their needs, regardless of how the data was sourced. I think some of this is going to depend on how court cases are argued and decided, how entrenched the AI suppliers are, etc., so it’s hard to predict, but I think it’s a legitimate possibility that needs to be considered.