In theory, yes it’s impossible. You may have laws and licensing but it may be difficult to prove anything. However, there may also be technical things you can do to prevent certain usage. For example, an attack called GLAZE (https://glaze.cs.uchicago.edu/) can make image styles harder to mimic for many common text-to-image models. Nothing is foolproof though and new models can make this obsolete
In theory, yes it’s impossible. You may have laws and licensing but it may be difficult to prove anything. However, there may also be technical things you can do to prevent certain usage. For example, an attack called GLAZE (https://glaze.cs.uchicago.edu/) can make image styles harder to mimic for many common text-to-image models. Nothing is foolproof though and new models can make this obsolete