AI Stable Diffusion, are these images too close to the real thing?

haha, yes, if you read between the lines of a lot of our laws it becomes ‘if you do this, get a lawyer to argue it for you’. ← they’re not even arguing points of law, merely convincing a jury.

Our system, being based off the British, is founded upon a conservative (catholic) system which has then been modified via emotive public opinion/knee jerk reactions. When a district court judge (it’s kind of the middle level of importance) makes a ruling that actually works, there’s uproar because Australia wants to jail its way out of the problem.

Here’s the interesting kicker, in Australia the age of consent is 16 but for registered child sex offenders it becomes 18. Like I said, laws created by emotive reaction.


Recently I learned a number of users have gotten permanent bans from some forums bbecause they created AI images trained up on pictures of real children. Mary Kate & Ashley Olsen being one. Raises the question; if a famous child actor has hundreds of photos of themselves out on the internet, none of them pornographic in any way; are they fair game for people to use in training their AI generated art or lolicon?

I agree overall that it’s twisting something into a use for something it was not intended. The average person who may have pictures of themselves on the internet in social media spaces of them as a minor or their children should not be subverted for such things.