6 comments

  • rlpb 9 hours ago

    It was always possible to create these kinds of images. Only you needed to be (or find) a highly skilled yet unscrupulous artist to do it, which didn't happen in practice. Now AI has commoditised this. It's not going to go away.

    Regulating services can only go so far because as long as general purpose computing exists people will eventually be able to perform all of these abuses locally.

    I think the only solution that will work in practice is to go after the abusers based on their intent. Going after technology providers is never going to work because the technology is fundamentally general purpose. Wherever the line is drawn, it will always be possible for abusers to take it and specialise it for abuse locally.

    Edit: to be clear, I can't think of a legitimate use for this service and it sounds like their behaviour is abusive and they should be shut down. But that won't stop the abuse because sooner or later abusers won't need a service to carry out this abuse. They'll be able to use a generic tool to do it locally instead.

    • Frieren 8 hours ago

      > It was always possible to create these kinds of images. Only you needed to be (or find) a highly skilled yet unscrupulous artist to do it, which didn't happen in practice.

      Because that human being would have been sent to trial and probably prison. Why nobody is going to prison now?

      > Going after technology providers is never going to work because the technology is fundamentally general purpose.

      A tech guy telling the public that "Going after technology providers is never going to work" seems very biased. I would propose the opposite. To send to prison all these CEOs that create tech that harms people, specially minors. They are getting the profits, they should be paying the price too.

    • aiiizzz 7 hours ago

      I think that once it's an everyday occurrence, it's not a problem anymore. But right now it still is, because people don't know.

    • Avshalom 8 hours ago

      the "technology provider" here is a company that makes an app to make nudes of anybody without their consent. That's not actually very general purpose.

    • ndsipa_pomu 9 hours ago

      This is equivalent to unscrupulous artists blatantly advertising their services. It's difficult to even think of a non-abusive reason to provide or use this kind of service.

  • ndsipa_pomu 10 hours ago

    I can' see how this kind of service can be at all legal if it can produce CSAM images - there's been people prosecuted for producing drawings of underage children which arguably don't actually harm anyone, whereas this app can drastically harm people, young and old.