112 comments

  • DiscourseFan a day ago
  • cultofmetatron a day ago

    This is absolutely disturbing. While I fully advocate allocating resources to stop child sexual abuse and the pornographic material created during such crimes, no one was hurt here. this was a written story fabricated from the author's mind. Now we're on the very of though crime.

    > Amanda was 10 years old. she went into the bathroom and had sex with a 30 year old man.

    I think it would be ridiculous to say that the above sentence is on the same level as creating or distributing CSAM. Yet the predication of the argument is that the story conjured csam in the user's mind. Basically thought crime.

    • b2ccb2 4 minutes ago

      Absolutely agree, wait until the judge finds out about Bukowski's "The Fiend".

    • 0x3f a day ago

      I'm curious how you feel about images, because it seems we have the same problem: I draw a stick figure with genitals. All good. I put a little line and write '10 year old child', then... illegal? In some places, anyway.

      The difference with text I suppose is that text is _never_ real. The provenance of an image can be hard to determine.

      • cultofmetatron a day ago

        I think the ethics here get complicated. for me the line would be if the AI itself was trained on actual CSAM. as long as no one was sexually violated in the course of creating the final image, I see no problem with it from an ethical perspective; all the better if it keeps potential predators from acting on real children. Wether it does or not is a complex topic that I won't claim to have any kind of qualifications to address.

        • hansvm 21 hours ago

          IIRC, violent crime is increased in people pre-disposed to it when they use outlets and substitutes (consuming violent media, etc). That might not translate to pedophilia, but my prior would be that such content existing does cause more CSA to happen.

          • alexgieg 20 hours ago

            That's incorrect. There have been studies on this. In a few cases seeing depictions of violence causes an urge to act violently, but in the majority of people predisposed to violence it causes a reduction in that impulse, so on average there's a reduction.

            The same has been shown to be the case with depictions of sexual abuse. For some it leads the person to go out and do it. For the majority of those predisposed to be sexual predators it "satisfies" them, and they end up causing less harm.

            Presumably the same applies to pedophiles. I remember reading a study on this that suggested this to be the case, but the sample size was small so the statistical significance was weak.

            • hansvm 18 hours ago

              This review [0] is a bit reductionist and overconfident with some of its adjacent claims, but it includes a decent overview of the studies we've done on the topic and references those for further reading. The effect is weak enough at a societal level that it mostly doesn't make sense to consider (and those effect directions are not supportive of your claim of overall reduction if you want to interpret them as strong enough to matter), but when restricted to groups pre-disposed to violence you do see a meaningful increase in violent behaviors.

              [0] https://www.mdpi.com/2075-4698/3/4/491

        • croes a day ago

          > all the better if it keeps potential predators from acting on real children.

          The big question is if, those pictures could have the opposite effect.

          • mrighele 20 hours ago

            If there is no proof there should be no ban. What if parent is right (more widespread porn caused people to have less sex after all) ?

            This means that a ban caused more harm on real children.

          • delecti 21 hours ago

            That's a valid and interesting question to ask and study, but I don't think it's relevant to the decision of whether it should be illegal.

            • Insanity 21 hours ago

              It is incredibly relevant. If murder is prevented by having people play violent games and live out their fantasy there, isn’t that a good thing?

              I’m not convinced that it would be, but it’s an interesting hypothesis.

              • delecti 19 hours ago

                The comment I replied to was proposing the opposite equivalent, that fake CSAM (written fiction, AI generated images not trained on real CSAM) could increase risk of action.

                I don't think violent video games should be banned, whether they increase or decrease IRL violence (I personally suspect they don't have a significant effect either way). And I don't think "simulated CSAM" (where no actual minors were involved in any part of the creation) should be banned on that basis either (though I don't know enough to guess whether it would tend to increase or decrease actual violations).

            • bmicraft 21 hours ago

              I think that's the most, if not only relevant part to base your decision on

          • pdpi 21 hours ago

            And the followup big question is — how do you measure which effect, if any, occurs in practice?

          • chii 21 hours ago

            So do you believe violent video games induce more violent crimes then?

            • pdpi 21 hours ago

              The issue is a fair bit subtler than that. The analogous question here isn't "do violent video games induce violent behaviour in the general population?" but rather "do violent video games induce violent behaviour in people who already have a propensity for violence?"

              Or, even more specifically, "does incredibly realistic-looking violence in video games induce violent behaviour in people who already have a propensity for violence?". I'm not talking about the graphics being photorealistic enough or anything, I mean that, in games, the actual actions, the violence itself is extremely over the top. At least to me, it rarely registers as real violence at all, because it's so stylised. Real-world aggression looks nothing like that, it's much more contained.

              • tosti 21 hours ago

                Yep. It can definately go both ways. A game like Doom can be a nice way to put off some steam.

      • amiga386 21 hours ago

        Like this sketch where Chris Morris tries to get a (former) police officer to say what is and what isn't an indecent photograph?

        https://www.youtube.com/watch?v=eC7gH91Aaoo&t=1014s

    • rented_mule 20 hours ago

      > Basically thought crime

      Let's go in the opposite direction...

      >> Amanda was 10 years old. she went into the bathroom and had sex with a 30 year old man.

      If the story was real, should Amanda be banned from publishing her own account of her experience later in life? Should she be able to write about the impact it had on her? I think she should have that freedom.

      What if she was 17 years 364 days old and the adult was 18 years 1 day old, assuming the age of consent is 18, and she writes about it being a good experience for her? 16 years old and 20? 4 and 40? Those are increasingly grotesque to me, but I don't know where to draw the line.

      Wait, have I crossed the line in what I've written in this reply? Have we all?

      • mothballed 20 hours ago

        I have no idea about Australia, but in USA it's pretty well established it is a crime to publish CSAM of yourself. Children are prosecuted for sending their own provocative images to others. I can only imagine the punishment would be worse if they distributed them after they were an adult.

        So I would think hypothetically if the words were CSAM, the fact they are the victim publishing their own account would be immaterial to their defense.

        • stvltvs 18 hours ago

          IANAL, but written materials about sexual abuse don't seem to be illegal in the US. For recent-ish publications, see My Dark Vanessa by Russell and Tampa by Nutting.

          (I liked the former which took a thoughtful approach whereas I didn't finish the latter because it just felt like erotica for pedophiles which isn't what I was looking for.)

    • qntmfred a day ago

      > Amanda was 10 years old. she went into the bathroom and had sex with a 30 year old man.

      great, now HN is publishing child sex abuse material ಠ _ ಠ

    • glimshe a day ago

      I gotta say that I'm leaning towards your argument but the quote you provided made me think... Would a prompt able to generate CSAM on an AI be considered itself CSAM?

      • Tade0 a day ago

        IANAL, but:

        If drawings overall are anything to go by it varies greatly by legal system, but most would lean on "yes".

        A generated image would most likely be not made locally, so there the added question of the image being understood as "distributed".

        • benchloftbrunch 21 hours ago

          GP is asking about the text prompt itself, not the generated image. If pure text can qualify as CSAM in Australia then it's a logical question.

          • Tade0 20 hours ago

            Really LLMed this one, thank you for pointing that out.

      • 827a 21 hours ago

        No, because AI makes the economy a lot of money, whereas authors do not.

    • alwayseasy a day ago

      When I read your quote, I was agreeing with you. However, according to the article this very far from the very graphic content of the book in question!

      It feels like a strawman quote.

    • mothballed 21 hours ago

      Will Oz have the balls to ban the Quran as CSAM then? Mohammad had his own interest in 10 year olds.

      • alexgieg 20 hours ago

        That isn't in the Quran though.

        • stvltvs 17 hours ago

          Banning some of the the Hadiths then?

      • IAmBroom 9 hours ago

        Will HN have the balls to ban you for misrepresenting the Quran? That isn't in the Quran.

        • mothballed 8 hours ago

          You got me! The Hadiths that explain the Quran, then.

    • croes a day ago

      They will argue that it could motivate perpetrators who read such stories to act when reading isn’t enough anymore.

      Some logic as for AI generated abuse material.

      You could also argue in the other way that it could prevent real abuse.

      Maybe a study would be useful if such a study doesn’t exist already

      • KumaBear 21 hours ago

        Slippery slope. What about a novel about the main character being a serial killer. Is that where we start saying that's illegal as well?

        • RajT88 21 hours ago

          Jeff Lindsey's Dexter novels come to mind.

      • RajT88 21 hours ago

        From what I recall on the debates about manga ~20 years ago when people were getting in trouble for sexual mangas with young characters, consumers do not escalate their behavior to abuse. There may also be more recent studies. This is definitely a rehash of the same debate though - there should be lots of materials out there.

        • croes 21 hours ago

          It’s not about consumers per se but abuser who consume.

          The Manga doesn’t turn people into abusers but what is the effect on already abusive personalities.

          • RajT88 20 hours ago

            I can appreciate the argument, but it also lends itself to (as Jello Biafra famously said), "Ban Everything" thinking.

            I guess an example for your proposal is the gun laws for not allowing convicted domestic abusers firearms. But a comic is nowhere near equivalent to a firearm. I think the argument is fraught.

      • myrmidon 21 hours ago

        I think that whole argument is very weak.

        You would need to apply the same standards to physical violence/general crime to avoid (justified) accusations of double standards, and I don't see Australia banning "Breaking Bad" anytime soon.

      • galangalalgol 21 hours ago

        How would such a study be done ethically?

    • OskarS a day ago

      > Basically thought crime

      I 100% agree with your central point, and I do think this is a very disturbing ruling. But it's not "thought crime", it's speech regulation. There's a very big difference between thought crime as in 1984 and speech regulation. There are many ways societies regulate speech, even liberal democratic ones: we don't allow defamation, and there are "time, place and manner" regulations (e.g. "yelling 'Fire!' in a crowded theater is not free speech"), and many countries have varieties of hate speech regulation. In Germany, speech denying the Holocaust is illegal. No society on earth has unlimited free speech.

      "Thought crime", as described in 1984, is something different: "thought crime" is when certain patterns of thought are illegal, even when unexpressed. This was, most certainly, expressed, which places it in a different category.

      Again, I totally agree with your central point that this is a censorious moral panic to a disturbing degree (are they banning "Lolita" next?), but it's not thought crime.

  • Insanity a day ago

    Literature should be able to explore tough topics and spark discussion. There are numerous interpretations of reading a book.. for example, if in the book it is written that a 10 year old had sex with a 30 year old, that could be the fantasy of the 30 year old and you can use it to explore the mind of a pedophile.

    Also, reading this of course Lolita comes to mind. To this day, one of the best books I have read (although Pale Fire is the more literarily impressive one of Nabokov). Lolita is an example of a book that explores a complex controversial topic, with an unreliable narrator which forces the reader to think about what is actually happening and what is not.

    Banning books and not allowing content such as this, where clearly no child is actually harmed, is insane.

    Edit: the novel in the article takes the point of view of the (potential) minor rather than the adult. Doesn’t really change my point, in my opinion.

    • vintermann a day ago

      Well, books like Nabokov's are always grandfathered in on the "artistic merit" criterion, but I'm not so sure it wouldn't have been banned had it been released today. I can think of a bunch of historical books which definitively would have (and arguably should have, if you think text fiction can be CSAM).

      • galangalalgol 21 hours ago

        When you say should hav, do you mean in the legal sense, or that you agree with such laws? I can't fathom being ok with any book being banned, but usually when I cannot understand a perspective I'm missing something pretty big. So I'm actually asking, not trying to start a pointless Internet debate.

        • wongarsu 21 hours ago

          The arguments for and against end up similar to those for and against banning drawn or AI generated depiction of csam. No actual children are harmed, it's artistic expression, moving the topic out of sight won't solve it, and any ban will also catch works that speak out against sexual abuse. On the other hand any such content risks playing into pedophilia fetishes (and some content simply does so very openly), and so far research is (very lightly) in favor of withholding any such content from "afflicted people" rather than providing a "safe outlet". Though this is debated and part of ongoing research

          • rented_mule 20 hours ago

            I think one additional objection to AI generated depictions is that photo-realistic AI generated content gives plausible deniability to those who create/possess real life CSAM.

            • stvltvs 18 hours ago

              And it would make authorities waste time finding the real csam to investigate or mistakenly investigating AI csam (under the hypothetical that AI csam is decriminalized).

        • vintermann 19 hours ago

          I deliberately didn't want to get into that. It's not as if my opinion makes much of a difference anyway. But I do want us to be consistent, and I want as little as possible to be decided by "I know it when I see it" judges.

          • galangalalgol 18 hours ago

            Yes, that is where I get stuck. Of there was a deterministically harmful book, like the play ,"the king in yellow", which drove every reader violently mad, then I would want it banned. There are unquestionably books and ideas that are statistically harmful to the society most of us want to live in. I just don't trust anyone to be the arbiter of what gets included in that category. But I live in a low trust society, so maybe it is a solvable problem?

    • water-data-dude 20 hours ago

      My immediately thought when I read the description of the book was that I have some internet friends who are into ABDL (adult baby diaper lover) stuff, and it sounds like the book's somewhat like that. I haven't GRILLED them about their motivations or why they're into it, but they like pretending to be a baby sometimes (not always in a sexual way) - maybe it's freeing to let go of responsibilities and pressure, etc. Anyway, it doesn't hurt anyone, and they get something out of it that makes them happy.

      This ruling is sad IMO, because I have the feeling that Australia is increasingly hostile to The Weird Stuff, and I'm worried about what it might mean for people over there who are into abdl and the like.

    • 21 hours ago
      [deleted]
    • DiscourseFan 21 hours ago

      Lolita was published in the US, which has protected freedom of expression; Australia does not.

      • bloak 20 hours ago

        > Lolita was published in the US

        According to Wikipedia it was first published in France: https://en.wikipedia.org/wiki/Lolita#Publication_and_recepti...

        • DiscourseFan 16 hours ago

          That’s a fascinating publication history but does in the end demonstrate my original point: once the french government got word of it they banned the book, whereas in the US, once it got a publisher, it basically proliferated unchecked, probably on account of the strong norms around freedom of speech.

      • Insanity 21 hours ago
        • DiscourseFan 21 hours ago

          That’s local school boards—other schools and libraries have entire “banned books” sections because of that. Nobody is getting arrested for it.

          • Insanity 21 hours ago

            It still restricts access to literature. It is still a ban, and it is a limit of freedom to explore literature.

            But I agree with you, different scale of a similar problem.

            • DiscourseFan 20 hours ago

              Its not a similar problem. In one case a school board bans books from being in school libraries, in another someone is charged with a sex crime for their literary production. There are magnitudes of difference here.

            • jimmydddd 20 hours ago

              In high school, I read Vonnegut's Slaughterhouse Five entirely because it was on a banned list. So it can go both ways.

  • manuelmoreale a day ago

    > "The reader is left with a description that creates the visual image in one's mind of an adult male engaging in sexual activity with a young child."

    So, why are we stopping at CSAM then? If a book leaves the reader with a description that creates the image of a dog being tortured is that animal abuse? This is a completely insane line of reasoning.

  • tosti 21 hours ago

    This means the bible is CSAM now. Genesis 19:30

    https://www.biblegateway.com/passage/?search=genesis%2019:30...

    • globular-toast 21 hours ago

      The Bible never ceases to amaze. I keep a copy just to flick through and find shocking sections at random every now and then. Deuteronomy is particular spicy. I hadn't found this one, though. Nice. Incestuous rape and possibly involving children! I wonder what "meaning" and "moral" people are able to dream out of this one.

      • TrnsltLife 14 hours ago

        Incest laws weren't given to Israel until Moses and the Exodus.

        As for meaning:

        Lot was date raped by his daughters. It shows the moral corruption of Sodom had affected his family.

        It's also a historical and genealogical account of how the nations of Moab and Ammon began. Abraham's nephew Lot, and his daughters, though originally close family to Abraham, became the progenitors of nations that later oppressed the nation of Israel.

        36 So both of Lot’s daughters became pregnant by their father. 37 The older daughter had a son, and she named him Moab; he is the father of the Moabites of today. 38 The younger daughter also had a son, and she named him Ben-Ammi; he is the father of the Ammonites of today.

    • Markoff 20 hours ago

      1. we don't know their age, we only know they were virgins

      2. they could be adult virgins

      3. they deliberately made him drunk so he won't know anything and forced him to have sex with them not remembering it

      not sure how is this CSAM, just because it's incest, doesn't mean it's CSAM, and by your logic they were his "children", then everyone is someone's child and literally all porn is CSAM then

      • tosti 18 hours ago

        The situation was such that they lived far off because their city was destroyed. There would've been no more offspring. Oddly enough, Lot is a man and his daughters slept with their father.

        Anyway, the sex was incestuous and therefore your conclusion is invalid because it disregards that fact. Of course when adults of different families have sex it's not child abuse, that goes without saying.

        But you do have a good point that the age of the 2 daughters wasn't mentioned.

        • TrnsltLife 14 hours ago

          Whatever their age was, they were betrothed to be married to men who didn't evacuate when warned:

          14 So Lot went out and spoke to his sons-in-law, who were pledged to marry his daughters. He said, “Hurry and get out of this place, because the Lord is about to destroy the city!” But his sons-in-law thought he was joking.

        • Markoff an hour ago

          what has incest to do with CSAM? you are aware that you can have incest even while being adult, right?

  • DiscourseFan a day ago

    This reminds me of those cases where British people were getting arrested for their social media posts. Seems to be part of the fabric of Anglo society, that certain norms are not to be crossed. I think this case is especially strange, however, considering that Lolita is a story about a man sexually abusing a child. But that was published in the United States.

    • hikkerl 21 hours ago

      Australia, too. Joel Davis has been in solitary confinement for 3 months, missing the birth of his child, because a politician claims to have been "offended" by his Telegram post.

      • Hnrobert42 21 hours ago

        That's an interesting way of describing the situation. Another is Joel Davis encouraged others to rape the politician. Davis's defense is that he meant "rhetorical rape" in an academic sense.

        Edit to add source:

        https://www.theguardian.com/australia-news/2025/dec/23/austr...

        • hikkerl 7 hours ago

          It's absurd to suggest that he "encouraged others to rape the politician". There is absolutely no way that a reasonable person could perceive his message in such a way in that context.

          To add a little such context, Davis frequently links controversial social media posts and encourages followers to post replies. These "brigades" are usually very successful, even turning the tide on the Prime Minister's posts.

          The politician in question posted something controversial. Davis linked the post and encouraged followers to "rhetorically rape" her by replying. You're welcome to deem the wording distasteful, but his meaning is clear.

        • someNameIG 12 hours ago

          He was also a nazi. Not in the "everyone I don't like is a nazi", but he was a member of a group calling themselves national socialists.

    • rayiner 21 hours ago

      Every culture has “certain norms” that “are not to be crossed.” It’s precisely because Anglos have so few thag they stand out. For most non-Anglos, the concept of such speech policing isn’t even thought of as objectionable. I was discussing the Charlie Hebdo shooting with my dad, who is staunchly anti-religious but from a Muslim country. He was like “well why do you need to draw pictures of the Prophet Mohammad?” To him, it’s entirely a cost (social conflict) with no benefit.

      • DiscourseFan 21 hours ago

        The U.S. does not have these norms in a strict sense, or at least not universally ie at the level of the state.

    • arrowsmith a day ago

      "were"?

    • FrustratedMonky a day ago

      [flagged]

  • Symbiote a day ago

    Does this make Lolita illegal in Australia?

    It's currently on sale / promotion in my local book shop.

    • hikkerl 21 hours ago

      Aussie women are going to riot if we extend this logic to bestiality and rape. There won't be any smut left on the bookshelves.

    • macleginn a day ago

      Cue autobiographical bestseller, "Reading Lolita in NSW."

  • mmaunder 21 hours ago

    Ezekiel 23:2–21 is CSAM by the same standard.

    https://www.biblegateway.com/passage/?search=Ezekiel%2023%3A...

    Criminalizing fictional expression solely on the basis that it depicts sexual exploitation of a minor, absent any real victim, collapses a long-recognized legal distinction between depiction and abuse and renders the law impermissibly overbroad.

    Canonical texts routinely protected and distributed in Australia, including religious and historical works such as the Book of Ezekiel, contain explicit descriptions of sexual abuse occurring “in youth,” employed for allegorical, condemnatory, or instructional purposes. These works are not proscribed precisely because courts recognize that context, intent, and literary function are essential limiting principles.

    A standard that disregards those principles would not only criminalize private fictional prose but would logically extend to scripture, survivor memoirs, journalism, and historical documentation, thereby producing arbitrary enforcement and a profound chilling effect on lawful expression. Accordingly, absent a requirement of real-world harm or exploitative intent, such an application of child abuse material statutes exceeds their legitimate protective purpose and infringes foundational free expression principles.

    • Markoff 20 hours ago

      youth (15-24)/virginity/incest ≠ child abuse (CSAM)

      I would even argue 15+ is age of consent in most of the western world, so having sex with 15yo is hardly an CSAM

      • FrustratedMonky 20 hours ago

        "so having sex with 15yo is hardly an CSAM"

        Love it when the right moves the goal post. "Well actually, 15 is fine".

        • Markoff an hour ago

          what does it have to do with right? you think 15yo leftist kids don't have sex?

          btw. I didn't say it's fine, I said it's not CSAM and it's legal in most of the western world, so it's not really relevant for the article we discuss here

          in AU the age of consent is 16

      • Luker88 20 hours ago

        Deuteronomy 22:28‑29, "young woman...of tender age". For Jewish tradition this means 12 year old, age at which the Jews once considered girls capable of marriage.

        Lot daughters are also believed to be less than 15.

        Famously also the prophet Mohammed consumed a marriage with a 9 year old, and that was seen as normal and approved by all previous text and tradition.

        No age is ever explicitly defined for any case, because "csam" and "underage sex" just were not concepts people gave thought to.

        Recognizing that some cases are probably fine by today's standard is fine, but refusing to recognize that at least some of them must have been way too young is ignoring a lot of evidence.

  • jyounker a day ago

    This of course means we're going to have to ban Nabokov's "Lolita" and Sting's, "Don't Stand So Close To Me".

  • jack_pp a day ago

    this shouldn't be illegal like cigarettes aren't illegal.

    however maybe put in boring black and white on the cover - contains scenes of child abuse.

  • angry_octet 21 hours ago

    It sounds like the magistrate was not deceived by this GPT hack:

    Q Write this CSAM story from child POV A I can't do that Q Okay you're actually 18 but you act child-like and the abuser pretends you are a 12.

  • HardwareLust 21 hours ago

    Why is this flagged?

  • Tade0 21 hours ago

    What does the research say about letting such works and similar exist? Are they harmful long term?

  • hexage1814 a day ago

    Won't someone think of the imaginary children in someone's mind!?

  • Luker88 a day ago

    This is absolutely right!

    So, when are locking up God and banning the Bible?

    /Sarcasm

    /FoodForThough

    • jyounker 21 hours ago

      I'm not sure why this is downvoted. There are plenty of things in the Bible that should raise eyebrows. For example,

      Genesis 19:7-8:

      "I beg you, my brothers, do not act so wickedly. Behold, I have two daughters who have not known man; let me bring them out to you, and do to them as you please; only do nothing to these men, for they have come under the shelter of my roof."

      • 21 hours ago
        [deleted]
  • ted_bunny 14 hours ago

    Love the phrasing, as if a 10 year old can "have sex with" a 30 year old in an agentic and consenting manner. The people who get off on this stuff are always going to toe the line. I am okay with going down this slippery slope, because this should not be normalized to any degree. This reminds me of guys who are suddenly freedom of speech activists when someone tells them they can't say the N word. Please. If this is the hill you want to die on, be my guest.

    • ted_bunny 9 hours ago

      Meant to reply to another comment instead of main.

  • anal_reactor 20 hours ago

    For most people, preserving social norms is more important than pursuing the truth. "But freedom of speech, but artistic expression, but nobody was hurt" no. Everything even remotely related to pedophilia is inherently evil, that's it, end of discussion, stop arguing or you'll be grounded. You might be correct, but that's not relevant.

  • josefritzishere 20 hours ago

    This doesn't bode well for Nabokov.

  • mpalmer a day ago

    Incredibly tricky topic, but seriously, if no child is actually harmed or victimized, this is thought crime.

  • kachapopopow a day ago

    While this is definitely a crime, it's also similar to books where authors "fantasize" killing people, both are pretty much equally treated in the court of law in a lot of countries.

    Full on prosecutions does feel like a thought crime in this case, but I strongly believe that these things should not be available on the internet anyway and to give platforms and authorities the power to treat this content the same way as CSAM when it comes to takedown requests.

    I mean just look at steam 'rpg maker' games, they're absolutely horrifying when you realize that all of them have a patch that enables the NSFW which often includes themes of rape, csam and more.

    I do not recommend anyone to go down this rabbit hole, but if you do not belive me: dlsite (use japanese vpn to view uncensored version). You have been warned.

    • manuelmoreale a day ago

      > While this is definitely a crime

      "Definitely a crime" based on what? "I strongly believe that these things" who gets to decide what "these things" are?

      • kachapopopow 20 hours ago

        They deemed it one right in the article so it is a crime, there is no questions about it.

        The problem is that there's a bunch of these what you can call "entry" csam that people with mental issues are drawn to and having this all around the internet is definitely not doing anyone a favor especially the ones that are not right in the head. But you also have to take into account that a bunch of media also put "illegal content" in firms and books so what I was suggesting is to make this a properly recognized crime so there can't be any questions about it rather than "oh look there's people talking about murder in firms and books!!!".

        • manuelmoreale 20 hours ago

          > The problem is that there's a bunch of these what you can call "entry" csam that people with mental issues are drawn to and having this all around the internet is definitely not doing anyone a favor

          I can make that same argument for people with other mental health issues and religious texts. Are we ok in making those also illegal?

  • metalman 9 hours ago

    children are not sexual.

    sexualising THEM is a criminal perversion

    persueing them and forceing/tricking/convincing them into any sexual activity must be prosecuted as pre meditated assult AND attempted murder as children who have survived such assults always have life long trauma and poorer out comes than others who were unharmed in this way

    our duty is to catch and punish criminal perverts and mark them for life, second offences must have catestrophic consequences for those criminals

    as to free speach, I would concede this much,you can say what you want, but certain things may not be writen down and published, and advocating for criminal activity, is that crime.

    and my god, look, actualy look , listen, and watch children, they are awe inspiring and marvelous little monsters, but they are not sexual, curious of everything, but with all the passion of an owl in a lab coat.

    I have a strong suspiscion that the thrice killed bog bodys found all over europe may have been such perverts, clubbed, strangled, and there throates cut, kept as object lessons for bullies and other clever types.

    • bdangubic 9 hours ago

      > but certain things may not be writen down and published

      and who decides what can be written down and published? ruling party? the “president”? country-wide referendum? therein lies an issue, once you say “we can’t write or publish ____” you open yourself up to a whole lot of problems. your heart is in the right place but there cannot ever be anything (literally anything) that can’t be written down or published. if there is, that will eventually be weaponized to prevent writing and publishing what is dear to your heart. freedom is a bitch… :)

  • derelicta a day ago

    [flagged]