Maybe Jeff Bezos will write an article about him and editorialize about “personal liberty”. I have to keep posting this because every day another MAGA/lover - religious bigot or otherwise pretend upstanding community member is indicted or arrested for heinous acts against women and children.
Geez, two million? Good riddance. Great job everyone!
Goddam what an obvious fucking name. If you wrote a procedural cop show where the child traffickers ran a site called KidFlix, you’d be laughed out of the building for being so on-the-nose.
During the investigation, Europol’s analysts from the European Cybercrime Centre (EC3) provided intensive operational support to national authorities by analysing thousands of videos.
I don’t know how you can do this job and not get sick because looking away is not an option
You do get sick, and I would be most surprised if they didnt allow people to look away and take breaks/get support as needed.
Most emergency line operators and similar kinds of inspectors get them, so it would be odd if they did not.
Yes, my wife used to work in the ER, she still tells the same stories over and over again 15 years later, because the memories of the horrible shit she saw doesn’t go away
Indeed, but in my country the psychological support is even mandatory. Furthermore, I know there have been pilots with using ML to go through the videos. When the system detects explicit material, an officer has to confirm it. But it prevents them going through it all day every day for each video. I think Microsoft has also been working on a database with hashes that LEO provides to automatically detect materials that have already been identified. All in all, a gruesome job, but fortunately technique is alleviating the harshest activities bit by bit.
And this is for law enforcement level of personnel. Meta and friends just outsource content moderation to low-wage countries and let the poors deal with the PTSD themselves.
Let’s hope that’s what AI can help with, instead of techbrocracy
Kidflix sounds like a feature on Nickelodeon. The world is disgusting.
Or a Netflix for children/video editing app for primary schoolers in the early 2000s/late 1900s.
And it didn’t even require sacrificing encryption huh!
Basically the only reason I read the article is to know if they needed a “backdoor” in encryption, guess the don’t need it, like everyone with a little bit of IT knowledge always told them.
“See we caught these guys without doing it, thank of how many more we can catch if we do! Like all the terrorists America has caught with violating their privacy. …Maybe some day they will.”
They also seized 72,000 illegal videos from the site and personal information of its users, resulting in arrests of 1,400 suspects around the world.
Wow
1,393 suspects identified 79 suspects arrested Over 3,000 electronic devices seized 39 children protected
Imagine if humans evolved enough to self-solve the problem of liking this shit.
Massive congratulations to Europol and its partners in taking this shit down and putting these perverts away. However, they shouldn’t rest on their laurels. The objective now is to ensure that the distribution of this disgusting material is stopped outright and that no further children are harmed.
The objective now is to ensure that the distribution of this disgusting material is stopped outright and that no further children are harmed.
Sure, it’ll only cost you every bit of your privacy as governments make illegal and eliminate any means for people to communicate without the eye of Big Brother watching.
Every anti-privacy measure that governments put forward is always like "We need to be able to track your location in real time, read all of your text messages and see every picture that your phone ever takes so that we can catch the .001% of people who are child predators. Look at how scary they are!
Why are you arguing against these anti-pedophile laws?! You don’t support child sex predators do you?!"
Then end up USA
USA ain’t got shit on the Eastern Bloc when it comes to sex trafficking.
If that’s the actual splash screen that pops up when you try to access it (no, I’m not going to go to it and check, I don’t want to be on a new and exciting list) then kudos to the person who put that together. Shit goes hard. So do all the agency logos.
Feds have been stepping up their seized website banner game lately. The one for Genesis Market was pretty cool too.
Here’s a reminder that you can submit photos of your hotel room to law enforcement, to assist in tracking down CSAM producers. The vast majority of sex trafficking media is produced in hotels. So being able to match furniture, bedspreads, carpet patterns, wallpaper, curtains, etc in the background to a specific hotel helps investigators narrow down when and where it was produced.
Nice to know. Thanks.
Wouldnt this be so much better if we got hoteliers on board instead of individuals
They’re only concerned with the room fees.
Don’t see, don’t tell.
Sadly.
I worked in customer service a long time. No one was trained on how to be law enforcement and no one was paid enough to be entrusted with public safety beyond the common sense everyday people have about these things. I reported every instance of child abuse I’ve seen, and that’s maybe 4 times in two decades. I have no problem with training and reporting, but you have to accept that the service staff aren’t going to police hotels.
Thank you for posting this.
Every now and again I am reminded of my sentiment that the introduction of “media” onto the Internet is a net harm. Maybe 256 dithered color photos like you’d see in Encarta 95 and that’s the maximum extent of what should be allowed. There’s just so much abuse from this kind of shit… despicable.
With that logic, I might as well throw away my computer and phone and go full Uncle Ted.
It is easy to very feel disillusioned with the world, but it is important to remember that there are still good people all around willing to fight the good fight. And it is also important to remember that technology is not inherently bad, it is a neutral object, but people could use it for either good or bad purposes.
Raping kids has unfortunately been a thing since long before the internet. You could legally bang a 13 year old right up to the 1800s and in some places you still can.
As recently as the 1980s people would openly advocate for it to be legal, and remove the age of consent altogether. They’d get it in magazines from countries where it was still legal.
I suspect it’s far less prevalent now than it’s ever been. It’s now pretty much universally seen as unacceptable, which is a good start.
The youngest Playboy model, Eva Ionesco, was only 12 years old at the time of the photo shoot, and that was back in the late 1970’s… It ended up being used as evidence against the Eva’s mother (who was also the photographer), and she ended up losing custody of Eva as a result. The mother had started taking erotic photos (ugh) of Eva when she was only like 5 or 6 years old, under the guise of “art”. It wasn’t until the Playboy shoot that authorities started digging into the mother’s portfolio.
But also worth noting that the mother still holds copyright over the photos, and has refused to remove/redact/recall photos at Eva’s request. The police have confiscated hundreds of photos for being blatant CSAM, but the mother has been uncooperative in a full recall. Eva has sued the mother numerous times to try and get the copyright turned over, which would allow her to initiate the recall instead.
Let’s get rid of the printing press because it can be used for smut. /s
great pointless strawman. nice contribution.
It’s not a strawman if they repeat your own logic back at you. You just had a shit take.
It’s satire of your suggestion that we hold back progress but I guess it went over your head.
I think it just shows all the hideousness of humanity and all it’s glory in a way that we have never confronted before. It’s shatters the illusion the humanity has grown from its barbaric ways.
1.8 million users and they only caught 1000?
79
79 arrested, but it seems they found the identity of a thousand or so.
I imagine it’s easier to catch uploaders than viewers.
It’s also probably more impactful to go for the big “power producers” simultaneously and quickly before word gets out and people start locking things down.
It also likely gives you the best $ spent/children protected rate, because you know the producers have children they are abusing which may or may not be the case for a viewer.
Yeah, I don’t suspect they went after any viewers, only uploaders.
Hopefully, they went after the pimps and financiers.
Wow, with such a daring name as well. Fucking disgusting.
I once saw a list of defederated lemmy instances. In most cases, and I mean like 95% of them, the reason of thedefederation was pretty much in the instance name. CP everywhere. Humanity is a freaking mistake.
I regularly see people on Lemmy advocating for pedophilia, at LEAST every 3 months as a popular, upvoted stance. I argue with them in my history
gross, probably the reason they got banned from reddit for the same thing, promoting or soliciting csam material.
I don’t really think Reddit minds that, actually, given that u/spez was the lead mod of r/jailbait until.he got caught and hid who the mods were
Gratefully, I’ve not experienced this.
And I don’t want to know where they are.
what he fuck I’ve never ran into anyone like that.don’t even wanna know where you do this regularly
I’m sorry but classifying that as advocating for pedophilia is crazy. all they said is they don’t know about studies regarding it so they said “probably” instead of making a definitive statement. you took that word and ran with it; your response is extremely over the top and hostile to someone who didn’t advocate for what you’re saying they advocate for.
It’s none of my business what you do with your time here but if I were you I’d be more cool headed about this because this is giving qanon.
They literally investigated specific time frames of their voyeurism kink in medieval times extensively, wrote several paragraphs in favor of having children watch adults have sex, but couldn’t be bothered to do the most basic of research that sex abuse is harmful to children.
Even then, a common bit you’ll hear from people actually defending pedophilia is that the damage caused is a result of how society reacts to it or the way it’s done because of the taboo against it rather than something inherent to the act itself, which would be even harder to do research on than researching pedophilia outside a criminal context already is to begin with. For starters, you’d need to find some culture that openly engaged in adult sex with children in some social context and was willing to be examined to see if the same (or different or any) damages show themselves.
And that’s before you get into the question of defining where exactly you draw the age line before it “counts” as child sexual abuse, which doesn’t have a single, coherent answer. The US alone has at least three different answers to how old someone has to be before having sex with them is not illegal based on their age alone (16-18, with 16 being most common), with many having exceptions that go lower (one if the partners are close “enough” in age are pretty common). For example in my state, the age of consent is 16 with an exception if the parties are less than 4 years difference in age. For California in comparison if two 17 year olds have sex they’ve both committed a misdemeanor unless they are married.
none of this applies to the comment they cited as an example of defending pedophilia.
I feel like what he’s trying to say it shouldn’t be the end of the world if a kid sees a sex scene in a movie, like it should be ok for them to know it exists. But the way he phrases it is questionable at best.
When I was a kid I was forced to leave the room when any intimate scenes were in a movie and I honestly do feel like it fucked with my perception of sex a bit. Like it’s this taboo thing that should be hidden away and never discussed.
You’re just seeing “survivor’s bias” (as nasty as that sounds in this case) not a general representation.
No, there really are a lot of pedophiles on Lemmy
There are a lot of pedophiles everywhere.
Unfortunately, the really rich ones get away with it for a long time, and few get away with it forever.
epstein files is never going to be fully released, too many rich and powerful people
idk im at like 7 months an the only time i was able to ctrl f “pedo” in your history was when you were talking about trump (which, fair) and then again about 4chan pedophiles (which, again).
From a month ago: https://lemmy.world/post/26165823/15375845
as said before, that person was not advocating for anything. he made a qualified statement, which you answered to with examples of kids in cults and flipped out calling him all kinds of nasty things.
yeah that’s on me I only searched “pedo”
Ive been on lemmy for years (even when .ml was the only instance) and hadn’t seen anything of the sort though I don’t go digging for it either. I doubt that Lemmy is any worse than Facebook or Telegram when it comes to this.
Isnit not encouraging that it is ostracised and removed from normal people. There are horrible parts of everything in nature, life is good despite those people and because of the rest combatting their shittiness
On average, around 3.5 new videos were uploaded to the platform every hour, many of which were previously unknown to law enforcement.
Absolutely sick and vile. I hope they honey potted the site and that the arrests keep coming.
I just got ill
1.8m users, how the hell did they ran that website for 3 years?
Bribes, most likely. Until the wrong people (the Right People) became aware of it.
it says “this hidden site”, meaning it was a site on the dark web. It probably took them awhile to figure out were the site was located so they could shut it down.
it says “this hidden site”, meaning it was a site on the dark web.
Not just on the dark web (which technically is anything not indexed by search engines) but hidden sites are specifically a TOR thing (though Freenet/Hyphanet has something similar but it’s called something else). Usually a TOR hidden site has a URL that ends in .onion and the TOR protocol has a structure for routing .onion addresses.
That’s unfortunately (not really sure) probably the fault of Germanys approach to that. It is usually not taking these websites down but try to find the guys behind it and seize them. The argument is: they will just use a backup and start a “KidFlix 2” or sth like that. Some investigations show, that this is not the case and deleting is very effective. Also the German approach completely ignores the victim side. They have to deal with old men masturbating to them getting raped online. Very disturbing…
And yet there are cases like Kim Dotcom, Snowden, Manning, Assange…
I used to work in netsec and unfortunately government still sucks at hiring security experts everywhere.
That being said hiring here is extremely hard - you need to find someone with below market salary expectation working on such ugly subject. Very few people can do that. I do believe money fixes this though. Just pay people more and I’m sure every European citizen wouldn’t mind 0.1% tax increase for a more effective investigation force.
they probably make double/triple in the private sector, i doubt govt can match that salary. fb EVEN probalby paid more, before they starte dusing AI to sniff out cp.
Discovery of this kind of thing is as old as civilization.
Someone runs their mouth, or you catch someone with incrimination evidence on them. Then you lean on them to tell you where to go.
This feels like one of those things where couch critics aren’t qualified. There’s a pretty strong history of three letter agencies using this strategy successfully in other organized crime industries.
Like I stated earlier, someone was caught red-handed, and snitched to get a lesser sentence.
I think you are mixing here two different aspects of this and of similar past cases. I the past there was often a problem with takedowns of such sites, because german prosecutors did not regard themselves as being in charge of takedowns, if the servers were somewhere overseas. Their main focus was to get the admins and users of those sites and to get them into jail.
In this specific case they were observing this platform (together with prosecutors from other countries in an orchestrated operation) to gather as much data as possible about the structure, the payment flows, the admins and the users of this before moving into action and getting them arrested. The site was taken down meanwhile.
If you blow up and delete)such a darknet service immediately immediately upon discovery, you may get rid of it (temporarily) but you might not catch many of the people behind it.
Honestly, if the existing victims have to deal with a few more people masturbating to the existing video material and in exchange it leads to fewer future victims it might be worth the trade-off but it is certainly not an easy choice to make.
Well, some pedophiles have argued that AI generated child porn should be allowed, so real humans are not harmed, and exploited.
I’m conflicted on that. Naturally, I’m disgusted, and repulsed. I AM NOT ADVOCATING IT.
But if no real child is harmed…
I don’t want to think about it, anymore.
Understand you’re not advocating for it, but I do take issue with the idea that AI CSAM will prevent children from being harmed. While it might satisfy some of them (at first, until the high from that wears off and they need progressively harder stuff), a lot of pedophiles are just straight up sadistic fucks and a real child being hurt is what gets them off. I think it’ll just make the “real” stuff even more valuable in their eyes.
I feel the same way. I’ve seen the argument that it’s analogous to violence in videogames, but it’s pretty disingenuous since people typically play videogames to have fun and for escapism, whereas with CSAM the person seeking it out is doing so in bad faith. A more apt comparison would be people who go out of their way to hurt animals.
Somehow I doubt allowing it actually meaningfully helps the situation. It sounds like an alcoholic arguing that a glass of wine actually helps them not drink.
Issue is, AI is often trained on real children, sometimes even real CSAM(allegedly), which makes the “no real children were harmed” part not necessarily 100% true.
Also since AI can generate photorealistic imagery, it also muddies the water for the real thing.
that is still cp, and distributing CP still harms childrens, eventually they want to move on to the real thing, as porn is not satisfying them anymore.
It doesn’t though.
The most effective way to shut these forums down is to register bot accounts scraping links to the clearnet direct-download sites hosting the material and then reporting every single one.
If everything posted to these forums is deleted within a couple of days, their popularity would falter. And victims much prefer having their footage deleted than letting it stay up for years to catch a handful of site admins.
Frankly, I couldn’t care less about punishing the people hosting these sites. It’s an endless game of cat and mouse and will never be fast enough to meaningfully slow down the spread of CSAM.
Also, these sites don’t produce CSAM themselves. They just spread it - most of the CSAM exists already and isn’t made specifically for distribution.
Who said anything about punishing the people hosting the sites. I was talking about punishing the people uploading and producing the content. The ones doing the part that is orders of magnitude worse than anything else about this.
I’d be surprised if many “producers” are caught. From what I have heard, most uploads on those sites are reuploads because it’s magnitudes easier.
Of the 1400 people caught, I’d say maybe 10 were site administors and the rest passive “consumers” who didn’t use Tor. I wouldn’t put my hopes up too much that anyone who was caught ever committed child abuse themselves.
I mean, 1400 identified out of 1.8 million really isn’t a whole lot to begin with.
If most are reuploads anyway that kills the whole argument that deleting things works though.