• 0 Posts
  • 7 Comments
Joined 5 months ago
cake
Cake day: March 31st, 2025

help-circle
  • mfed1122@discuss.tchncs.detoScience Memes@mander.xyzdo what you love
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    1
    ·
    edit-2
    19 hours ago

    This is PURE speculation, but I feel like this could be caused by the only people who feel comfortable getting a philosophy degree being wealthy connected people. I know a lot of people from my high school that have stereotypical “be poor forever” degrees and are doing great - but if you knew them in high school, you’d know that they had millionaire parents. All the poor kids went for safer degrees because they knew they’d need money.

    To be clear: I love philosophy and think it is very valuable. But sadly it seems like something that only privileged people or the very passionate take a risk on.


  • Yeah, you’re absolutely right and I agree. So then do we have to resign the situation to being an eternal back-and-forth of just developing random new challenges every time the scrapers adapt to them? Like antibiotics for viruses? Maybe that is the way it is. And honestly that’s what I suspect. But Anubis feels so clever and so close to something that would work. The concept of making it about a cost that adds up, so that it intrinsically only effects massive processes significantly, is really smart…since it’s not about coming up with a challenge a computer can’t complete, but just a challenge that makes it economically not worth it to complete. But it’s disappointing to see that, at least with the current wait times, it doesn’t seem like it will cost enough to dissuade scrapers. And worse, the cost is so low that it seems like making the cost significant to the scrapers will require really insufferable wait times for users.


  • By negligence, I meant that the cost is negligible to the companies running scrapers, not that the solution itself is negligent. I should have said “negligibility” of Anubis, sorry - that was poor clarity on my part.

    But I do think that the cost of it is indeed negligible, as the article shows. It doesn’t really matter if the author is biased or not, their analysis of the costs seems reasonable. I would need a counter-argument against that to think they were wrong. Just because they’re biased isn’t enough to discount the quantification they attempted to bring to the debate.

    Also, I don’t think there’s any hypocrisy in me saying I’ve only thought about other solutions here and there - I’m not maintaining an anti-scraping library. And there’s already been indications that scrapers are just accepting the cost of Anubis on Codeberg, right? So I’m not trying to say I’m some sort of tech genius who has the right idea here, but from what Codeberg was saying, and from the numbers in this article, it sure looks like Anubis isn’t the right idea. I am indeed only having fun with my suggestions, not making whole libraries out of them and pronouncing them to be solutions. I personally haven’t seen evidence that Anubis is so clearly working? As the author points out, it seems like it’s only working right now because of how new it is, but if scrapers want to go through it, they easily can - which puts us in a sort of virus/antibiotic eternal war of attrition. And if course that is the case with many things in computing as well. So I guess my open wondering are just about if there’s ever any way to develop a countermeasure that the scrapers won’t find “worth it” to force through?

    Edit for tone clarity: I’m don’t want to be antagonistic, rude, or hurtful in any way. Just trying to have a discussion and understand this situation. Perhaps I was arrogant, if so I apologize. It was also not my intent, fwiw. Also, thanks for helping me understand why I was getting downvoted. I intended my post to just be constructive spitballing about what I see as the eventual inevitable weakness in Anubis. I think it’s a great project and it’s great that people are getting use out of it even temporarily, and of course the devs deserve lots of respect for making the thing. But as much as I wish I could like it and believe it will solve the problem, I still don’t think it will.


  • Yeah, well-written stuff. I think Anubis will come and go. This beautifully demonstrates and, best of all, quantifies the negligence negligible cost to scrapers of Anubis.

    It’s very interesting to try to think of what would work, even conceptually. Some sort of purely client-side captcha type of thing perhaps. I keep thinking about it in half-assed ways for minutes at a time.

    Maybe something that scrambles the characters of the site according to some random “offset” of some sort, e.g maybe randomly selecting a modulus size and an offset to cycle them, or even just a good ol’ cipher. And the “captcha” consists of a slider that adjusts the offset. You as the viewer know it’s solved when the text becomes something sensical - so there’s no need for the client code to store a readable key that could be used to auto-undo the scrambling. You could maybe even have some values of the slider randomly chosen to produce English text if the scrapers got smart enough to check for legibility (not sure how to hide which slider positions would be these red herring ones though) - which could maybe be enough to trick the scraper into picking up junk text sometimes.


  • “I Transformed Into An Invisible Tiger And Began Killing Billionaires” - A cat-and-mouse Death Note style detective anime about a college freshman communist tech nerd who miraculously gains the ability to transform into an invisible tiger and uses his leet Linux hacking skills to track down billionaire targets. At first the supernatural crimes are easy to carry out, then billionaires start to develop bunkers and defense systems against the mysterious threat. But when the killer discovers that a prodigy medical genius has acquired the same power, and intends to unveil their research on it to earn a Nobel Prize - that’s when things really start to get complicated.

    DM me for my Venmo info for any royalties thank you


  • This is a great example in support of something I often think about. We see our consciousness as “me” and as “the thing in charge” of the body, but really it’s more of an ancillary subprocess that the body runs for its own benefit. It’s just a special subprocess that does its job best when it mistakenly thinks of itself as being the boss of the body.