

If 99.9% of PCs were solely made to steal your credit card info, then yes.
Admin on the slrpnk.net Lemmy instance.
He/Him or what ever you feel like.
XMPP: povoq@slrpnk.net
Avatar is an image of a baby octopus.


If 99.9% of PCs were solely made to steal your credit card info, then yes.


I have high hopes for GNU Taler in that regard, as it is in theory super easy to include in any website and makes tipping small sums very feasible.
But in reality it is bogged down by bureocractic hurdles on the banking side, and I am starting to lose a bit of hope due to perpetual delays even after some banks promised to support it as part of an EU grant via Nlnet.


Open minded to being scammed? No thanks.


Lol, wat? I have not seen Anubis even once in front of a static page. You are either making shit up or don’t understand what a static site is 🤦


Well… you found your problem then. It is neither my problem, nor a problem of apartments in general 🤷


Release where? And why?
Also: looks too well fed for a stray. Are you sure it doesn’t belong to a neighbour?
DSub2000 is also fairly nice for Android.


Many apartments are owned by the inhabitants or are cooperatively managed.


That is a silly assumption, like why would you assume the worst possible setup? And it would be much easier to talk to the person managing the apartment internet than having to deal with some AI chatbot that pretends to be the support at some shitty ISP.


Obviously I don’t think you need Anubis for a static site. And if that is what your admin experience is limited too, than you have a strong case of dunning krueger.


No one is disputing that in theory (!) Anubis offers very little protection against an adversary that specifically tries to circumvent it, but we are dealing with an elephant in the porcelain shop kind of situation. The AI companies simply don’t care if they kill off small independently hosted web-applications with their scraping and Anubis is the mouse that is currently sufficient to make them back off.
And no, forced site reloads are extremely disruptive for web-applications and often force a lot of extra load for re-authentication etc. It is not as easy as you make it sound.


You clearly don’t know what you are talking about.


Yeah, German Universities have special direct internet access via the “Hochschulnetz”. We had some pretty fancy 5ghz directional wifi connections over several km connecting to it, but it was fairly slow (shared 10 mbit), which made that impractical for most private internet use.


It would already help if apartment buildings had an internal network with a single connection point, but I can tell you as someone who worked on this as a volunteer for student dormitories back in the day that ISPs are extremely hostile to the idea.


If you check for GPU (not generally a bad idea) you will have the same people that currently complain about JS, complain about this breaking with their anti-fingerprinting browser addons.
But no, you can’t spoof PoW obviously, that’s the entire point of it. If you do the calculation in Javascript or not doesn’t really matter for it to work.
In the current shape Anubis has zero impact on usability for 99% of the site visitors, not so with meta refresh.


And how do you actually check for working JS in a way that can’t be easily spoofed? Hint: PoW is a good way to do that.
Meta refresh is a downgrade in usability for everyone but a tiny minority that has disabled JS.


You are arguing a strawman. Anubis works because because most AI scrapers (currently) don’t want to spend extra on running headless chromium, and because it slightly incentivises AI scrapers to correctly identify themselves as such.
Most of the AI scraping is frankly just shoddy code written by careless people that don’t want to ddos the independent web, but can’t be bothered to actually fix that on their side.


AI scraping is a massive issue for specific types of websites, such as git forges, wikis and to a lesser extend Lemmy etc, that rely on complex database operations that can not be easily cached. Unless you massively overprovision your infrastructure these web-applications come to a grinding halt by constantly maxing out the available CPU power.
The vast majority of the critical commenters here seem to talk from a point of total ignorance about this, or assume operators of such web applications have time for hyperviligance to constantly monitor and manually block AI scrapers (that do their best to circumvent more basic blocks). The realistic options for such operators are right now: Anubis (or similar), Cloudflare or shutting down their servers. Of these Anubis is clearly the least bad option.


I kinda sucks how AI scrapers make websites inaccessible to everyone 🙄















https://movim.eu/ can do that AFAIK, but for now the A/V calls don’t go through an SFU distribution server (coming soonish), so it will not scale to many participants. But if you want to only stream to a few people (like max. 5 or so, depends a bit on your and their internet speed) it should work.