• 0 Posts
  • 13 Comments
Joined 1 year ago
cake
Cake day: July 3rd, 2023

help-circle

  • Okay that’s fine, but when websites are effectively writing

    if user_agent_string != [chromium]
         break;
    

    It doesn’t really matter how good compatibility is. I’ve had websites go from nothing but a “Firefox is not supported, please use Chrome” splash screen to working just fine with Firefox by simply spoofing the user agent to Chrome. Maybe some feature was broken, but I was able to do what I needed. More often than not they just aren’t testing it and don’t want to support other browsers.

    The more insidious side of this is that websites will require and attempt to enforce Chrome as adblocking gets increasingly impossible on them, because it aligns with their interests. It’s so important for the future of the web that we resist this change, but I think it’s too late.

    The world wide web is quickly turning into the dark alley of the internet that nobody is willing to walk down.



  • Yeah this is a hard one to navigate and it’s the only thing I’ve ever found that challenges my philosophy on the freedom of information.

    The archive itself isn’t causing the abuse, but CSAM is a record of abuse and we restrict the distribution not because distribution or possession of it is inherently abusive, but because the creation of it was, and we don’t want to support an incentive structure for the creation of more abuse.

    i.e. we don’t want more pedos abusing more kids with the intention of archival/distribution. So the archive itself isn’t the abuse, but the incentive to archive could be.

    There’s also a lot of questions with CSAM in general that come up about the ethics of it in that I think we aren’t ready to think about. It’s a hard topic all around and nobody wants to seriously address it beyond virtue signalling about how bad it is.

    I could potentially see a scenario where the archival could be beneficial to society similar to the FBI hash libraries Apple uses to scan iCloud for CSAM. If we throw genAI at this stuff to learn about it, we may be able to identify locations, abusers and victims to track them down and save people. But it would necessitate the existence of the data to train on.

    I could also see potential for using CSAM itself for psychotherapy. Imagine a sci-fi future where pedos are effectively cured by using AI trained on CSAM to expose them to increasingly mature imagery, allowing their attraction to mature with it. We won’t really know if something like that is possible if we delete everything. It seems awfully short sighted to me to delete data no matter how perverse, because it could have legitimate positive applications that we haven’t conceived of yet. So to that end, I do hope some 3 letter agencies maintain their restricted archives of data for future applications that could benefit humanity.

    All said, I absolutely agree that the potential of creating incentives for abusers to abuse is a major issue with immutable archival, and it’s definitely something that we need to figure out, before such an archive actually exists. So thank you for the thought experiment.









  • Ok, but my gen 1 Chromecast stopped working years ago, and it was sluggish, unreliable to cast to and virtually unusable long before that. My Chromecast Ultra has steadily declined in usability too, the whole Chromecast UX has fallen off a cliff the past few years. It used to be fucking magic and just worked with everything. Now my Chromecast with Google TV only really works satisfactorily when I bother to update smarttube and pair it with the code.

    The whole platform is trash these days compared to the original gen 1 experience. I went from living in the future, only using my phone to watch TV 10 years ago, to slowly migrating back to a fucking remote control like it’s 1980 again. Say what you want, but the promise that gen 1 Chromecast delivered on has been dead and buried for a while now.