• 0 Posts
  • 710 Comments
Joined 9 months ago
cake
Cake day: February 18th, 2024

help-circle











  • They’re struggling because they’re not learning, or learning how to learn.

    LLM outputs aren’t reliable. Using one for your research is doing the exact opposite of the steps that are required to make good decisions.

    The prerequisite to making a good decision is learning the information relevant to the decision, then you use that information to determine your options and likely outcomes of those paths. The internalization of the problem space is fundamental to the process. You need to actually understand the space you’re making a decision about in order to make a good decision. The effort is the point.







  • It sets an absolutely obscene precedent that a government can globally restrict information. Even global terrible actors like Russia and China haven’t succeeded at that.

    Yes, that precedent is 1000 orders of magnitude more harm than India losing access (which they won’t, because the entirety of Wikipedia is open source and would be mirrored in the country instantly. But even if they actually would, it is literally impossible to get anywhere near the harm of the precedent this sets).



  • They not only can, trivially. They unconditionally must.

    It is not possible to ever be a reputable organization ever again if you have to choose between censoring content globally for an authoritarian government and shutting down in that country, and censoring content globally is something they genuinely consider. Open, fact based information is their entire reason for existing.