• 0 Posts
  • 18 Comments
Joined 1 year ago
cake
Cake day: June 14th, 2023

help-circle
  • I’m starting to wonder if its LLMs. An AGI is something we would be incredibly cautious around and is really no more likely to be psychopathic than any other living thing, the vast majority of which are not. LLMs on the other hand are pushed into every role techbros can shove them into while having less understanding of what they do than a housefly, the potential for damage is immense if someone decides to put one in charge of something important like infastructure or weaponry.




  • Sometimes I wonder how america would do if it was ever invaded, they have a huge army and incredible power projection but since WW2 the US has had zero experience in any war where they had to face the consiquences for anything that happens. How well does the doctrine work when shock and awe is blowing up your own infastructure and you don’t have a pristine manufacturing base to fall back on.







  • CheeseNoodle@lemmy.worldtoScience Memes@mander.xyzgottem
    link
    fedilink
    English
    arrow-up
    8
    ·
    2 months ago

    The placebo effect also works even if you know its a placebo… Though sometimes I wonder if that information is in itself some kind of benevolent false information campaign to create a second layer of placebo effect around placebos. A placebo effect of the placebo effect working even if you know about it as it were.







  • Nah the dark forest doesn’t really work, If turning on a light (so to speak) makes you a target then a muzzle flash is even worse. It takes a lot of energy to kill a planet however you do it and thats going to tell everyone where the shooter is.
    And no you can’t use an asteroid because all the matter in the universe couldn’t make a computer powerful enough to make it hit over a reasonable distance and getting to our solar system to use one of the ones here is just as energetic as firing a projectile.