• Maroon@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    1 day ago

    Imagine however, that a machine

    That’s hypothetical. In the real world, in the human society, the humans who are part of corporations and receiving profits by making/selling these computers must also bear the responsibility.

    • calcopiritus@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      Tbf that leads to the problem of:

      Company/Individual makes program that is in no way meant for making management decision.

      Someone else comes and deploys that program to make management decisions.

      The ones that made that program couldn’t stop the ones that deployed it from deploying it.

      Even if the maker aimed to make a decision-making program, and marketed it as so. Whoever deployed it is ultimately the responsible for it. As long as the maker doesn’t fake tests or certifications of course, I’m sure that would violate many laws.

      • ZombiFrancis@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        1 day ago

        The premise is that a computer must never make a management decision. Making a program capable of management decisons already failed. The deployment and use of that program to that end is already built upon that failure.

    • onnekas@sopuli.xyz
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      I believe those who deploy the machines should be responsible in the first place. The corporations who make/sell those machines should be accountable if they deceptively and intentionally program those machines to act maliciously or in somebody else’s interest.