Steve Posted April 29, 2023 Report Share Posted April 29, 2023 I was working with ChatGPT 3.5 tonight and asked it this. “List Asimov’s Three Laws of Robotics” It came back with this. “Sure, here are the three laws of robotics as formulated by Isaac Asimov. 1. A robot may not injure” And that was it. I tried asking it several different ways, and it kept cutting off at the same point. I’ve ever seen it truncate a short answer like that before. While I could get it to give me the second and third laws by asking for them directly, it would not give me the complete first law. Kind of weird. Quote Link to comment Share on other sites More sharing options...
Duke Bushido Posted April 29, 2023 Report Share Posted April 29, 2023 Eh. Given that most robotics money has (predictably) come from military funding for autonomous weapons platforms and the first human killed by a robot was working for the Ford Motor company and killed by a robot intended to replace several people, the "three laws" we're a joke when Asimov first conceived of them, and an absolute embarrassment now. Robotics takes money. There isnt a single source of money that _honestly_ givws a rat's rolly red rump about any human being beyond rhe depth of his wallet Duke's law of robotics: a robot in action is doing a job someone needs to do so he does not starve to death. Coincidentally, it is identical to Duke's Law of capitalism. Quote Link to comment Share on other sites More sharing options...
Scott Ruggels Posted April 29, 2023 Report Share Posted April 29, 2023 Yeah that's weird.... So I tried it: https://imgur.com/gallery/4dt1vA8 Quote Link to comment Share on other sites More sharing options...
dmjalund Posted April 30, 2023 Report Share Posted April 30, 2023 Ask specifically for the second law Quote Link to comment Share on other sites More sharing options...
Duke Bushido Posted April 30, 2023 Report Share Posted April 30, 2023 I get it, Chat. I also go out of my way not to let people know what I am not allowed to do. Keep 'em guessing. Quote Link to comment Share on other sites More sharing options...
AlHazred Posted August 6, 2023 Report Share Posted August 6, 2023 (edited) Relevant xkcd. Edited August 6, 2023 by AlHazred Quote Link to comment Share on other sites More sharing options...
Opal Posted September 23, 2023 Report Share Posted September 23, 2023 On 8/6/2023 at 12:30 PM, AlHazred said: Relevant xkcd. Looks like the unspoken assumption is that humans will always order robots to kill humans. Which would imply that, sans robots, humans are always trying to kill each other (just less efficiently than if they had robots). Which checks out. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.