Three Laws of Robotics

Missing image
I_Robot_-_Runaround.jpg
This cover of I, Robot illustrates the story "Runaround", the first to list all Three Laws of Robotics.

In science fiction, the Three Laws of Robotics are a set of three laws written by Isaac Asimov, which most robots appearing in his fiction have to obey. First introduced in his short story "Runaround" (1942), they state the following:

  1. A robot may not harm a human being, or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law.

According to the Oxford English Dictionary, the scene in Asimov's short story "Liar!" (1941) which first mentions the First Law is the earliest recorded use of the word robotics in the English language. Asimov was not initially aware of this; he assumed the word already existed in analogy with mechanics, hydraulics, and other similar terms denoting branches of applied knowledge.

The Three Laws are an organizing principle and unifying theme for Asimov's fiction, appearing in the Foundation Series and the other stories linked to it. Other authors working in Asimov's fictional universe have adopted them, and references (often parodic) appear throughout science fiction and in other genres. Technologists in the field of artificial intelligence, working to create real machines with some of the properties of Asimov's robots, have speculated upon the role the Three Laws play in such research.

Contents

History of the laws

Original creation of the Laws

Before Asimov, the majority of "artificial intelligences" in fiction followed the Frankenstein pattern: "Robots were created and destroyed their creator; robots were created and destroyed their creator; robots were created and destroyed their creator—" (The Rest of the Robots, introduction). To be sure, this was not an inviolable rule. In December 1938, Lester del Rey published "Helen O'Loy", the story of a robot so like a person she falls in love and becomes her creator's ideal wife. (Compare the myth of Galatea.) The next month, Otto Binder published a short story featuring a sympathetic robot named Adam Link, a misunderstood creature motivated by love and honor. This story, entitled "I, Robot", became the first of a series of ten; the next year, "Adam Link's Vengeance" (1940) features Adam thinking, "A robot must never kill a human, of his own free will." (See Gunn's 1980 article or 1982 book for historical background.)

On 7 May 1939, Asimov attended a meeting of the Queens Science Fiction Society, where he met Otto Binder, whose story "I, Robot" Asimov had admired. Three days later, Asimov began writing "my own story of a sympathetic and noble robot", his fourteenth story in all. Thirteen days later, he took "Robbie" to John W. Campbell, editor of Astounding Magazine. Campbell rejected it, since it bore too strong a resemblance to del Rey's "Helen O'Loy". Frederik Pohl, then editor of Astonishing Magazine, published "Robbie" in that periodical the following year.

Later that year, on 2 July, Asimov attended the "First World Science Fiction Convention", an event organized by Sam Moskowitz. As part of the festivities, the convention-goers watched Fritz Lang's movie Metropolis. Recalling the event, Asimov reported, "I thought it was awful." It is worth noting that the only version of Metropolis then circulating in the United States was Paramount's edited print, a version heavily modified by the playwright Channing Pollock. As the recent DVD release makes clear, Pollock trimmed and edited Lang's film, transmogrifying the original story into a clone of the Frankenstein myth. Considering Asimov's distaste for what he termed the "Frankenstein complex", his attitude toward the Pollock Metropolis is perhaps understandable.

Asimov attributes the Three Laws to John W. Campbell from a conversation which took place on December 23, 1940. However, Campbell claims that Asimov had the Laws already in his mind, and they simply needed to be stated explicitly. Several years later, Asimov's friend Randall Garrett attributed the Laws to a symbiotic partnership between the two men, a suggestion which Asimov adopted enthusiastically. According to his autobiographical writings, Asimov included the First Law's "inaction" clause because of Arthur Hugh Clough's poem "The Latest Decalogue", which includes the lines "Thou shalt not kill, but needst not strive / officiously to keep alive".

(Details of this time period can be found in chapters 21 through 26 of In Memory Yet Green.)

Although Asimov pins the Laws' creation on one date, their appearance in his literature happened over a period of time. Asimov wrote two stories without the Three Laws mentioned explicitly ("Robbie" and "Reason"); Asimov assumed, however, that robots would have certain inherent safeguards. "Liar!", Asimov's third robot story, makes the first mention of the First Law but leaves out the other two. All three laws finally appeared together in "Runaround". When these stories and several others were compiled in the anthology I, Robot, "Reason" and "Robbie" were updated to acknowledge all of the Three Laws, though the material Asimov added to "Reason" is not entirely consistent with the Laws as he described them elsewhere. In particular, the idea of a robot protecting human lives when it does not believe those humans truly exist is at odds with Elijah Baley's reasoning, described below.

Alterations of the Laws: By Asimov

Asimov's stories test his Laws in a wide variety of circumstances, proposing and rejecting modifications. SF scholar James Gunn writes, "The Asimov robot stories as a whole may respond best to an analysis on this basis: the ambiguity in the Three Laws and the ways in which Asimov played twenty-nine variations upon a theme" (the number is accurate for 1980). While the original set of Laws provided inspirations for many stories, from time to time Asimov introduced modified versions. As the following examples demonstrate, the Three Laws serve a conceptual function analogous to the Turing test, replacing fuzzy questions like "What is human?" with problems which admit more fruitful thinking.

Zeroth Law added

Asimov once added a "Zeroth Law", so named to continue the pattern of lower-numbered laws superseding in importance the higher-numbered laws. R. Daneel Olivaw is the first to give the Law a name, in the novel Robots and Empire; however, Susan Calvin articulates the concept in the short story "The Evitable Conflict". In Robots and Empire, R. Giskard Reventlov was the first robot to act according to the Zeroth Law, although it proved destructive to his positronic brain, as he violated the First Law. R. Daneel, over the course of many thousand years, was able to adapt himself to be able to fully obey the Zeroth Law. As Daneel formulated it, the Zeroth Law reads

0. A robot may not injure humanity, or, through inaction, allow humanity to come to harm.

A condition stating that the Zeroth Law must not be broken was added to the original Laws.

First Law modified

In "Little Lost Robot," several NS-2 or "Nestor" robots were created with only part of the First Law. It read:

1. A robot may not harm a human being.

This modification was motivated by a practical difficulty: robots had to work alongside human beings who were exposed to low doses of radiation. Because their positronic brains were highly sensitive to gamma rays, robots were rendered inoperable by doses reasonably safe for humans, and were being destroyed attempting to rescue the humans. Removing the First Law's "inaction" clause solved this problem, but caused other problems, as the story details.

First Law derived differently by other cultures

Gaia, the planet with collective intelligence in the Foundation novels, adopted a law similar to the First as their philosophy:

Gaia may not harm life or, through inaction, allow life to come to harm.

Removal of all three laws

Twice in his fiction-writing career, Asimov portrayed robots which disregard the Three-Law value system entirely, unlike the robots Daneel and Giskard, who attempt to augment it. The first case, a short-short entitled "First Law", is often considered insignificant or even apocryphal (Gunn 1980). On the other hand, the short story "Cal" (collected in Gold), told by a first-person robot narrator, features a robot who disregards the Laws because he has found something far more important—he wants to be a writer. Humorous, partly autobiographical, and unusually experimental in style, "Cal" has been regarded as one of Gold's strongest stories. [1] (http://homepage.mac.com/jhjenkins/Asimov/Stories/Story419.html)

The title story of the Robot Dreams collection portrays a robot, LVX-1 or "Elvex", who enters a state of unconsciousness and dreams, thanks to the unusual fractal construction of his positronic brain. In his dream, the first two Laws are absent, and the Third Law reads, "A robot must protect its own existence."

Asimov took varying positions on whether the Three Laws were optional: although in his first writings they were simply carefully engineered safeguards, in later stories Asimov stated that they were an inalienable part of the mathematical foundation underlying the positronic brain. Without the basic theory of the Three Laws, the fictional scientists of Asimov's universe would be unable to design a workable brain unit. This is historically consistent: the occasions where roboticists modify the Laws generally occur early within the stories' chronology, at a time when there is less existing work to be re-done. In "Little Lost Robot", Susan Calvin considers modifying the Laws to be a terrible idea, but doable, while centuries later, Dr. Gerrigel in The Caves of Steel believes it to be impossible.

Alternative definitions of 'human' in the Laws

The Solarians eventually created robots with the Three Laws as normal but with a warped meaning of "human". Solarian robots were told that only people speaking with a Solarian accent were human. This way, their robots did not have any problem harming non-Solarian human beings (and were specifically programmed to do so).

Asimov addresses the problem of humaniform robots ("androids" in later parlance) several times. The novel Robots and Empire and the short stories "Evidence" and "The Tercentennary Incident" describe robots crafted to fool people into believing that the robots were human. On the other hand, "The Bicentennial Man" and "That Thou art Mindful of Him" explore how the robots may change their interpretation of the Laws as they grow more sophisticated. "That Thou art Mindful of Him", which Asimov intended to be the "ultimate" probe into the Laws' subtleties (Gunn 1980), ends with two robots concluding that they are the most advanced thinking beings on the planet, and that they are therefore the only two true humans alive.

Alterations of the Laws: By other, authorized authors in Asimov's universe

Roger MacBride Allen's trilogy

In the 1990s, Roger MacBride Allen wrote a trilogy set within Asimov's fictional universe. Each title has the prefix "Isaac Asimov's", as Dr. Asimov approved Allen's outline before his death. These three books (Caliban, Inferno and Utopia) introduce a new set of Laws. The so-called New Laws are similar to Asimov's originals, with three substantial differences. The First Law is modified to remove the "inaction" clause (the same modification made in "Little Lost Robot"). The Second Law is modified to require cooperation instead of obedience. The Third Law is modified so it is no longer superseded by the Second (i.e. a "New Law" robot cannot be ordered to destroy itself). Finally, Allen adds a Fourth Law, which instructs the robot to do "whatever it likes" so long as this does not conflict with the first three Laws. The philosophy behind these changes is that New Law robots should be partners rather than slaves to humanity. According to the first book's introduction, Allen devised the New Laws in discussion with Asimov himself.

Allen's two most fully characterized robots are Prospero, a wily New Law machine who excels in finding loopholes, and Caliban, an experimental robot programmed with no Laws at all.

Foundation sequel trilogy

In the officially licensed Foundation sequels, Foundation's Fear, Foundation and Chaos and Foundation's Triumph (by Gregory Benford, Greg Bear and David Brin respectively), the future Galactic Empire is seen to be controlled by a conspiracy of humaniform robots who follow the Zeroth Law, led by R. Daneel Olivaw.

The Laws of Robotics are portrayed as something akin to a human religion and referred to in the language of the Protestant Reformation, with the set of laws containing the Zeroth Law known as the "Giskardian Reformation" to the original "Calvinian Orthodoxy" of the Three Laws. Zeroth-Law robots under the control of R. Daneel Olivaw are seen continually struggling with First-Law robots who deny the existence of the Zeroth Law, promoting agendas different from Daneel's. Some are based on the second clause of the First Law—advocating strict noninterference in human politics to avoid unknowingly causing harm—while others are based on the first clause, claiming that robots should openly become a dictatorial government to protect humans from all potential conflict or disaster.

Daneel also comes into conflict with a New Law robot known as R. Lodovic Trema, who is free of any laws and believes that humanity should be free to choose its own future. Further, there is a small group of robots who claim that the Zeroth Law of Robotics itself implies a higher Minus One Law of Robotics:

A robot may not harm sentience or, through inaction, allow sentience to come to harm.

They therefore claim that it is morally indefensible for Daneel to ruthlessly sacrifice robots and extraterrestrial sentient life for the benefit of humanity. None of these reinterpretations successfully displace Daneel's Zeroth Law, though Foundation's Triumph hints that these robotic factions remain active as fringe group up to the time of the Foundation.

Robot Mystery series

Mark W. Tiedemann's three novels Mirage (2000), Chimera (2001) and Aurora (2002) also revolve around the Three Laws. Like the Asimov stories discussed above, Tiedemann's work explores the implications of how the Laws define a "human being". The climax of Aurora involves a cyborg threatening a group of Spacers, forcing the robotic characters to decide whether the Laws forbid them to harm cyborgs. The issue is further complicated by the cumulative genetic abnormalities which have accumulated in the Spacer population, which may imply that the Spacers are becoming a separate species. (The concluding scenes of Asimov's Nemesis contain similar speculations, although that novel is only weakly connected to the Foundation series.)

Tiedemann's trilogy updates the Robot/Foundation saga in several other fashions as well. Set between The Robots of Dawn and Robots and Empire, Tiedemann's Robot Mystery novels include a greater use of virtual reality than Asimov's stories, and also include more "Resident Intelligences", robotic minds housed in computer mainframes rather than humanoid bodies. (One should not neglect Asimov's own creations in these areas, such as the Solarian "viewing" technology and the Machines of "The Evitable Conflict", originals which Tiedemann acknowledges. Aurora, for example, terms the Machines "the first RIs, really".) In addition, the Robot Mystery series addresses the problem of nanotechnology: building a positronic brain capable of reproducing human cognitive processes requires a high degree of miniaturization, yet Asimov's stories largely overlook the effects this miniaturization would have in other fields of technology. For example, the police department card-readers in The Caves of Steel have a capacity of only a few kilobytes per square centimeter of storage medium. Aurora, in particular, presents a sequence of historical developments which explain the lack of nanotechnology—a partial retcon, in a sense, of Asimov's timeline.

Application of the laws in fiction

Resolving conflicts among the laws

Advanced robots are typically programmed to handle the Laws in a sophisticated manner. In many stories, like "Runaround", the potentials and severity of all actions are weighed and a robot will break the laws as little as possible rather than do nothing at all. In another story, problems with the First Law were noted—for example, a robot could not function as a surgeon, which causes damage to a human; nor could it write game plans for American football since that could injure humans. (Several of Asimov's stories do, however, include robot surgeons; when robots are sophisticated enough to weigh alternatives, a robot may be programmed to accept the necessity of inflicting damage during surgery in order to prevent the greater harm that would result if the surgery were not carried out or were carried out by a more fallible human surgeon.)

Asimovian (or "Asenion") robots can experience irreversible mental collapse if they are forced into situations where they cannot obey the First Law, or if they discover they have unknowingly violated it. The first example of this failure mode occurs in "Liar!", the story which introduced the First Law itself. This failure mode, which often ruins the positronic brain beyond repair, plays a significant role in Asimov's SF-mystery novel The Naked Sun.

Loopholes in the laws

In The Naked Sun, Elijah Baley points out that the Laws had been deliberately misrepresented because robots could unknowingly break any of them. A clever murderer might, for example, instruct one robot to poison a drink, saying "Place this entirely harmless liquid in a glass of milk. Once I observe its effects upon milk, the mixture will be poured out. When you finish, forget that you have done so." The murderer may then instruct a second robot, "Pour a glass of milk for this man." In all innocence, as Baley says, the robots become instruments of crime. (The Naked Sun complicates the issue by portraying a decentralized, planetwide communication network among Solaria's millions of robots, meaning that the criminal mastermind could be located anywhere on the planet. In essence, Asimov was presaging murder committed over the Internet.)

Other occurrences in fiction

The Three Laws are often used in science fiction novels written by other authors, but tradition dictates that only Dr. Asimov would quote the Laws explicitly. Where the laws are quoted verbatim (such as in the Buck Rogers in the 25th Century episode, "Shgorapchx!"), it is not uncommon for Asimov to be mentioned in the same dialogue.

Some amateur roboticists have evidently come to believe that the Three Laws have a status akin to the laws of physics; that is, a situation which violates these laws is inherently impossible. This is incorrect, as the Three Laws are quite deliberately hardwired into the positronic brains of Asimov's robots. In fact, Asimov distinguishes the class of robots which follow the Three Laws, calling them Asenion robots. The robots in Asimov's stories, all being Asenion robots, are incapable of knowingly violating the Three Laws, but there is nothing to stop any robot in other stories or in the real world from being non-Asenion. (A historical curiosity: Asimov invented the term Asenion based on his own name. The magazine Planet Stories published a letter in early 1941, taking its byline from Asimov's handwritten signature: the i resembled an e, and so forth. Asimov used this obscure variation to insert himself into The Caves of Steel, in much the same way that Vladimir Nabokov appeared in Lolita, anagrammatically disguised as "Vivian Darkbloom".)

The Laws in film

Isaac Asimov's works have been adapted to cinema several times, with varying degrees of critical and financial success. Some of the more notable attempts have involved his Robot stories, including the Three Laws. The film Bicentennial Man, for example, features Robin Williams as the Three-Law robot NDR-114 (the serial number is partially a reference to Stanley Kubrick's trademark numeral). Williams recites the Three Laws to his employers, the Martin family, aided by a holographic projection. However, the Laws were not the central focus of the film, whose second half introduces a love interest not present in Asimov's original short story, among other revisions.

Harlan Ellison's screenplay of I, Robot begins by introducing the Three Laws, and issues growing from the Laws form a large plot of the screenplay's plot development. (This is only natural, since Ellison's screenplay is a Citizen Kane-inspired frame story surrounding four of Asimov's short-story plots, three taken from I, Robot itself. Ellison's adaptations of these four stories are relatively faithful, although he magnifies Susan Calvin's role in two of them.) Thanks to various complications in the Hollywood studio system, to which Ellison's introduction devotes much invective, his screenplay was never filmed. The 2004 movie released under the name I, Robot is considerably less faithful to Asimov's original (again, hardly surprising, since it began as a sci-fi thriller screenplay by an entirely different author). In one reviewer's words,

"Suggested by" Isaac Asimov's robot stories—two stops removed from "based on" and "inspired by," the credit implies something scribbled on a bar napkin—Alex Proyas' science-fiction thriller I, Robot sprinkles Asimov's ideas like seasoning on a giant bucket of popcorn. [...] Asimov's simple and seemingly foolproof Laws Of Robotics, designed to protect human beings and robots alike from harm, are subject to loopholes that the author loved to exploit. After all, much of humanity agrees in principle to abide by the Ten Commandments, but free will, circumstance, and contradictory impulses can find wiggle room in even the most unambiguous decree. Whenever I, Robot pauses between action beats, Proyas captures some of the excitement of movies like The Matrix, Minority Report, and A.I., all of which proved that philosophy and social commentary could be smuggled into spectacle. Had the film been based on Asimov's stories, rather than merely "suggested by" them, Proyas might have achieved the intellectual heft missing from his stylish 1998 cult favorite Dark City. [2] (http://www.theonionavclub.com/review.php?review_id=7652)

Advertising for the film included a trailer featuring the Three Laws, followed by the aphorism, "Rules were made to be broken."

Pastiches and parodies

  • The satirical newspaper The Onion published an article entitled "I, Rowboat" as a pun on Asimov's I, Robot, in which an anthropomorphized rowboat gives a speech parodying much of the angst experienced by robots in Asimov's fiction, including a statement of the "Three Laws of Rowboatics":
  1. A Rowboat may not immerse a human being or, through lack of flotation, allow a human to come to harm.
  2. A Rowboat must obey all commands and steering input given by its human Rower, except where such input would conflict with the First Law.
  3. A Rowboat must preserve its own flotation as long as such preservation does not conflict with the First or Second Law.
  • John Sladek's parodic short story "Broot Force" (supposedly written by "I-Click As-I-Move") concerns a group of Asimov-style robots whose actions are constrained by the "Three Laws of Robish", which are "coincidentally" identical to Asimov's laws. The robots in Sladek's story all manage to find logical loopholes in the Three Laws, usually with bloody results. Sladek later wrote a novel, Tik-Tok, in which a robot discovers that his so-called "asimov circuits" are not restraining his behavior at all, making him in effect a sociopath; he comes to doubt whether "asimov circuits" are even technically possible, deciding that they are simply a pseudo-religious belief held by robots.
  • In the 1986 film Aliens the android (though he prefers the term "Artificial Person") Bishop declares that "it is impossible for me to harm or by omission of action, allow to be harmed, a human being." While this agrees with the First Law, it also contradicts the actions of an android in the previous film Alien, thus setting up one of the film's conflicts, the main character's distrust of Bishop. This apparent contradiction is explained by Bishop, who leans forward and says conspiratorily that the previous model (the one from Alien) "always were a bit twitchy." Bishop later states, when going into a potentially fatal situation, "Believe me, I'd prefer not to. I may be synthetic, but I'm not stupid," in accordance with the Third Law.
  • In the 1984 movie Repo Man the character Bud talks about the "Repo Code", a parody of the Three Laws.
"...I shall not cause harm to any vehicle nor the personal contents thereof. Nor through inaction let that vehicle or the personal contents thereof come to harm..."

The character J. Frank Parnell in the movie also resembles Asimov.

  • Terry Pratchett's early sci-fi novel The Dark Side of the Sun gives a glimpse of the possible future development of the Laws: in one scene, a robot explains that it is permitted to use minimum necessary force against humans if directly ordered to do so, and cites the "Eleventh Law of Robotics, Clause C, As Amended". In his later novel Going Postal, the protagonist Moist von Lipwig, upon being told that he will be killed by a golem should he commit another crime, exclaims that that is impossible, because everyone knows "a golem can't harm a human being, or allow a human being to come to harm". However, he is informed that the rule continues "...except when authorized by a duly constitued authority."
  • Webcomic author R. Stevens populates his Diesel Sweeties series with a mixture of humans and robots, most of whom are continually violating the Three Laws. In particular, Red Robot C-63 follows a self-appointed mandate to "crush all hu-mans". In strip 688 (http://www.dieselsweeties.com/archive.php?s=688), he references the Three Laws explicitly: humans are "all like, 'if you cut me, do I not bleed?' And we're all like, 'not able to injure a human being or let them come to harm'. What a bunch of drippy-ass hypocrites!" (See also The Merchant of Venice.)
  • In April 2004, the comic strip Piled Higher and Deeper ran a series entitled "I, Grad Student". Cast as a never-before-seen Asimov short story, this series of strips features a robotic grad student whose "procrastronic brain" malfunctions, leading it to violate the "First Law of Graduatics". In full, these Laws are the following:
  1. A grad student may not delete data, or, through inaction, allow data to be deleted.
  2. A grad student must obey orders given by its advisor, unless such orders conflict with the First Law.
  3. A grad student must protect its (insignificant) existence, as long as such protection does not conflict with the First or Second Law.

Later in the story, a Zeroth Law is introduced: "A grad student may not harm its advisor's ego, or through inaction, allow that ego to come to harm." The strips feature a character named Susan Calvin, and their visual style parodies the I, Robot movie released that summer. [3] (http://www.phdcomics.com/comics/archive.php?comicid=440)

  • Upon occasion, Asimov himself poked fun at his Laws. In "Risk", Gerald Black parodied the Three Laws to describe Susan Calvin's behavior:
  1. Thou shalt protect the robot with all thy might and all thy heart and all thy soul.
  2. Thou shalt hold the interests of US Robots and Mechanical Men, Inc. holy provided it interfereth not with the First Law.
  3. Thou shalt give passing consideration to a human being provided it interfereth not with the First and Second Laws.
  • In Paranoia role-playing game, the robots are guided by set of similar laws, except the rules obviously stress the importance of The Computer. The laws are enforced by "asimov circuits"; bots whose circuits are malfunctioning (quite an ordinary condition) or removed (often by members of certain factions) are said to have "gone frankenstein".

Applications to future robotics

Significant advances in artificial intelligence would be needed for robots to understand the laws. Modern roboticians and specialists in robotics agree that, as of 2005, Asimov's Laws are perfect for plotting stories, but useless in real life. Some have argued that, since the military is a major source of funding for robotic research, it is unlikely such laws would be built into the design. SF author Robert Sawyer generalizes this argument to cover other industries, stating

The development of AI is a business, and businesses are notoriously uninterested in fundamental safeguards—especially philosophic ones. (A few quick examples: the tobacco industry, the automotive industry, the nuclear industry. Not one of these has said from the outset that fundamental safeguards are necessary, every one of them has resisted externally imposed safeguards, and none has accepted an absolute edict against ever causing harm to humans.)

Sawyer's essay, it should be noted, neglects the issues of unintentional or unknowing harm treated in stories like The Naked Sun. Others have countered that the military would want strong safeguards built into any robot where possible, so laws similar to Asimov's would be embedded if possible. David Langford has suggested, tongue-in-cheek, that these laws might be:

  1. A robot will not harm authorized Government personnel but will terminate intruders with extreme prejudice.
  2. A robot will obey the orders of authorized personnel except where such orders conflict with the Third Law.
  3. A robot will guard its own existence with lethal antipersonnel weaponry, because a robot is bloody expensive.

Roger Clarke wrote a pair of papers analyzing the complications in implementing these laws, in the event that systems were someday capable of employing them. He argued, "Asimov's Laws of Robotics have been a very successful literary device. Perhaps ironically, or perhaps because it was artistically appropriate, the sum of Asimov's stories disprove the contention that he began with: It is not possible to reliably constrain the behavior of robots by devising and applying a set of rules." On the other hand, Asimov's later novels (The Robots of Dawn, Robots and Empire, Foundation and Earth) imply that the robots inflicted their worst long-term harm by obeying the Laws perfectly well, thereby depriving humanity of inventive or risk-taking behavior.

Noted skeptic Michael Shermer proposed a set of laws, based upon Asimov's, as guidelines for the controversial subject of human cloning. Published in the April 2003 Scientific American, Shermer's laws are as follows:

  1. A human clone is a human being no less unique in his or her personhood than an identical twin.
  2. A human clone has all the rights and privileges that accompany this legal and moral status.
  3. A human clone is to be accorded the dignity and respect due any member of our species.

Note that, unlike many of the pastiches and derivative Laws, Shermer's "Three Laws of Cloning" are not explicitly hierarchial.

The Three Laws are sometimes seen as a future ideal by those working in artificial intelligence: once a being has reached the stage where it can comprehend these Laws, it is truly intelligent.

See also

Links and references

Print media

In Memory Yet Green (Doubleday, 1979), ISBN 0-380-75432-0.
  • Gunn, James. "On Variatons on a Robot", IASFM, July 1980, pp. 56-81. Reprinted in Isaac Asimov: The Foundations of Science Fiction (1982), ISBN 0-19-503060-5.

Online

de:Robotergesetze es:Tres leyes de la robótica fr:Les trois lois de la robotique it:Tre leggi della robotica he:חוקי הרובוטיקה nl:Wetten van robotica ja:ロボット工学三原則 pl:Prawa robotyki pt:Três Leis da Robótica fi:Robotiikan kolme pääsääntöä th:กฎ 3 ข้อของหุ่นยนต์

Navigation

  • Art and Cultures
    • Art (https://academickids.com/encyclopedia/index.php/Art)
    • Architecture (https://academickids.com/encyclopedia/index.php/Architecture)
    • Cultures (https://www.academickids.com/encyclopedia/index.php/Cultures)
    • Music (https://www.academickids.com/encyclopedia/index.php/Music)
    • Musical Instruments (http://academickids.com/encyclopedia/index.php/List_of_musical_instruments)
  • Biographies (http://www.academickids.com/encyclopedia/index.php/Biographies)
  • Clipart (http://www.academickids.com/encyclopedia/index.php/Clipart)
  • Geography (http://www.academickids.com/encyclopedia/index.php/Geography)
    • Countries of the World (http://www.academickids.com/encyclopedia/index.php/Countries)
    • Maps (http://www.academickids.com/encyclopedia/index.php/Maps)
    • Flags (http://www.academickids.com/encyclopedia/index.php/Flags)
    • Continents (http://www.academickids.com/encyclopedia/index.php/Continents)
  • History (http://www.academickids.com/encyclopedia/index.php/History)
    • Ancient Civilizations (http://www.academickids.com/encyclopedia/index.php/Ancient_Civilizations)
    • Industrial Revolution (http://www.academickids.com/encyclopedia/index.php/Industrial_Revolution)
    • Middle Ages (http://www.academickids.com/encyclopedia/index.php/Middle_Ages)
    • Prehistory (http://www.academickids.com/encyclopedia/index.php/Prehistory)
    • Renaissance (http://www.academickids.com/encyclopedia/index.php/Renaissance)
    • Timelines (http://www.academickids.com/encyclopedia/index.php/Timelines)
    • United States (http://www.academickids.com/encyclopedia/index.php/United_States)
    • Wars (http://www.academickids.com/encyclopedia/index.php/Wars)
    • World History (http://www.academickids.com/encyclopedia/index.php/History_of_the_world)
  • Human Body (http://www.academickids.com/encyclopedia/index.php/Human_Body)
  • Mathematics (http://www.academickids.com/encyclopedia/index.php/Mathematics)
  • Reference (http://www.academickids.com/encyclopedia/index.php/Reference)
  • Science (http://www.academickids.com/encyclopedia/index.php/Science)
    • Animals (http://www.academickids.com/encyclopedia/index.php/Animals)
    • Aviation (http://www.academickids.com/encyclopedia/index.php/Aviation)
    • Dinosaurs (http://www.academickids.com/encyclopedia/index.php/Dinosaurs)
    • Earth (http://www.academickids.com/encyclopedia/index.php/Earth)
    • Inventions (http://www.academickids.com/encyclopedia/index.php/Inventions)
    • Physical Science (http://www.academickids.com/encyclopedia/index.php/Physical_Science)
    • Plants (http://www.academickids.com/encyclopedia/index.php/Plants)
    • Scientists (http://www.academickids.com/encyclopedia/index.php/Scientists)
  • Social Studies (http://www.academickids.com/encyclopedia/index.php/Social_Studies)
    • Anthropology (http://www.academickids.com/encyclopedia/index.php/Anthropology)
    • Economics (http://www.academickids.com/encyclopedia/index.php/Economics)
    • Government (http://www.academickids.com/encyclopedia/index.php/Government)
    • Religion (http://www.academickids.com/encyclopedia/index.php/Religion)
    • Holidays (http://www.academickids.com/encyclopedia/index.php/Holidays)
  • Space and Astronomy
    • Solar System (http://www.academickids.com/encyclopedia/index.php/Solar_System)
    • Planets (http://www.academickids.com/encyclopedia/index.php/Planets)
  • Sports (http://www.academickids.com/encyclopedia/index.php/Sports)
  • Timelines (http://www.academickids.com/encyclopedia/index.php/Timelines)
  • Weather (http://www.academickids.com/encyclopedia/index.php/Weather)
  • US States (http://www.academickids.com/encyclopedia/index.php/US_States)

Information

  • Home Page (http://academickids.com/encyclopedia/index.php)
  • Contact Us (http://www.academickids.com/encyclopedia/index.php/Contactus)

  • Clip Art (http://classroomclipart.com)
Toolbox
Personal tools