(Tech Xplore)—Roboticists working on a robot's hardware and software can brag a lot. They have made robots which can flip pancakes, make sandwiches, ask children and adults questions, and generate expressions of happiness, wonder and sadness.
Technologists have been successful in teaching them a bunch of tasks and behaviors, but how are we doing teaching robots our value systems? That is a daunting challenge and more scientists are stepping up to say it is time to work this out.
The latest news is that a discussion has been prepared which sets forth a standard for robotic ethics.
This is "Robots and robotic devices. Guide to the ethical design and application of robots and robotic systems."
It is a guide from BSI, a business standards company.
BS 8611 as it is referred to was written by a committee of scientists, academics, ethicists, philosophers and users. The target readers include the device designers and managers. BSI works with business experts, government bodies, trade associations and consumer groups to capture best practice; BSI says it was the world's first National Standards Body.
So the guidelines are for the identification of potential ethical harm. The standard builds on existing safety requirements for different types of robots, covering industrial, personal care and medical.
This guide is not about safety per se as it is about ethics. The authors raise issues that manufacturers should consider.
London-based Mike Brown in Inverse gives us a brief look at topics that should be explored. One of them is the possibility that some children will develop considerable bonds with robots and in doing so will be less conscious than would adults that the robot is not a human. How easily can they accept that their beloved robot is a machine?
Hannah Devlin is the Guardian's science correspondent; she also pointed this out as a contentious issue in robotics, as to whether an emotional bond with a robot is desirable. Especially in the field of assistive technologies where robots are fashioned to interact with the elderly and children, this could be an issue worthy of attention.
Devlin quoted a robotics professor on the subject. He welcomed new of the guide. Alan Winfield, a professor of robotics at the University of the West of England, said they represented 'the first step towards embedding ethical values into robotics and AI.'"
Devlin described the guide as "written in the dry language of a health and safety manual, but the undesirable scenarios it highlights could be taken directly from fiction."
She also quoted Dan Palmer, head of manufacturing at BSI.
His remarks underscore the fact that attention must be paid to ethical issues surrounding robots.
"Using robots and automation techniques to make processes more efficient, flexible and adaptable is an essential part of manufacturing growth. For this to be acceptable, it is essential that ethical issues and hazards such as dehumanisation of humans or over-dependence on robots, are identified and addressed.
"This new guidance on how to deal with various robot applications will help designers and users of robots and autonomous systems to establish this new area of work."
Tom Brant, meanwhile, in PCMag, reported that the EU was working on robot ethics standards, towards a discussion of provisions calling for robots that can act in humans' best interests, and forbidding users to modify a robot such that it would function as a weapon.