Autonomy, when applied to humans, is taken to mean freedom to make choices and take actions based on those choices without constraints from others. There is an implicit assumption that the autonomous person will be subject to the laws and other ethical constraints which apply to every other person. It is generally accepted that the current developments of robots and highly automated systems will continue. However, concerns are raised about whether there are risks to individuals and to society as a whole from these developments. These concerns can only be alleviated by showing that the new type of product can be trusted to behave in an ethical manner. The book presents engineering approaches to the problem.The terms `robot', `intelligent system' and `Artificial Intelligence (AI)' are frequently used, but with little consensus about their exact meaning. Clear definitions of terms are essential in both engineering and law so various definitions are provided and discussed. An autonomous system is defined as `a system which has the ability to perform intended tasks based on current state, knowledge and sensing, without human intervention'. Autonomy level was a specialist concept, but is now gaining more widespread use, so the range of definitions of the relevant terms, with a selection of industry-specific definitions, is explained. The article reviews various issues relating to current thinking on automated and autonomous weapon systems.
Automated control and autonomy, Page 1 of 2
< Previous page Next page > /docserver/preview/fulltext/books/ra/sbra517e/SBRA517E_ch3-1.gif /docserver/preview/fulltext/books/ra/sbra517e/SBRA517E_ch3-2.gif