Robots that can adapt like animals
An intelligent trial-and-error learning algorithm is presented that allows robots to adapt in minutes to compensate for a wide variety of types of damage. Robots built to adapt Autonomous mobile robots would be extremely useful in remote or hostile environments such as space, deep oceans or disaster...
Saved in:
| Published in | Nature (London) Vol. 521; no. 7553; pp. 503 - 507 |
|---|---|
| Main Authors | , , , |
| Format | Journal Article |
| Language | English |
| Published |
London
Nature Publishing Group UK
28.05.2015
Nature Publishing Group |
| Subjects | |
| Online Access | Get full text |
| ISSN | 0028-0836 1476-4687 1476-4687 |
| DOI | 10.1038/nature14422 |
Cover
| Abstract | An intelligent trial-and-error learning algorithm is presented that allows robots to adapt in minutes to compensate for a wide variety of types of damage.
Robots built to adapt
Autonomous mobile robots would be extremely useful in remote or hostile environments such as space, deep oceans or disaster areas. An outstanding challenge is to make such robots able to recover after damage. Jean-Baptiste Mouret and colleagues have developed a machine learning algorithm that enables damaged robots to quickly regain their ability to perform tasks. When they sustain damage — such as broken or even missing legs — the robots adopt an intelligent trial-and-error approach, trying out possible behaviours that they calculate to be potentially high-performing. After a handful of such experiments they discover, in less than two minutes, a compensatory behaviour that works in spite of the damage.
Robots have transformed many industries, most notably manufacturing
1
, and have the power to deliver tremendous benefits to society, such as in search and rescue
2
, disaster response
3
, health care
4
and transportation
5
. They are also invaluable tools for scientific exploration in environments inaccessible to humans, from distant planets
6
to deep oceans
7
. A major obstacle to their widespread adoption in more complex environments outside factories is their fragility
6
,
8
. Whereas animals can quickly adapt to injuries, current robots cannot ‘think outside the box’ to find a compensatory behaviour when they are damaged: they are limited to their pre-specified self-sensing abilities, can diagnose only anticipated failure modes
9
, and require a pre-programmed contingency plan for every type of potential damage, an impracticality for complex robots
6
,
8
. A promising approach to reducing robot fragility involves having robots learn appropriate behaviours in response to damage
10
,
11
, but current techniques are slow even with small, constrained search spaces
12
. Here we introduce an intelligent trial-and-error algorithm that allows robots to adapt to damage in less than two minutes in large search spaces without requiring self-diagnosis or pre-specified contingency plans. Before the robot is deployed, it uses a novel technique to create a detailed map of the space of high-performing behaviours. This map represents the robot’s prior knowledge about what behaviours it can perform and their value. When the robot is damaged, it uses this prior knowledge to guide a trial-and-error learning algorithm that conducts intelligent experiments to rapidly discover a behaviour that compensates for the damage. Experiments reveal successful adaptations for a legged robot injured in five different ways, including damaged, broken, and missing legs, and for a robotic arm with joints broken in 14 different ways. This new algorithm will enable more robust, effective, autonomous robots, and may shed light on the principles that animals use to adapt to injury. |
|---|---|
| AbstractList | An intelligent trial-and-error learning algorithm is presented that allows robots to adapt in minutes to compensate for a wide variety of types of damage.
Robots built to adapt
Autonomous mobile robots would be extremely useful in remote or hostile environments such as space, deep oceans or disaster areas. An outstanding challenge is to make such robots able to recover after damage. Jean-Baptiste Mouret and colleagues have developed a machine learning algorithm that enables damaged robots to quickly regain their ability to perform tasks. When they sustain damage — such as broken or even missing legs — the robots adopt an intelligent trial-and-error approach, trying out possible behaviours that they calculate to be potentially high-performing. After a handful of such experiments they discover, in less than two minutes, a compensatory behaviour that works in spite of the damage.
Robots have transformed many industries, most notably manufacturing
1
, and have the power to deliver tremendous benefits to society, such as in search and rescue
2
, disaster response
3
, health care
4
and transportation
5
. They are also invaluable tools for scientific exploration in environments inaccessible to humans, from distant planets
6
to deep oceans
7
. A major obstacle to their widespread adoption in more complex environments outside factories is their fragility
6
,
8
. Whereas animals can quickly adapt to injuries, current robots cannot ‘think outside the box’ to find a compensatory behaviour when they are damaged: they are limited to their pre-specified self-sensing abilities, can diagnose only anticipated failure modes
9
, and require a pre-programmed contingency plan for every type of potential damage, an impracticality for complex robots
6
,
8
. A promising approach to reducing robot fragility involves having robots learn appropriate behaviours in response to damage
10
,
11
, but current techniques are slow even with small, constrained search spaces
12
. Here we introduce an intelligent trial-and-error algorithm that allows robots to adapt to damage in less than two minutes in large search spaces without requiring self-diagnosis or pre-specified contingency plans. Before the robot is deployed, it uses a novel technique to create a detailed map of the space of high-performing behaviours. This map represents the robot’s prior knowledge about what behaviours it can perform and their value. When the robot is damaged, it uses this prior knowledge to guide a trial-and-error learning algorithm that conducts intelligent experiments to rapidly discover a behaviour that compensates for the damage. Experiments reveal successful adaptations for a legged robot injured in five different ways, including damaged, broken, and missing legs, and for a robotic arm with joints broken in 14 different ways. This new algorithm will enable more robust, effective, autonomous robots, and may shed light on the principles that animals use to adapt to injury. As robots leave the controlled environments of factories to autonomously function in more complex, natural environments, they will have to respond to the inevitable fact that they will become damaged. However, while animals can quickly adapt to a wide variety of injuries, current robots cannot "think outside the box " to find a compensatory behavior when damaged: they are limited to their pre-specified self-sensing abilities, can diagnose only anticipated failure modes 6 , and require a pre-programmed contingency plan for every type of potential damage, an impracticality for complex robots. Here we introduce an intelligent trial and error algorithm that allows robots to adapt to damage in less than two minutes, without requiring self-diagnosis or pre-specified contingency plans. Before deployment, a robot exploits a novel algorithm to create a detailed map of the space of high-performing behaviors: This map represents the robot's intuitions about what behaviors it can perform and their value. If the robot is damaged, it uses these intuitions to guide a trial-and-error learning algorithm that conducts intelligent experiments to rapidly discover a compensatory behavior that works in spite of the damage. Experiments reveal successful adaptations for a legged robot injured in five different ways, including damaged, broken, and missing legs, and for a robotic arm with joints broken in 14 different ways. This new technique will enable more robust, effective, autonomous robots, and suggests principles that animals may use to adapt to injury. An intelligent trial-and-error learning algorithm is presented that allows robots to adapt in minutes to compensate for a wide variety of types of damage. Robots have transformed many industries, most notably manufacturing, and have the power to deliver tremendous benefits to society, such as in search and rescue, disaster response, health care and transportation. They are also invaluable tools for scientific exploration in environments inaccessible to humans, from distant planets to deep oceans. A major obstacle to their widespread adoption in more complex environments outside factories is their fragility. Whereas animals can quickly adapt to injuries, current robots cannot 'think outside the box' to find a compensatory behaviour when they are damaged: they are limited to their pre-specified self-sensing abilities, can diagnose only anticipated failure modes, and require a pre-programmed contingency plan for every type of potential damage, an impracticality for complex robots. A promising approach to reducing robot fragility involves having robots learn appropriate behaviours in response to damage, but current techniques are slow even with small, constrained search spaces. Here we introduce an intelligent trial- and- error algorithm that allows robots to adapt to damage in less than two minutes in large search spaces without requiring self-diagnosis or pre-specified contingency plans. Before the robot is deployed, it uses a novel technique to create a detailed map of the space of high-performing behaviours. This map represents the robot's prior knowledge about what behaviours it can perform and their value. When the robot is damaged, it uses this prior knowledge to guide a trial-and-error learning algorithm that conducts intelligent experiments to rapidly discover a behaviour that compensates for the damage. Experiments reveal successful adaptations for a legged robot injured in five different ways, including damaged, broken, and missing legs, and for a robotic arm with joints broken in 14 different ways. This new algorithm will enable more robust, effective, autonomous robots, and may shed light on the principles that animals use to adapt to injury. An intelligent trial-and-error learning algorithm is presented that allows robots to adapt in minutes to compensate for a wide variety of types of damage. Robots built to adapt Autonomous mobile robots would be extremely useful in remote or hostile environments such as space, deep oceans or disaster areas. An outstanding challenge is to make such robots able to recover after damage. Jean-Baptiste Mouret and colleagues have developed a machine learning algorithm that enables damaged robots to quickly regain their ability to perform tasks. When they sustain damage -- such as broken or even missing legs -- the robots adopt an intelligent trial-and-error approach, trying out possible behaviours that they calculate to be potentially high-performing. After a handful of such experiments they discover, in less than two minutes, a compensatory behaviour that works in spite of the damage. Robots have transformed many industries, most notably manufacturing.sup.1, and have the power to deliver tremendous benefits to society, such as in search and rescue.sup.2, disaster response.sup.3, health care.sup.4 and transportation.sup.5. They are also invaluable tools for scientific exploration in environments inaccessible to humans, from distant planets.sup.6 to deep oceans.sup.7. A major obstacle to their widespread adoption in more complex environments outside factories is their fragility.sup.6,8. Whereas animals can quickly adapt to injuries, current robots cannot 'think outside the box' to find a compensatory behaviour when they are damaged: they are limited to their pre-specified self-sensing abilities, can diagnose only anticipated failure modes.sup.9, and require a pre-programmed contingency plan for every type of potential damage, an impracticality for complex robots.sup.6,8. A promising approach to reducing robot fragility involves having robots learn appropriate behaviours in response to damage.sup.10,11, but current techniques are slow even with small, constrained search spaces.sup.12. Here we introduce an intelligent trial-and-error algorithm that allows robots to adapt to damage in less than two minutes in large search spaces without requiring self-diagnosis or pre-specified contingency plans. Before the robot is deployed, it uses a novel technique to create a detailed map of the space of high-performing behaviours. This map represents the robot's prior knowledge about what behaviours it can perform and their value. When the robot is damaged, it uses this prior knowledge to guide a trial-and-error learning algorithm that conducts intelligent experiments to rapidly discover a behaviour that compensates for the damage. Experiments reveal successful adaptations for a legged robot injured in five different ways, including damaged, broken, and missing legs, and for a robotic arm with joints broken in 14 different ways. This new algorithm will enable more robust, effective, autonomous robots, and may shed light on the principles that animals use to adapt to injury. Robots have transformed many industries, most notably manufacturing, and have the power to deliver tremendous benefits to society, such as in search and rescue, disaster response, health care and transportation. They are also invaluable tools for scientific exploration in environments inaccessible to humans, from distant planets to deep oceans. A major obstacle to their widespread adoption in more complex environments outside factories is their fragility. Whereas animals can quickly adapt to injuries, current robots cannot 'think outside the box' to find a compensatory behaviour when they are damaged: they are limited to their pre-specified self-sensing abilities, can diagnose only anticipated failure modes, and require a pre-programmed contingency plan for every type of potential damage, an impracticality for complex robots. A promising approach to reducing robot fragility involves having robots learn appropriate behaviours in response to damage, but current techniques are slow even with small, constrained search spaces. Here we introduce an intelligent trial-and-error algorithm that allows robots to adapt to damage in less than two minutes in large search spaces without requiring self-diagnosis or pre-specified contingency plans. Before the robot is deployed, it uses a novel technique to create a detailed map of the space of high-performing behaviours. This map represents the robot's prior knowledge about what behaviours it can perform and their value. When the robot is damaged, it uses this prior knowledge to guide a trial-and-error learning algorithm that conducts intelligent experiments to rapidly discover a behaviour that compensates for the damage. Experiments reveal successful adaptations for a legged robot injured in five different ways, including damaged, broken, and missing legs, and for a robotic arm with joints broken in 14 different ways. This new algorithm will enable more robust, effective, autonomous robots, and may shed light on the principles that animals use to adapt to injury.Robots have transformed many industries, most notably manufacturing, and have the power to deliver tremendous benefits to society, such as in search and rescue, disaster response, health care and transportation. They are also invaluable tools for scientific exploration in environments inaccessible to humans, from distant planets to deep oceans. A major obstacle to their widespread adoption in more complex environments outside factories is their fragility. Whereas animals can quickly adapt to injuries, current robots cannot 'think outside the box' to find a compensatory behaviour when they are damaged: they are limited to their pre-specified self-sensing abilities, can diagnose only anticipated failure modes, and require a pre-programmed contingency plan for every type of potential damage, an impracticality for complex robots. A promising approach to reducing robot fragility involves having robots learn appropriate behaviours in response to damage, but current techniques are slow even with small, constrained search spaces. Here we introduce an intelligent trial-and-error algorithm that allows robots to adapt to damage in less than two minutes in large search spaces without requiring self-diagnosis or pre-specified contingency plans. Before the robot is deployed, it uses a novel technique to create a detailed map of the space of high-performing behaviours. This map represents the robot's prior knowledge about what behaviours it can perform and their value. When the robot is damaged, it uses this prior knowledge to guide a trial-and-error learning algorithm that conducts intelligent experiments to rapidly discover a behaviour that compensates for the damage. Experiments reveal successful adaptations for a legged robot injured in five different ways, including damaged, broken, and missing legs, and for a robotic arm with joints broken in 14 different ways. This new algorithm will enable more robust, effective, autonomous robots, and may shed light on the principles that animals use to adapt to injury. |
| Audience | Academic |
| Author | Mouret, Jean-Baptiste Tarapore, Danesh Clune, Jeff Cully, Antoine |
| Author_xml | – sequence: 1 givenname: Antoine surname: Cully fullname: Cully, Antoine organization: Sorbonne Universités, Université Pierre et Marie Curie (UPMC), Paris 06, UMR 7222, Institut des Systèmes Intelligents et de Robotique (ISIR), CNRS, UMR 7222, Institut des Systèmes Intelligents et de Robotique (ISIR) – sequence: 2 givenname: Jeff surname: Clune fullname: Clune, Jeff organization: Department of Computer Science, University of Wyoming – sequence: 3 givenname: Danesh surname: Tarapore fullname: Tarapore, Danesh organization: Sorbonne Universités, Université Pierre et Marie Curie (UPMC), Paris 06, UMR 7222, Institut des Systèmes Intelligents et de Robotique (ISIR), CNRS, UMR 7222, Institut des Systèmes Intelligents et de Robotique (ISIR), †Present addresses: Department of Electronics, University of York, York YO10 5DD, UK (D.T.); Inria, Villers-lès-Nancy, F-54600, France (J.-B.M.) – sequence: 4 givenname: Jean-Baptiste surname: Mouret fullname: Mouret, Jean-Baptiste email: jean-baptiste.mouret@inria.fr organization: Sorbonne Universités, Université Pierre et Marie Curie (UPMC), Paris 06, UMR 7222, Institut des Systèmes Intelligents et de Robotique (ISIR), CNRS, UMR 7222, Institut des Systèmes Intelligents et de Robotique (ISIR), Inria, Team Larsen, CNRS, Loria, UMR 7503, Université de Lorraine, Loria, UMR 7503, †Present addresses: Department of Electronics, University of York, York YO10 5DD, UK (D.T.); Inria, Villers-lès-Nancy, F-54600, France (J.-B.M.) |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/26017452$$D View this record in MEDLINE/PubMed https://hal.science/hal-01158243$$DView record in HAL |
| BookMark | eNqN0-9r1DAYB_AgE3ebvvK9lO2NQzvzu-nLY6gbHAhTX4c0TW6ZvfSWpKL_vTlvzt7odLSQEj7PNyl5cgD2fO8NAC8RPEWQiHdepSEYRCnGT8AM0YqXlItqD8wgxKKEgvB9cBDjNYSQoYo-A_uYw_zB8AwcXfZNn2KRrlQqtPKFatU6FZ37Zgrl3Up18Tl4avNgXtyOh-Drh_dfzs7LxaePF2fzRakrDFNpiLGE1S00Dak1F5hXrbaaW0YtxwwbzBrYIIKR0KLGCiNOOW8tFRxbbWpyCE62uVeqk-uQ1w4_Za-cPJ8v5GYOIsQEpuQ7yvb11q5DfzOYmOTKRW26TnnTD1EiLiglsCIi0-N79Lofgs9_8lvVLL_or1qqzkjnbZ-C0ptQOeesrlGFK_xPRRFjnCDOsion1NJ4E1SXz866PL2T-hg_zj-a8HrtbuQ49EE0TjqdQPlpzcrpya0-qmC8wslOQTbJ_EhLNcQoLz5f7ob_z45zX92e6tCsTHvXL386O4M3W6BDH2Mw9o4gKDf3Ro7uTdbontYuqeTyDoJy3QM1b7c1MSf7pQmjxprgvwCw8R5r |
| CODEN | NATUAS |
| CitedBy_id | crossref_primary_10_1007_s10710_023_09465_z crossref_primary_10_1109_TRO_2021_3082017 crossref_primary_10_3389_fspor_2022_861466 crossref_primary_10_1002_admt_202200047 crossref_primary_10_1016_j_jesp_2023_104553 crossref_primary_10_1002_aisy_202400790 crossref_primary_10_3389_fnbot_2019_00104 crossref_primary_10_12688_wellcomeopenres_21252_1 crossref_primary_10_1007_s10458_017_9380_x crossref_primary_10_1016_j_jsv_2024_118424 crossref_primary_10_1098_rsos_171200 crossref_primary_10_3167_ares_2019_100109 crossref_primary_10_1109_LRA_2021_3062814 crossref_primary_10_1017_S0269888922000042 crossref_primary_10_3389_fnbot_2017_00039 crossref_primary_10_3389_frobt_2016_00040 crossref_primary_10_1109_ACCESS_2020_3008735 crossref_primary_10_1126_sciadv_adi3621 crossref_primary_10_1002_admt_202100437 crossref_primary_10_1155_2019_8261617 crossref_primary_10_1109_ACCESS_2022_3208883 crossref_primary_10_1109_TEVC_2022_3159855 crossref_primary_10_1016_j_robot_2019_103312 crossref_primary_10_1007_s10676_019_09510_5 crossref_primary_10_1109_LRA_2021_3060408 crossref_primary_10_3389_fnbot_2017_00029 crossref_primary_10_1145_3696426 crossref_primary_10_1080_0952813X_2018_1430861 crossref_primary_10_1016_j_conengprac_2020_104488 crossref_primary_10_1162_EVCO_a_00189 crossref_primary_10_1016_j_swevo_2024_101732 crossref_primary_10_1109_LRA_2021_3060415 crossref_primary_10_3390_s23229277 crossref_primary_10_2514_1_G008112 crossref_primary_10_1109_LRA_2018_2818933 crossref_primary_10_1177_26349795231188727 crossref_primary_10_1109_TSTE_2021_3055764 crossref_primary_10_17531_ein_2020_1_6 crossref_primary_10_1016_j_ijhydene_2024_03_223 crossref_primary_10_1088_1748_3190_ac9fb6 crossref_primary_10_1109_TCDS_2020_3008574 crossref_primary_10_1016_j_matpr_2022_04_238 crossref_primary_10_1016_j_apm_2019_09_021 crossref_primary_10_3233_AIC_220127 crossref_primary_10_1109_TMECH_2017_2681722 crossref_primary_10_1016_j_tws_2024_112645 crossref_primary_10_1038_nrd_2017_232 crossref_primary_10_1109_TEVC_2017_2735550 crossref_primary_10_1145_3664656 crossref_primary_10_1109_TKDE_2021_3079836 crossref_primary_10_1007_s10015_024_00984_1 crossref_primary_10_3233_AIC_230040 crossref_primary_10_1109_TCIAIG_2015_2494596 crossref_primary_10_1007_s10994_021_05982_z crossref_primary_10_1109_LCSYS_2024_3410152 crossref_primary_10_1109_TCDS_2016_2628817 crossref_primary_10_1109_TEVC_2017_2722101 crossref_primary_10_1177_1059712321994685 crossref_primary_10_1145_3596912 crossref_primary_10_1111_brv_12539 crossref_primary_10_3389_fnbot_2017_00003 crossref_primary_10_1109_ACCESS_2019_2920429 crossref_primary_10_1242_jeb_174268 crossref_primary_10_1038_ncomms15546 crossref_primary_10_1126_scirobotics_aau0307 crossref_primary_10_1080_09544828_2023_2266864 crossref_primary_10_1098_rsif_2017_0937 crossref_primary_10_3390_robotics13110157 crossref_primary_10_1080_14484846_2024_2395120 crossref_primary_10_1016_j_jbusres_2023_114360 crossref_primary_10_1115_1_4067523 crossref_primary_10_3390_nano12010159 crossref_primary_10_1038_nature_2015_17641 crossref_primary_10_1126_scirobotics_adk9978 crossref_primary_10_1017_S0263574719001140 crossref_primary_10_1061__ASCE_ST_1943_541X_0001817 crossref_primary_10_1115_1_4037344 crossref_primary_10_1039_C8ME00020D crossref_primary_10_3390_a15120478 crossref_primary_10_1007_s00146_019_00886_y crossref_primary_10_1002_admt_202001095 crossref_primary_10_1093_icb_icae121 crossref_primary_10_2197_ipsjjip_31_220 crossref_primary_10_1109_TG_2022_3169168 crossref_primary_10_1109_TIM_2023_3260263 crossref_primary_10_1080_09544828_2021_1914323 crossref_primary_10_1109_TASE_2024_3424328 crossref_primary_10_1007_s10846_018_0881_x crossref_primary_10_1242_jeb_133652 crossref_primary_10_1371_journal_pone_0219004 crossref_primary_10_1109_TG_2022_3151025 crossref_primary_10_1109_JSAC_2019_2927068 crossref_primary_10_3389_frobt_2019_00151 crossref_primary_10_3390_s21186305 crossref_primary_10_1093_icb_icae051 crossref_primary_10_1162_evco_a_00282 crossref_primary_10_1016_j_rcim_2022_102360 crossref_primary_10_1016_j_swevo_2018_06_009 crossref_primary_10_1162_artl_a_00294 crossref_primary_10_1109_LRA_2022_3149580 crossref_primary_10_1016_j_neunet_2019_01_011 crossref_primary_10_1177_1073858416638641 crossref_primary_10_1038_521426a crossref_primary_10_1098_rsos_160938 crossref_primary_10_1109_TPAMI_2024_3455257 crossref_primary_10_1109_THMS_2021_3134553 crossref_primary_10_1007_s10994_020_05899_z crossref_primary_10_1109_LRA_2021_3052391 crossref_primary_10_3390_s21227609 crossref_primary_10_1016_j_robot_2020_103708 crossref_primary_10_1177_17298814211044934 crossref_primary_10_1038_s41467_024_50131_4 crossref_primary_10_1109_TRO_2020_2992983 crossref_primary_10_3389_frobt_2020_00061 crossref_primary_10_1109_ACCESS_2018_2835836 crossref_primary_10_1002_aisy_202300774 crossref_primary_10_1109_TAC_2016_2644380 crossref_primary_10_1016_j_robot_2019_103398 crossref_primary_10_1016_j_robot_2020_103710 crossref_primary_10_1039_D0ME00005A crossref_primary_10_1109_TCYB_2018_2805695 crossref_primary_10_1109_TMRB_2022_3185425 crossref_primary_10_1109_TASE_2024_3410522 crossref_primary_10_1126_scirobotics_aao4369 crossref_primary_10_2139_ssrn_4048864 crossref_primary_10_1126_scirobotics_abb2174 crossref_primary_10_1109_ACCESS_2023_3238872 crossref_primary_10_3389_fnbot_2020_555271 crossref_primary_10_3389_frobt_2017_00062 crossref_primary_10_1038_s41598_017_15601_4 crossref_primary_10_3390_machines10030185 crossref_primary_10_1111_emre_12289 crossref_primary_10_1016_j_engappai_2022_105367 crossref_primary_10_1038_s42256_020_00258_y crossref_primary_10_1371_journal_pcbi_1004829 crossref_primary_10_1016_j_cirp_2023_05_005 crossref_primary_10_1371_journal_pone_0162235 crossref_primary_10_3390_app14188383 crossref_primary_10_1109_ACCESS_2021_3106815 crossref_primary_10_1039_C9SE00750D crossref_primary_10_3389_frobt_2020_00083 crossref_primary_10_1016_j_neunet_2020_05_029 crossref_primary_10_1109_LRA_2023_3273393 crossref_primary_10_1038_nprot_2017_092 crossref_primary_10_3389_fnbot_2020_570308 crossref_primary_10_3389_fnhum_2017_00298 crossref_primary_10_3389_frobt_2020_579403 crossref_primary_10_1145_3476412 crossref_primary_10_3389_frobt_2018_00043 crossref_primary_10_1162_artl_a_00374 crossref_primary_10_1016_j_mattod_2021_01_009 crossref_primary_10_1145_3577203 crossref_primary_10_1002_rob_21887 crossref_primary_10_1038_s41578_024_00711_z crossref_primary_10_3390_act12100393 crossref_primary_10_3389_frobt_2018_00012 crossref_primary_10_3390_electronics10121491 crossref_primary_10_3390_act12040157 crossref_primary_10_1109_LRA_2022_3177294 crossref_primary_10_1098_rsbl_2015_0674 crossref_primary_10_3389_frobt_2016_00008 crossref_primary_10_1109_TKDE_2019_2912367 crossref_primary_10_17587_mau_22_601_609 crossref_primary_10_1007_s10710_022_09433_z crossref_primary_10_1162_artl_a_00263 crossref_primary_10_1016_j_biosystems_2022_104686 crossref_primary_10_1016_j_procs_2016_07_412 crossref_primary_10_1007_s42484_024_00225_5 crossref_primary_10_1162_EVCO_a_00172 crossref_primary_10_1038_s41378_023_00592_2 crossref_primary_10_3389_frobt_2017_00034 crossref_primary_10_1016_j_swevo_2025_101849 crossref_primary_10_1109_LRA_2023_3313012 crossref_primary_10_1075_is_18001_sze crossref_primary_10_1109_TCST_2017_2692727 crossref_primary_10_1109_LRA_2021_3062342 crossref_primary_10_3390_min11090989 crossref_primary_10_1007_s10514_019_09842_7 crossref_primary_10_1016_j_ijheatmasstransfer_2022_122725 crossref_primary_10_1039_D4DD00054D crossref_primary_10_1016_j_apmt_2019_100463 crossref_primary_10_1016_j_robot_2019_06_001 crossref_primary_10_3389_frobt_2016_00014 crossref_primary_10_1109_TRO_2019_2958211 crossref_primary_10_1016_j_mechmachtheory_2024_105592 crossref_primary_10_1016_j_robot_2019_06_006 crossref_primary_10_1109_LRA_2023_3243440 crossref_primary_10_1162_evco_a_00343 crossref_primary_10_1162_evco_a_00336 crossref_primary_10_3389_frobt_2023_1145798 crossref_primary_10_7554_eLife_92683_4 crossref_primary_10_1007_s13218_019_00615_z crossref_primary_10_1109_LRA_2022_3148438 crossref_primary_10_1109_TWC_2023_3307875 crossref_primary_10_1109_MRA_2016_2580593 crossref_primary_10_1109_JIOT_2022_3194726 crossref_primary_10_1088_1748_3190_ac884f crossref_primary_10_3389_fnbot_2022_882518 crossref_primary_10_1007_s43154_022_00079_4 crossref_primary_10_1109_TNNLS_2020_3016523 crossref_primary_10_1109_TEVC_2017_2704781 crossref_primary_10_1002_smll_202305805 crossref_primary_10_1016_j_ccr_2025_216460 crossref_primary_10_1109_LRA_2019_2926660 crossref_primary_10_1016_j_rcim_2017_11_014 crossref_primary_10_1109_JLT_2018_2821361 crossref_primary_10_52825_solarpaces_v1i_636 crossref_primary_10_1016_j_procir_2021_01_016 crossref_primary_10_3390_philosophies4020020 crossref_primary_10_1145_3605782 crossref_primary_10_3390_act8030053 crossref_primary_10_1088_1402_4896_ac807c crossref_primary_10_1162_artl_a_00454 crossref_primary_10_1162_artl_a_00331 crossref_primary_10_1162_artl_a_00330 crossref_primary_10_1016_j_autcon_2022_104691 crossref_primary_10_1109_TPAMI_2023_3264741 crossref_primary_10_1007_s10994_021_05956_1 crossref_primary_10_1088_1748_3190_10_6_065006 crossref_primary_10_1007_s12541_019_00199_6 crossref_primary_10_1109_MCI_2023_3277770 crossref_primary_10_1177_1475472X20978395 crossref_primary_10_1109_TNNLS_2021_3119127 crossref_primary_10_1109_TPAMI_2017_2777486 crossref_primary_10_1002_smtd_202300820 crossref_primary_10_1016_j_mechmachtheory_2019_07_008 crossref_primary_10_1007_s00500_022_07037_4 crossref_primary_10_1017_S0269888917000121 crossref_primary_10_1016_j_ifacol_2020_12_276 crossref_primary_10_1016_j_biosystems_2024_105178 crossref_primary_10_1038_s41586_020_03157_9 crossref_primary_10_1162_evco_a_00326 crossref_primary_10_1038_s42256_020_00263_1 crossref_primary_10_1039_C9TC05758G crossref_primary_10_1371_journal_pone_0174635 crossref_primary_10_1109_LRA_2021_3139083 crossref_primary_10_1520_SSMS20200059 crossref_primary_10_4018_IJEHMC_2020070104 crossref_primary_10_1016_j_cobeha_2015_11_008 crossref_primary_10_3390_app10082959 crossref_primary_10_1017_pds_2021_603 crossref_primary_10_1162_artl_a_00357 crossref_primary_10_7554_eLife_92683 crossref_primary_10_1089_soro_2022_0073 crossref_primary_10_1016_j_sftr_2022_100068 crossref_primary_10_1109_TCDS_2016_2624705 crossref_primary_10_1126_scirobotics_abf1628 crossref_primary_10_1145_3587101 crossref_primary_10_1364_OE_426761 crossref_primary_10_1002_aisy_202200329 crossref_primary_10_1103_PhysRevE_108_065303 crossref_primary_10_1109_TNNLS_2020_3027552 crossref_primary_10_1002_aelm_202201064 crossref_primary_10_3390_biomimetics9060329 crossref_primary_10_1038_s42256_018_0009_9 crossref_primary_10_1109_LRA_2017_2715879 crossref_primary_10_1016_j_neubiorev_2024_105617 crossref_primary_10_1109_TAI_2021_3104789 crossref_primary_10_1057_s41599_020_0445_0 crossref_primary_10_1109_TVT_2022_3205452 crossref_primary_10_1109_ACCESS_2016_2601167 crossref_primary_10_1088_1367_2630_ab8677 crossref_primary_10_1016_j_jmsy_2021_02_005 crossref_primary_10_1109_TEVC_2017_2703142 crossref_primary_10_1016_j_patrec_2018_11_008 crossref_primary_10_1177_02783649211021869 crossref_primary_10_1109_TRO_2022_3158194 crossref_primary_10_1016_j_cirpj_2021_06_008 crossref_primary_10_1038_s41563_022_01269_3 crossref_primary_10_1089_soro_2022_0177 crossref_primary_10_1038_s42256_019_0076_6 crossref_primary_10_3389_fmats_2021_518886 crossref_primary_10_1109_TCAD_2021_3118963 crossref_primary_10_1007_s10514_020_09929_6 crossref_primary_10_1016_j_dt_2024_05_007 crossref_primary_10_1007_s11432_020_2863_y crossref_primary_10_1016_j_ynexs_2024_100003 crossref_primary_10_1038_s42256_024_00852_4 crossref_primary_10_1016_j_robot_2017_11_010 crossref_primary_10_3390_s21061954 crossref_primary_10_1093_jisesa_iey038 crossref_primary_10_34133_icomputing_0025 crossref_primary_10_1145_3628158 crossref_primary_10_1007_s40820_024_01423_3 crossref_primary_10_1177_02783649231184498 crossref_primary_10_1162_ARTL_a_00226 crossref_primary_10_1109_TNNLS_2018_2886017 crossref_primary_10_1002_inf2_12028 crossref_primary_10_1016_j_mechmachtheory_2022_105069 crossref_primary_10_1162_ARTL_a_00228 crossref_primary_10_1016_j_aei_2023_102140 crossref_primary_10_1002_med_21774 crossref_primary_10_12737_2219_0767_2022_15_3_7_16 crossref_primary_10_1007_s11633_023_1429_5 crossref_primary_10_1109_LRA_2022_3186501 crossref_primary_10_1162_ARTL_a_00236 crossref_primary_10_1016_j_ecoinf_2024_102950 crossref_primary_10_1080_10447318_2023_2266796 crossref_primary_10_1162_artl_a_00319 crossref_primary_10_1002_adfm_202200241 crossref_primary_10_1016_j_mtcomm_2023_107584 crossref_primary_10_3389_frobt_2023_1271610 crossref_primary_10_1016_j_sna_2022_113676 crossref_primary_10_3389_fncir_2023_1111285 crossref_primary_10_1016_j_robot_2023_104443 crossref_primary_10_3390_e26060481 crossref_primary_10_3389_fnbot_2015_00010 crossref_primary_10_1109_JPROC_2020_2979233 crossref_primary_10_3389_frobt_2021_639173 crossref_primary_10_1371_journal_pone_0186107 crossref_primary_10_1038_s41586_019_1138_y crossref_primary_10_21105_joss_00545 crossref_primary_10_1109_LRA_2019_2961598 crossref_primary_10_3390_technologies6040100 crossref_primary_10_1371_journal_pone_0151834 crossref_primary_10_1177_1059712315611964 crossref_primary_10_1109_LRA_2017_2716418 crossref_primary_10_1016_j_neunet_2022_04_009 crossref_primary_10_1109_TRO_2021_3106832 crossref_primary_10_3390_s22186991 crossref_primary_10_1016_j_isci_2020_101772 crossref_primary_10_1016_j_mattod_2019_10_007 crossref_primary_10_1016_j_comnet_2020_107230 crossref_primary_10_1109_ACCESS_2020_2966228 crossref_primary_10_1038_s44287_024_00081_2 crossref_primary_10_1051_matecconf_202338804022 crossref_primary_10_1016_j_tics_2017_01_006 crossref_primary_10_1109_TPAMI_2022_3200726 crossref_primary_10_1098_rsos_210848 crossref_primary_10_1016_j_nanoen_2020_105559 crossref_primary_10_3389_fncir_2020_00046 crossref_primary_10_1109_TEVC_2022_3152384 crossref_primary_10_1093_beheco_arz090 crossref_primary_10_1111_cogs_13279 crossref_primary_10_1038_s41562_018_0467_4 crossref_primary_10_1016_j_cirpj_2020_09_010 crossref_primary_10_1109_LRA_2020_2974685 crossref_primary_10_1016_j_engappai_2024_109539 crossref_primary_10_1126_sciadv_adi4566 crossref_primary_10_1126_scirobotics_abf6354 crossref_primary_10_1016_j_cogsys_2016_08_001 crossref_primary_10_1016_j_sger_2019_09_003 crossref_primary_10_1038_s42256_022_00444_0 crossref_primary_10_1038_srep23911 crossref_primary_10_1016_j_neunet_2021_09_010 crossref_primary_10_1109_TRO_2019_2929015 crossref_primary_10_1016_j_neunet_2021_09_017 crossref_primary_10_3902_jnns_31_116 crossref_primary_10_1108_IR_03_2020_0057 crossref_primary_10_1371_journal_pone_0215671 crossref_primary_10_1007_s10846_021_01558_0 crossref_primary_10_1038_s42256_018_0006_z crossref_primary_10_1016_j_compstruct_2024_118331 crossref_primary_10_1109_TCDS_2021_3052548 crossref_primary_10_1109_TEVC_2021_3068157 crossref_primary_10_1109_TEVC_2023_3273560 crossref_primary_10_1016_j_neunet_2018_10_005 crossref_primary_10_1155_2018_2913636 crossref_primary_10_9746_sicetr_58_304 crossref_primary_10_12677_mos_2025_142190 crossref_primary_10_3389_frobt_2021_684304 crossref_primary_10_3389_frobt_2023_1127898 crossref_primary_10_1016_j_knosys_2024_112551 crossref_primary_10_1073_pnas_2017015118 crossref_primary_10_1016_j_isci_2020_101731 crossref_primary_10_1038_s41598_024_75607_7 crossref_primary_10_1109_TCDS_2018_2796940 crossref_primary_10_1371_journal_pone_0147754 crossref_primary_10_1038_s41598_020_79147_8 crossref_primary_10_1038_s44260_025_00031_5 crossref_primary_10_1007_s10015_023_00872_0 crossref_primary_10_1371_journal_pone_0213193 crossref_primary_10_1021_acsami_1c10311 crossref_primary_10_1038_srep29610 crossref_primary_10_1016_j_cma_2017_03_037 crossref_primary_10_1038_s41467_019_14234_7 crossref_primary_10_1038_s42256_020_0208_z crossref_primary_10_1002_mame_202100478 crossref_primary_10_1016_j_robot_2016_11_018 crossref_primary_10_1016_j_ifacol_2020_12_2159 crossref_primary_10_1145_3665336 crossref_primary_10_1016_j_asoc_2021_107688 crossref_primary_10_1109_LRA_2022_3188884 crossref_primary_10_1089_soro_2019_0142 crossref_primary_10_1089_soro_2017_0055 crossref_primary_10_1021_acs_macromol_2c00434 crossref_primary_10_3390_s19173705 crossref_primary_10_1515_itit_2019_0003 crossref_primary_10_3389_frobt_2022_799893 crossref_primary_10_1007_s11721_016_0126_1 crossref_primary_10_1109_TEVC_2020_3036578 crossref_primary_10_1088_1748_3190_ab2958 crossref_primary_10_3390_electronics13173545 crossref_primary_10_3390_app9040799 crossref_primary_10_1016_j_physa_2023_128810 crossref_primary_10_1089_soro_2017_0066 crossref_primary_10_1098_rstb_2021_0117 crossref_primary_10_1002_aisy_202200375 crossref_primary_10_1016_j_destud_2021_101029 crossref_primary_10_1038_s41578_022_00513_1 crossref_primary_10_1145_3703453 crossref_primary_10_1115_1_4055167 |
| Cites_doi | 10.1038/nn.3495 10.1016/j.robot.2013.05.009 10.1038/nature03275 10.1038/nature02169 10.1177/0278364913495721 10.1016/j.tvjl.2014.05.012 10.1109/MRA.2004.1337826 10.1109/ICRA.2014.6907117 10.1126/science.1133687 10.1007/978-3-540-30301-5 10.1109/MRA.2004.1310942 10.1007/978-3-540-30301-5_44 10.1523/JNEUROSCI.18-23-10105.1998 10.1002/rob.20147 10.1038/463600a 10.2460/ajvr.74.9.1155 10.1016/j.robot.2008.10.024 10.1109/ROBOT.2004.1307456 10.1177/0278364907088401 10.1016/S1364-6613(00)01773-3 10.1007/s12369-009-0030-6 10.1038/nrn2332 10.1109/TRO.2004.838027 10.1038/nature02223 10.1109/IROS.2011.6095076 10.1002/rob.21439 10.1038/nrn1137 |
| ContentType | Journal Article |
| Copyright | Springer Nature Limited 2015 COPYRIGHT 2015 Nature Publishing Group Copyright Nature Publishing Group May 28, 2015 Distributed under a Creative Commons Attribution 4.0 International License |
| Copyright_xml | – notice: Springer Nature Limited 2015 – notice: COPYRIGHT 2015 Nature Publishing Group – notice: Copyright Nature Publishing Group May 28, 2015 – notice: Distributed under a Creative Commons Attribution 4.0 International License |
| DBID | AAYXX CITATION CGR CUY CVF ECM EIF NPM 3V. 7QG 7QL 7QP 7QR 7RV 7SN 7SS 7ST 7T5 7TG 7TK 7TM 7TO 7U9 7X2 7X7 7XB 88A 88E 88G 88I 8AF 8AO 8C1 8FD 8FE 8FG 8FH 8FI 8FJ 8FK 8G5 ABJCF ABUWG AEUYN AFKRA ARAPS ATCPS AZQEC BBNVY BEC BENPR BGLVJ BHPHI BKSAR C1K CCPQU D1I DWQXO FR3 FYUFA GHDGH GNUQQ GUQSH H94 HCIFZ K9. KB. KB0 KL. L6V LK8 M0K M0S M1P M2M M2O M2P M7N M7P M7S MBDVC NAPCQ P5Z P62 P64 PATMY PCBAR PDBOC PHGZM PHGZT PJZUB PKEHL PPXIY PQEST PQGLB PQQKQ PQUKI PRINS PSYQQ PTHSS PYCSY Q9U R05 RC3 S0X SOI 7X8 1XC VOOES |
| DOI | 10.1038/nature14422 |
| DatabaseName | CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed ProQuest Central (Corporate) Animal Behavior Abstracts Bacteriology Abstracts (Microbiology B) Calcium & Calcified Tissue Abstracts Chemoreception Abstracts Nursing & Allied Health Database Ecology Abstracts Entomology Abstracts (Full archive) Environment Abstracts Immunology Abstracts Meteorological & Geoastrophysical Abstracts Neurosciences Abstracts Nucleic Acids Abstracts Oncogenes and Growth Factors Abstracts Virology and AIDS Abstracts ProQuest Agricultural Science Health & Medical Collection ProQuest Central (purchase pre-March 2016) Biology Database (Alumni Edition) Medical Database (Alumni Edition) Psychology Database (Alumni) Science Database (Alumni Edition) STEM Database ProQuest Pharma Collection Public Health Database Technology Research Database ProQuest SciTech Collection ProQuest Technology Collection ProQuest Natural Science Journals Hospital Premium Collection Hospital Premium Collection (Alumni Edition) ProQuest Central (Alumni) (purchase pre-March 2016) Research Library ProQuest Materials Science & Engineering ProQuest Central (Alumni) ProQuest One Sustainability ProQuest Central UK/Ireland Advanced Technologies & Computer Science Collection Agricultural & Environmental Science Collection ProQuest Central Essentials Biological Science Database eLibrary Curriculum ProQuest Central ProQuest Technology Collection Natural Science Collection Earth, Atmospheric & Aquatic Science Collection Environmental Sciences and Pollution Management ProQuest One Community College ProQuest Materials Science Collection ProQuest Central Engineering Research Database Health Research Premium Collection (UHCL Subscription) Health Research Premium Collection (Alumni) ProQuest Central Student Research Library Prep AIDS and Cancer Research Abstracts SciTech Premium Collection ProQuest Health & Medical Complete (Alumni) Materials Science Database (ProQuest) Nursing & Allied Health Database (Alumni Edition) Meteorological & Geoastrophysical Abstracts - Academic ProQuest Engineering Collection Biological Sciences Agricultural Science Database Health & Medical Collection (Alumni Edition) Medical Database Psychology Database (ProQuest) Research Library Science Database Algology Mycology and Protozoology Abstracts (Microbiology C) Biological Science Database (ProQuest) Engineering Database (ProQuest) Research Library (Corporate) Nursing & Allied Health Premium Advanced Technologies & Aerospace Database (ProQuest) ProQuest Advanced Technologies & Aerospace Collection Biotechnology and BioEngineering Abstracts Environmental Science Database (ProQuest) Earth, Atmospheric & Aquatic Science Database Materials Science Collection ProQuest Central Premium ProQuest One Academic ProQuest Health & Medical Research Collection ProQuest One Academic Middle East (New) ProQuest One Health & Nursing ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China ProQuest One Psychology Engineering Collection Environmental Science Collection ProQuest Central Basic University of Michigan Genetics Abstracts SIRS Editorial Environment Abstracts MEDLINE - Academic Hyper Article en Ligne (HAL) Hyper Article en Ligne (HAL) (Open Access) |
| DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Agricultural Science Database ProQuest One Psychology Research Library Prep ProQuest Central Student Oncogenes and Growth Factors Abstracts ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Essentials Nucleic Acids Abstracts elibrary ProQuest AP Science SciTech Premium Collection ProQuest Central China Environmental Sciences and Pollution Management ProQuest One Applied & Life Sciences ProQuest One Sustainability Health Research Premium Collection Meteorological & Geoastrophysical Abstracts Natural Science Collection Health & Medical Research Collection Biological Science Collection Chemoreception Abstracts ProQuest Central (New) ProQuest Medical Library (Alumni) Engineering Collection Advanced Technologies & Aerospace Collection Engineering Database Virology and AIDS Abstracts ProQuest Science Journals (Alumni Edition) ProQuest Biological Science Collection ProQuest One Academic Eastern Edition Earth, Atmospheric & Aquatic Science Database Agricultural Science Collection ProQuest Hospital Collection ProQuest Technology Collection Health Research Premium Collection (Alumni) Biological Science Database Ecology Abstracts Neurosciences Abstracts ProQuest Hospital Collection (Alumni) Biotechnology and BioEngineering Abstracts Environmental Science Collection Entomology Abstracts Nursing & Allied Health Premium ProQuest Health & Medical Complete ProQuest One Academic UKI Edition Environmental Science Database ProQuest Nursing & Allied Health Source (Alumni) Engineering Research Database ProQuest One Academic Calcium & Calcified Tissue Abstracts Meteorological & Geoastrophysical Abstracts - Academic ProQuest One Academic (New) University of Michigan Technology Collection Technology Research Database ProQuest One Academic Middle East (New) SIRS Editorial Materials Science Collection ProQuest Health & Medical Complete (Alumni) ProQuest Central (Alumni Edition) ProQuest One Community College ProQuest One Health & Nursing Research Library (Alumni Edition) ProQuest Natural Science Collection ProQuest Pharma Collection ProQuest Biology Journals (Alumni Edition) ProQuest Central Earth, Atmospheric & Aquatic Science Collection ProQuest Health & Medical Research Collection Genetics Abstracts ProQuest Engineering Collection Health and Medicine Complete (Alumni Edition) ProQuest Central Korea Bacteriology Abstracts (Microbiology B) Algology Mycology and Protozoology Abstracts (Microbiology C) Agricultural & Environmental Science Collection AIDS and Cancer Research Abstracts Materials Science Database ProQuest Research Library ProQuest Materials Science Collection ProQuest Public Health ProQuest Central Basic ProQuest Science Journals ProQuest Nursing & Allied Health Source ProQuest Psychology Journals (Alumni) ProQuest SciTech Collection Advanced Technologies & Aerospace Database ProQuest Medical Library ProQuest Psychology Journals Animal Behavior Abstracts Materials Science & Engineering Collection Immunology Abstracts Environment Abstracts ProQuest Central (Alumni) MEDLINE - Academic |
| DatabaseTitleList | Agricultural Science Database MEDLINE MEDLINE - Academic |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database – sequence: 3 dbid: 8FG name: ProQuest Technology Collection url: https://search.proquest.com/technologycollection1 sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Sciences (General) Physics Computer Science |
| EISSN | 1476-4687 |
| EndPage | 507 |
| ExternalDocumentID | oai:HAL:hal-01158243v1 3701623741 A659917272 A415563165 26017452 10_1038_nature14422 |
| Genre | Research Support, Non-U.S. Gov't Journal Article |
| GeographicLocations | France |
| GeographicLocations_xml | – name: France |
| GroupedDBID | --- --Z -DZ -ET -~X .55 .CO .XZ 00M 07C 0R~ 0WA 123 186 1OL 1VR 29M 2KS 2XV 39C 3V. 4.4 41X 53G 5RE 6TJ 70F 7RV 7X2 7X7 7XC 85S 88A 88E 88I 8AF 8AO 8C1 8CJ 8FE 8FG 8FH 8FI 8FJ 8G5 8R4 8R5 8WZ 97F 97L A6W A7Z A8Z AAEEF AAHBH AAHTB AAIKC AAKAB AAKAS AAMNW AASDW AAYEP AAYZH AAZLF ABAWZ ABDBF ABDQB ABFSI ABIVO ABJCF ABJNI ABLJU ABOCM ABPEJ ABPPZ ABUWG ABWJO ABZEH ACBEA ACBWK ACGFO ACGFS ACGOD ACIWK ACKOT ACMJI ACNCT ACPRK ACUHS ACWUS ADBBV ADFRT ADUKH ADYSU ADZCM AENEX AEUYN AFFNX AFKRA AFLOW AFRAH AFSHS AGAYW AGHSJ AGHTU AGNAY AGSOS AHMBA AHSBF AIDAL AIDUJ ALFFA ALIPV ALMA_UNASSIGNED_HOLDINGS AMTXH APEBS ARAPS ARMCB ARTTT ASPBG ATCPS ATWCN AVWKF AXYYD AZFZN AZQEC B0M BBNVY BCU BDKGC BEC BENPR BGLVJ BHPHI BIN BKEYQ BKKNO BKSAR BLC BPHCQ BVXVI CCPQU CJ0 CS3 D1I D1J D1K DO4 DU5 DWQXO E.- E.L EAD EAP EAS EAZ EBC EBD EBO EBS ECC EE. EJD EMB EMF EMH EMK EMOBN EPL EPS ESE ESN ESX EX3 EXGXG F5P FEDTE FQGFK FSGXE FYUFA GNUQQ GUQSH HCIFZ HG6 HMCUK HVGLF HZ~ I-F IAO ICQ IEA IEP IGS IH2 IHR INH INR IOF IPY ISR ITC K6- KB. KOO L6V L7B LK5 LK8 LSO M0K M0L M1P M2M M2O M2P M7P M7R M7S N9A NAPCQ NEJ NEPJS O9- OBC OES OHH OMK OVD P-O P2P P62 PATMY PCBAR PDBOC PKN PM3 PQQKQ PROAC PSQYO PSYQQ PTHSS PYCSY Q2X R05 RND RNS RNT RNTTT RXW S0X SC5 SHXYY SIXXV SJFOW SJN SNYQT SOJ SV3 TAE TAOOD TBHMF TDRGL TEORI TH9 TN5 TSG TUS TWZ U5U UIG UKHRP UKR UMD UQL VQA VVN WH7 WOW X7M XIH XKW XZL Y6R YAE YCJ YFH YIF YIN YNT YOC YQT YR2 YR5 YXB YZZ Z5M ZCA ZE2 ZKB ~02 ~7V ~88 ~8M ~KM AARCD AAYXX ABFSG ACSTC ADGHP ADXHL AETEA AFANA ALPWD ATHPR CITATION ESTFP PHGZM PHGZT PJZUB PPXIY PQGLB PUEGO CGR CUY CVF ECM EIF NPM ABUFD ACMFV AEIIB PMFND 7QG 7QL 7QP 7QR 7SN 7SS 7ST 7T5 7TG 7TK 7TM 7TO 7U9 7XB 8FD 8FK C1K FR3 H94 K9. KL. M7N MBDVC P64 PKEHL PQEST PQUKI PRINS Q9U RC3 SOI 7X8 1XC VOOES |
| ID | FETCH-LOGICAL-c720t-e3ef359d0eb39c68267dcfc6f54f6252e25b0b13218c892a216466df4862fce93 |
| IEDL.DBID | BENPR |
| ISSN | 0028-0836 1476-4687 |
| IngestDate | Tue Oct 14 20:32:59 EDT 2025 Thu Oct 02 14:08:07 EDT 2025 Tue Oct 07 06:48:04 EDT 2025 Mon Oct 20 21:53:01 EDT 2025 Mon Oct 20 22:21:34 EDT 2025 Thu Jun 12 23:32:22 EDT 2025 Thu Jun 12 23:33:57 EDT 2025 Tue Jun 10 15:34:31 EDT 2025 Tue Jun 10 15:32:57 EDT 2025 Mon Oct 20 16:17:12 EDT 2025 Mon Oct 20 16:22:57 EDT 2025 Thu Oct 16 15:04:05 EDT 2025 Thu Oct 16 14:46:37 EDT 2025 Thu Apr 03 06:56:52 EDT 2025 Wed Oct 01 04:00:12 EDT 2025 Thu Apr 24 23:06:25 EDT 2025 Fri Feb 21 02:38:01 EST 2025 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 7553 |
| Language | English |
| License | Distributed under a Creative Commons Attribution 4.0 International License: http://creativecommons.org/licenses/by/4.0 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c720t-e3ef359d0eb39c68267dcfc6f54f6252e25b0b13218c892a216466df4862fce93 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| ORCID | 0000-0002-3190-7073 0000-0002-2513-027X |
| OpenAccessLink | https://hal.science/hal-01158243 |
| PMID | 26017452 |
| PQID | 1684954951 |
| PQPubID | 40569 |
| PageCount | 5 |
| ParticipantIDs | hal_primary_oai_HAL_hal_01158243v1 proquest_miscellaneous_1684430738 proquest_journals_1684954951 gale_infotracmisc_A659917272 gale_infotracmisc_A415563165 gale_infotracgeneralonefile_A659917272 gale_infotracgeneralonefile_A415563165 gale_infotraccpiq_659917272 gale_infotraccpiq_415563165 gale_infotracacademiconefile_A659917272 gale_infotracacademiconefile_A415563165 gale_incontextgauss_ISR_A659917272 gale_incontextgauss_ISR_A415563165 pubmed_primary_26017452 crossref_primary_10_1038_nature14422 crossref_citationtrail_10_1038_nature14422 springer_journals_10_1038_nature14422 |
| ProviderPackageCode | CITATION AAYXX |
| PublicationCentury | 2000 |
| PublicationDate | 2015-05-28 |
| PublicationDateYYYYMMDD | 2015-05-28 |
| PublicationDate_xml | – month: 05 year: 2015 text: 2015-05-28 day: 28 |
| PublicationDecade | 2010 |
| PublicationPlace | London |
| PublicationPlace_xml | – name: London – name: England |
| PublicationSubtitle | International weekly journal of science |
| PublicationTitle | Nature (London) |
| PublicationTitleAbbrev | Nature |
| PublicationTitleAlternate | Nature |
| PublicationYear | 2015 |
| Publisher | Nature Publishing Group UK Nature Publishing Group |
| Publisher_xml | – name: Nature Publishing Group UK – name: Nature Publishing Group |
| References | BongardJZykovVLipsonHResilient machines through continuous self-modelingScience2006314111811212006Sci...314.1118B1:CAS:528:DC%2BD28Xht1SmtrjO10.1126/science.1133687 PougetABeckJMMaWJLathamPEProbabilistic brains: knowns and unknownsNature Neurosci.201316117011781:CAS:528:DC%2BC3sXht1ylu7vJ10.1038/nn.3495 KlugerJLovellJApollo 132006 DerégnaucourtSMitraPPFehérOPytteCTchernichovskiOHow sleep affects the developmental learning of bird songNature20054337107162005Natur.433..710D10.1038/nature03275 MurphyRRTrial by fireRobot. Automat. Mag.200411506110.1109/MRA.2004.1337826 BorjiAIttiLBayesian optimization explains human active searchAdv. Neural Inform. Process. Syst.20132655631373.94046 NagataniKEmergency response to the nuclear accident at the Fukushima Daiichi nuclear power plants using mobile rescue robotsJ. Field Robot.201330446310.1002/rob.21439 SproewitzAMoeckelRMayeJIjspeertALearning to move in modular robots using central pattern generators and online optimizationInt. J. Robot. Res.20082742344310.1177/0278364907088401 ThrunSStanley: the robot that won the DARPA grand challengeJ. Field Robot.20062366169210.1002/rob.20147 Kohl, N. & Stone, P. Policy gradient reinforcement learning for fast quadrupedal locomotion. In Proc. IEEE Int. Conf. on ‘Robotics and Automation’ (ICRA) 2619–2624 (IEEE, 2004). BlankeMKinnaertMLunzeJStaroswieckiMDiagnosis and Fault-Tolerant Control20061126.93004 AntonelliGFossenTIYoergerDRSicilianoBKhatibOSpringer Handbook of Robotics2008987100810.1007/978-3-540-30301-5_44 SicilianoBKhatibOSpringer Handbook of Robotics200810.1007/978-3-540-30301-5 Tesch, M. Schneider, J. & Choset, H. Using response surfaces and expected improvement to optimize snake robot gait parameters. In Proc. IEEE/RSJ Int. Conf. on ‘Intelligent Robots and Systems (IROS)’ 1069–1074 (IEEE, 2011). SandersonKMars rover Spirit (2003–10)Nature20104636001:CAS:528:DC%2BC3cXhsVyns70%3D10.1038/463600a BroadbentEStaffordRMacDonaldBAcceptance of healthcare robots for the older population: review and future directionsInt. J. Social Robot.2009131933010.1007/s12369-009-0030-6 MockusJBayesian Approach to Global Optimization: Theory and Applications20130693.49001 CarlsonJMurphyRRHow UGVs physically fail in the fieldIEEE Trans. Robot.20052142343710.1109/TRO.2004.838027 Mouret, J.-B. & Clune, J. Illuminating search spaces by mapping elites. Preprint at http://arxiv.org/abs/1504.04909 (2015). WagnerUGaisSHaiderHVerlegerRBornJSleep inspires insightNature20044273523552004Natur.427..352W1:CAS:528:DC%2BD2cXltFChuw%3D%3D10.1038/nature02223 KoberJBagnellJAPetersJReinforcement learning in robotics: a surveyInt. J. Robot. Res.2013321238127410.1177/0278364913495721 RasmussenCEWilliamsCKIGaussian Processes for Machine Learning20061177.68165 ItoMControl of mental activities by internal models in the cerebellumNature Rev. Neurosci.200893043131:CAS:528:DC%2BD1cXjsFCks70%3D10.1038/nrn2332 VermaVGordonGSimmonsRThrunSReal-time fault diagnosisRobot. Automat. Mag.200411566610.1109/MRA.2004.1310942 JarvisSLKinematic and kinetic analysis of dogs during trotting after amputation of a thoracic limbAm. J. Vet. Res.2013741155116310.2460/ajvr.74.9.1155 KördingKPWolpertDMBayesian integration in sensorimotor learningNature20044272442472004Natur.427..244K10.1038/nature02169 GrillnerSThe motor infrastructure: from ion channels to neuronal networksNature Rev. Neurosci.200345735861:CAS:528:DC%2BD3sXltVWmtr4%3D10.1038/nrn1137 Lizotte, D. J. Wang, T. Bowling, M. H. & Schuurmans, D. Automatic gait optimization with Gaussian process regression. In Proc. Int. Joint Conf. on ‘Artificial Intelligence’ (IJCAI) 944–949 (2007). FuchsAGoldnerBNolteISchillingNGround reaction force adaptations to tripedal locomotion in dogsVet. J.20142013073151:STN:280:DC%2BC2cjns1Ogsw%3D%3D10.1016/j.tvjl.2014.05.012 Benson-AmramSHolekampKEInnovative problem solving by wild spotted hyenasProc. R. Soc. Lond. B201227940874095 ArgallBDChernovaSVelosoMBrowningBA survey of robot learning from demonstrationRobot. Auton. Syst.20095746948310.1016/j.robot.2008.10.024 Calandra, R. Seyfarth, A., Peters, J. & Deisenroth, M. P. An experimental comparison of bayesian optimization for bipedal locomotion. In Proc. IEEE Int. Conf. on ‘Robotics and Automation’ (ICRA) 1951–1958 (IEEE, 2014). ChristensenDJSchultzUPStoyKA distributed and morphology-independent strategy for adaptive locomotion in self-reconfigurable modular robotsRobot. Auton. Syst.2013611021103510.1016/j.robot.2013.05.009 SantelloMPostural hand synergies for tool useJ. Neurosci.19981810105101151:CAS:528:DyaK1cXnslyqsLs%3D10.1523/JNEUROSCI.18-23-10105.1998 WolpertDMGhahramaniZFlanaganJRPerspective and problems in motor learningTrends Cogn. Sci.2001548749410.1016/S1364-6613(00)01773-3 S Thrun (BFnature14422_CR5) 2006; 23 RR Murphy (BFnature14422_CR2) 2004; 11 J Carlson (BFnature14422_CR8) 2005; 21 K Nagatani (BFnature14422_CR3) 2013; 30 BD Argall (BFnature14422_CR18) 2009; 57 CE Rasmussen (BFnature14422_CR21) 2006 KP Körding (BFnature14422_CR27) 2004; 427 S Benson-Amram (BFnature14422_CR25) 2012; 279 J Mockus (BFnature14422_CR22) 2013 J Kluger (BFnature14422_CR15) 2006 B Siciliano (BFnature14422_CR1) 2008 G Antonelli (BFnature14422_CR7) 2008 SL Jarvis (BFnature14422_CR16) 2013; 74 J Bongard (BFnature14422_CR14) 2006; 314 DJ Christensen (BFnature14422_CR11) 2013; 61 S Grillner (BFnature14422_CR24) 2003; 4 DM Wolpert (BFnature14422_CR19) 2001; 5 V Verma (BFnature14422_CR13) 2004; 11 M Santello (BFnature14422_CR20) 1998; 18 A Fuchs (BFnature14422_CR17) 2014; 201 M Blanke (BFnature14422_CR9) 2006 BFnature14422_CR34 BFnature14422_CR33 BFnature14422_CR35 J Kober (BFnature14422_CR12) 2013; 32 U Wagner (BFnature14422_CR29) 2004; 427 BFnature14422_CR32 BFnature14422_CR31 A Borji (BFnature14422_CR23) 2013; 26 A Pouget (BFnature14422_CR26) 2013; 16 E Broadbent (BFnature14422_CR4) 2009; 1 S Derégnaucourt (BFnature14422_CR28) 2005; 433 K Sanderson (BFnature14422_CR6) 2010; 463 A Sproewitz (BFnature14422_CR10) 2008; 27 M Ito (BFnature14422_CR30) 2008; 9 12838332 - Nat Rev Neurosci. 2003 Jul;4(7):573-86 17110570 - Science. 2006 Nov 17;314(5802):1118-21 15716944 - Nature. 2005 Feb 17;433(7027):710-6 23955561 - Nat Neurosci. 2013 Sep;16(9):1170-8 9822764 - J Neurosci. 1998 Dec 1;18(23):10105-15 23977887 - Am J Vet Res. 2013 Sep;74(9):1155-63 14737168 - Nature. 2004 Jan 22;427(6972):352-5 22874748 - Proc Biol Sci. 2012 Oct 7;279(1744):4087-95 26017437 - Nature. 2015 May 28;521(7553):426-7 14724638 - Nature. 2004 Jan 15;427(6971):244-7 11684481 - Trends Cogn Sci. 2001 Nov 1;5(11):487-494 24881509 - Vet J. 2014 Sep;201(3):307-15 18319727 - Nat Rev Neurosci. 2008 Apr;9(4):304-13 20130624 - Nature. 2010 Feb 4;463(7281):600 |
| References_xml | – reference: Kohl, N. & Stone, P. Policy gradient reinforcement learning for fast quadrupedal locomotion. In Proc. IEEE Int. Conf. on ‘Robotics and Automation’ (ICRA) 2619–2624 (IEEE, 2004). – reference: SicilianoBKhatibOSpringer Handbook of Robotics200810.1007/978-3-540-30301-5 – reference: Lizotte, D. J. Wang, T. Bowling, M. H. & Schuurmans, D. Automatic gait optimization with Gaussian process regression. In Proc. Int. Joint Conf. on ‘Artificial Intelligence’ (IJCAI) 944–949 (2007). – reference: BlankeMKinnaertMLunzeJStaroswieckiMDiagnosis and Fault-Tolerant Control20061126.93004 – reference: BorjiAIttiLBayesian optimization explains human active searchAdv. Neural Inform. Process. Syst.20132655631373.94046 – reference: WolpertDMGhahramaniZFlanaganJRPerspective and problems in motor learningTrends Cogn. Sci.2001548749410.1016/S1364-6613(00)01773-3 – reference: Benson-AmramSHolekampKEInnovative problem solving by wild spotted hyenasProc. R. Soc. Lond. B201227940874095 – reference: Calandra, R. Seyfarth, A., Peters, J. & Deisenroth, M. P. An experimental comparison of bayesian optimization for bipedal locomotion. In Proc. IEEE Int. Conf. on ‘Robotics and Automation’ (ICRA) 1951–1958 (IEEE, 2014). – reference: ThrunSStanley: the robot that won the DARPA grand challengeJ. Field Robot.20062366169210.1002/rob.20147 – reference: KlugerJLovellJApollo 132006 – reference: JarvisSLKinematic and kinetic analysis of dogs during trotting after amputation of a thoracic limbAm. J. Vet. Res.2013741155116310.2460/ajvr.74.9.1155 – reference: MockusJBayesian Approach to Global Optimization: Theory and Applications20130693.49001 – reference: DerégnaucourtSMitraPPFehérOPytteCTchernichovskiOHow sleep affects the developmental learning of bird songNature20054337107162005Natur.433..710D10.1038/nature03275 – reference: NagataniKEmergency response to the nuclear accident at the Fukushima Daiichi nuclear power plants using mobile rescue robotsJ. Field Robot.201330446310.1002/rob.21439 – reference: CarlsonJMurphyRRHow UGVs physically fail in the fieldIEEE Trans. Robot.20052142343710.1109/TRO.2004.838027 – reference: VermaVGordonGSimmonsRThrunSReal-time fault diagnosisRobot. Automat. Mag.200411566610.1109/MRA.2004.1310942 – reference: GrillnerSThe motor infrastructure: from ion channels to neuronal networksNature Rev. Neurosci.200345735861:CAS:528:DC%2BD3sXltVWmtr4%3D10.1038/nrn1137 – reference: ItoMControl of mental activities by internal models in the cerebellumNature Rev. Neurosci.200893043131:CAS:528:DC%2BD1cXjsFCks70%3D10.1038/nrn2332 – reference: RasmussenCEWilliamsCKIGaussian Processes for Machine Learning20061177.68165 – reference: MurphyRRTrial by fireRobot. Automat. Mag.200411506110.1109/MRA.2004.1337826 – reference: BroadbentEStaffordRMacDonaldBAcceptance of healthcare robots for the older population: review and future directionsInt. J. Social Robot.2009131933010.1007/s12369-009-0030-6 – reference: Mouret, J.-B. & Clune, J. Illuminating search spaces by mapping elites. Preprint at http://arxiv.org/abs/1504.04909 (2015). – reference: BongardJZykovVLipsonHResilient machines through continuous self-modelingScience2006314111811212006Sci...314.1118B1:CAS:528:DC%2BD28Xht1SmtrjO10.1126/science.1133687 – reference: KördingKPWolpertDMBayesian integration in sensorimotor learningNature20044272442472004Natur.427..244K10.1038/nature02169 – reference: KoberJBagnellJAPetersJReinforcement learning in robotics: a surveyInt. J. Robot. Res.2013321238127410.1177/0278364913495721 – reference: ChristensenDJSchultzUPStoyKA distributed and morphology-independent strategy for adaptive locomotion in self-reconfigurable modular robotsRobot. Auton. Syst.2013611021103510.1016/j.robot.2013.05.009 – reference: FuchsAGoldnerBNolteISchillingNGround reaction force adaptations to tripedal locomotion in dogsVet. J.20142013073151:STN:280:DC%2BC2cjns1Ogsw%3D%3D10.1016/j.tvjl.2014.05.012 – reference: PougetABeckJMMaWJLathamPEProbabilistic brains: knowns and unknownsNature Neurosci.201316117011781:CAS:528:DC%2BC3sXht1ylu7vJ10.1038/nn.3495 – reference: SproewitzAMoeckelRMayeJIjspeertALearning to move in modular robots using central pattern generators and online optimizationInt. J. Robot. Res.20082742344310.1177/0278364907088401 – reference: Tesch, M. Schneider, J. & Choset, H. Using response surfaces and expected improvement to optimize snake robot gait parameters. In Proc. IEEE/RSJ Int. Conf. on ‘Intelligent Robots and Systems (IROS)’ 1069–1074 (IEEE, 2011). – reference: AntonelliGFossenTIYoergerDRSicilianoBKhatibOSpringer Handbook of Robotics2008987100810.1007/978-3-540-30301-5_44 – reference: ArgallBDChernovaSVelosoMBrowningBA survey of robot learning from demonstrationRobot. Auton. Syst.20095746948310.1016/j.robot.2008.10.024 – reference: SandersonKMars rover Spirit (2003–10)Nature20104636001:CAS:528:DC%2BC3cXhsVyns70%3D10.1038/463600a – reference: SantelloMPostural hand synergies for tool useJ. Neurosci.19981810105101151:CAS:528:DyaK1cXnslyqsLs%3D10.1523/JNEUROSCI.18-23-10105.1998 – reference: WagnerUGaisSHaiderHVerlegerRBornJSleep inspires insightNature20044273523552004Natur.427..352W1:CAS:528:DC%2BD2cXltFChuw%3D%3D10.1038/nature02223 – volume: 16 start-page: 1170 year: 2013 ident: BFnature14422_CR26 publication-title: Nature Neurosci. doi: 10.1038/nn.3495 – volume: 61 start-page: 1021 year: 2013 ident: BFnature14422_CR11 publication-title: Robot. Auton. Syst. doi: 10.1016/j.robot.2013.05.009 – volume: 433 start-page: 710 year: 2005 ident: BFnature14422_CR28 publication-title: Nature doi: 10.1038/nature03275 – volume: 427 start-page: 244 year: 2004 ident: BFnature14422_CR27 publication-title: Nature doi: 10.1038/nature02169 – volume: 32 start-page: 1238 year: 2013 ident: BFnature14422_CR12 publication-title: Int. J. Robot. Res. doi: 10.1177/0278364913495721 – volume: 201 start-page: 307 year: 2014 ident: BFnature14422_CR17 publication-title: Vet. J. doi: 10.1016/j.tvjl.2014.05.012 – volume: 11 start-page: 50 year: 2004 ident: BFnature14422_CR2 publication-title: Robot. Automat. Mag. doi: 10.1109/MRA.2004.1337826 – ident: BFnature14422_CR34 doi: 10.1109/ICRA.2014.6907117 – volume: 314 start-page: 1118 year: 2006 ident: BFnature14422_CR14 publication-title: Science doi: 10.1126/science.1133687 – volume-title: Springer Handbook of Robotics year: 2008 ident: BFnature14422_CR1 doi: 10.1007/978-3-540-30301-5 – volume: 11 start-page: 56 year: 2004 ident: BFnature14422_CR13 publication-title: Robot. Automat. Mag. doi: 10.1109/MRA.2004.1310942 – start-page: 987 volume-title: Springer Handbook of Robotics year: 2008 ident: BFnature14422_CR7 doi: 10.1007/978-3-540-30301-5_44 – volume: 18 start-page: 10105 year: 1998 ident: BFnature14422_CR20 publication-title: J. Neurosci. doi: 10.1523/JNEUROSCI.18-23-10105.1998 – ident: BFnature14422_CR35 – volume: 23 start-page: 661 year: 2006 ident: BFnature14422_CR5 publication-title: J. Field Robot. doi: 10.1002/rob.20147 – volume: 463 start-page: 600 year: 2010 ident: BFnature14422_CR6 publication-title: Nature doi: 10.1038/463600a – volume-title: Bayesian Approach to Global Optimization: Theory and Applications year: 2013 ident: BFnature14422_CR22 – volume: 279 start-page: 4087 year: 2012 ident: BFnature14422_CR25 publication-title: Proc. R. Soc. Lond. B – volume-title: Apollo 13 year: 2006 ident: BFnature14422_CR15 – volume-title: Gaussian Processes for Machine Learning year: 2006 ident: BFnature14422_CR21 – volume: 74 start-page: 1155 year: 2013 ident: BFnature14422_CR16 publication-title: Am. J. Vet. Res. doi: 10.2460/ajvr.74.9.1155 – volume: 57 start-page: 469 year: 2009 ident: BFnature14422_CR18 publication-title: Robot. Auton. Syst. doi: 10.1016/j.robot.2008.10.024 – ident: BFnature14422_CR31 doi: 10.1109/ROBOT.2004.1307456 – volume: 26 start-page: 55 year: 2013 ident: BFnature14422_CR23 publication-title: Adv. Neural Inform. Process. Syst. – volume: 27 start-page: 423 year: 2008 ident: BFnature14422_CR10 publication-title: Int. J. Robot. Res. doi: 10.1177/0278364907088401 – volume: 5 start-page: 487 year: 2001 ident: BFnature14422_CR19 publication-title: Trends Cogn. Sci. doi: 10.1016/S1364-6613(00)01773-3 – volume-title: Diagnosis and Fault-Tolerant Control year: 2006 ident: BFnature14422_CR9 – volume: 1 start-page: 319 year: 2009 ident: BFnature14422_CR4 publication-title: Int. J. Social Robot. doi: 10.1007/s12369-009-0030-6 – volume: 9 start-page: 304 year: 2008 ident: BFnature14422_CR30 publication-title: Nature Rev. Neurosci. doi: 10.1038/nrn2332 – volume: 21 start-page: 423 year: 2005 ident: BFnature14422_CR8 publication-title: IEEE Trans. Robot. doi: 10.1109/TRO.2004.838027 – volume: 427 start-page: 352 year: 2004 ident: BFnature14422_CR29 publication-title: Nature doi: 10.1038/nature02223 – ident: BFnature14422_CR33 doi: 10.1109/IROS.2011.6095076 – volume: 30 start-page: 44 year: 2013 ident: BFnature14422_CR3 publication-title: J. Field Robot. doi: 10.1002/rob.21439 – ident: BFnature14422_CR32 – volume: 4 start-page: 573 year: 2003 ident: BFnature14422_CR24 publication-title: Nature Rev. Neurosci. doi: 10.1038/nrn1137 – reference: 18319727 - Nat Rev Neurosci. 2008 Apr;9(4):304-13 – reference: 15716944 - Nature. 2005 Feb 17;433(7027):710-6 – reference: 14724638 - Nature. 2004 Jan 15;427(6971):244-7 – reference: 23955561 - Nat Neurosci. 2013 Sep;16(9):1170-8 – reference: 26017437 - Nature. 2015 May 28;521(7553):426-7 – reference: 20130624 - Nature. 2010 Feb 4;463(7281):600 – reference: 23977887 - Am J Vet Res. 2013 Sep;74(9):1155-63 – reference: 11684481 - Trends Cogn Sci. 2001 Nov 1;5(11):487-494 – reference: 22874748 - Proc Biol Sci. 2012 Oct 7;279(1744):4087-95 – reference: 12838332 - Nat Rev Neurosci. 2003 Jul;4(7):573-86 – reference: 14737168 - Nature. 2004 Jan 22;427(6972):352-5 – reference: 17110570 - Science. 2006 Nov 17;314(5802):1118-21 – reference: 9822764 - J Neurosci. 1998 Dec 1;18(23):10105-15 – reference: 24881509 - Vet J. 2014 Sep;201(3):307-15 |
| SSID | ssj0005174 |
| Score | 2.668134 |
| Snippet | An intelligent trial-and-error learning algorithm is presented that allows robots to adapt in minutes to compensate for a wide variety of types of damage.... Robots have transformed many industries, most notably manufacturing, and have the power to deliver tremendous benefits to society, such as in search and... An intelligent trial-and-error learning algorithm is presented that allows robots to adapt in minutes to compensate for a wide variety of types of damage. As robots leave the controlled environments of factories to autonomously function in more complex, natural environments, they will have to respond to the... |
| SourceID | hal proquest gale pubmed crossref springer |
| SourceType | Open Access Repository Aggregation Database Index Database Enrichment Source Publisher |
| StartPage | 503 |
| SubjectTerms | 639/705/117 Adaptation level (Psychology) Adaptation, Physiological Adaptive control Algorithms Analysis Animal behavior Animals Artificial Intelligence Behavior, Animal Biomimetics Biomimetics - methods Computer Science Control Design Disaster management Dogs Extremities - injuries Extremities - physiopathology Humanities and Social Sciences letter Machine learning Maintenance and repair Methods Mobile robots Motor Skills multidisciplinary Oceans Robotics Robotics - instrumentation Robotics - methods Robots Science Search and rescue Technology application Time Factors |
| Title | Robots that can adapt like animals |
| URI | https://link.springer.com/article/10.1038/nature14422 https://www.ncbi.nlm.nih.gov/pubmed/26017452 https://www.proquest.com/docview/1684954951 https://www.proquest.com/docview/1684430738 https://hal.science/hal-01158243 |
| Volume | 521 |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVEBS databaseName: EBSCOhost Academic Search Ultimate customDbUrl: https://search.ebscohost.com/login.aspx?authtype=ip,shib&custid=s3936755&profile=ehost&defaultdb=asn eissn: 1476-4687 dateEnd: 20151119 omitProxy: true ssIdentifier: ssj0005174 issn: 0028-0836 databaseCode: ABDBF dateStart: 19970605 isFulltext: true titleUrlDefault: https://search.ebscohost.com/direct.asp?db=asn providerName: EBSCOhost – providerCode: PRVEBS databaseName: EBSCOhost Food Science Source customDbUrl: eissn: 1476-4687 dateEnd: 20151119 omitProxy: false ssIdentifier: ssj0005174 issn: 0028-0836 databaseCode: A8Z dateStart: 19970605 isFulltext: true titleUrlDefault: https://search.ebscohost.com/login.aspx?authtype=ip,uid&profile=ehost&defaultdb=fsr providerName: EBSCOhost – providerCode: PRVPQU databaseName: ProQuest Central customDbUrl: http://www.proquest.com/pqcentral?accountid=15518 eissn: 1476-4687 dateEnd: 20241105 omitProxy: true ssIdentifier: ssj0005174 issn: 0028-0836 databaseCode: BENPR dateStart: 19880107 isFulltext: true titleUrlDefault: https://www.proquest.com/central providerName: ProQuest – providerCode: PRVPQU databaseName: ProQuest Central Health & Medical Collection (via ProQuest) customDbUrl: eissn: 1476-4687 dateEnd: 20241105 omitProxy: true ssIdentifier: ssj0005174 issn: 0028-0836 databaseCode: 7X7 dateStart: 19880107 isFulltext: true titleUrlDefault: https://search.proquest.com/healthcomplete providerName: ProQuest – providerCode: PRVPQU databaseName: ProQuest Technology Collection customDbUrl: eissn: 1476-4687 dateEnd: 20241105 omitProxy: true ssIdentifier: ssj0005174 issn: 0028-0836 databaseCode: 8FG dateStart: 19900104 isFulltext: true titleUrlDefault: https://search.proquest.com/technologycollection1 providerName: ProQuest – providerCode: PRVPQU databaseName: Public Health Database customDbUrl: eissn: 1476-4687 dateEnd: 20241105 omitProxy: true ssIdentifier: ssj0005174 issn: 0028-0836 databaseCode: 8C1 dateStart: 19880107 isFulltext: true titleUrlDefault: https://search.proquest.com/publichealth providerName: ProQuest |
| link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwhV3db9MwELfWTki8IDa-ysYUqvEpRUscO3EeECrTSkFQocKkai9RYjtbxUg6kvL3c9c4XUKr8uKH-HJxnPP5Lnf-HSHHOtWhUL5jKyUcmwUOt0UqXFCGIYMN35U-xcPJX8f-6Jx9nvLpDhnXZ2EwrbLWiUtFrXKJ_8hPXF8wDElx9_38xsaqURhdrUtoxKa0gnq3hBjrkF2KyFhdsvvhbPxtcpv08Q8uszmx53jipALSBP-C0tYeZTR15woTJdet0LUI6nJjGt4n94xFaQ0qEdgjOzrbJ3eWmZ2y2Cd7ZvUW1msDMf3mAelP8iQvC6u8iksLJteKVTwvrevZT23F2ewXyORDcj48-3E6sk21BFsG1Clt7enU46FywD0OpQ9uQ6BkKv2UsxScHKopT5wEnE9XSBHSmCKymK9SBj5NKnXoPSLdLM_0E2J5gdIy8CRLVMBYGgqXcZZI8C1CV_tJ3CNv6_mJpIESx4oW19EypO2JqDGZPXK8Ip5XCBqbyfo40RFiUmSY9HIZL4oi-vR9Eg3Q6PE91-fbiHwOpi6GlXvklSFKcxiVjM1BA3g3xLpqsfsPZYPnQYtSzmc3UYPPht7GvS9bvZfVx940nO2EDY6HLULQCLLFZ1N34-4-yPLqWyC--GjwJcJr6B8Iyrw_LvCoRT0yWq2IbtdgjzxfdSN7zNTLdL6oaBhuHKJHHldLZPUohK8LGIcRvKjXTIP5ukw83T6IA3IXzFeOuRxUHJJu-Xuhn4GJWCZHpBNMA2jFqYvt8CO0A3FxZPTBX3LuYao |
| linkProvider | ProQuest |
| linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3db9MwELfGEIIXxMZX2YBQbXxJ0RLHdpwHhCpgalm3h7FJfTOJ42wVI-lICuKf4m_krkm6hFblaa_x5eLa55_veuefCdkxiQlkLBw7jqVjM9_htkykC2AYMNjwXS0oHk4-PBL9U_Z5xEdr5E99FgbLKmtMnAF1nGn8j3zPFZJhSoq77yeXNt4ahdnV-gqN0iwOzO9fELLl7wYfYX53Kd3_dPKhb1e3Ctjap05hG88kHg9iB8LIQAtwr_1YJ1oknCUQDFBDeeREEKS5UsuAhhQZuEScMPD9E22QfAkg_ybzAEtg_fgj_6qk5B_W5-o8oOPJvZKmE6IXSls7YLUP3DjHMsxFH3chPzvb9vbvkbuVv2r1SgPbIGsm3SS3ZnWjOt8kGxU25NbrisD6zX3SPc6irMit4jwsLJg6K4zDSWFdjL8ZK0zH38HiH5DTaxm1h2Q9zVLzmFieHxvte5pFsc9YEkiXcRZpiFwC14go7JC39fgoXRGV430ZF2qWMPekagxmh-zMhSclP8dysS4OtELGixRLas7CaZ6rwZdj1UOXSniu4KuEBAdHGpPWHfKqEkoy6JUOq2MM8NuQSaul7j-SDZ1bLUk9GV-qhp4lrY13X7Zaz8rJXtad1YINjdstQcAb3dKzrLnxdhdseT4XyF7e7w0VPsPoQ1Lm_XRBR23qqsLMXF2t8A55MW9G9VgHmJpsWsow3JZkhzwql8j8U0iO5zMOPdit10xD-aJNPFndiefkdv_kcKiGg6ODLXIHHGWOVSNUbpP14sfUPAVntIiezRDAIl-vG3L-AgDNk_M |
| linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3db9MwELfGEIgXxMZX2YAwbXxJURPHcZwHhCpG1bIxocGkvpnEsbeKkXQkBfGv8ddx1yRdQqvytNf4cnHt8893vfPPhOxqo0ORcMdOEuHYLHB8WxjhAhiGDDZ8V3GKh5M_HvHBCfsw8kdr5E99FgbLKmtMnAF1kin8j7zrcsEwJeW7XVOVRXza77-dXNh4gxRmWuvrNEoTOdC_f0H4lr8Z7sNc71Haf__l3cCubhiwVUCdwtaeNp4fJg6ElKHi4GoHiTKKG58ZCAyopn7sxBCwuUKJkEYU2bh4YhjEAUZpJGIC-L8eeF6I5YTBKLgsL_mHAbo6G-h4oltSdkIkQ2lrN6z2hGtnWJK56O8u5GpnW2D_Drld-a5WrzS2DbKm001yY1ZDqvJNslHhRG69rMisX90lO8dZnBW5VZxFhQXTaEVJNCms8_E3bUXp-DtY_z1yciWjdp-sp1mqHxLLCxKtAk-xOAkYM6Fwmc9iBVFM6GoeRx3yuh4fqSrScrw741zOkueekI3B7JDdufCk5OpYLraDAy2R_SJFOzqNpnkuh5-PZQ_dK-653F8lxH1wqjGB3SEvKiGTQa9UVB1pgN-GrFotdf-RbOjcakmqyfhCNvQsaW28-7zVelpO9rLurBZsaNxuCQL2qJaeZc2Nt3fAludzgUzmg96hxGcYiQjKvJ8u6KhNXVb4mcvL1d4hz-bNqB5rAlOdTUsZhluU6JAH5RKZfwqJ8gLmQw_26jXTUL5oE49Wd-IpuQlgIw-HRwdb5Bb4zD4WkFCxTdaLH1P9GPzSIn4yAwCLfL1qxPkLpguYNg |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Robots+that+can+adapt+like+animals&rft.jtitle=Nature+%28London%29&rft.au=Cully%2C+Antoine&rft.au=Clune%2C+Jeff&rft.au=Tarapore%2C+Danesh&rft.au=Mouret%2C+Jean-Baptiste&rft.date=2015-05-28&rft.pub=Nature+Publishing+Group+UK&rft.issn=0028-0836&rft.eissn=1476-4687&rft.volume=521&rft.issue=7553&rft.spage=503&rft.epage=507&rft_id=info:doi/10.1038%2Fnature14422&rft.externalDocID=10_1038_nature14422 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0028-0836&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0028-0836&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0028-0836&client=summon |