GSTDTAP  > 气候变化
DOI10.1126/science.abd3643
How to improve robotic touch
Subramanian Sundaram
2020-11-13
发表期刊Science
出版年2020
英文摘要Human hands are densely covered with touch receptors called mechanoreceptors that provide us the sense of touch. This continuous tactile feedback from objects we touch, and information on hand articulation and movement —called proprioception—enable us to effortlessly handle diverse objects with fine dexterity. Bell's prescient treatise on the hand, written in 1833, refers to the human hand as the “consummation of all perfection as an instrument” ([ 1 ][1]). This sentiment echoes stronger today, in part fueled by the steep challenge of effectively instrumenting a robotic hand to provide similar feedback with the goal of attaining human-level dexterity. Recent insights into the primate tactile system and advances in machine learning (ML) may offer new prospects for tackling this old robotics challenge. Roboticists have looked at all aspects of human tactile physiology (ranging from mechanoreceptors to neural coding schemes to grasping strategies) for inspiration on designing robotic touch systems. Most of the attention has focused on creating electronic sensors that mimic mechanoreceptors (see the figure) ([ 2 ][2]). Four types of mechanoreceptors—each responding to different types of tactile stimuli—innervate the skin: the fast adapting receptors FA I and FA II, and slowly adapting receptors SA I and SA II ([ 3 ][3]). The spatial coverage of these distinct mechanoreceptor types in the human hand have been systematically mapped since 1970, bringing attention to their collective functional roles. However, the subcellular molecular transducers of force have been harder to identify. The mechanically activated ion channels PIEZO1 and PIEZO2 were found to be force transducers in mammalian cells ([ 4 ][4]), and evidence of PIEZO2's role in human tactile and proprioceptive function is enabling an understanding of the initial signals involved in human touch ([ 5 ][5]). PIEZO2 is known to be involved in many low-threshold mechanoreceptors. Currently, there are many unknowns in our understanding of the exact links between these initial transducers of force (ion channels) and mechanoreceptors. Many families of mechanosensitive ion channels other than PIEZOs are believed to contribute to touch because PIEZOs alone do not account for all human mechanosensation ([ 6 ][6]), but what these other families are, and how signals from these transducers are processed together, are not known. Even if these transducing channels were identified, it is not clear what role they play in enabling the distinct mechanoreceptor types that respond to specific stimuli. Mechanoreceptor responses may encode signals from many diverse force transducers, and a better understanding of the mammalian system may guide artificial sensor design. First-order tactile neurons that collect signals from mechanoreceptors have a branched architecture, thereby integrating signals from multiple mechanoreceptors in distant points of the skin known as the receptive field ([ 3 ][3]). This complex spatial arrangement allows each first-order neuron to record object features during contact—like edges—as opposed to detecting forces at single points ([ 7 ][7]). As a result, the signal carried by a first-order neuron captures many properties as seen through multiple transducers and from many different locations. Furthermore, this signal may depend on material properties such as hardness and surface texture. Prior to sensing by mechanoreceptors, object contact is mediated by the skin, which modulates its own mechanical properties through skin hydration. Moisture secreted from sweat glands hydrates keratin in the skin and softens it. This process changes the contact dynamics between the fingertip and an object; junctional contact area increases more slowly when touching a hard, smooth glass surface as compared to a soft rubber ([ 8 ][8]). Replicating all of the subtle features of mechanoreceptors and the skin in contemporary robots has remained daunting. However, substantial progress is being made in the creation of soft, stretchable electronic skins that can sense loads normal to the surface or shear forces against the surface; in both cases, static or dynamic forces can be sensed ([ 2 ][2], [ 9 ][9]). Creative new fabrication strategies that use soft materials and high-performance organic electronics have also enabled neuromorphic sensors that produce action potential–like voltage spikes in response to forces ([ 10 ][10]). Neuromorphic tactile systems designed for robots today are still nascent when compared with architectures in humans. In humans, the spatial distribution of receptors is typically 240 units/cm2 at fingertips, with over 17,000 receptors covering the hand ([ 3 ][3]). Contemporary robots lag by two to three orders of magnitude, and the number of levels of distributed processing need much improvement. However, this approach is promising because spike-based processing can efficiently encode temporal relationships, motion, and other object properties with high fidelity and energy efficiency. Passive touch is not the only way in which we acquire tactile information. To identify a grasped object, and simultaneously estimate its material characteristics, texture, and weight—all without the aid of vision—humans often use active haptic sensing, the use of exploratory and informative actions, to refine the sensory information acquired ([ 11 ][11]). Thus, we can determine texture by sliding our fingers, and reliably distinguish fabrics like velvet from wool. ![Figure][12] Human versus robotic touch Human hands are covered with sensors (mechanoreceptors) that provide continuous touch-sensation feedback that allow us to identify and control objects. For example, we can exert the right amount of force to hold sharp and pointy fruits such as a horned melon (kiwano) with great poise and dexterity. Robotic hands do not have these capabilities but continue to improve. GRAPHIC: C. BICKEL/ SCIENCE Although robotic manipulation of objects with similar sensorimotor control is limited by tactile hardware in robot hands, there has been exciting progress in using visual data. The ubiquity of cameras and the ability to acquire large datasets—which are critical for effectively utilizing deep convolutional neural networks (CNNs) and reinforcement learning algorithms—are contributing to the success of visuomotor policies in manipulation tasks ([ 12 ][13], [ 13 ][14]). Ultimately, relying too heavily on vision alone has its problems. This approach works well in fixed settings without any visual occlusions, such as robotic pick and place. Achieving manual dexterity in robot manipulation with vision alone may prove to be difficult in highly unstructured and dynamic settings without continuous line-of-sight access, such as searching through piles of rubble in disaster response. The success of robot vision–based object manipulation has many strong implications for robotic touch. It confirms that the emerging ML tools are effective at distilling information from high-dimensional images into actionable control policies. Many classes of algorithms such as CNNs can be readily used with tactile data ([ 14 ][15]), and vision-based ML strategies can be used for grasp planning. A roadmap for hardware advances in tactile sensors may focus on at least three broad themes. There should be renewed emphasis on the reliability of tactile sensors over a robot's lifetime (several years), and on producing large amounts of high-quality data. Multimodal transducers that record light touch, deformation, vibrations, and temperatures using array architectures with increased distributed processing and reduced wiring are needed. Robot hands specifically suitable for sensorimotor control will benefit from a codesign of actuators and sensors with a high density of coverage (>100 sensors/cm2 over large areas). In addition, robust proprioception will be needed in such applications for reliably localizing object contact. Neuromorphic tactile hardware (and software) advances will strongly influence the future of bionic prostheses—a compelling application of robotic hands. Adhering to encoding techniques that closely mirror natural tactile coding schemes has been critical in providing users with realistic sensory feedback through peripheral nerve interfaces ([ 15 ][16]). In that regard, increasing the density of these connections will be crucial. 1. [↵][17]1. C. Bell , The Hand: Its Mechanism and Vital Endowments as Evincing Design (The Bridgewater Treatises, William Pickering, London, UK, 1833), vol. 4. 2. [↵][18]1. C. Bartolozzi, 2. L. Natale, 3. F. Nori, 4. G. Metta , Nat. Mater. 15, 921 (2016). [OpenUrl][19] 3. [↵][20]1. A. B. Vallbo, 2. R. S. Johansson , Hum. Neurobiol. 3, 3 (1984). [OpenUrl][21][PubMed][22][Web of Science][23] 4. [↵][24]1. B. Coste et al ., Science 330, 55 (2010). [OpenUrl][25][Abstract/FREE Full Text][26] 5. [↵][27]1. A. T. Chester et al ., N. Engl. J. Med. 375, 1355 (2016). [OpenUrl][28][CrossRef][29][PubMed][30] 6. [↵][31]1. F. Moehring, 2. P. Halder, 3. R. P. Seal, 4. C. L. Stucky , Neuron 100, 2 (2018). [OpenUrl][32] 7. [↵][33]1. J. A. Pruszynski, 2. R. S. Johansson , Nat. Neurosci. 17, 1404 (2014). [OpenUrl][34][CrossRef][35][PubMed][36] 8. [↵][37]1. B. Dzidek, 2. S. Bochereau, 3. S. A. Johnson, 4. V. Hayward, 5. M. J. Adams , Proc. Natl. Acad. Sci. U.S.A. 114, 10864 (2017). [OpenUrl][38][Abstract/FREE Full Text][39] 9. [↵][40]1. A. Chortos, 2. J. Liu, 3. Z. Bao , Nat. Mater. 15, 937 (2016). [OpenUrl][41][CrossRef][42][PubMed][43] 10. [↵][44]1. Y. Kim et al ., Science 360, 998 (2018). [OpenUrl][45][Abstract/FREE Full Text][46] 11. [↵][47]1. S. J. Lederman, 2. R. L. Klatzky , Cognit. Psychol. 19, 342 (1987). [OpenUrl][48][CrossRef][49][PubMed][50][Web of Science][51] 12. [↵][52]1. S. Levine et al ., J. Mach. Learn. Res. 17, 1334 (2016). [OpenUrl][53] 13. [↵][54]1. M. Andrychowicz et al ., Int. J. Robot. Res. 39, 3 (2020). [OpenUrl][55] 14. [↵][56]1. S. Sundaram et al ., Nature 569, 698 (2019). [OpenUrl][57] 15. [↵][58]1. J. A. George et al ., Sci. Robot. 4, 32 (2019). [OpenUrl][59] Acknowledgments: S.S. is supported by American Heart Association grant 20POST35210045. [1]: #ref-1 [2]: #ref-2 [3]: #ref-3 [4]: #ref-4 [5]: #ref-5 [6]: #ref-6 [7]: #ref-7 [8]: #ref-8 [9]: #ref-9 [10]: #ref-10 [11]: #ref-11 [12]: pending:yes [13]: #ref-12 [14]: #ref-13 [15]: #ref-14 [16]: #ref-15 [17]: #xref-ref-1-1 "View reference 1 in text" [18]: #xref-ref-2-1 "View reference 2 in text" [19]: {openurl}?query=rft.jtitle%253DNat.%2BMater.%26rft.volume%253D15%26rft.spage%253D921%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [20]: #xref-ref-3-1 "View reference 3 in text" [21]: {openurl}?query=rft.jtitle%253DHuman%2Bneurobiology%26rft.stitle%253DHum%2BNeurobiol%26rft.aulast%253DVallbo%26rft.auinit1%253DA.%2BB.%26rft.volume%253D3%26rft.issue%253D1%26rft.spage%253D3%26rft.epage%253D14%26rft.atitle%253DProperties%2Bof%2Bcutaneous%2Bmechanoreceptors%2Bin%2Bthe%2Bhuman%2Bhand%2Brelated%2Bto%2Btouch%2Bsensation.%26rft_id%253Dinfo%253Apmid%252F6330008%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [22]: /lookup/external-ref?access_num=6330008&link_type=MED&atom=%2Fsci%2F370%2F6518%2F768.atom [23]: /lookup/external-ref?access_num=A1984TB34300002&link_type=ISI [24]: #xref-ref-4-1 "View reference 4 in text" [25]: {openurl}?query=rft.jtitle%253DScience%26rft_id%253Dinfo%253Adoi%252F10.1126%252Fscience.1193270%26rft_id%253Dinfo%253Apmid%252F20813920%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [26]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6Mzoic2NpIjtzOjU6InJlc2lkIjtzOjExOiIzMzAvNjAwMC81NSI7czo0OiJhdG9tIjtzOjIyOiIvc2NpLzM3MC82NTE4Lzc2OC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30= [27]: #xref-ref-5-1 "View reference 5 in text" [28]: {openurl}?query=rft.jtitle%253DN.%2BEngl.%2BJ.%2BMed.%26rft.volume%253D375%26rft.spage%253D1355%26rft_id%253Dinfo%253Adoi%252F10.1056%252FNEJMoa1602812%26rft_id%253Dinfo%253Apmid%252F27653382%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [29]: /lookup/external-ref?access_num=10.1056/NEJMoa1602812&link_type=DOI [30]: /lookup/external-ref?access_num=27653382&link_type=MED&atom=%2Fsci%2F370%2F6518%2F768.atom [31]: #xref-ref-6-1 "View reference 6 in text" [32]: {openurl}?query=rft.jtitle%253DNeuron%26rft.volume%253D100%26rft.spage%253D2%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [33]: #xref-ref-7-1 "View reference 7 in text" [34]: {openurl}?query=rft.jtitle%253DNat.%2BNeurosci.%26rft.volume%253D17%26rft.spage%253D1404%26rft_id%253Dinfo%253Adoi%252F10.1038%252Fnn.3804%26rft_id%253Dinfo%253Apmid%252F25174006%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [35]: /lookup/external-ref?access_num=10.1038/nn.3804&link_type=DOI [36]: /lookup/external-ref?access_num=25174006&link_type=MED&atom=%2Fsci%2F370%2F6518%2F768.atom [37]: #xref-ref-8-1 "View reference 8 in text" [38]: {openurl}?query=rft.jtitle%253DProc.%2BNatl.%2BAcad.%2BSci.%2BU.S.A.%26rft_id%253Dinfo%253Adoi%252F10.1073%252Fpnas.1706233114%26rft_id%253Dinfo%253Apmid%252F28973874%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [39]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoicG5hcyI7czo1OiJyZXNpZCI7czoxMjoiMTE0LzQxLzEwODY0IjtzOjQ6ImF0b20iO3M6MjI6Ii9zY2kvMzcwLzY1MTgvNzY4LmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ== [40]: #xref-ref-9-1 "View reference 9 in text" [41]: {openurl}?query=rft.jtitle%253DNat.%2BMater.%26rft.volume%253D15%26rft.spage%253D937%26rft_id%253Dinfo%253Adoi%252F10.1038%252Fnmat4671%26rft_id%253Dinfo%253Apmid%252F27376685%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [42]: /lookup/external-ref?access_num=10.1038/nmat4671&link_type=DOI [43]: /lookup/external-ref?access_num=27376685&link_type=MED&atom=%2Fsci%2F370%2F6518%2F768.atom [44]: #xref-ref-10-1 "View reference 10 in text" [45]: {openurl}?query=rft.jtitle%253DScience%26rft.stitle%253DScience%26rft.aulast%253DKim%26rft.auinit1%253DY.%26rft.volume%253D360%26rft.issue%253D6392%26rft.spage%253D998%26rft.epage%253D1003%26rft.atitle%253DA%2Bbioinspired%2Bflexible%2Borganic%2Bartificial%2Bafferent%2Bnerve%26rft_id%253Dinfo%253Adoi%252F10.1126%252Fscience.aao0098%26rft_id%253Dinfo%253Apmid%252F29853682%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [46]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6Mzoic2NpIjtzOjU6InJlc2lkIjtzOjEyOiIzNjAvNjM5Mi85OTgiO3M6NDoiYXRvbSI7czoyMjoiL3NjaS8zNzAvNjUxOC83NjguYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9 [47]: #xref-ref-11-1 "View reference 11 in text" [48]: {openurl}?query=rft.jtitle%253DCognitive%2Bpsychology%26rft.stitle%253DCognit%2BPsychol%26rft.aulast%253DLederman%26rft.auinit1%253DS.%2BJ.%26rft.volume%253D19%26rft.issue%253D3%26rft.spage%253D342%26rft.epage%253D368%26rft.atitle%253DHand%2Bmovements%253A%2Ba%2Bwindow%2Binto%2Bhaptic%2Bobject%2Brecognition.%26rft_id%253Dinfo%253Adoi%252F10.1016%252F0010-0285%252887%252990008-9%26rft_id%253Dinfo%253Apmid%252F3608405%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [49]: /lookup/external-ref?access_num=10.1016/0010-0285(87)90008-9&link_type=DOI [50]: /lookup/external-ref?access_num=3608405&link_type=MED&atom=%2Fsci%2F370%2F6518%2F768.atom [51]: /lookup/external-ref?access_num=A1987J054700002&link_type=ISI [52]: #xref-ref-12-1 "View reference 12 in text" [53]: {openurl}?query=rft.jtitle%253DJ.%2BMach.%2BLearn.%2BRes.%26rft.volume%253D17%26rft.spage%253D1334%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [54]: #xref-ref-13-1 "View reference 13 in text" [55]: {openurl}?query=rft.jtitle%253DInt.%2BJ.%2BRobot.%2BRes.%26rft.volume%253D39%26rft.spage%253D3%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [56]: #xref-ref-14-1 "View reference 14 in text" [57]: {openurl}?query=rft.jtitle%253DNature%26rft.volume%253D569%26rft.spage%253D698%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [58]: #xref-ref-15-1 "View reference 15 in text" [59]: {openurl}?query=rft.jtitle%253DSci.%2BRobot.%26rft.volume%253D4%26rft.spage%253D32%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx
领域气候变化 ; 资源环境
URL查看原文
引用统计
文献类型期刊论文
条目标识符http://119.78.100.173/C666/handle/2XK7JSWQ/304123
专题气候变化
资源环境科学
推荐引用方式
GB/T 7714
Subramanian Sundaram. How to improve robotic touch[J]. Science,2020.
APA Subramanian Sundaram.(2020).How to improve robotic touch.Science.
MLA Subramanian Sundaram."How to improve robotic touch".Science (2020).
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Subramanian Sundaram]的文章
百度学术
百度学术中相似的文章
[Subramanian Sundaram]的文章
必应学术
必应学术中相似的文章
[Subramanian Sundaram]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。