GSTDTAP  > 气候变化
DOI10.1126/science.abi4692
Ecology in the age of automation
Timothy H. Keitt; Eric S. Abelson
2021-08-20
发表期刊Science
出版年2021
英文摘要The accelerating pace of global change is driving a biodiversity extinction crisis ([ 1 ][1]) and is outstripping our ability to track, monitor, and understand ecosystems, which is traditionally the job of ecologists. Ecological research is an intensive, field-based enterprise that relies on the skills of trained observers. This process is both time-consuming and expensive, thus limiting the resolution and extent of our knowledge of the natural world. Although technology will never replace the intuition and breadth of skills of the experienced naturalist ([ 2 ][2]), ecologists cannot ignore the potential to greatly expand the scale of our studies through automation. The capacity to automate biodiversity sampling is being driven by three ongoing technological developments: the commoditization of small, low-power computing devices; advances in wireless communications; and an explosion in automated data-recognition algorithms in the field of machine learning. Automated data collection and machine learning are set to revolutionize in situ studies of natural systems. Automation has swept across all human endeavors over recent decades, and science is no exception. The extent of ecological observation has traditionally been limited by the costs of manual data collection. We envision a future in which data from field studies are augmented with continuous, fine-scale, remotely sensed data recording the presence, behavior, and other properties of individual organisms. As automation drives down costs of these networks, there will not be a simple expansion of the quantity of data. Rather, the potential high resolution and broad extent of these data will lead to qualitatively new findings and will result in new discoveries about the natural world that will enable ecologists to better predict and manage changing ecosystems ([ 3 ][3]). This will be especially true as different types of sensing networks, including mobile elements such as drones, are connected together to provide a rich, multidimensional view of nature. Given the role that biodiversity plays in lending resilience to the ecosystems on which humans depend ([ 4 ][4]), monitoring the distribution and abundance of species along with climate and other variables is a critical need in developing ecological hypotheses and for adapting to emerging global challenges. Ecosystems are alive with sound and motion that can be captured with audio and video sensors. Rapid advances in audio and video classification algorithms can allow the recognition of species and labeling of complex traits and behaviors, which were traditionally the domain of manual species identification by experts. The major advance has been the discovery of deep convolutional neural networks ([ 5 ][5]). These algorithms extract fundamental aspects of contrast and shape in a manner analogous to how we and other animals recognize objects in our visual field. Applied to audio signals, these neural networks are highly effective at classifying natural and anthropogenic sounds ([ 6 ][6]). A canonical example is the classification of bird songs. Other acoustic examples include insects, amphibians, and disturbance indicators such as chainsaws. Naturally, these algorithms also lend themselves to species identification from images and videos. In cases of animals displaying complex color patterns, individuals may be distinguished, allowing minimally invasive mark recapture, an important tool in population studies and conservation ([ 7 ][7]). Beyond sight and sound, sensors can target a wide range of physical, chemical, and biological phenomena. Particularly intriguing is the possibility for widespread environmental sensing of biomolecular compounds that could, for example, allow quantification of “DNA-scapes” by means of laboratory-on-a-chip–type sensors ([ 8 ][8]). Several technological trends are shaping the emergence of large-scale sensor networks. One is the ongoing miniaturization of technology, allowing deployment of extended arrays of low-power sensor devices across landscapes [for example, ([ 9 ][9])]. In many cases, these can be solar-powered in remote locations. The widespread availability of computer-on-a-chip devices along with various attached sensors is enabling the construction of large distributed sensing networks at price points that were formerly unattainable. Similarly, the ubiquitous availability of cloud-based computing and storage for back-end processing is facilitating large-scale deployments. Another trend is advancements in wireless communications. For example, the emerging internet of things ([ 10 ][10]) enables low-power devices to establish ad hoc mesh networks that can pass information from node to node, eventually reaching points of aggregation and analysis. The same technology used to connect smart doorbells and lightbulbs can be leveraged to move data across sensor networks distributed across a landscape. These protocols are designed for low power consumption but may not have sufficient bandwidth for all applications. An alternative, although more power hungry, is cellular technology, which has increasing coverage globally. In remote locations, where commercial cellular data services may not be available, researchers can consider a private cellular network for on-site telemetry and satellite uplinks for internet streaming. However, in the near term, telecommunications costs and per-device power requirements may nonetheless prove prohibitive in certain high-bandwidth applications, such as video and audio streaming. An alternative for sites where communications bandwidth is limited by cost, isolation, or power constraints is edge computing ([ 11 ][11]). In this design, computation is moved to the sensing devices themselves, which then transmit filtered or classified results for analysis, greatly reducing transmission requirements. One more trend is the advancement of machine-learning methods ([ 12 ][12]) that can classify and extract patterns from data streams. Much of this technology has been commoditized through intensive development efforts in the technology sector that have resulted in widely available software libraries usable by nonexperts. The aforementioned convolutional neural networks can be coded both to segment data into units and to label these units with appropriate classes. The major bottleneck is in training classifiers because initial training inputs must be labeled manually by experts. Although labeled training sets exist in some domains—most notably, image recognition—future analysts may be able to skip much of the training step as large collections of pretrained networks become available. These pretrained networks can be combined and modified for specific tasks without the requirement of comprehensive training sets. Of particular interest from the standpoint of automation are new developments in continual learning ([ 13 ][13]), in which networks adjust in response to changing inputs. This holds the promise of automating model adaptation for detecting emerging phenomena, such as species shifting their ranges in response to climate change or other shifts in ecosystem properties. Ecologists could leverage these developments to create automated sensing networks at scales previously unimaginable. As an example, consider the North American Breeding Bird Survey, a highly successful citizen-science initiative running since the late 1960s with continental-scale coverage. Expert observers conduct point counts of birds along routes, generating data that have proved invaluable in tracking trends in songbird populations ([ 14 ][14]). Although we hope to see such efforts continue, imagine what could be learned if, instead of sampling these communities once per year, a long-term, continental-scale songbird observatory could be constructed to record and classify bird vocalizations in near–real time along with environmental covariates. Similar networks could use camera traps or video streams to reveal details of diurnal and seasonal variation across diverse floras and faunas. As with all sampling methods, sensing networks will not be without biases in sensitivity and discrimination, yet they hold the extraordinary promise of regional sampling of biodiversity at the organismal scale, something that has proven difficult, for example, by using traditional satellite-based remote sensing. These efforts would complement ongoing development of continental-scale observatories in ecology [for example, ([ 15 ][15])] by increasing the spatial and temporal resolution of sampling. 1. [↵][16]1. S. Díaz et al ., Science 366, eaax3100 (2019). [OpenUrl][17][Abstract/FREE Full Text][18] 2. [↵][19]1. J. Travis , Am. Nat. 196, 1 (2020). [OpenUrl][20] 3. [↵][21]1. M. C. Dietze et al ., Proc. Natl. Acad. Sci. U.S.A. 115, 1424 (2018). [OpenUrl][22][Abstract/FREE Full Text][23] 4. [↵][24]1. B. J. Cardinale et al ., Nature 486, 59 (2012). [OpenUrl][25][CrossRef][26][PubMed][27][Web of Science][28] 5. [↵][29]1. Y. LeCun, 2. Y. Bengio, 3. G. Hinton , Nature 521, 436 (2015). [OpenUrl][30][CrossRef][31][PubMed][32] 6. [↵][33]1. S. S. Sethi et al ., Proc. Natl. Acad. Sci. U.S.A. 117, 17049 (2020). [OpenUrl][34][Abstract/FREE Full Text][35] 7. [↵][36]1. R. C. Whytock et al ., Methods Ecol. Evol. 12, 1080 (2021). [OpenUrl][37] 8. [↵][38]1. B. C. Dhar, 2. N. Y. Lee , Biochip J. 12, 173 (2018). [OpenUrl][39] 9. [↵][40]1. A. P. Hill et al ., Methods Ecol. Evol. 9, 1199 (2018). [OpenUrl][41] 10. [↵][42]1. L. Atzori, 2. A. Iera, 3. G. Morabito , Comput. Netw. 54, 2787 (2010). [OpenUrl][43][CrossRef][44][Web of Science][45] 11. [↵][46]1. W. Shi, 2. J. Cao, 3. Q. Zhang, 4. Y. Li, 5. L. Xu , IEEE Internet Things J. 3, 637 (2016). [OpenUrl][47] 12. [↵][48]1. M. I. Jordan, 2. T. M. Mitchell , Science 349, 255 (2015). [OpenUrl][49][Abstract/FREE Full Text][50] 13. [↵][51]1. R. Aljundi, 2. K. Kelchtermans, 3. T. Tuytelaars , Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2019, pp. 11254–11263. 14. [↵][52]1. J. R. Sauer, 2. W. A. Link, 3. J. E. Fallon, 4. K. L. Pardieck, 5. D. J. Ziolkowski Jr. , N. Am. Fauna 79, 1 (2013). [OpenUrl][53] 15. [↵][54]1. M. Keller, 2. D. S. Schimel, 3. W. W. Hargrove, 4. F. M. Hoffman , Front. Ecol. Environ. 6, 282 (2008). [OpenUrl][55][CrossRef][56] Acknowledgments: Our perspective on autonomous sensing was developed with the support of the Stengl-Wyer Endowment and the Office of the Vice President for Research Bridging Barriers programs at the University of Texas at Austin, and the National Science Foundation (BCS-2009669). Comments from members of the Keitt laboratory, Planet Texas 2050, A. Wolf, and M. Abelson were invaluable in refining our ideas. [1]: #ref-1 [2]: #ref-2 [3]: #ref-3 [4]: #ref-4 [5]: #ref-5 [6]: #ref-6 [7]: #ref-7 [8]: #ref-8 [9]: #ref-9 [10]: #ref-10 [11]: #ref-11 [12]: #ref-12 [13]: #ref-13 [14]: #ref-14 [15]: #ref-15 [16]: #xref-ref-1-1 "View reference 1 in text" [17]: {openurl}?query=rft.jtitle%253DScience%26rft.stitle%253DScience%26rft.aulast%253DDiaz%26rft.auinit1%253DS.%26rft.volume%253D366%26rft.issue%253D6471%26rft.spage%253Deaax3100%26rft.epage%253Deaax3100%26rft.atitle%253DPervasive%2Bhuman-driven%2Bdecline%2Bof%2Blife%2Bon%2BEarth%2Bpoints%2Bto%2Bthe%2Bneed%2Bfor%2Btransformative%2Bchange%26rft_id%253Dinfo%253Adoi%252F10.1126%252Fscience.aax3100%26rft_id%253Dinfo%253Apmid%252F31831642%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [18]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6Mzoic2NpIjtzOjU6InJlc2lkIjtzOjE3OiIzNjYvNjQ3MS9lYWF4MzEwMCI7czo0OiJhdG9tIjtzOjIyOiIvc2NpLzM3My82NTU3Lzg1OC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30= [19]: #xref-ref-2-1 "View reference 2 in text" [20]: {openurl}?query=rft.jtitle%253DAm.%2BNat.%26rft.volume%253D196%26rft.spage%253D1%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [21]: #xref-ref-3-1 "View reference 3 in text" [22]: {openurl}?query=rft.jtitle%253DProc.%2BNatl.%2BAcad.%2BSci.%2BU.S.A.%26rft_id%253Dinfo%253Adoi%252F10.1073%252Fpnas.1710231115%26rft_id%253Dinfo%253Apmid%252F29382745%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [23]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoicG5hcyI7czo1OiJyZXNpZCI7czoxMDoiMTE1LzcvMTQyNCI7czo0OiJhdG9tIjtzOjIyOiIvc2NpLzM3My82NTU3Lzg1OC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30= [24]: #xref-ref-4-1 "View reference 4 in text" [25]: {openurl}?query=rft.jtitle%253DNature%26rft.stitle%253DNature%26rft.aulast%253DCardinale%26rft.auinit1%253DB.%2BJ.%26rft.volume%253D486%26rft.issue%253D7401%26rft.spage%253D59%26rft.epage%253D67%26rft.atitle%253DBiodiversity%2Bloss%2Band%2Bits%2Bimpact%2Bon%2Bhumanity.%26rft_id%253Dinfo%253Adoi%252F10.1038%252Fnature11148%26rft_id%253Dinfo%253Apmid%252F22678280%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [26]: /lookup/external-ref?access_num=10.1038/nature11148&link_type=DOI [27]: /lookup/external-ref?access_num=22678280&link_type=MED&atom=%2Fsci%2F373%2F6557%2F858.atom [28]: /lookup/external-ref?access_num=000304854000027&link_type=ISI [29]: #xref-ref-5-1 "View reference 5 in text" [30]: {openurl}?query=rft.jtitle%253DNature%26rft.volume%253D521%26rft.spage%253D436%26rft_id%253Dinfo%253Adoi%252F10.1038%252Fnature14539%26rft_id%253Dinfo%253Apmid%252F26017442%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [31]: /lookup/external-ref?access_num=10.1038/nature14539&link_type=DOI [32]: /lookup/external-ref?access_num=26017442&link_type=MED&atom=%2Fsci%2F373%2F6557%2F858.atom [33]: #xref-ref-6-1 "View reference 6 in text" [34]: {openurl}?query=rft.jtitle%253DProc.%2BNatl.%2BAcad.%2BSci.%2BU.S.A.%26rft_id%253Dinfo%253Adoi%252F10.1073%252Fpnas.2004702117%26rft_id%253Dinfo%253Apmid%252F32636258%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [35]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoicG5hcyI7czo1OiJyZXNpZCI7czoxMjoiMTE3LzI5LzE3MDQ5IjtzOjQ6ImF0b20iO3M6MjI6Ii9zY2kvMzczLzY1NTcvODU4LmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ== [36]: #xref-ref-7-1 "View reference 7 in text" [37]: {openurl}?query=rft.jtitle%253DMethods%2BEcol.%2BEvol.%26rft.volume%253D12%26rft.spage%253D1080%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [38]: #xref-ref-8-1 "View reference 8 in text" [39]: {openurl}?query=rft.jtitle%253DBiochip%2BJ.%26rft.volume%253D12%26rft.spage%253D173%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [40]: #xref-ref-9-1 "View reference 9 in text" [41]: {openurl}?query=rft.jtitle%253DMethods%2BEcol.%2BEvol.%26rft.volume%253D9%26rft.spage%253D1199%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [42]: #xref-ref-10-1 "View reference 10 in text" [43]: {openurl}?query=rft.jtitle%253DComput.%2BNetw.%26rft.volume%253D54%26rft.spage%253D2787%26rft_id%253Dinfo%253Adoi%252F10.1016%252Fj.comnet.2010.05.010%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [44]: /lookup/external-ref?access_num=10.1016/j.comnet.2010.05.010&link_type=DOI [45]: /lookup/external-ref?access_num=000283039900014&link_type=ISI [46]: #xref-ref-11-1 "View reference 11 in text" [47]: {openurl}?query=rft.jtitle%253DIEEE%2BInternet%2BThings%2BJ.%26rft.volume%253D3%26rft.spage%253D637%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [48]: #xref-ref-12-1 "View reference 12 in text" [49]: {openurl}?query=rft.jtitle%253DScience%26rft.stitle%253DScience%26rft.aulast%253DJordan%26rft.auinit1%253DM.%2BI.%26rft.volume%253D349%26rft.issue%253D6245%26rft.spage%253D255%26rft.epage%253D260%26rft.atitle%253DMachine%2Blearning%253A%2BTrends%252C%2Bperspectives%252C%2Band%2Bprospects%26rft_id%253Dinfo%253Adoi%252F10.1126%252Fscience.aaa8415%26rft_id%253Dinfo%253Apmid%252F26185243%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [50]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6Mzoic2NpIjtzOjU6InJlc2lkIjtzOjEyOiIzNDkvNjI0NS8yNTUiO3M6NDoiYXRvbSI7czoyMjoiL3NjaS8zNzMvNjU1Ny84NTguYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9 [51]: #xref-ref-13-1 "View reference 13 in text" [52]: #xref-ref-14-1 "View reference 14 in text" [53]: {openurl}?query=rft.jtitle%253DN.%2BAm.%2BFauna%26rft.volume%253D79%26rft.spage%253D1%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [54]: #xref-ref-15-1 "View reference 15 in text" [55]: {openurl}?query=rft.jtitle%253DFront.%2BEcol.%2BEnviron.%26rft.volume%253D6%26rft.spage%253D282%26rft.atitle%253DFRONT%2BECOL%2BENVIRON%26rft_id%253Dinfo%253Adoi%252F10.1890%252F1540-9295%25282008%25296%255B282%253AACSFTN%255D2.0.CO%253B2%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [56]: /lookup/external-ref?access_num=10.1890/1540-9295(2008)6[282:ACSFTN]2.0.CO;2&link_type=DOI
领域气候变化 ; 资源环境
URL查看原文
引用统计
文献类型期刊论文
条目标识符http://119.78.100.173/C666/handle/2XK7JSWQ/336024
专题气候变化
资源环境科学
推荐引用方式
GB/T 7714
Timothy H. Keitt,Eric S. Abelson. Ecology in the age of automation[J]. Science,2021.
APA Timothy H. Keitt,&Eric S. Abelson.(2021).Ecology in the age of automation.Science.
MLA Timothy H. Keitt,et al."Ecology in the age of automation".Science (2021).
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Timothy H. Keitt]的文章
[Eric S. Abelson]的文章
百度学术
百度学术中相似的文章
[Timothy H. Keitt]的文章
[Eric S. Abelson]的文章
必应学术
必应学术中相似的文章
[Timothy H. Keitt]的文章
[Eric S. Abelson]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。