{"id":12,"date":"2015-05-04T11:00:45","date_gmt":"2015-05-04T11:00:45","guid":{"rendered":"http:\/\/aideproject.umh.es\/?page_id=12"},"modified":"2016-09-16T10:16:32","modified_gmt":"2016-09-16T10:16:32","slug":"descripcion-del-proyecto","status":"publish","type":"page","link":"https:\/\/aideproject.umh.es\/es\/","title":{"rendered":"Descripci\u00f3n del Proyecto"},"content":{"rendered":"<p><\/p>\n<p style=\"text-align: center\"><iframe  style=\"display: block; margin: 0px auto;\"  id=\"_ytid_11914\"  width=\"480\" height=\"270\"  data-origwidth=\"480\" data-origheight=\"270\" src=\"https:\/\/www.youtube.com\/embed\/IB4l7iYnlUw?enablejsapi=1&autoplay=0&cc_load_policy=0&cc_lang_pref=&iv_load_policy=1&loop=0&modestbranding=0&rel=1&fs=1&playsinline=0&autohide=2&theme=dark&color=red&controls=1&\" class=\"__youtube_prefs__  no-lazyload\" title=\"YouTube player\"  allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen data-no-lazy=\"1\" data-skipgform_ajax_framebjll=\"\"><\/iframe><\/p>\n<p>The AIDE concept goes beyond the current state of the art in using a novel modular multimodal perception system to customize an adaptive multimodal interface towards disabled people needs. The multimodal interface will analyse and extract relevant information from the identification of residual abilities, behaviours, emotional state and intentions of the user, from analysis of the environment and from context factors. Finally, the humanEmachine cooperative system will be designed in accordance with specific user needs. A series of applications for the AIDE system have been identified across several domains in which disabled people could greatly benefit:<\/p>\n<div>\n<ol>\n<li><b>Co<\/b><b>mm<\/b><b>un<\/b><b>ication<\/b>: The main objective is to improve the communication of severely disabled people for social autonomy. The user will be assisted in communicating with her\/his relatives and friends. Communication will be provided by using standard Internet services, such as email, Skype \u00a0and \u00a0whatsapp \u00a0and \u00a0standard \u00a0social \u00a0networks (i.e., Facebook and Twitter). The developed system will provide support for web browsing as well.<\/li>\n<li><b>Ho<\/b><b>m<\/b><b>e Automation<\/b>: The goal is to allow severely disabled people to interact with the devices\u00a0 \u00a0present\u00a0 \u00a0at\u00a0 \u00a0their\u00a0 \u00a0smart\u00a0 \u00a0home\u00a0 \u00a0environments.\u00a0 \u00a0In\u00a0 \u00a0short,\u00a0 \u00a0the\u00a0 \u00a0user\u00a0 \u00a0will\u00a0 \u00a0be supported by AIDE multimodal interaction system in daily activities, such as turning lights, radio and television off and on, answering or initiating telephone calls, lock or unlock a door, closing or opening drapes, changing environmental settings and in medical\u00a0 emergency \u00a0situations.<\/li>\n<li><b>W<\/b><b>earable robots \u00a0for\u00a0 assisting \u00a0in\u00a0 ADL<\/b>: adaptively and dynamically modify the level of assistance provided\u00a0 by the intelligent robotic exoskeleton\u00a0 in\u00a0 accordance with specific user needs.<\/li>\n<li><b>En<\/b><b>tertainment<\/b>: Severely impaired people have reported that participation in normal entertainment activities, like playing a computer game or watching a movie, as an important need. Thus, a main objective is to support the user in \u00a0playing \u00a0computer games, in expressing his\/her feelings, in \u00a0playing \u00a0music \u00a0and\/or \u00a0engaging \u00a0in \u00a0painting and so on.<\/li>\n<\/ol>\n<p style=\"text-align: center\"><a href=\"https:\/\/aideproject.umh.es\/files\/2015\/05\/AIDE_concept1.png\"><img loading=\"lazy\" class=\"aligncenter wp-image-17\" src=\"https:\/\/aideproject.umh.es\/files\/2015\/05\/AIDE_concept1.png\" alt=\"AIDE_concept\" width=\"527\" height=\"261\" srcset=\"https:\/\/aideproject.umh.es\/files\/2015\/05\/AIDE_concept1.png 812w, https:\/\/aideproject.umh.es\/files\/2015\/05\/AIDE_concept1-300x148.png 300w\" sizes=\"(max-width: 527px) 100vw, 527px\" \/><\/a><\/p>\n<p style=\"text-align: center\"><!--StartFragment--><\/p>\n<p><a href=\"https:\/\/aideproject.umh.es\/files\/2015\/05\/EU.png\"><img loading=\"lazy\" class=\"size-medium wp-image-49 alignleft\" src=\"https:\/\/aideproject.umh.es\/files\/2015\/05\/EU-300x58.png\" alt=\"EU\" width=\"300\" height=\"58\" srcset=\"https:\/\/aideproject.umh.es\/files\/2015\/05\/EU-300x58.png 300w, https:\/\/aideproject.umh.es\/files\/2015\/05\/EU.png 494w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/a><\/p>\n<\/div>\n<p><\/p>","protected":false},"excerpt":{"rendered":"<p>The AIDE concept goes beyond the current state of the art in using a novel modular multimodal perception system to customize an adaptive multimodal interface towards disabled people needs. The multimodal interface will analyse and extract relevant information from the identification of residual abilities, behaviours, emotional state and intentions of the user, from analysis of [&#8230;]<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":1,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_links_to":"","_links_to_target":""},"_links":{"self":[{"href":"https:\/\/aideproject.umh.es\/es\/wp-json\/wp\/v2\/pages\/12"}],"collection":[{"href":"https:\/\/aideproject.umh.es\/es\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/aideproject.umh.es\/es\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/aideproject.umh.es\/es\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/aideproject.umh.es\/es\/wp-json\/wp\/v2\/comments?post=12"}],"version-history":[{"count":0,"href":"https:\/\/aideproject.umh.es\/es\/wp-json\/wp\/v2\/pages\/12\/revisions"}],"wp:attachment":[{"href":"https:\/\/aideproject.umh.es\/es\/wp-json\/wp\/v2\/media?parent=12"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}