Computer processing of body language
This article includes a list of references, related reading, or external links, but its sources remain unclear because it lacks inline citations. (January 2012) |
The normal way that a computer functions manually is through a person that controls the computer. An individual generates computer actions with the use of either a computer mouse or keyboard. However the latest technology and computer innovation might allow a computer to not only detect body language but also respond to it. Modern devices are being experimented with, that may potentially allow that computer related device to respond to and understand an individual's hand gesture, specific movement or facial expression.
In relation to computers and body language, research is being done with the use of mathematics in order to teach computers to interpret human movements, hand gestures and even facial expressions. This is different from the normal way people generally communicate with computers for example with the click of the mouse, keyboard, or any physical contact in general between the user and the computer.
MIAUCE and Chaabane Djeraba
[edit]This type of research is being done by a group of European researchers and other scientists as well. There is also a project called MIAUCE (Multimodal interactions analysis and exploration of users within a Controlled Environment). This project has scientists working on making this sort of new advance in computer technology a reality. Chaabane Djeraba, the project coordinator stated "The motivation of the project is to put humans in the loop of interaction between the computer and their environment."
Researchers and scientists are trying to use their innovation and ideas in a way that can help them apply these modern technological devices to the daily needs of businesses and places people visit such as the mall or an airport. The project coordinator of MIAUCE stated "We would like to have a form of ambient intelligence where computers are completely hidden…this means a multimodal interface so people can interact with their environment. The computer sees their behavior and then extracts information useful for the user." This specific research group has developed a couple of different real life models of computer technology that will use body language as a means of communication and way to function.
See also
[edit]- Emotion recognition
- Facial recognition system
- Facial Action Coding System
- Machine translation of sign languages
- 3D pose estimation
References
[edit]- Moursund, David. Brief Introduction to Educational Implications of Artificial Intelligence. Oregon: Dave Moursund, 2006. Print.
- Braffort, Annelies. Gesture-based Communication in Human-computer Interaction: International Gesture Workshop, GW '99, Gif-sur-Yvette, France, March 17–19, 1999 : Proceedings. Berlin: Springer, 1999. Print.
- Fred, Ina. "Gates: Natal to Bring Gesture Recognition to Windows Too." Cnetnews 14 July 2009: 1. http://news.cnet.com. Ina Fred, 14 July 2009. Web. 18 Nov. 2010. <http://news.cnet.com/8301-13860_3-10286309-56.html>.
- Hansen, Evan. "Building a Better Computer Mouse." CNET News. CNET, 2 Oct. 2002. Web. 20 Nov. 2010. <http://news.cnet.com/2100-1023-960408.html>.
- Unknown. "How Computers Can Read Body Language." EUROPA - European Commission - Homepage. 22 Oct. 2010. Web. 22 Nov. 2010. <http://ec.europa.eu/research/headlines/news/article_10_10_22_en.html>.
- Braffort, Annelies. Gesture-based Communication in Human-computer Interaction: Proceedings. Berlin [etc.: Springer, 1999. Print.
- Yang, Ming-Hsuan, and Narendra Ahuja. Face Detection and Gesture Recognition for Human-computer Interaction. Boston: Kluwer Academic, 2001. Print.
External links
[edit]- Computers Detecting Body Language
- Artificial Intelligence
- John McCarthy
- Subfields of Computer Science
- Online Artificial Intelligence Resource
- Computers and Gestures
- Mathematics and Computer Science
- https://web.archive.org/web/20110717201127/http://www.faculty.iu-bremen.de/llinsen/publications/theses/Alen_Stojanov_Guided_Research_Report.pdf
- http://www.physorg.com/news/2010-11-human-computer-music-links-musical-gestures.html
- Tecce, J (1998). "Eye movement control of computer functions". International Journal of Psychophysiology. 29 (3): 319–325. doi:10.1016/S0167-8760(98)00020-8. PMID 9666385.