Body+Language

=Body Language = (Looking at the body as a functioning unit, as a tool for non-verbal communication)
 * For the last two projects, I have focused more on hand gestures as a form of communication. Now, I will be looking at how the body works as a unit to communicate a message non-verbally.
 * My original idea was to focus on sign language, gathering quantitative information about the hearing impaired, however, not many people have studied, or recorded this area in quantifiable data. It has been looked at from a psychological perspective, sociological, etc.
 * Instead, I will focus on comparing body language between different cultures, as we often confuse gesturing and cause misinterpretation. This is especially dangerous in mass media when it is broadcast around the world. Someone in an authoritative position could be intending to achieve a strong sense of peace through the v-sign, but have their determined facial expression interpreted as angry in another culture.

 Main Focus Interactions
 * The main focus will be on creating and destructing facial expressions and what they mean.
 * Mouse overs, drag and drop (miss and match), drop down menus

It was difficult to find the different types of information (chronological, quantitative, spatial) for the type of media I was dealing with. Gestures and facial expression can create very interesting spatial compositions but it lacks in the other two areas of information.
 * Information**

There are recordings about who first studied and recorded facial features and expressions (Darwin) but there is not much on the subject. Now, there are advancements with new technology, but a lot of it is really new and there is not a lot of information on it. This is why I chose to include examples of the emotions and expressions displayed by Kismet (a robot).

So, I was playing with the ability of the user to create emotions and expressions. There are three areas/bars that you can click, labeled 'Eyebrows, Eyes/Lids and Mouth'. Once you click these, it changes to another part of the face. Once you match up the faces, information blocks appear that you can rollover to get more information about that specific emotion. However, the information blocks only appear if you match up the face.
 * Final Site Info:**

You can also rollover 'Non-Verbal Communication' in the top right corner to get more information on the subject, and 'Kismet?' in the bottom left corner to get more information on the robot. You can rollover these buttons at any point.

I also color coded the faces to convey the emotion for greater understanding.

Matched 'Happy': info blocks appear**
 * Main Screen**:
 * Clicked Middle Bar**:
 * [[image:emotions_clicked.png]]
 * Rolled over one of info blocks:**
 * Matched 'Sad' face and rolled over another info block:**
 * Messing around with faces and rolled over 'Non-Verbal Communication' for more info:**
 * Rolled over 'Kismet?' for more information:**

Final: [|2005p_pro02_08_jabbey.swf]

//Data Face//. 2008. Corel Corp., Hemera. 28 March 2008 <[|http://www.face-and-emotion.com/dataface/emotion/expression.jsp>.] Facial Expression. 2002. David B. Givens/Center for Nonverbal Studies. 26 March 2008 <[|http://members.aol.com/nonverbal3/facialx.htm>.] StockXchng. 2008. HAAP Media Ltd. 26 March 2008 <[|http://www.sxc.hu/index.phtml>.]
 * Sources:**