2. Course
• develop gesture-based Natural User
Interfaces (NUI)
• in various interactive application platforms
for sonic, musical, 2D or 3D interfaces
Friday, 25 March 2011, Week
3. Course In General
• the course is not to teach Processing, Pure
data or Unity
• to develop applications involving bodily
interaction by using these programming
environments and Kinect sensor.
• prerequisite is to know basics of one of
them. you will learn by doing more about
them
Friday, 25 March 2011, Week
4. Schedule
• 24th Mar 2011 12-14: Introduction
• 28th Mar 2011 13-15: Kinect with Processing (Joint session with Processing Club)
• 31st Mar 2011 12-14: Kinect with Pure Data (4th floor Mac Classroom)
WS WEEK (Optional Sessions on Unity)
• 11th Apr 2011 13-15: Skeletal Tracking with Processing and Pure Data
• 14th Apr 2011 12-14: Small Project Presentation/Concept Idea Presentation/Group
Formation
• 18th Apr 2011 13-15: Hands on Project Work/Tutoring
• 21st Apr 2011 12-14: Hands on Project Work/Tutoring
• 28st Apr 2011 12-14: Hands on Project Work/Tutoring
• 2nd May 2011 13-15: Final Project Presentation (Group-work)
Paja is booked for Week 16-17-18 for you btwn 9-17. (Except 18.04.2011)
Friday, 25 March 2011, Week
5. Small Project
• Very simple project
• Done in Processing, PD, Unity or QC
• Some interaction with body data
• Individually
• Presented in the class
Friday, 25 March 2011, Week
6. Final Project
•
Virtual Reality
Concept Development Augmented Reality
Puppetry
•
Multiuser interactive environment
Sonic/Musical, 2D, 3D applications Augmented interactive dance performance
Gestural musical instrument
(from image viewer to media art) Architectural Projection
• Groups of 2 people (strongly suggested)
• Exhibition
• Public space in TaiK
• Venue outside (public space outside)
• Spring Demoday 26.05
Friday, 25 March 2011, Week
7. Programming
• Processing?
• Pure Data?
• Unity?
• Any other?
• Level of knowledge? (0-5)
Friday, 25 March 2011, Week
8. User Interface
the system by which
users (people) interact (communicate) with a machine.
Friday, 25 March 2011, Week
9. Components of UI
• The user interface includes
• hardware (physical)
• software (logical) components.
• Provide a means of:
• Input, allowing the users to manipulate a system,
and/or
• Output, allowing the system to indicate the
effects of the users' manipulation.
Friday, 25 March 2011, Week
10. Interaction
• indicates the means by which user inputs changes
to the system and the feedback supplied by the
system
Friday, 25 March 2011, Week
11. Interaction
• How do you do?
• How do you feel?
• How do you know?
Friday, 25 March 2011, Week
12. Timeline of UI’s
Command-line Interface
Friday, 25 March 2011, Week
17. Timeline of UI’s
Post WIMP Era
Tangible User Interface
WIMP Gesture-Based Interface
Command-line Interface Graphical User Interface
Natural User Interface
Friday, 25 March 2011, Week
19. Natural Interaction
• Experience (Human-Computer --> Human-Human )
• People communicate thru
• gestures
• expressions
• movements
• People discover by
• looking around
• manipulating physical stuff
Friday, 25 March 2011, Week
20. Why?
• Less cognitive load
• Simpler (for certain applications)
• typing
• changing slide, navigating in 3D VR
• Direct manipulation
Friday, 25 March 2011, Week
21. Bodily Interaction
• whole body in context
• multi-modality
• human=multi-sensory ?
• user as an input modality
• direct input from user’s sensory
modalities
• manipulate digital data with body
Friday, 25 March 2011, Week
23. Put-that-There, 1979
• voice and gesture at the graphics interface
pioneering multimodal application combined
speech and gesture recognition.
• Put-that-There, MIT, 1979
• Put-that-there, MIT 1983
• Richard A. Bolt "Put-That-There":Voice and Gesture at the Graphics Interface (pdf) SIGGRAPH '80
Friday, 25 March 2011, Week
24. Kinect Sensor
IR Laser Projector IR Camera
Friday, 25 March 2011, Week
25. How Does it Work
Watch the Video
Friday, 25 March 2011, Week
33. Technical Specs of
Kinect
• Field of View
Horizontal field of view: 57 degrees
Vertical field of view: 43 degrees
Physical tilt range: ± 27 degrees
Depth sensor range: 1.2m – 3.5m (10m)
• Skeletal Tracking System
Tracks up to 6 people, including 2 active players
Tracks 20 joints per active player
• Works in complete darkness
Friday, 25 March 2011, Week