As part of the University of the Arts, B.S. of Industrial Design department: Design a cyborg-like (HCI & Human Augmentation) product or service using 'communication' as your primary focal-point, using whatever skills and technology is accessible within 48 hours.*
* The premise of this project initially seemed straight-forth, however further discussion with my team of three (2 industrial designers & 1 multimedia) grew into our dislike of conceptional designs over built prototypes. I won't speak of my personal options on this in-comparison to how each department is ran, but needless to say, we upped the challenge by deicing to build a functioning prototype to show, all within 48 hours.
Our final concept was for an automated interactive communication system to share information and ideas, and translate nonverbal communication into a visual language. Inspiration was drawn from South Korean & Japanese music videos that are perceived to have interactive interfaces and elements such as the below:
Our initial brainstorming session began to breakdown non- non-verbal communication into categories that we had the ability to build off of and branch into. Some more technical than others and thus we narrowed what we had the ability to work on within the 48 hours.
We agreed on a system using a few more basic elements chosen from the above and this led into the final idea.
Imagine yourself in a low-lighted room in the shape of the perfect square; sized large enough to fit a dozen individuals. Each wall has a frontal facing projector capable of mapping and tracking you as you move. You could think of it as if each wall, floor and celling was a computer screen; just cheaper and more interactive like a touch-screen device.
By using natural gestures, you can from basic shapes or notations commonly used in western cultures. And because the room is quite wired up, installing wifi and bluetooth sensors wouldn't be an issue. So rather than needing to plugin a USB drive to retrieve media, if it was BT enabled, it could be retrieved on voice command much like Apple's Siri system or AirDrop. This media could be displayed on any number of screens using gestures like pointing with both finger and more importantly arm (larger blob tracking) and moved to X wall (e.g. screen).
4 storyboards where made in total for demoing Progesture to the department, however they where able to be condensed into 1 to help save time. And because the Progesture has the ability to both blob track and projector map, the setting for the user scenario was based at a tech conference. The gist being their are both introverts and extroverts at these conferences. They are however by nature, discriminating, as they don't assist those whom are introverts.
Thusly we storyboarded around the idea that shapes with your info (ID badge version 2.0) such as name and portfolio could be shown around you as you walked. At times Progesture would recognize a user was being singled-out, not part of the larger crowds talking; and thus herd (for lack of a better word) this crowd to go meet this singled out introvert. At other times Progesture could understand when users are stressed by using a BPM Sensor and Galvanic Skin Response Sensor.
This was storyboard by showing two users, myself and another within our group, in differing colors based on projected mood. Once Progesture had motioned for us to meet and we begin to calm, Progesture would change colors showing a matched color. This could be seen by others to give a non-verbal heads up.
The opposite of this would be to show how Progesture would adapt to an ill confrontation or possible ill confrontation. Raised BPM, GSRS and vocal pitch level could lead to this prediction, at least in theory. Our final storyboard and demonstration to the department showcased this possibility. The task was assigned to a professor within the department to be in a mock-phone conversation. Once the call became heated and her pitch level and words spoken changed to predetermined 'negativity-level' Progesture would motion to the surrounding users to vacate the area e.g. giving her some 'personal space'.
Once all the above had been done, it left us little time to actually build the entire device. In ad-hoc fashion, we took what supplies we could 'barrow' for multiple departments and begin to integrate them into the ID department; removing celling panels, drilling holes in walls; wiring up multiple projectors and web-cams. I don't want to say it was a fire hazard, but it probably was near the end.. As we where only able to secure 2 projects within 48 hours we projects to the front of the room and floor of the room, programming accordingly to adjust.