Front-end & UX Designer

The non-verbal mood scarf

Background:

The basic idea for this interactive device is of personal experience, and perhaps others as well. I make it no secret anymore that I have long-term Severe Depression. Often others have difficulty knowing my mood, given the above & rather than always keep them guessing, at a distance or making it an announcement; what if it where possible to design an auto-sensing wearable HUD of sorts? Because after some 30 years, my poker-face makes it impossible to know via known body language at times.

 

Code Repository:

All code used in this project can found on GitHub @

https://github.com/GeekyMoore/moodscarf

 

Inventory List:

The items below are directly sold via AdaFruit's website.
// Not including whatever tools and prototyping wiring/resistors/etc I may have used e.g. YMMV.

thelist.jpg

Inspiration:

This project's idea came out of the need to communicate my state-of-mind with my girlfriend in some rapid and repeatable manor that doesn't become an annoyance. I could tell her "I'm ok, lets go do something together" or I'm in a bad mood, I need you to keep away", but multiply that by several times a week, each month, each year and so on. We'd experimented with using a fox and rabbit and at times it worked, others it didn't. Something more repeatable was needed.

Shout out to Adafruit's design:

https://learn.adafruit.com/chameleon-scarf/overview

 

 

Part 1: Lighting output and color vividness test


Part 2: heart-rate BMP sensor via ear, output light blink rate

 

Part 3: Gathering of all parts required & beginning to build


Part 4: 12 NeoPixels with Flora sown & testing lighting/pattern possibilities

 

Part 5: Adding AdaFruit accelerometer