As a graduate student, you’ve probably been spending most of your days sitting on a desk. Over time, we eventually develop a lot of bad habits without knowing! Such as bad sitting postures, falling asleep doing work, forget to add clothes when it gets cold. DeskPal aims to improve these common behaviors through interacting with the user in their desktop workspace.
DeskPal communicates with the user mainly through different “emotional states”, the transitions between the states are determined by the current sensor inputs and different states have different actions associated. An example scenario might be when DeskPal senses the user is in a bad sitting posture, it starts to vibrate with an angry face, and it’ll only stop when the user touches it and corrects their posture (DeskPal will then put on a smiley face and stop vibrating). A 2nd scenario is when the room temperature starts to get cold, DeskPal turns blue and starts vibrating in a certain way, the user noticed it and puts on some clothes, then touches DeskPal to calm it down.
Sitting postures – This is the part I haven’t figured out yet…I was thinking of using a bend sensor on the person’s neck or back, but don’t know how effective this will be, and it’ll also be better to have the sensor for this on the actual object itself to avoid separate parts…
Light – the pet can detect whether the light is suitable for reading and jump to different emotional states to feedback to the user
Temperature – temperature is measured to simply give the companion a color…if low the companion would appear more blue-ish
Capacitive touch– touch input can make the companion jump to a happier emotional state
LED matrix – output emotions
RGB LED – for the overall color reflecting temperature
Vibration motor – make it jumpy and wild…
Sound – beep or chirp in some way…