About the project
Supporting your speech with signs is a proven way to enable early communication, and support language development from birth. But how do you teach toddlers and parents to use them? And in what way can a device like an iPad improve on already existing methods? With these questions in mind, I set out to develop an app for that, using a user centered interface design approach.
A lot of effort went into researching early development, different learning methods and use cases like speech therapists that avidly use supporting sings already. I think the strength in this concept is in letting the learning happen outside of the iPad, in a social and proven way of one on one teaching. The app only guides the process, and triggers different parts of the brain to help learn faster.
Another solution to stimulate learning, was to integrate the parent and child part of the interface. In this way, there is no distraction for both parent and child, and the necessary information is only shown when needed.
Research and user testing is key for this kind of app. You would think for instance, that the interactions are too simple, or that kids aren’t ok with you taking the iPad from them.
In practice, it turns out kids are perfectly fine with handing over the iPad. Everything has to do with the expectation you create in the child, and the direction in which you steer the use as a designer.