A tap with vision

Scroll this

Tappy is a smart tap that uses computer vision to detect what you are doing with a tap and adjusts flow rate and duration to suit, improving water efficiency.

Tappy was created in response to a brief to create a cyber-physical system connected to Critical Infrastructure as part of the 3AI Master of Applied Cybernetics. The team involved are Nischal Mainali (Nepali computational neuroscientist), Ned Cooper (Australian lawyer & tech strategist), Teffera Teffera (Ethiopian-American media guru), Lorenn Ruster (Australian strategy consultant and intrapreneur).

Although originally conceived within the context of water efficiency – as a way of mitigating the threat of drought on our critical infrastructure – the application of Tappy is far more wide-reaching, particularly to populations who are unable to adjust the water flow and duration of a tap as it is currently designed.

With Tappy, the tap ‘sees’ the object under it and does the rest. It learns how much water you use for particular items and adjusts accordingly. There is no need to twist or turn a tap to get the required water. There are many applications of this tap to people with a disability.

You can find our more by visiting our website and looking at a demonstration of the Tappy prototype below.

The Tappy Team (Left to Right): Lorenn Ruster, Ned Cooper, Nischal Mainali, Teffera Teffera

Tappy Demonstration, November 2020