Insight: ‘Elowan’ and plant augmentation, Harpreet Sareen

 

Lots have been said and written about human enhancement, robotic prosthetics, cyborgs and even human-animal communication augmentation. But what about plants? What if machines and biorobots are not only interacting with animals and humans but plants and other living components or our environment?This question is what Harpreet has been addressing in his research.

 

Harpreet is an interaction designer;  traditionally designers use digital products to create a conversation with humans. However, in recent years, Harpreet has been working off the idea of an analogy between our environment’s capabilities and what we do with our artificial electronics. The way we have seen the environment and sustainability efforts have been much more passive and always about saving, but if we start looking at capabilities in the environment, we align ourselves with the development, as opposed to being divergent from it. He calls this new type of design convergent design. Harpreet is currently an Assistant Professor of Media and Interaction Design at Parsons School of Design, New York and researches on biological futures and their implications in interaction design. He graduated from MIT Media Lab’s Fluid Interfaces group, where he is also a Research Affiliate and recently appeared in the news for his work at Google using plants a Project called Oasis. Oasis voice is a terrarium you can talk to using Google home to recreate weather from any location around the world.

 

One of his recent projects addresses the issue of plant augmentation.  Elowan (Name is derived from Welsh/Celtic language meaning “good light” ) is a plant-robot hybrid driven by a plant’s natural signals. It’s a cybernetic lifeform, wherein plant-the living organism, controls the movement and not the robot.  It recalls to the creations of Gilberto Esparza in Plantas Nomadas (2008-2013), or Perejil Buscando al Sol (2007).

 

Elowan is driven by the bio-electrochemical signals of plants (this is the reason quite easy to make plant-based sound interfaces based on the plant’s conductivity). Plants produce these signals naturally based on changes in the environment,  Such electrical signals are produced in response to changes in light, gravity, mechanical stimulation, temperature, wounding, and these are very similar to our electrical signals. The experimental setup in our case includes electrodes inserted into stems and ground. The weak signals are amplified and sent to the robot. If such signals don’t occur inside a plant, the robot doesn’t move — meaning plants are deeply integrated and have agency over the technology.

 

(Video credit Harpreet Singh Sareen CC 4.0 Illustration/Animation assisted by Elbert Tiao)

 

In history, humans like with animals, have domesticated certain plants, selecting the desired species based on specific traits. A few became house plants, while others were made fit for agricultural practice. From natural habitats to micro-climates, the environments for these plants have significantly altered. As humans, we rely on technological augmentations to tune our fitness to the environment. However, the acceleration of evolution through technology needs to move from a human-centric to a holistic, nature-centric vie. Elowan is an attempt to highlight the natural capabilities inside plants and how such capabilities can be used to power our artificial functions. We also wanted to highlight the need to move the technological design from a human-centric to a nature-centric view. Through this robot, we’re demonstrating a way towards a bioelectronics world, where artificial devices could be powered by natural means.


Harpreet sees Elowan as a means to Plants working as light sensors. That is exactly what Elowan was designed to convey — Deep integration of technology with our nature. One small capability such as the response of plants to light shows how plants could be harnessed for our physical devices or interaction purposes.

 

This leads to applications such as sensing a surrounding environment through a plant or tree signals or routing those signals through our interactive devices. The plants could be used as sensing platforms for monitoring their own health, minute changes in the environment or to give rise to new organic interactive devicesElowan carries on saying that he is even expanding the scope of this research through a new initiative at Parsons School of Design, where he is also currently a Professor of Interaction Design. This new initiative called Synthetic Ecosystems looks at interaction design with nature at large. He says sometimes he flips the user of technologies to non-humans.  This leads to understanding what design really is and creating products for animals, plants or organisms – something that we never do but questioning why don’t we?

 

 

Text by CLOT Magazine (Twitter @clotmagazine)

 

 

Website www.media.mit.edu/projects/elowan-a-plant-robot-hybrid/overview/
(Photo Credit Harpreet Singh Sareen)

 

17 Dec 2018