Attendees FAQ Partners

Hosted by

Jan Jongboom

Co-Founder & CTO - Edge Impulse

Learn more

Giving your LoRaWAN device eyes and ears

Where

LoRaWAN Theatre

When

Friday 23 September 2022

11:50 am - 12:15 pm CEST

Add to Calendar 23-09-2022 11:50 am 23-09-2022 12:15 pm Europe/Paris Giving your LoRaWAN device eyes and ears

There’s lots of information in the real world that humans can easily perceive, but machines cannot. It’s trivial to see if a parking spot is available or whether there’s people in a meeting room; and it’s easy to hear gas hissing from a stove or whether an elephant is trumpeting nearby. But our IoT devices don’t have our primary senses, and thus we need to build poor abstractions of them in our deployments: maybe a PIR motion sensors to be put in the meeting room, or a gas sensor near a furnace. That sort of works, but easily falls apart if your usecase is complex (“should activate on human, but not on cat”) or if there’s no off-the-shelf sensor for it (“One elephant detector I2C sensor please”).

Thus, let’s give our LoRaWAN devices some eyes and ears! Using cheap cameras and microphones, combined with embedded machine learning we can now make our devices understand the world in the same way as we do. From just counting the number of cars in a parking garage, to hearing water leaks in pipes. And by running all of this on the IoT device itself we can even do this in the typical power budget of a battery powered LoRaWAN device.

There’s lots of information in the real world that humans can easily perceive, but machines cannot. It’s trivial to see if a parking spot is available or whether there’s people in a meeting room; and it’s easy to hear gas hissing from a stove or whether an elephant is trumpeting nearby. But our IoT devices don’t have our primary senses, and thus we need to build poor abstractions of them in our deployments: maybe a PIR motion sensors to be put in the meeting room, or a gas sensor near a furnace. That sort of works, but easily falls apart if your usecase is complex (“should activate on human, but not on cat”) or if there’s no off-the-shelf sensor for it (“One elephant detector I2C sensor please”).

Thus, let’s give our LoRaWAN devices some eyes and ears! Using cheap cameras and microphones, combined with embedded machine learning we can now make our devices understand the world in the same way as we do. From just counting the number of cars in a parking garage, to hearing water leaks in pipes. And by running all of this on the IoT device itself we can even do this in the typical power budget of a battery powered LoRaWAN device.