SO we looked at this and it adds so much weight and battery drain. There are a million micro-robots - believe it or not this form factor and simplicity is what makes us a go-to tool (contrary to what you might think, under a lot of stress first responders may not want to drive something around given the additional cognitive load)
You already need someone watching the screens. I don't see the driving being a big increase from that. Weight and battery life don't seem to be a concern. There are drive mechanisms that use mass distribution or inertia to drive it like a wheel and don't use much energy. This is especially true if you're throwing it and just need to retrieve it because you made a bad toss, or need to clear an additonal angle.
The idea is to throw this in instead of sending a person or a dog, reducing the odds they get hurt or civilians are unnecessarily hurt. And we keep costs as low as possible to reduce the tears (but honestly the camera is usually ok - pretty tough)
YES! That's the neat trick. Clever of you to get that without prompting! So I figure you must have some imaging background. We're hundreds to thousands of times more efficient than traditional SIFT/SURF or other feature detection methods, which is the only way we can do this at the edge on the camera itself (without melting it or running out of battery in five minutes).
Interesting. I assumed it was just sloppy journalism but it sounds like you’ve seen this confusion before. The name did not strike me as hard to remember.
So I commented (Bounce CEO) with some explanations, and posted a sample video of the thermal that I quickly put on youtube while my team can access our official account. Posting here too: https://youtube.com/shorts/PlmG9HdzltU?feature=share
Domestically (not on exported cameras that are restricted to 9FPS), you can actually hit ~27FPS reasonably easily and with our built-in IMU this is in fact usable in flight. However, thermal is hard enough to interpret when stationary so the main use case for the THERMAL variant is once it is in the middle of a room or on a pole to search attics etc. However our visual cameras are super easy to work with in flight - if you download our app (Bounce Viewer) and go to the demo videos in History you can see it working on the back of a dog or thrown in the air.
And I mean thousands of these systems are in use around the world (of the visual, the thermal Pit Viper we just announced) and have been thrown through windows, down stairs, on ropes, poles, vehicles, dogs....
Hi! I'm the CEO of Bounce Imaging. Now sure exactly how to post here but excited to see ourselves on here after someone sent me the link. Happy to show some sample video as requested: https://youtube.com/shorts/PlmG9HdzltU?feature=share
(I posted this quickly on my own Youtube as I'll need my colleagues at work to put up a better on our official page)
As noted, indeed our near-IR cameras (prior generations - the Recce360 Mini etc) are much easier to interpret in-flight than thermal just because it is hard to get your head around panoramic thermal as easily. If you want to see fully 360 video of that, just download our app (Bounce Viewer) and in the History section scroll down to demo videos and you can pan around as it is thrown in the air or shooting around on the back of a dog. Note that the stabilization is along multiple axes!
Indeed Jonas' setup was pretty cool but this had actually been tried many times before THAT- including cool designs by the Brits, US Navy, and others from decades ago and a cool conceptual design by Franziska Faro (spelling, sorry!). ALL however suffered from the challenge of doing real-time processing with low-latency and automatic stabilization in flight without melting the camera through too much processing. The way we cracked it, first for our visual cameras and now for thermal, is through a stitching method that is 200X-2000X more efficient and noise-insensitive because it is not based on SURF/SIFT feature detection (if you're into the nerdy side of things).
Have you looked at producing 3D reconstructions over the thrown trajectory? And/or something like a gaussian splat-based representation for viewing the whole trajectory at once?