cobright
DIS Veteran
- Joined
- Jan 6, 2013
- Messages
- 2,760
Okay, I'm not claiming the next great thing but I built up something that I both think is pretty cool and is also maybe a bit much or controversial. I'm not a social creature so I would appreciate any unblemished reactions...
Last month a friends 7 yr old child was involved in an attempted stranger abduction on way to school. The man was caught and the information on the guy coming out is horrible. I won't get into specifics because they would 'out' some of the people involved and they aren't really important. Except for one. The guy had been following this kid for days before the attempted abduction.
Kid's mom, my friend, is very messed up over this and has taken to extreme (some may say) levels of protecting the kiddo from the outside world.
So this, and the arrival of a new computer board, got me thinking. The computer board is called the Jetson Nano and it's tiny and is developed to make AI, Machine Learning, and Computer Vision easier to develop, and boy does it. Things it's really good at is taking a camera feed and identifying specific things in that video and also, to varying extents, figuring out what those things are doing. Think about these new cameras that identify faces, can tell when you're smiling, can tell when you blink to avoid snapping the shutter then. Think about self driving cars avoiding pedestrians. That's the sort of thing this little computer is designed to do.
So here's what I built...
Kiddo's mom wants it. Now I'm having doubts. In the state I live is as well as the state my friend lives in, there is no legal privacy issues. The device geo-locates and can be told to turn off within school property. Even audio can be saved as both states are single-party consent states.
I can't think of a solid reason not to put the gizmo into use. Letting her have it will get the kid a little bit of normalcy back, being able to go off with friends and walk to school and such. Holding me back is this mantra I've been saying since I started playing with machine learning, "Do we want Skynet? This is how we get Skynet."
Thoughts? Good outweighs creepiness? Vice-versa?
Last month a friends 7 yr old child was involved in an attempted stranger abduction on way to school. The man was caught and the information on the guy coming out is horrible. I won't get into specifics because they would 'out' some of the people involved and they aren't really important. Except for one. The guy had been following this kid for days before the attempted abduction.
Kid's mom, my friend, is very messed up over this and has taken to extreme (some may say) levels of protecting the kiddo from the outside world.
So this, and the arrival of a new computer board, got me thinking. The computer board is called the Jetson Nano and it's tiny and is developed to make AI, Machine Learning, and Computer Vision easier to develop, and boy does it. Things it's really good at is taking a camera feed and identifying specific things in that video and also, to varying extents, figuring out what those things are doing. Think about these new cameras that identify faces, can tell when you're smiling, can tell when you blink to avoid snapping the shutter then. Think about self driving cars avoiding pedestrians. That's the sort of thing this little computer is designed to do.
So here's what I built...
- It's a 4"x4"x1.5" box that fits into kiddo's backpack.
- It connects to a 1"x1"x1" 360° camera that is snapped onto the top of the right backpack strap.
- It records kiddo's GPS location and sends it to kiddo's mom's computer or phone. (this is nothing new) There is also a button in the strap kiddo can press to send an alert to mom and also begin storing current location on cloud server.
- It views area around kiddo and identifies faces, stores those faces. It also observes those faces and determines if they are watching the kiddo, if so, for how long.
- Each night, we go through the list of faces of people who were looking specifically at kiddo for more than fleeting glimpses. Anyone we know, we tag as such.
- Anyone who happens to be in the same-ish location at the same-ish time we flag as such.
- This step goes on for several weeks.
- The computer learns kiddo's normal routine and many of the people who she will encounter on a normal day.
- The computer can identify a lot of mitigating behavior. Like someone looking at kiddo followed by kiddo turning around to face the person and having a conversation. It can also identify someone watching her who turns away every time kiddo turns to look at that person.
- So now I'm testing this and I have written an algorithm that will pretty reliably identify when a stranger is watching me, if he's watching me over the course of several days or weeks, showing up in unusual locations, and if he's watching me but doesn't wish to be seen doing so by me.
- The algorithm can alert me if I'm being followed. Or if someone just happens to be where I am way more often than normal. It's a 360 cam so it can watch everyone around me all the time. It compiles reports identifying suspicious persons and actively seeks those people out and geo-locates them in a report.
- Not relevant to this particular use-case, but images of a particular person, like a non-custodial parent or known neighborhood sex offenders, could be added and flagged for immediate alert.
- So far in my testing, me wearing the backpack, the computer does a great job observing and categorizing the people around me as not-suspicious, because 99.999% of people I walk by every day are not suspicious. But when I test it with friend "strangers", it does a great job of flagging new faces and attaching appropriate risk flags based on certain behaviors.
- New machine learning software modules propose "identifying human intent" by observing things like posture, face, and gesture. Essentially, picking people out of a crowd that "act guilty". My current system doesn't use anything this advanced but it would not take much to integrate new learning into my current software.
Kiddo's mom wants it. Now I'm having doubts. In the state I live is as well as the state my friend lives in, there is no legal privacy issues. The device geo-locates and can be told to turn off within school property. Even audio can be saved as both states are single-party consent states.
I can't think of a solid reason not to put the gizmo into use. Letting her have it will get the kid a little bit of normalcy back, being able to go off with friends and walk to school and such. Holding me back is this mantra I've been saying since I started playing with machine learning, "Do we want Skynet? This is how we get Skynet."
Thoughts? Good outweighs creepiness? Vice-versa?