Keep absolutely still - its vision is based on movement!
It's a classic scene from Jurassic Park: Sam Neill as Dr Alan Grant saves Jeff Goldblum's Dr Ian Malcolm. When one of our dev advocates heard that, he asked himself: "I wonder what that would actually look like?"
He was so preoccupied with whether or not he could do it that he didn't stop to think if he should.
So he devised an edge detection effect, using Augmented Reality and Virtual Reality. He generated the view it by calculating the difference between two frames of a live video stream. Both frames are converted from colour to greyscale, and then each pixel of each image is checked. If no significant change has occurred, the pixel is colored black. But if a significant change is detected, the pixel turns red - exactly the colour for a bloodthirsty dinosaur to see.
Our adventurous advocate is a member of the IBM Developer Ecosystems Group. He and his colleagues provide inspiration, education and content to help every developer be successful. Come meet him and other members of the team, and see how they can inspire and assist you. You can ask him about his AI-enabled picture frame, his AI-powered word-counter, and his bottomless hard drive.
Footnote: bonus points if you could work out what the dinosaur is seeing in our main image. It's an auditorium at a hackathon at which the app was shared.
It was developed with IBM Watson Natural Language Understanding, IBM Watson Studio, and IBM Watson Knowledge Studio.
Come and see Study Buddy helping to school a human companion, and talk to our inventor about the theory and technology behind the idea. It's bound to be a demo you'll remember.