Artificial Intelligence: A Future of Integrations

Future-of-AI-A-Blizzard-of-Integrated-Hardware-and-Integration-Services-1

Artificial Intelligence: A Future of Integrations

Artificial Intelligence (AI) is the hottest technology disruption of 2017. Integrated components of machines will soon enable a small mechanism to run a full analysis and decision process on coin-sized hardware. Until this effort becomes reality, however, the complicated analytical tasks of AI will continue to require a great deal of compute power. Ben Vigoda and his team at Gamalon Machine Intelligence made powerful strides by bringing analysis and decision processes to a laptop-sized compute unit. Services like Amazon Rekognition are applying design to these concepts to allow for the application of rich AI capabilities across a variety of AI projects.Amazon Web Services (AWS) announced Rekognition at AWS re:Invent 2016. Not intended for instant analysis, Rekognition provides a web service API for incorporating sophisticated deep-learning image analytics into applications. For example, an application might be programmed to find a specific face in a crowd and identify the estimated age, gender, and mood of the person. Upon analysis, the application might be able to conclude that this person enjoys large parties with 20+ attendees, but isn’t fond of small parties with less than 10 attendees.

When our current tiny processor capabilities are used in conjunction with web API services like Rekognition, we produce powerful and rich AI application experiences, much like those of Amazon Alexa. Tiny integrated components easily conduct the analytics and decisions required for immediate action, while web services assess the details and build a smarter cache of knowledge within the tiny localized solutions.

A Theoretical Use Case for Tiny Components and Web Services

A theoretical example of this design principle can be seen in an autonomous vehicle driving down a road. What happens when a small animal begins crossing the street 50 feet ahead of the vehicle? If the vehicle’s components do not detect, confirm, and change the forward velocity, damage of some kind will likely occur. It could take a web service 30 seconds or more to receive an image from the vehicle, process, and return the result that the small animal is a dog. However, the vehicle doesn’t need to know that the animal is a dog, only that there is an obstruction in the road.

mit-autonomous-ethics-1-press

The tiny components recognize that there is an obstruction in the road, enforce slowing down, and instantly send a picture of the obstacle to web services for a more detailed analysis. As the vehicle collects information, it will immediately begin to recognize obstructions from information provided by web services’ detailed analysis, and so will every other car linked to the same AI. The car(s) will have a higher likelihood of avoiding the same mistake again, which would take several years for humans to master. These details will allow the tiny components to decipher the difference between living and inanimate obstructions, size classifications, type and subtype classifications, and directional velocity. The tiny components will then learn to decelerate for dogs crossing the road, stop for trees in the road, and just brake and hope for the best for squirrels because there’s no supercomputer on the planet that can predict what those little guys are going to do.

AI is already being incorporated into the world as we experience it today. Manufacturing and autonomous activities are easy targets for AI and machine learning, however, boundaries and limits are only in our imagination. The creative design patterns necessary to combine the abilities of component-level processing with the integrations of web services where more robust and scalable compute is available.  The technology necessary to make a significant impact already exists. It’s time to create! What are you working on right now? Do you see a bright future for AI?

Always be in the Know, Subscribe to the Relus Cloud Blog!