As I write this blog post it’s Back to the Future day. You can’t escape it; people are talking about it all over social media. Looking at what the movie predicted got me thinking about technology and of course a favorite subject of mine, robotics/AI. The flying cars in the Back to the Future movie may be a little way off yet, but we do have the partially self-driving Tesla and video conferencing was just imagination then but is now commonplace. We’re even about to see the launch of the first true hover board. Is it really going to be that long before robotics are an integral part of our lives, just like we saw with the robotic waiters in Back to the Future II?
We’re already a good part of the way there. I was really interested in the Indiegogo campaign for the Buddy robot which was hugely successful; achieving an impressive 617% funding. People have a fascination with AI and the potential of having robots in their lives on a regular basis. That’s why previous crowdfunding campaigns, such as those for Jibo, Personal Robot and Homey, have been so successful. Do we sometimes fall in love with the idea of the robot rather than its functionality though?
Looking at the main types of AI system
As a starting point to answering this question, I wanted to take a look at the different AI systems in use, and what they have to offer us as human beings.
Software based personal AI systems
Smartphone users have access to apps such as Siri, Cortana and Google Now. These apps are already an integral part of many people’s lives. You can check the weather, get directions, make payments and set reminders for yourself. These apps are becoming a user’s primary interface with their devices. Using them they will be able to control other apps and devices.
It seems to be that we’re accessing an ever increasing amount of functionality just from something we can hold in our hand. AI enables us to interact with our devices in an intelligent manner. Your device knows your movements. It knows where you like to eat and what you like to buy. Through AI your smartphone becomes your personal companion.
Stationary robotic AI systems
They may not exactly look like the vision of a robot you have in your head. Thanks to the movies, when many people think robot they visualize R2D2 or C3PO, but the stationary selection of robot devices are still robots and pretty smart ones at that. Their appearance is more like a household appliance or even a decoration but they enable you to control temperature and security, play music, make video calls or even learn new things. You have a household companion that, through the use of AI, can interact with all of your home smart controls including your security alarms. Your robotic AI device can create music playlists for you and remind you about important events in your life. Here are a few examples of this type of robotic device.
- Jibo – Looking a little like some future world South Park character, Jibo was successful in raising almost $4 million on Indiegogo. It’s designed to be an all-round household companion, with hi-res cameras, speech and hearing functions and AI algorithms that help it learn your requirements.
- Amazon Echo – It looks just like a large cylinder but Amazon Echo gives you full control over smart devices and it answers your questions.
- Homey – Looking a bit more futuristic, the Homey orb is a multi-functional personal assistant that enables full interaction with the smart devices in your home. Temperature control, entertainment selection and home security can all be at your fingertips.
So, it seems that AI robotics have become a familiar part of life without us even really noticing. We just accept technology continues to make our lives easier. We’re not quite at the stage of the robotic waiters of the Back to the Future world, but next we’re going to take a look at how AI robots are becoming more mobile, and question how much this actually adds to the experience.
Here comes Buddy
Now that the crowdfunding campaign for Buddy is over he’s not far from making his first real world appearance, with early shipping dates scheduled for April 2016. The thing with Buddy, and other AI robots such as Personal Robot and Pepper, is they move. They also look a little more as though they’re interacting with you.
There’s no doubt about it, Buddy is very cute, with the screen in his head that displays facial features and can switch to a video screen for video calling. The actual AI functionality is pretty much the same as with the static devices though. Buddy may feel more “real” so you’re more likely to interact naturally, but we’re not quite at the WALL-E stage yet.
When we get to that stage is when we will really see the benefit of mobility in AI robots. If we’re simply using a robot to control smart devices or make video calls then mobility doesn’t add anything to the equation. If we want to have a robot butler or maid that does all of the housework, then it’s a different story. Then mobility is key to their performance.
Integral to our lives
There’s little doubt that robotics and AI are already becoming integral to our lives. Use of the first two categories of robotic AI technology will continue to develop in the near future. Mobile robots have a little way to go before they mimic the prediction of Back to the Future, but they’ve already started to develop. Maybe we do love them a little more for the idea of them than for their functionality right now but that will change. I’m sure Buddy and his descendants will eventually become just as much part of the everyday as smartphones are today. In the meantime I’ll just spend my time having fun with my Buddy companion.