In the next decade I am pretty sure that we will be interacting with objects by touching them, no longer switches and knobs but just surfaces that would look like screens. We have got used to touch a tablet screen, that is what I am doing right now as I write this post, and similarly touching surfaces to interact with an object will become quite natural.
There are a few technologies that can support this kind of interactions. One was showcased at WMC in March 2017 by Sony and will become available as product in the first half of 2017.
It is based on a projector with an integrated IR sensor. The projector can project an image on a surface (covering a rectangle basically equivalent to a 30' screen) and the sensor detects the presence of your hand/finger in a specific location and the gesture you are doing. A software that comes with the projector let an application understand what you are doing and reacts correspondently. The effect is that of using a gigantic tablet. You are actually touching the light. There are plenty of possible applications, basically all the ones we are using today on a tablet plus new ones that are connected to the type of object.
The projector, Xperia Touch, will be marketed at 1,500$.
What I find interesting is the transformation of our perception of objects into entities we can interact with in a much more seamless way.
Down the lane, in the next decade, I can imagine the use of smart materials to transform any object surface into an interactive interface.