In order to establish a symbiotic “relation” two systems need to exchange “information”. The types of information exchange varies a lot, as well as the means supporting the exchange and the actual protocol for the exchange. Furthermore, the exchange can be direct (from one system to the other) or indirect, mediated by another system (or the ambient).
Foxgloves and bumble bees live a sort of symbiotic relation, the former needing the latter for pollination and the latter needing the former as “food”. The random chance of evolution brought these too different species to a symbiotic relationship (even though it is likely that none realizes the importance of the other).
Depending on the systems involved specific interactions are needed. A swarm of robots may interact using direct communications (like Bluetooth) or indirect communications, as it happens in swarms of bees or flock of birds, by following a specific set of rules enforcing/keeping distance from one another. In the case of a robotic swarm this can be achieved, as an example, by proximity sensors or by analyzing images streamed by cameras giving “sight” to each robot. In Nature, as it is the case for autonomous systems today, the communication is indirect.
In a more distant future, and 2050 may be a reasonable thresholds, autonomous systems might have the capability to create and establish a direct communication with other autonomous system and negotiate a joint activity to pursue a goal. This is tough since it basically requires the capacity to create a language to convey a meaning.
In case of human to artifact the communication happens by design. An implant is designed in such a way to become aware of the body situation (for the specifics that matters) and react in consequence. The first artificial pancreas for insulin delivery has been approved by FDC in September 2016 and clinical trials were opened in February 2017.
More sophisticated examples are provided by prosthetics that interact with muscle or nerves to mimic the replace body part functionality. As an example, sensors pick up electrical signals from the arm and use them to control a prosthetic hand. More sophisticated, recent, prosthetics interface with the brain receiving commands and feeding back sensations.
There is even an Open Hand project to stimulate innovation in this area and dramatically decrease prosthetic cost (a prosthetic hand may cost up to 100,000$).
Notice that today, in the case of brain computer interactions, the artifacts is designed to “speak” a certain language (to pick up certain data through sensors and to process them using a logic that is getting more and more sophisticated –signal processing). However a good portion of the communication “meaning” is managed by the brain that, experiencing the behavior of the artifact in consequence of what the brain does, rearranges itself (learn) to provide the signaling leading to the desired result.
This is an area where research on signal processing, languages and semantics needs to, and likely will, make significant progresses.
The likelihood of having an artifact connected to the brain and immediately “speaking” its language is slim, even on a long timeframe. There might be specific situations, interfaces, where this will become possible, like the interfacing of a camera with the retinal optic nerve or the interfacing of an artificial limb, but in general the interfacing with what a brain “think” is well beyond our observation horizon.
This goes both ways. So do not expect to be able to “download” data on your brain in the next decades. Of course interactions mediated by our senses will become better and better and this will result, often, in seamless communications and hence in stronger symbioses.
At the physical level it may be worth noting that in the coming decade we may move from a communications based on the decoding of electrical fields created by electrons (which is what happens in our electronic artifacts) to the decoding of electrical fields created by protons (protonics). This latter promises to be much more accurate, being able to capture the electrical activity of a single neurone (dendrite and axon). The technology for using protons rather than electrons works in prototypes but is still far in terms of industrial product.
The first results in these areas go back to the end of the last decade with the creation of a first transistor working on protons, rather than electrons. More recently a further step was taken, still at Washington University, in collaboration with Yale, Pittsburg and Leipzig university, by understanding the mechanism of proton movement in water that is at the bases of electrical communication in living cells.
Going back to artifacts interacting with other artifacts and with the ambient significant work is going on, and will progress, in the area of 3D sensing. Interesting, in this respect is to look at the NASA roadmap on Robotics and Autonomous Systems (area 4.1.1). These sensors will provide more, and more precise, data that can increase the awareness of the artifact(s) and its capability to interpret the “intention” of the other interacting autonomous system (including interaction with a human being). This is a first step in increasing the intelligence of the system itself, of its interaction and of the symbiotic relationship.