The result is that we become exposed to content that aligns with how we already think. So rather than be challenged by different lines of thought that force us to reconsider our points of view, these points of view are reinforced. Gaps between groups become wider and more challenging to bridge because the common ground is lost.
Now when you take it a few steps further, we can envision a world where content is curated at a sentence level to match audience preferences. Imagine reading an online article in which every sentence is devised to match your preference in tone, style, political alignment, biases etc.
Human thought has a binary nature. Right and wrong. Up and down. Forward and backward. Thought evolves through polarities. What happens to thought once you minimize the dialogical nature of ideas and make them monologues, with each idea further corroborating past behavior. tendencies and biases. I don't know what happens, but it certainly does not look like something positive to me.
In sharing lives with other humans, we run into conflicts constantly. Unlike AI, people are not necessarily consistently able to smartly predict what we are going to do next. As AI evolves, predicting where we are going to go before we even type an address on a map navigator, or what we want to buy for our home before even beginning a search in a marketplace, we become more and more unaccustomed to having explain how we think, what we want, and why we want it. AI can create the illusion that the world is made to the taste of each individual. Psychologically this could easily foment a scenario of social withdrawal and lack of desire to engage with others, even more so in adversarial situations.
No need to drive because the car will get you there. No need to enter the destination because it already knows based on your previous behavioral patterns, calendar, searches, and other data points. We have not seen the full ramifications of this reality, but the technological conditions for this scenario are already feasible and available. What’s left?
As we disengage from manual tasks there is a lot more bandwidth to think, structure, plan, architect. But do we have the educational framework to make use of this new bandwidth in a positive way? Or will we use this extra bandwidth to consume more, worry more, I am curious to see how we will evolve emotionally, spiritually, socially and politically to keep up with the ever-increasing level of technological transformation.
I do approach it with optimism and recognize the positive transformation potential that AI brings but believe that our attitude, flexibility, and discipline are crucial towards developing a healthy relationship with this new world that AI is opening up.
Gabriel Fairman, CEO of Bureau WorksLast modified on Monday, 26 August 2019