Monday, May 2, 2011

Our lazy brain and the machines.

The brain stores and organizes information so that it can prioritize course of actions. Routines or automatic reactions are usually triggered in the subconscious department, helping the brain to avoid being overwhelmed by petty decisions all the time.

But now, technology in the form of web tools and applications are helping us store and organize information as well. In very practical matters such as listing top 10 movies, books, to-do lists, etc to help ourselves or others to make risk averse decisions. What if we create apps that could help make emotional decisions? Example: Top 200 friends as listed on Facebook (so that you don't have to crack your head deciding who to invite and who not to invite for your parties), babies' names (based on our favorites and likes), partner compatibility (based on search results from two different people).

We have seen fun applications/websites spewing daily horoscopes and personality tests and love thermometers for the longest time so there's hardly anything novel about them. But when these 'future telling' tools provide results that are based on concrete data, they ought to be quite scary. Because if machines or technology take over us in data formation and analysis, then our conscious brain will be reduced to making decisions of A or B, Yes or No, Black or White. Imagine if technology can tell you that you are only 23% compatible with your current partner and actually tells you why based on both of your search results. The next step is a pretty clear one, if you ask me.

The good side is, we might be liberated from unnecessary emotional weights but the bad side is, we become less and less human. We lose our confidence and might become overly data reliant because we learn to trust equations and algorithms more than our own intuition and faith. And that is when, we become machines.

The power to think and make good judgments will never be robbed from us. We just have to know when to stop giving them away.

Photo credit here.

No comments:

Post a Comment