Where are Computers Headed in the Next Five Years?

A 2 minute read, Posted on Wed, Jul 5, 2017

Do you remember when dial-up was a hot trend? It's ok if you don't. The rest of us are still trying to forget.

Consider for a moment how far computers have come in the last century. At one time, a single computer could take up an entire room. Now you can hold one in your hand.

So where are computers headed in the next 5 years?

Big Data: The Hot New Trend

Big Data involves sifting through large data sets to discover trends, patterns, and associations. It is often used to make marketing decisions and study consumer behavior. Businesses are looking for a way to become more efficient and increase profit. Analyzing Big Data also helps businesses win out over the competition who haven’t caught up with the trend.

Online and brick-and-mortar universities are rushing to create programs to fill this need. Both students and businesses are fueling this demand. Colleges have started to offer Bachelors and Masters in Big Data.

Although Big Data still requires human skill, analyzing large data sets will soon be taken over by artificial intelligence.

Artificial Intelligence

Artificial Intelligence describes the development of technology that is able to do human tasks such as making complex decisions. In 2011, a supercomputer named Watson competed and won Jeopardy! Watson even beat Ken Jennings, the winner who holds the record for the longest running streak on the show.

While some fear that AI may take jobs away from humans, others see it as a tool to give computers the menial tasks that humans do in order for us to pursue more desirable labor.

Moore’s Law: Efficiency to the Limit

Moore’s Law predicts that the number of transistors that can be placed on a unified circuit will double either every 18 months or every 2 years.

Computing efficiency will continue until it is no longer feasible to make a computer more efficient. Greg Satell, creator and contributor to Digital Tonto, states that processing efficiency will increase 100-fold. Sartell also states that we are nearing a new digital paradigm where digital devices will become so efficient that the laws of physics won’t allow us to make them any more efficient.

Computing of the future will continue to become more efficient as things like Big Data and Artificial Intelligence continue to grow in popularity. The future holds the promise of technology that will surpass our wildest dreams.