Software Engineering challenges of Deep Learning

Artificial Intelligence is nothing new. It has been in and out of the spotlight since the 1950s. So why is everyone saying we’re experiencing a revolution unlike anything we’ve ever seen before? The reason stems from a handful of breakthroughs in computational power, data collection and an AI technique called deep learning.

The graphic card surprise

The rapid proliferation of AI could not have been possible without exponential growth in computing power over the last half-century. The major breakthrough came when graphics processing units (GPUs), originally designed for video gaming and graphics editing, unexpectedly took center stage in the world of AI. 

This was simply because they happened to be designed to perform the very operations cutting edge AI requires – arrays of linked processors operating in parallel to supercharge their speed. These GPUs proved to be 20 to 50 times more efficient than hardware used earlier for deep-learning computations.  Suddenly AI no longer needed to be run on supercomputers in specialized labs. Instead, ever-faster, ever-cheaper computer chips made hardware required for AI available to organisations of all sizes. 

The rapid proliferation of AI could not have been possible without exponential growth in computing power over the last half-century. The major breakthrough came when graphics processing units (GPUs), originally designed for video gaming and graphics editing, unexpectedly took center stage in the world of AI. This was simply because they happened to be designed to perform the very operations cutting edge AI requires – arrays of linked processors operating in parallel to supercharge their speed. These GPUs proved to be 20 to 50 times more efficient than hardware used earlier for deep-learning computations.  Suddenly AI no longer needed to be run on supercomputers in specialized labs. Instead, ever-faster, ever-cheaper computer chips made hardware required for AI available to organisations of all sizes. 

The rapid proliferation of AI could not have been possible without exponential growth in computing power over the last half-century. The major breakthrough came when graphics processing units (GPUs), originally designed for video gaming and graphics editing, unexpectedly took center stage in the world of AI. 

This was simply because they happened to be designed to perform the very operations cutting edge AI requires – arrays of linked processors operating in parallel to supercharge their speed. These GPUs proved to be 20 to 50 times more efficient than hardware used earlier for deep-learning computations.  Suddenly AI no longer needed to be run on supercomputers in specialized labs. Instead, ever-faster, ever-cheaper computer chips made hardware required for AI available to organisations of all sizes.

Unlimited access to data

If computing power is AI’s engine and algorithms are its design, then data is AI’s fuel. To solve problems and make improvements in manufacturing, medicine, finance, transportation...everywhere, AI needs data to process and learn from.

It’s not a coincidence that today’s AI awakening coincides with the rise of Big Data where the widespread adoption of cloud computing, self-monitoring cell phones and a new plethora of tiny, powerful cameras and sensors are offering up trillions of new data points for AI to glean new insights from every day. So the question is no longer: When will powerful AI arrive? It already has. Instead we must ask: What data do we have for AI?

Conclusion

The rapid proliferation of AI could not have been possible without exponential growth in computing power over the last half-century. The major breakthrough came when graphics processing units (GPUs), originally designed for video gaming and graphics editing, unexpectedly took center stage.

References

  1. 01/ Stategy Analytics: A brave new world in machine learning 2018  — John Wiley & Sons
  2. 02/ Artificial intelligence  — Wikipedia
Get started for free