We would never know how it felt to be part of the industrial revolution or how people felt when electricity was transforming the way we lived. But we are no strangers to revolutions; well atleast digital revolution (also called 'third industrial revolution') happened during our lifetime. The revolution started half a century back with advent of computers and is still ongoing. Are we at the cusp of the fourth industrial revolution, marked by transformations from culmination of artificial intelligence, robotics, IoT etc.? Well we are no Oracle and we would let you to be the judge. We are starting a series of blogs to discuss various aspects of this potential fourth revolution and we hope you enjoy reading this series as much as we did writing it!
a bird's eye-view of AI,ML and Deep Learning
New Wine In New Bottle?
No. Let me start by saying that neither of AI, ML or Deep Learning is a new area. These fields existed for atleast half a century (1956 is the official year-of-birth of AI), though not with their currently popular names. Infact, AI is presently seeing its third wave of optimism (previous two were followed by long 'AI winters' i.e. period of low interest due to performance lagging behind expectations). Is another Winter coming? Games of Thrones fans might know better, not us! As we said in Introduction to our blog series, we are no Oracle!
Tide me over the Jargons
Before we go further, definitions are in the order.
Artificial Intelligence (AI) refers to the ability of machines to think or act like humans and to perform cognitive functions and tasks that we typically associate with humans such as problem solving, perception, learning etc. Typical examples of AI application are playing chess, face recognition (computer vision), writing poetry (natural language generation), cognitive agents, autonomous vehicles etc. This definition is not static. As with all things cognitive, as soon as a technology becomes traditional and standard over time, it is 'fashionably' excluded from AI i.e. not considered cognitive enough. For instance, Optical Character Recognition is no longer considered part of AI.
Machine Learning is a subset of AI. It consists of set of algorithms that allow machines/computers to learn from data, rather than requiring humans to programme them explicitly. For instance, given historical data of non-delinquent and delinquent borrowers, a computer programme is able to identify patterns that discriminate between these two classes of borrowers. Why do we want machines to identify such patterns? The answer is short and simple: to make predictions.Why don't we explicitly identify and program these rules ourselves? A simple but longer answer: we might fail to identify all data patterns, especially when data and number of variables is huge.
What is Deep Learning (DL) then? DL is a subset of ML and is the fancier name for neural networks (NN). NN are a set of algorithms that learn pattern from data in a layered manner. For instance, to classify images of different animals, NN algorithms would first identify simple features such as horizontal lines, vertical lines etc. In the next layer, NN would combine these simple features to identify slightly complex features such as arcs, circles, rectangles etc. Subsequent layers would help NN to identify eyes, nose, ears, skin type etc. The final layer would then help NN to classify the image in the right category. As compared to other ML algorithms, DL marvels at tasks that were traditionally dominated by humans such as face recognition, voice recognition, natural language generation,translation, medical diagnosis etc.
Why the resurgence?
We can attribute the third wave of AI and DL to three major factors. First, DL algorithms work better the more data you feed them. If there is anything that digital revolution has led to; it is data proliferation. Second, computational power has increased tremendously over the last decade; especially the usage of Graphical Processing Units (GPU) has resulted in faster execution of DL algorithms. Third, though DL/NN is pretty old; there have been significant improvements in the algorithm implementation and tuning. It is like car engines that have existed for over a century but are always improving due to innovations and tuning. Combine these three reinforcing waves and we get the third wave of AI.
Why should I and my organization care?
Just like electricity fundamentally transformed various industries, it is expected that the third wave of AI aka the fourth industrial revolution would transform the way we live, interact and do business. Ignore such a undamental shift at your own peril! Let's have a look at AI and ML algorithms and their applications (especially for financial services) in the next blog.
Contact the Authors
Dinesh Chaudhary
Director - Risk Analytics, Artificial Intelligence & Machine Learning
Sign up for our Monthly Hightlights newsletter
Don't miss this roundup of our newest and most distinctive insights