Why is AI being featured recently?
- Harish Lakshmanapathi

- Aug 8, 2019
- 4 min read
When scientists and engineers developed the algorithms and made autonomous vehicle systems in the 1980's itself using sophisticated machine learning algorithms, then what took so much delay for Machine Learning and Artificial Intelligence to get integrated with the industry in the past decade only?
Well that is the question we are going to answer in this post folks.
There are three most important crucial features to realise a Machine Learning algorithm.
1. Data
2. Computational Power
3. Pattern
We will discuss the effect of each of these parameters in good detail.
Note : Don't confuse Machine Learning and Artificial Intelligence both are the same just their applications and the amount of complexity involved in each of them categorizes them.
DATA
Well talking about data sets, back in those days like in 1980 YouTube was not there, neither was Google and neither any of the Social Media websites to which most of us are hooked onto. You might be thinking what is the link between all the popular websites listed above has to do with Machine Learning. Well data passing through the internet cables has exploded in the past decade like anything. With the invent of 4G technology more amount of data is sent within that same time period. So more data is flowing in the branches of internet across the globe. And we don't know how this plot will be after 5G is getting deployed in a large scale throughout the globe....

That is a mind boggling figure from India alone. With this much amount of data we can easily extract the required application specific data sets to train a computer to solve a particular problem. Also once again I would like to remind you that this is just from Facebook alone. Using natural language processing techniques analytics companies are extracting the crucial points from a review given to a product on any website, yeah think about Amazon the figure will exponentially rise. They use that data to understand the mindset of people and also that helps companies to predict the trend so that their clients are prepared to face it and to prepare in advance to make more profits.
Let's have a look at a global level scale too...

Yup I haven't heard about Exabytes too but once again multiply it by 12 to get the internet usage per year, you can do the math guys.
So by looking at all these plots we can see why Machine Learning has taken its rise now in the past decade from a data perspective.
COMPUTATIONAL POWER
Machine Learning algorithms generally take a lot of time to learn when a data set is fed into it.
And now that will increase exponentially when your size of the data set increases. And if the data set is an image data set then once again it will take a lot of time if you are performing Image Processing with Convolution Neural Networks to give the desired output. Back then in late 20th century we didn't have Intel i9 processors operating at 4.5 Ghz under turbo boost. The processing speed of a processor was very low back then. With the advent of faster and smaller processors we were able to implement complicated Machine Learning algorithms with large data sets (though it would take days to train a model). And we should not forget the major role of computation accompanied by GPU (Graphics Processing Unit) to boost the computers performance to reduce the time required to train a model. And when SSD (Solid State Disk) and NVMe (Non - Volatile Memory express) came into the big picture the data transfer speed inside the computer reached new heights which also facilitated in reducing the training time to some good extent because they were better and fast in transferring data from ROM to RAM and vice versa after their tasks are completed in the Machine Learning algorithm one deploys.

The above plot gives a good picture of the rise of microprocessors to cater our computational needs from time to time.
Now let's have a look at the trend of storage devices and network transfer speed.

Once again you wont be disappointed by the above plot which also facilitates the thirst of all those bulky processors and avoids memory bottleneck.
Last but not the least we can;t ignore the GPU providing boost in computation process and enabling parallel computing for carrying out large number of tasks in short span of time.

In the above graph blue line represents the capability of a CPU and the green line showcases the performance of a dedicated GPU. the reason for the GPU working better than CPU is because CPU have to run other processes like Operating System, data transfer monitoring, network data flow monitoring and other cache memory processes involved in our computer. Whereas GPU is dedicated for only processing a given request and that is the reason why they are ahead in terms of computing power and efficiency.
Once again the above plots clearly shows the way ahead for Machine Learning from a perspective of computation.
PATTERN
Well the above tow required parameters were technical ones, whereas patterns will differ from problem to problem and from a given data set to another data set and so on. It's the responsibility of a Machine Learning Engineer to find out patterns present in the given data set in order to make a Machine Learning model.
Many experts say we have just scratched the surface of Artificial Intelligence and there is a lot more to be discovered to benefit the humanity. But then once again AI has its own pro's and con's, but one thing is for sure, the revolution of AI has just started and this one will not be extinguished like how the software industry emerged back then in the beginning of 21st century, and then got normal after 12 to 13 years, rather Machine Learning and Artificial Intelligence will stay with us till we perish for sure...
Kudos to all those persons who are reading this post till the end!!!




Comments