Deep Learning is a part of Machine Learning, it mimics the way our brains function. We all know that our brain consists of billions of neurons interacting among themselves, although the interaction is simple at a fundamental level, it becomes complex when billions of them are interacting. Deep learning also uses the same approach to help machines learn.
Let's not get into what intelligence is, let us just agree that this system is intelligent if it predicts future outcome based on current and past trends.
Alright, we are going to make this system to predict future land rates.
First train the system, provide some knowledge to it, inform it that, in the past based on some set of conditions, the land rate was x.
Gather data of last years land rates.
Feed 10 to the system and the system spits out some rate(output), check if the rate is 1000, if it's different than 1000, then alter the system so that the rate is close to 1000.
Repeat the process with past data.
Just like you feed the past data, feed the future input and you will get some rate. As the model is trained to be close to the actual rates for past data, by induction, the future rate will be close to predicted rate.
So, the fundamental rule is simple, we can easily understand how it works for a single system described above, but when there are billions of these systems(neurons) interacting, with trillions of variables(data), it feels like magic and we call it intelligent.
Current capacity of computing power is far less than that of our brains.
As we have seen earlier, we have predicted the future land rates with a simple model, similarly we can create a bit more complex model and predict a really useful outcome, like predicting the companies profit, predicting the users interest in our product etc..
Not just prediction, with the same basic law, we can make our model to classify things. We can classify whether an image is a human or an animal, if a user is profitable or not etc..
With an exponential increase in computing resources, we will be able to create much more complex models, that is we can make our machines more intelligent.
It boils down to two important things:
2. Computing power
Use your data with ever growing computing power to create a machine that helps you guide in every aspect of your business.
Micropyramid is a software development and cloud consulting partner for enterprise businesses across the world. We work on python, Django, Salesforce, Angular, Reactjs, React Native, MySQL, PostgreSQL, Docker, Linux, Ansible, git, amazon web services. We are Amazon and salesforce consulting partner with 5 years of cloud architect experience. We develop e-commerce, retail, banking, machine learning, CMS, CRM web and mobile applications.
Django-CRM :Customer relationship management based on Django
Django-blog-it : django blog with complete customization and ready to use with one click installer Edit
Django-webpacker : A django compressor tool
Django-MFA : Multi Factor Authentication
Docker-box : Web Interface to manage full blown docker containers and imagesMore...