I’ve been learning machine learning by myself for a long time. From one Coursera specialization to a course to a YouTube playlist. But then I felt the problem with starting. I am learning the theory with some basic applications, but I don’t know how to go on by myself and start a project and analyze the data and find the correct structure and find-tune the parameters and so on…
Then came to me the old idea one more time; I need a mentor who knows how such professional life works and what really matters more. And after some searching and asking, I found one through a friend of a friend of a friend. And then she contacted me and offered help.
And after explaining to her what I know and what I want in life from this exercise, she formulated a plan for me. So here I am posting it.
I was almost done with all the steps and then got busy in life once again. Now I wanted to start the engines again and move on to a personal project to learn more, but I have some solid doubts that I did not do a 100% clean homework. That’s why I wanted to post my solution for every week again after revising it, cleaning it, and making sure that I can understand and present it well.
Then came to me the old idea one more time; I need a mentor who knows how such professional life works and what really matters more. And after some searching and asking, I found one through a friend of a friend of a friend. And then she contacted me and offered help.
And after explaining to her what I know and what I want in life from this exercise, she formulated a plan for me. So here I am posting it.
Week 1: Feedforward networks
A good start is the simple MNIST dataset. So train a feedforward network to study the basics of neural networks.
Week 2: Convolutional networks
Change the previous network to a convolutional network to study the basics of convolutional network.
Week 3: Hyperparameter optimization
Change the number of layers and different learning rates and other hyperparameters to learn validation and hyperparameter optimization.
Week 4: Dropout and batch normalization
Introduce dropout and batch normalization to the network to learn regularization and the rest of hyperparameter optimization. Up till now, it’s learning deep learning basics rather than a project.
Week 5&6: CIFAR-10 dataset
Repeat the above classification project but switch to CIFAR-10 dataset. There might be a few changes in the process, like input normalization and the need for more conv layers and so on. But this should solidify the knowledge.
Week 7: Transfer learning
Since you won’t train on your own from scratch all the time (no time and no resources), we sometimes borrow the lower layers from pre-trained networks and refine them. Use VGG model parameters with any other dataset and retrain for fine-tuning .
Which tools to use?
Since all the project is deep learning stick to Tensorflow and Keras, Keras is way easier but it is not as flexible. So you may want to choose between them based on your end goal. But give both a try and make sure you understand the basics in a theoretical level first. Consult a tutorial or a book or whatever you’re comfortable with. There are tons of materials online. Stanford course is one. It is more academic but easy. You won’t need other libraries, like scikit unless you want to play and compare with other algorithms or perhaps do a little input manipulation via them.
What’s next?
I assume if you reached this level, you’d have a pretty decent knowledge with clean data. I’ll look into other datasets to play with, as I know people who don’t consider the standard research datasets as a project but rather more of a tutorial following. So when you add it to your CV, it’ll look better.
I was almost done with all the steps and then got busy in life once again. Now I wanted to start the engines again and move on to a personal project to learn more, but I have some solid doubts that I did not do a 100% clean homework. That’s why I wanted to post my solution for every week again after revising it, cleaning it, and making sure that I can understand and present it well.
No comments:
Post a Comment