I clearly didn't complete my 30 days of learning Machine Learning Project. I went to India/China border instead.
I will soon do another marathon learning ML project, this time with a real problem in hand.
I started with trying to understand this implementation of RNN, which involves some language modelling. Soon I had to go back to understand basics!
So I watched few hours of Oxford lessons here, more precisely the Lecture 3 and 4. Way better introduction to language modelling than other tutorials I've seen so far.
Must not lose focus!
I went back to TensorFlow tutorials. Since I still don't get high level APIs, I figured best way is to understand these smaller programs and play around with them.
Also, got completely distracted by some climate change reports, then spent time day dreaming about end of the world. Not productive, Harsha.
Took a day off from all the machine learning, and watched looney tunes instead. No regrets :-)
Very unproductive day. Spent some time to get my cat-or-dog model to improve its accuracy. Zero, zilch, zip, nada, nothing. Not sure what is happening.
Perhaps, it needs more data, perhaps more layers, perhaps better hyper-parameters. Who knows whats going on inside that model!?.
It is quite hard to keep focus and not get distracted by all the noise.
Spent quite a bit of time re-writing / adjusting my cat-or-dog CNN network. Now I have an easy way to feed images to the model. Accuracy is still less than 50% though. Hyper-parameter tuning, here I come!
On a side note: Now I see why I should have GPUs. Each run takes 30 to 60 minutes.
Once I do get to some acceptable accuracy (>90%), I'll move on to RNNs and Voice recognition. Although I still don't know much about Tensorflow high level APIs. OR should I learn Keras? Uggh.
Mood: Sherlock Holmes
Woah! Struggling to understand all the high level APIs of TensorFlow.
So, I tried writing a cat-or-dog classifier, using a CNN. I mostly copy pasted code from everywhere and tried stitching it with whatever I understand. Code runs! I've downloaded some 25,000 pics of cats and dogs and ran the model on some 1000 of them.
Now I have a model that overfits with test accuracy of 100% but while evaluating for non-test images, I get less than 50% accuracy :-/ Must understand how to tweak a model.
More than that, I don't really understand how high level APIs work. So, will spend a day or 5 on this.
Mood: code, and more code.
Nothing much to write about! Mostly stuck in understanding how RNN works. I need to understand the manipulations of data inside an LSTM cell, with examples of data.
I did watch an hour of lectures on image manipulations with CNN. Also read about data prep for voice samples. Unless I actually start working/coding on these things, it won't make much difference.
MUST MAKE SOME PROGRESS!!
Mood: Meh. Bleh. Uggh.
Much of the day I was trying to figure out the dimensions of convoluted networks. Then I saw that TensorFlow does these things on its own. Neat.
I understand most of the code I wrote for TensorFlow implementation of CNNs and Multilayer perceptron networks. I should get better at graphs as well as data visualisation. It's rather impressive that a simple 50 lines of tensorflow program can identify MNIST handwritten numbers at 99% accuracy.
I also realised I've forgotten my awks and seds as well as shell scripts. Struggle is real.
Mood: Optimistic, curious.
Must stay off the reddit and twitter though.
Lost in the abstractions of Tensorflow. I gotta find a nice tutorial that explains these mystical high level APIs.
Watched a bit more of lectures, I now have an idea of MFCC, HMM etc. As in, I have vague clues about what they do.
Off to CNN we go today!
Mood: Meh, bleh, uggh.
Half of 30 days of learning Machine Learning is over already! I feel I haven't progressed much. I blame it on twitter, youtube and reddit. It is so easy to blame something else than my own stupidity :-)
I did quite a bit of reading and writing code for logistic regression and simple multilayered neural network. I understand what I coded, which is nice. Not knowing data structures in Python hurts. I did spend quite a bit of time debugging some monster which was a dictionary inside a list which was inside another dictionary[?].
Now on, I think I'll just get on to coding completely till I can understand whatever these guys are talking here.
Spent much of the day trying to code, had little success too! Now I've coded a linear regression model as well as a polynomial regression model. Played around with random data and hyper-parameters. This is a good reference.
Slow progress, progress nonetheless!
Mood: Happy, gimme more coffee!
Also did a sort of retrospective of last 10 days of effort:
Things that I think went well:
* Sticking to the daily struggle to learn ML and related areas!
* Spending equal time on theory and coding.
* Not giving up when I don't understand a concept! Good job Harsha
Things that I think did not go well:
* Not having a defined curriculum to study. This lead to having to think "what to do today" everyday. Not cool
* Not having a fixed goal [Ex: I will learn RNN end to end OR I will learn TensorFlow API ]. This lead to lack of focus and much context switch
* Not utilising available hours to learn more. I still get distracted by twitter, youtube and reddit. Not cool.
Focus for next 10 days:
* TensorFlow API and being comfortable in implementing easy RNN / CNN / LSTM examples with TF.
* Learning basics of speech recognition : Understand MFCC, CTC and related concepts
* Getting comfortable with data pre-processing. Mostly how to use python, I think.
Mood: Happy, focussed. [could be just caffiene high]
Watched couple of more episodes of Stanford NLP with Deep learning . I have slightly better understanding of process involved in the mammoth task of audio processing!
I'll will watch these lessons completely along with TensorFlow examples for NLP. Should be fun!
Mood: Lazy, reckless.
Spent sometime going through TensorFlow APIs. Spent some more time reading about Information Theory. Meh.
I'll spend some more time understanding speech processing before getting into coding. I am a headless chicken.