Episodes

  • 23. Deep NLP 2

    · Machine Learning Guide

    RNN review, bi-directional RNNs, LSTM & GRU cells. ## Resources - Overview Articles: ** Unreasonable Effectiveness of RNNs (http://karpathy.github.io/2015/05/21/rnn-effectiveness/) `article:easy` ** Deep Learning, NLP, and Representations (http://colah.github.io/posts/2014-07-NLP-RNNs-Representations/) `article:medium` ** Understanding LSTM Networks (http://colah.github.io/posts/2015-08-Understanding-LSTMs/) `article:medium` - Stanford cs224n: Deep NLP (https://www.youtube.com/playlist?list=PL3FW7Lu3i5Jsnh1rnUwq_TcylNr7EkRe6) `course:medium` (replaces cs224d) - TensorFlow Tutorials (https://www.tensorflow.org/tutorials/word2vec) `tutorial:medium` (start at Word2Vec + next 2 pages) - The usual DL resources (pick one): ** Deep Learning Book (http://amzn.to/2tXgCiT) (Free HTML version (http://www.deeplearningbook.org/)) `book:hard` comprehensive DL bible; highly mathematical ** Fast.ai (http://course.fast.ai/) `course:medium` practical DL for coders ** Neural Networks and Deep Learning (http://neuralnetworksanddeeplearning.com/) `book:medium` shorter online "book" ## Episode RNN Review ** Vanilla: When words + running context is sufficient. ** POS, NER, stocks, weather ** Bidirectional RNN (BiLSTM): When stuff from right helps too ** Encoder/decoder or Seq2seq: When you should hear everything first / spin a different way ** Classification, sentiment, translation ** Now w/ word embeddings Train: backprop through time ** Vanishing/exploding gradient LSTMs (http://colah.github.io/posts/2015-08-Understanding-LSTMs/) ** ReLU vs Sigmoid vs TanH (Nonlinearities future episode) ** Forget gate layer ** Input gate layer: decides which values to update ** Tanh layer: creates new candidate values ** Output layer

    starstarstarstarstar
  • Missing episodes?

    Click here to refresh the feed.

  • 22. Deep NLP 1

    · Machine Learning Guide

    Recurrent Neural Networks (RNNs) and Word2Vec. ## Resources - Overview Articles: ** Unreasonable Effectiveness of RNNs (http://karpathy.github.io/2015/05/21/rnn-effectiveness/) `article:easy` ** Deep Learning, NLP, and Representations (http://colah.github.io/posts/2014-07-NLP-RNNs-Representations/) `article:medium` ** Understanding LSTM Networks (http://colah.github.io/posts/2015-08-Understanding-LSTMs/) `article:medium` - Stanford cs224n: Deep NLP (https://www.youtube.com/playlist?list=PL3FW7Lu3i5Jsnh1rnUwq_TcylNr7EkRe6) `course:medium` (replaces cs224d) - TensorFlow Tutorials (https://www.tensorflow.org/tutorials/word2vec) `tutorial:medium` (start at Word2Vec + next 2 pages) - Deep Learning Resources (http://ocdevel.com/podcasts/machine-learning/9) ## Episode Deep NLP pros - Language complexity & nuances ** Feature engineering / learning ** Salary = degree*field, not + ** Multiple layers: pixels => lines => objects ** Multiple layers of language - Once model to rule them all; E2E models Sequence vs non-sequence - DNN = ANN = MLP = Feed Forward - RNNs for sequence (time series) RNNs - Looped hidden layers, learns nuances by combined features - Carries info through time: language model - Translation, sentiment, classification, POS, NER, ... - Seq2seq, encode/decode Word2Vec (https://www.tensorflow.org/tutorials/word2vec) - One-hot (sparse) doesn't help (plus sparse = compute) - Word embeddings ** Euclidean distance for synonyms / similar, Cosine for "projections" . king + queen - man = woman ** t-SNE (t-distributed stochastic neighbor embedding) - Vector Space Models (VSMs). Learn from context, predictive vs count-based - Predictive methods (neural probabilistic language models) - Learn model parameters which predict contexts ** Word2vec ** CBOW / Skip-Gram (cbow predicts center from context, skip-gram context from center. Small v large datasets) ** DNN, Softmax hypothesis fn, NCE loss (noise contrastive estimation) - Count-based methods / Distributional Semantics - (compute the statistics of how often some word co-occurs with its neighbor words in a large text corpus, and then map these count-statistics down to a small, dense vector for each word) ** GloVe ** Linear algebra stuff (PCA, LSA, SVD) ** Pros (?): faster, more accurate, incremental fitting. Cons (?): data hungry, more RAM. More info (http://blog.aylien.com/overview-word-embeddings-history-word2vec-cbow-glove/) - DNN for POS, NER (or RNNs)

    starstarstarstarstar
  • 20. Natural Language Processing 3

    · Machine Learning Guide

    Natural Language Processing classical/shallow algorithms. ## Resources - Speech and Language Processing (http://amzn.to/2uZaNyg) `book:hard` comprehensive classical-NLP bible - Stanford NLP YouTube (https://www.youtube.com/playlist?list=PL6397E4B26D00A269) `course|audio:medium` - NLTK Book (http://www.nltk.org/book) `book:medium` - Convert video to audio: ** mp4 => mp3: `for f in *.mp4; do ffmpeg -i "$f" "${f%.mp4}.mp3" && rm "$f"; done` ** youtube => mp3: setup youtube-dl (https://github.com/rg3/youtube-dl) and run `youtube-dl -x youtube.com/playlist?list=` ## Episode - Parsing ** Constituents ** Grammar: Context Free Grammars (CFGs), Probabalistic CFGs (PCFGs), Cocke–Younger–Kasami (CYK) ** Dependency Tree: Greedy transition-based parsing (stack/buffer) ** SyntaxNet (English = Parsey McParseface) - Relationship Extraction - Question Answering / Textual Entailment (TF-IDF+Cosine Similarity; Parsing; NER) - Automatic summarization (TF-IDF; TextRank) - Machine Translation (details here (https://www.youtube.com/watch?v=QuELiw8tbx8&list=PL3FW7Lu3i5Jsnh1rnUwq_TcylNr7EkRe6&index=9))

    starstarstarstarstar
  • 19. Natural Language Processing 2

    · Machine Learning Guide

    Natural Language Processing classical/shallow algorithms. ## Resources - Speech and Language Processing (http://amzn.to/2uZaNyg) `book:hard` comprehensive classical-NLP bible - Stanford NLP YouTube (https://www.youtube.com/playlist?list=PL6397E4B26D00A269) `course|audio:medium` - NLTK Book (http://www.nltk.org/book) `book:medium` - Convert video to audio: ** mp4 => mp3: `for f in *.mp4; do ffmpeg -i "$f" "${f%.mp4}.mp3" && rm "$f"; done` ** youtube => mp3: setup youtube-dl (https://github.com/rg3/youtube-dl) and run `youtube-dl -x youtube.com/playlist?list=` ## Episode - Edit distance: Levenshtein distance - Stemming/lemmatization: Porter Stemmer - N-grams, Tokens: regex - Language models ** Machine translation, spelling correction, speech recognition - Classification / Sentiment Analysis: SVM, Navie bayes - Information Extraction (POS, NER): Models: MaxEnt, Hidden Markov Models (HMM), Conditional Random Fields (CRF) - Generative vs Discriminitive models ** Generative: HMM, Bayes, LDA ** Discriminative: SVMs, MaxEnt / LogReg, ANNs ** Pros/Cons ** Generative depends on fewer data (NLP tends to be few data) ** MaxEnt vs Naive Bayes: Independence assumption of Bayes, etc ("Hong" "Kong") - Topic Modeling and keyword extraction: Latent Dirichlet Allocation (LDA) ** LDA ~= LSA ~= LSI: Latent diriclet allocation, latent semantic indexing, latent semantic analysis - Search / relevance / document-similarity: Bag-of-words, TF-IDF - Similarity: Jaccard, Cosine, Euclidean

    starstarstarstarstar
  • 18. Natural Language Processing 1

    · Machine Learning Guide

    Introduction to Natural Language Processing (NLP) topics. ## Resources - Speech and Language Processing (http://amzn.to/2uZaNyg) `book:hard` comprehensive classical-NLP bible - Stanford NLP YouTube (https://www.youtube.com/playlist?list=PL6397E4B26D00A269) `course|audio:medium` - NLTK Book (http://www.nltk.org/book) `book:medium` - Convert video to audio: ** mp4 => mp3: `for f in *.mp4; do ffmpeg -i "$f" "${f%.mp4}.mp3" && rm "$f"; done` ** youtube => mp3: setup youtube-dl (https://github.com/rg3/youtube-dl) and run `youtube-dl -x youtube.com/playlist?list=` ## Errata 22:21 "cat & car different by one word" should be "different by one letter" ## Episode Syntax vs Semantics Parts - Corpus - Lexicon - Morphology ** Lemmas & Stems (reduce morphological variation; lemmatization more sophisticated) ** Tokens ** Stop words ** Edit-distance ** Word sense disambiguation Syntax / Tasks - Info Extraction (POS, NER, Relationship extraction) - Parsing Goals - Spell check - Classification ** Tagging (topic modeling / keyword extraction) ** Sentiment analysis - Search / relevance, document similarity - Natural language understanding ** Question answering ** Textual entailment ** Machine Translation (AI-complete) ** NLU vs NLP - Natural language generation ** Image captioning ** Chatbots ** Automatic summarization - Won't cover ** Optical character recognition (OCR) ** Speech (TTS, STT, Segmentation, Diarization)

    starstarstarstarstar
  • 17. Checkpoint

    · Machine Learning Guide

    Checkpoint - learn the material offline! 45m/d ML - Coursera (https://www.coursera.org/learn/machine-learning) `course:hard` - Python (http://amzn.to/2mVgtJW) `book:medium` - Deep Learning Resources (http://ocdevel.com/podcasts/machine-learning/9) - Go deeper on shallow algos ** Elements of Statistical Learning (http://amzn.to/2tWW8He) `book:hard` ** Pattern Recognition and Machine Learning (http://amzn.to/2sDIIfb) (Free PDF? (https://goo.gl/aX038j)) `book:hard` 15m/d Math - Either LinAlg (https://www.khanacademy.org/math/linear-algebra) `course:medium` OR Fast.ai (http://www.fast.ai/2017/07/17/num-lin-alg/) `course:medium` - Stats (https://www.khanacademy.org/math/statistics-probability) `course:medium` - Calc (https://www.khanacademy.org/math/calculus-home) `course:medium` Audio - (removed CS229 - very heavy chalkboard use lends poorly to audio) - The Master Algorithm (http://amzn.to/2kLOQjW) `audio:medium` Semi-technical overview of ML basics & main algorithms - Mathematical Decision Making (https://goo.gl/V75I49) `audio|course:hard` course on "Operations Research", similar to ML - Statistics (https://projects.iq.harvard.edu/stat110/youtube) `audio|course:hard` - Calculus 1 (https://goo.gl/fcLP3l) 2 (https://goo.gl/sBpljN) 3 (https://goo.gl/8Hdwuh) `course|audio:hard` - Convert video to audio: ** mp4 => mp3: `for f in *.mp4; do ffmpeg -i "$f" "${f%.mp4}.mp3" && rm "$f"; done` ** youtube => mp3: setup youtube-dl (https://github.com/rg3/youtube-dl) and run `youtube-dl -x youtube.com/playlist?list=` Kaggle.com (https://www.kaggle.com/)

    starstarstarstarstar
  • 16. Consciousness

    · Machine Learning Guide

    Can AI be conscious? ## Resources Philosophy of Mind: Brains, Consciousness, and Thinking Machines (http://amzn.to/2kQGgk5) `audio:easy` ## Episode Inspirations for AI - economic automation - singularity - consciousness Definitinitions - cogsci: neuroscience, neuro-x(biology, physiology, computational __, etc), psychology, philosophy, AI ** computational neuroscience => perceptron ** frank rosenblatt, warren McCulloche, walter pitts - all brain guys (neurobiology, neurophysiology, computational neuroscience respectively) - intelligence (computation) vs consciousness (soul); intelligence in scale (animals); brain in scale; consciousness in scale? - perception, self-identity, memory, attention; (self reflection is just a human-special component) - awereness (qualia / sentience / subjective experience); modified by attention? (driving, dreams, coma) - missing: emotions; just built-in goal reinforcemer. plus we don't know how machines experience reinforcement (floor-is-lava) Hard vs soft problem ** soft problem = neuroscience ** hard problem = philosophy ** dualism: pineal gland, issue with physical->metaphysical; society of mind / connected intelligences ** maybe definitively non-science, since subjective ** maybe matter of time; phil is pre-science at each juncture; science turns magic => known (sickness). Either hard problem is unscientific (phil) or around the corner Emergence (emergent property) Computational theory of mind - intelligence & consciousness connected / same - think: word2vec = understanding? - consciousness in scale; does this mean every layer has its own consciousness? Panpsychism. I don't know - just concerned with that which does exhibit intelligence - integrated information theory - freewill; conscious / awareness center activated after decision made; all the information in place before whole ; westworld Biological plausibility - planes, brains - sans bio-plaus, functionalism; zombies; turing test; searle's chinese room

    starstarstarstarstar
  • 15. Performance

    · Machine Learning Guide

    Performance evaluation & improvement ## Episode Performance evaluation - Performance measures: accuracy, precision, recall, F1/F2 score - Cross validation: split your data into train, validation, test sets - Training set is for training your algorithm - Validation set is to test your algorithm's performance. It can be used to inform changing your model (ie, hyperparameters) - Test set is used for your final score. It can't be used to inform changing your model. Performance improvement - Modify hyperpamaraters - Data: collect more, fill in missing cells, normalize fields - Regularize: reduce overfitting (high variance) and underfitting (high bias)

    starstarstarstarstar
  • 14. Shallow Algos 3

    · Machine Learning Guide

    Speed run of Anomaly Detection, Recommenders(Content Filtering vs Collaborative Filtering), and Markov Chain Monte Carlo (MCMC) ## Resources - Andrew Ng Week 9 (https://www.coursera.org/learn/machine-learning/resources/szFCa) ## Episode - Anomoly Detection algorithm - Recommender Systems (Content Filtering, Collaborative Filtering) - Markov Chains & Monte Carlo

    starstarstarstarstar
  • 13. Shallow Algos 2

    · Machine Learning Guide

    Speed run of Support Vector Machines (SVMs) and Naive Bayes Classifier. ## Resources - Andrew Ng Week 7 (https://www.coursera.org/learn/machine-learning/resources/Es9Qo) - Machine Learning with R (http://amzn.to/2n5fSUF) `book:medium` - Mathematical Decision Making (https://goo.gl/V75I49) `audio|course:hard` course on "Operations Research", similar to ML - Which algo to use? ** Pros/cons table for algos (https://blog.recast.ai/machine-learning-algorithms/2/) `picture` ** Decision tree of algos (http://scikit-learn.org/stable/tutorial/machine_learning_map/) `picture` ## Episode - Support Vector Machines (SVM) - Naive Bayes Classifier

    starstarstarstarstar
  • 12. Shallow Algos 1

    · Machine Learning Guide

    Speed-run of some shallow algorithms: K Nearest Neighbors (KNN); K-means; Apriori; PCA; Decision Trees ## Resources - Andrew Ng Week 8 (https://www.coursera.org/learn/machine-learning/resources/kGWsY) - Tour of Machine Learning Algorithms (http://machinelearningmastery.com/a-tour-of-machine-learning-algorithms) `article:easy` - Elements of Statistical Learning (http://amzn.to/2tWW8He) `book:hard` - Pattern Recognition and Machine Learning (http://amzn.to/2sDIIfb) (Free PDF? (https://goo.gl/aX038j)) `book:hard` - Machine Learning with R (http://amzn.to/2n5fSUF) `book:medium` - Which algo to use? ** Pros/cons table for algos (https://blog.recast.ai/machine-learning-algorithms/2/) `picture` ** Decision tree of algos (http://scikit-learn.org/stable/tutorial/machine_learning_map/) `picture` ## Episode KNN (supervised) Unsupervised - Clustering -> K-Means - Association rule learning / Market basket -> Apriori - Dimensionality Reduction -> PCA Decision Trees (supervised, classify/regress) - Random Forests - Gradient Boost

    starstarstarstarstar
  • 11. Checkpoint

    · Machine Learning Guide

    Checkpoint - start learning the material offline! 45m/d ML - Coursera (https://www.coursera.org/learn/machine-learning) `course:hard` - Python (http://amzn.to/2mVgtJW) `book:medium` - Deep Learning Resources (http://ocdevel.com/podcasts/machine-learning/9) 15m/d Math (KhanAcademy) - Either LinAlg (https://www.khanacademy.org/math/linear-algebra) `course:medium` OR Fast.ai (http://www.fast.ai/2017/07/17/num-lin-alg/) `course:medium` - Stats (https://www.khanacademy.org/math/statistics-probability) `course:medium` - Calc (https://www.khanacademy.org/math/calculus-home) `course:medium` Audio - The Master Algorithm (http://amzn.to/2kLOQjW) `audio:medium` Semi-technical overview of ML basics & main algorithms - Mathematical Decision Making (https://goo.gl/V75I49) `audio|course:hard` course on "Operations Research", similar to ML - Statistics (https://projects.iq.harvard.edu/stat110/youtube) `audio|course:hard` - Calculus 1 (https://goo.gl/fcLP3l) 2 (https://goo.gl/sBpljN) 3 (https://goo.gl/8Hdwuh) `course|audio:hard` - Convert video to audio: ** mp4 => mp3: `for f in *.mp4; do ffmpeg -i "$f" "${f%.mp4}.mp3" && rm "$f"; done` ** youtube => mp3: setup youtube-dl (https://github.com/rg3/youtube-dl) and run `youtube-dl -x youtube.com/playlist?list=`

    starstarstarstarstar
  • 10. Languages & Frameworks

    · Machine Learning Guide

    Languages & frameworks comparison. Languages: Python, R, MATLAB/Octave, Julia, Java/Scala, C/C++. Frameworks: Hadoop/Spark, Deeplearning4J, Theano, Torch, TensorFlow. ## Resources - Python (http://amzn.to/2mVgtJW) `book:medium` - TensorFlow Tutorials (https://www.tensorflow.org/get_started/get_started) `tutorial:medium` - Hands-On Machine Learning with Scikit-Learn and TensorFlow (http://amzn.to/2tVdIXN) `book:medium` ## Episode Languages - C/C++ ** Performance ** GPU (CUDA/cuDNN) - Math Langs ** R ** MATLAB / Octave ** Julia - Java / Scala ** Data mining ** Hadoop + Mahout / Spark + SparkML ** Deeplearning4j - Python ** R => Pandas ** MATLAB => numpy ** C/C++/GPU => TensorFlow (or other symbolic graph) ** Data Mining => PySpark ** Server (Flask, Django) - Analogy: Data => Analytics (biz intelligence, etc) => Adsense - Other languages like Node, Go, Rust (forgot to mention) see my answer (https://goo.gl/9d21xE) for why NOT to use them. - Articles ** Best Programming Language for Machine Learning (http://machinelearningmastery.com/best-programming-language-for-machine-learning) ** Data Science Job Report 2017 (http://r4stats.com/2017/02/28/r-passes-sas) Frameworks - ML libraries ** Numpy, Pandas, scikit-learn - Computational/symbolic graphs ** Automatic differentiation - Theano ** Math layer ** Blocks/Lasagne ML layer ** Keras DL layer - Torch ** CNNs ** note about RNNs - TensorFlow ** Perf over time ** Mobile etc ** Keras - Others ** Caffe (old-n-dying, C++) ** CNTK (MS) ** mxnet (Amazon) ** DL4J ** OpenCV (vision only) - Articles ** An Overview of Python Deep Learning Frameworks (http://www.kdnuggets.com/2017/02/python-deep-learning-frameworks-overview.html) ** Evaluation of Deep Learning Toolkits (https://github.com/zer0n/deepframeworks/blob/master/README.md) ** Comparing Frameworks: Deeplearning4j, Torch, Theano, TensorFlow, Caffe, Paddle, MxNet, Keras & CNTK (https://deeplearning4j.org/compare-dl4j-torch7-pylearn) - grain of salt, it's super heavy DL4J propaganda (written by them)

    starstarstarstarstar
  • 9. Deep Learning

    · Machine Learning Guide

    Deep learning and neural networks. How to stack our logisitic regression units into a multi-layer perceptron. ## Resources - Overview: ** Deep Learning Simplified (https://www.youtube.com/watch?v=b99UVkWzYTQ) `video:easy` quick series to get a lay-of-the-land. - Quickstart: ** TensorFlow Tutorials (https://www.tensorflow.org/get_started/get_started) `tutorial:medium` - Deep-dive code (pick one): ** Fast.ai (http://course.fast.ai/) `course:medium` practical DL for coders ** Hands-On Machine Learning with Scikit-Learn and TensorFlow (http://amzn.to/2tVdIXN) `book:medium` - Deep-dive theory (pick one): ** Deep Learning Book (http://amzn.to/2tXgCiT) (Free HTML version (http://www.deeplearningbook.org/)) `book:hard` comprehensive DL bible; highly mathematical ** Neural Networks and Deep Learning (http://neuralnetworksanddeeplearning.com/) `book:medium` shorter online "book" ## Episode - Value ** Represents brain? Magic black-box ** Feature learning (layer removed from programmer) ** Subsumes AI - Stacked shallow learning ** Logistic regression = lego, Neural Network = castle - Deep Learning => ANNs => MLPs (& RNNs, CNNs, DQNs, etc) ** MLP: Perceptron vs LogReg / sigmoid activation - Architecture ** (Feed forward) Input => Hidden Layers => Hypothesis fn ** "Feed forward" vs recursive (RNNs, later) ** (Loss function) Cross entropy ** (Learn) Back Propagation - Price ~ smoking + obesity + age^2 ** 1-layer MLP - Face? ~ pixels ** Extra layer = hierarchical breakdown ** Inputs => Employees => Supervisors => Boss - Backprop / Gradient descent ** Optimizers: adagrad, adam, ... vs gradient descent - Silver bullet, but don't abuse ** linear (housing market) ** features don't combine ** expensive: like hiring a company when the boss h(x) does all the work - Brian comparison (dentrites, axons); early pioneers as neuroscientists / cogsci - Different types ** vs brain ** RNNs ** CNNs - Activation fns ** Activation units / neurons (hidden layer) ** Relu, TanH, Sigmoid

    starstarstarstarstar
  • 8. Math

    · Machine Learning Guide

    Introduction to the branches of mathematics used in machine learning. Linear algebra, statistics, calculus. ## Resources Come back here after you've finished Ng's course; or learn these resources in tandem with ML (say 1 day a week). Primers (PDFs) - See "Section Notes" of cs229 (http://cs229.stanford.edu/materials.html) `handout:medium` KhanAcademy: - Either LinAlg (https://www.khanacademy.org/math/linear-algebra) `course:medium` OR Fast.ai (http://www.fast.ai/2017/07/17/num-lin-alg/) `course:medium` - Stats (https://www.khanacademy.org/math/statistics-probability) `course:medium` - Calc (https://www.khanacademy.org/math/calculus-home) `course:medium` Books - Linear Algebra Done Right (http://amzn.to/2t28p8F) `book:hard` - All of statistics (http://amzn.to/2t2dOwg) `book:hard` - Calculus (http://amzn.to/2tXfXhp) `book:hard` Audio (supplementary material) - Statistics (https://projects.iq.harvard.edu/stat110/youtube) `audio|course:hard` - Calculus 1 (https://goo.gl/fcLP3l) 2 (https://goo.gl/sBpljN) 3 (https://goo.gl/8Hdwuh) `course|audio:hard` - Mathematical Decision Making (https://goo.gl/V75I49) `audio|course:hard` course on "Operations Research", similar to ML - Information Theory (http://www.thegreatcourses.com/courses/the-science-of-information-from-language-to-black-holes.html) `course|audio:hard` - Convert video to audio: ** mp4 => mp3: `for f in *.mp4; do ffmpeg -i "$f" "${f%.mp4}.mp3" && rm "$f"; done` ** youtube => mp3: setup youtube-dl (https://github.com/rg3/youtube-dl) and run `youtube-dl -x youtube.com/playlist?list=` ## Episode - Linear Algebra = Matrix (or "Tensor") math. Wx + b. Chopping in our analogy. - Stats = Probability/inference, the heart of machine learning. Recipes/cookbook. - Calculus = Learning. Moving our error dot to the bottom of the valley. Baking, the actual "cook" step.

    starstarstarstarstar
  • 7. Logistic Regression

    · Machine Learning Guide

    Your first classifier: Logistic Regression. That plus Linear Regression, and you're a 101 supervised learner! ## Resources You've started Ng's Coursera course (https://www.coursera.org/learn/machine-learning), right? Riight? ## Episode See Andrew Ng Week 3 Lecture Notes (https://www.coursera.org/learn/machine-learning/resources/Zi29t)

    starstarstarstarstar
  • 6. Certificates & Degrees

    · Machine Learning Guide

    Discussion on certificates and degrees from Udacity to a Masters degree. ## Resources - Discussions: 1 (http://canyon289.github.io/DSGuide.html#DSGuide) 2 (https://news.ycombinator.com/item?id=13654127) 3 (http://cole-maclean.github.io/blog/Self%20Taught%20AI/) 4 (https://news.ycombinator.com/item?id=12516441) ## Episode Self-edify - Coursera Specialization - flat $500 - Udacity Nanodegree - $200/m (discount if timely completion) ** Great for self-teaching, not recognized degree ** Machine Learning (https://www.udacity.com/course/machine-learning-engineer-nanodegree--nd009) ** Self Driving Car (https://www.udacity.com/drive) ** Artificial Intelligence (https://www.udacity.com/ai) OMSCS (https://www.omscs.gatech.edu/): Great & cheap online masters degree Portfolio: Most important for getting a job

    starstarstarstarstar
  • 5. Linear Regression

    · Machine Learning Guide

    Introduction to the first machine-learning algorithm, the 'hello world' of supervised learning - Linear Regression ## Resources - Andrew Ng's Machine Learning Coursera course (https://www.coursera.org/learn/machine-learning) `course:hard` No question, the most essential, important, recommended resource in my entire series _period_. Consider it required, not optional. ## Episode See Andrew Ng Week 2 Lecture Notes (https://www.coursera.org/learn/machine-learning/resources/QQx8l)

    starstarstarstarstar
  • 4. Algorithms - Intuition

    · Machine Learning Guide

    Overview of machine learning algorithms. Infer/predict -> error/loss -> train/learn. Supervised, unsupervised, reinforcement learning. ## Resources - Tour of Machine Learning Algorithms (http://machinelearningmastery.com/a-tour-of-machine-learning-algorithms) `article:easy` - The Master Algorithm (http://amzn.to/2kLOQjW) `audio:medium` Semi-technical overview of ML basics & main algorithms ## Episode Learning (ML) - 3-step process ** Infer / Predict ** Error / Loss ** Train / Learn - First as batch from spreadsheet, then "online" going forward ** Pre-train your "model" ** "Examples" ** "Weights" - Housing cost example ** "Features" ** Infer cost based on num_rooms, sq_foot, etc ** Error / Loss function Categories - Supervised learning ** Vision (CNN) ** Speech (RNN) - Unsupervised ** Market segmentation - Reinforcement & Semi-Supervised ** Planning (DQN): Games (chess, Mario); Robot movement

    starstarstarstarstar