This list of resources is specifically targeted at Web Developers and Data Scientists.… so do with it what you will…
This list borrows heavily from multiple lists created by : sindresorhus
Machine learning is a subfield of artificial intelligence, which is broadly defined as the capability of a machine to imitate intelligent human behavior. Artificial intelligence systems are used to perform complex tasks in a way that is similar to how humans solve problems.
The goal of AI is to create computer models that exhibit "intelligent behaviors" like humans, according to Boris Katz, a principal research scientist and head of the InfoLab Group at CSAIL. This means machines that can recognize a visual scene, understand a text written in natural language, or perform an action in the physical world.
Machine learning is one way to use AI. It was defined in the 1950s by AI pioneer Arthur Samuel as "the field of study that gives computers the ability to learn without explicitly being programmed."
[📖] Good to Great: Why Some Companies Make the Leap...And Others Don't
[📖] Hello, Startup: A Programmer's Guide to Building Products, Technologies, and Teams
[📖] How Google Works
[📖] Learn to Earn: A Beginner's Guide to the Basics of Investing and Business
[📖] Rework
[📖] The Airbnb Story
[📖] The Personal MBA
[🅤]ꭏ App Monetization
[🅤]ꭏ App Marketing
Natural language processing
Natural language processing is a field of machine learning in which machines learn to understand natural language as spoken and written by humans, instead of the data and numbers normally used to program computers. This allows machines to recognize language, understand it, and respond to it, as well as create new text and translate between languages. Natural language processing enables familiar technology like chatbots and digital assistants like Siri or Alexa.
Neural networks
Neural networks are a commonly used, specific class of machine learning algorithms. Artificial neural networks are modeled on the human brain, in which thousands or millions of processing nodes are interconnected and organized into layers.
In an artificial neural network, cells, or nodes, are connected, with each cell processing inputs and producing an output that is sent to other neurons. Labeled data moves through the nodes, or cells, with each cell performing a different function. In a neural network trained to identify whether a picture contains a cat or not, the different nodes would assess the information and arrive at an output that indicates whether a picture features a cat.
Be familiar with howMachine Learning is applied at other companies
[📰] How Facebook uses super-efficient AI models to detect hate speech
[📰] Cannes: HowMachine Learning saves us $1.7M a year on document previews
-
[ ] Real-world AI Case Studies
- [ ] Andrej Karpathy on AI at Tesla (Full Stack Deep Learning - August 2018)
- [ ] Jai Ranganathan at Data Science at Uber (Full Stack Deep Learning - August 2018)
- [ ] John Apostolopoulos of Cisco discusses "Machine Learning in Networking"
0:48:44
- [ ] Joaquin Candela, Director of Applied Machine Learning , Facebook in conversation with Esteban Arcaute
0:52:27
- [ ] Eric Colson, Chief Algorithms Officer, Stitch Fix
0:53:57
- [ ] Claudia Perlich, Advisor to Dstillery and Adjunct Professor NYU Stern School of Business
0:51:59
- [ ] Jeff Dean, Google Senior Fellow and SVP Google AI - Deep Learning to Solve Challenging Problems
0:58:45
- [ ] James Parr, Director of Frontier Development Lab (NASA), FDL Europe & CEO, Trillium Technologies
0:55:46
- [ ] Daphne Koller, Founder & CEO of Insitro - In Conversation with Carlos Bustamante
0:49:29
- [ ] Eric Horvitz, Microsoft Research - AI in the Open World: Advances, Aspirations, and Rough Edges
0:56:11
- [ ] Tony Jebara, Netflix - Machine Learning for Recommendation and Personalization
0:55:20
[📺 ] How does YouTube recommend videos? - AI EXPLAINED!
0:33:53
[📺 ] How does Google Translate's AI work?
0:15:02
[📺 ] Data Science in Finance
0:17:52
-
[📺 ] The Age of AI
- [ ] How Far is Too Far? | The Age of A.I.
0:34:39
- [ ] Healed through A.I. | The Age of A.I.
0:39:55
- [ ] Using A.I. to build a better human | The Age of A.I.
0:44:27
- [ ] Love, art and stories: decoded | The Age of A.I.
0:38:57
- [ ] The 'Space Architects' of Mars | The Age of A.I.
0:30:10
- [ ] Will a robot take my job? | The Age of A.I.
0:36:14
- [ ] Saving the world one algorithm at a time | The Age of A.I.
0:46:37
- [ ] How A.I. is searching for Aliens | The Age of A.I.
0:36:12
- [ ] How Far is Too Far? | The Age of A.I.
[📺 ] Using Intent Data to Optimize the Self-Solve Experience
[📺 ] Trillions of Questions, No Easy Answers: A (home) movie about how Google Search works
[📺 ] Netflix Machine Learning Mock Interview: Type-ahead Search
[📺 ] Engineering Systems for Real-Time Predictions @DoorDash
[📺 ] How Gmail Uses Iterative Design, Machine Learning and AI to Create More Assistive Features
[📺 ] Wayfair Data Science Explains It All: Human-in-the-loop Systems
[📺 ] Leaving the lab: Building NLP applications that real people can use
[📺 ] Machine Learning at Uber (Natural Language Processing Use Cases)
[📺 ] Machine learning across industries with Vicki Boykis
0:34:02
[📺 ] Rachael Tatman - Conversational A.I. and Linguistics
0:36:51
[📺 ] Nicolas Koumchatzky - Machine Learning in Production for Self Driving Cars
0:44:56
[📺 ] Brandon Rohrer - Machine Learning in Production for Robots
0:34:31
Be able to frame anMachine Learning problem
[📰] Software 2.0
[📰] Most impactful AI trends of 2018: the rise ofMachine Learning Engineering
[📰] Building machine learning products: a problem well-defined is a problem half-solved.
[📰] Simple considerations for simple people building fancy neural networks
[📖] AI Superpowers: China, Silicon Valley, and the New World Order
[📖] Prediction Machines: The Simple Economics of Artificial Intelligence
[📖] Building Machine Learning Powered Applications: Going from Idea to Product
[ ] Pluralsight: How to Think About Machine Learning Algorithms
[📺 ] Vincent Warmerdam: The profession of solving (the wrong problem) | PyData Amsterdam 2019
[📺 ] Hugging Face, Transformers | NLP Research and Open Source | Interview with Julien Chaumond
[📺 ] Vincent Warmerdam - Playing by the Rules-Based-Systems | PyData Eindhoven 2020
Be familiar with data ethics
-
- [ ] Lesson 1: Disinformation
- [ ] Lesson 2: Bias & Fairness
- [ ] Lesson 3: Ethical Foundations & Practical Tools
- [ ] Lesson 4: Privacy and surveillance
- [ ] Lesson 4 continued: Privacy and surveillance
- [ ] Lesson 5.1: The problem with metrics
- [ ] Lesson 5.2: Our Ecosystem, Venture Capital, & Hypergrowth
- [ ] Lesson 5.3: Losing the Forest for the Trees, guest lecture by Ali Alkhatib
- [ ] Lesson 6: Algorithmic Colonialism, and Next Steps
[📺 ] Lecture 9: Ethics (Full Stack Deep Learning - Spring 2021)
1:04:50
[📺 ] SE4AI: Ethics and Fairness
1:18:37
[📺 ] SE4AI: Security
1:18:24
[📺 ] SE4AI: Safety
1:17:37
Be able to import data from multiple sources
Be able to setup data annotation efficiently
[📰] Create A Synthetic Image Dataset — The “What”, The “Why” and The “How”
[📰] Machine Learning Infrastructure Tools for Data Preparation
[📰] Exploring the Role of Human Raters in Creating NLP Datasets
[📺 ] Snorkel: Dark Data and Machine Learning - Christopher Ré
[📺 ] Training a NER Model with Prodigy and Transfer Learning
[📺 ] Training a New Entity Type with Prodigy – annotation powered by active learning
[📺 ] ECCV 2020 WSL tutorial: 4. Human-in-the-loop annotations
[📺 ] Active Learning: Why Smart Labeling is the Future of Data Annotation | Alectio
[📺 ] Lecture 8: Data Management (Full Stack Deep Learning - Spring 2021)
0:59:42
[📺 ] Lab 6: Data Labeling (Full Stack Deep Learning - Spring 2021)
0:05:06
[📺 ] SE4AI: Data Quality
1:07:15
[📺 ] SE4AI: Data Programming and Intro to Big Data Processing
0:33:04
[📺 ] SE4AI: Managing and Processing Large Datasets
1:21:27
Be able to manipulate data with Numpy
[📰] NumPy Fundamentals for Data Science and Machine Learning
[ ] Pluralsight: Working with Multidimensional Data Using NumPy
Be able to manipulate data with Pandas
[📰] A Gentle Visual Intro to Data Analysis in Python Using Pandas
[📰] Comprehensive Guide to Grouping and Aggregating with Pandas
[📰] 8 Python Pandas Value_counts() tricks that make your work more efficient
[ ] edX: Implementing Predictive Analytics with Spark in Azure HDInsight
-
[📰] Modern Pandas
Be able to manipulate data in spreadsheets
Be able to manipulate data in databases
[💻] Intermediate SQL
[💻] Reporting in SQL
[💻] Database Design
Be able to use Linux
[📰] Understand Linux Load Averages and Monitor Performance of Linux
[📰] Command-line Tools can be 235x Faster than your Hadoop Cluster
[ ] Calmcode: entr
-
[ ] MIT: The Missing Semester of CS Education
- [ ] Lecture 1: Course Overview + The Shell (2020)
0:48:16
- [ ] Lecture 2: Shell Tools and Scripting (2020)
0:48:55
- [ ] Lecture 3: Editors (vim) (2020)
0:48:26
- [ ] Lecture 4: Data Wrangling (2020)
0:50:03
- [ ] Lecture 5: Command-line Environment (2020)
0:56:06
- [ ] Lecture 6: Version Control (git) (2020)
1:24:59
- [ ] Lecture 7: Debugging and Profiling (2020)
0:54:13
- [ ] Lecture 8: Metaprogramming (2020)
0:49:52
- [ ] Lecture 9: Security and Cryptography (2020)
1:00:59
- [ ] Lecture 10: Potpourri (2020)
0:57:54
- [ ] Lecture 11: Q&A (2020)
0:53:52
- [ ] Lecture 1: Course Overview + The Shell (2020)
[ ] Thoughtbot: tmux
[🅤]ꭏ Shell Workshop
[📺 ] GNU Parallel
Be able to perform feature selection and engineering
[📺 ] AppliedMachine Learning 2020 - 04 - Preprocessing
1:07:40
[📺 ] AppliedMachine Learning 2020 - 11 - Model Inspection and Feature Selection
1:15:15
Be able to experiment in a notebook
[📰] Securely storing configuration credentials in a Jupyter Notebook
[ ] Pluralsight: Getting Started with Jupyter Notebook and Python
[📺 ] I Like Notebooks
[📺 ] I don't like notebooks.- Joel Grus (Allen Institute for Artificial Intelligence)
[📺 ] Ryan Herr - After model.fit, before you deploy| JupyterCon 2020
Be able to visualize data
Be able to do literature review using research papers
[ ] Paper: Efficient Estimation of Word Representations in Vector Space
[ ] Paper: Sequence to Sequence Learning with Neural Networks
[ ] Paper: Neural Machine Translation by Jointly Learning to Align and Translate
[ ] Paper: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
[ ] Paper: XLNet: Generalized Autoregressive Pretraining for Language Understanding
[ ] Paper: Synonyms Based Term Weighting Scheme: An Extension to TF.IDF
[ ] Paper: RoBERTa: A Robustly Optimized BERT Pretraining Approach
[ ] Paper: GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
[ ] Paper: Amazon.com Recommendations Item-to-Item Collaborative Filtering
[ ] Paper: Collaborative Filtering for Implicit Feedback Datasets
[ ] Paper: BPR: Bayesian Personalized Ranking from Implicit Feedback
[ ] Paper: Neural Factorization Machines for Sparse Predictive Analytics
[ ] Paper: Multiword Expressions: A Pain in the Neck for NLP
[ ] Paper: PyTorch: An Imperative Style, High-Performance Deep Learning Library
[ ] Paper: ALBERT: A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS
[ ] Paper: Self-supervised Visual Feature Learning with Deep Neural Networks: A Survey
[ ] Paper: A Simple Framework for Contrastive Learning of Visual Representations
[ ] Paper: Self-Supervised Learning of Pretext-Invariant Representations
[ ] Paper: FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence
[ ] Paper: Self-Labelling via Simultalaneous Clustering and Representation Learning
[ ] Paper: A survey on Semi-, Self- and Unsupervised Techniques in Image Classification
[ ] Paper: Train Once, Test Anywhere: Zero-Shot Learning for Text Classification
[ ] Paper: Zero-shot Text Classification With Generative Language Models
[ ] Paper: Beyond Accuracy: Behavioral Testing of NLP models with CheckList
[ ] Paper: Boosting Self-Supervised Learning via Knowledge Transfer
[ ] Paper: Question Generation via Overgenerating Transformations and Ranking
[ ] Paper: Good Question! Statistical Ranking for Question Generation
[ ] Paper: TowardsMachine Learning Engineering: A Brief History Of TensorFlow Extended (TFX)
[ ] Paper: Pest Management In Cotton Farms: An AI-System Case Study from the Global South
[ ] Paper: BERT2DNN: BERT Distillation with Massive Unlabeled Data for Online E-Commerce Search
[ ] Paper: On the surprising similarities between supervised and self-supervised models
[ ] Paper: All-but-the-Top: Simple and Effective Postprocessing for Word Representations
[ ] Paper: Simple and Effective Dimensionality Reduction for Word Embeddings
[ ] Paper: AutoCompete: A Framework for Machine Learning Competitions
[ ] Paper: Cost-effective Deployment of BERT Models in Serverless Environment
[📺 ] mixup: Beyond Empirical Risk Minimization (Paper Explained)
Be able to model problems mathematically
-
[ ] 3Blue1Brown: Essence of Calculus
- [ ] The Essence of Calculus, Chapter 1
0:17:04
- [ ] The paradox of the derivative | Essence of calculus, chapter 2
0:17:57
- [ ] Derivative formulas through geometry | Essence of calculus, chapter 3
0:18:43
- [ ] Visualizing the chain rule and product rule | Essence of calculus, chapter 4
0:16:52
- [ ] What's so special about Euler's number e? | Essence of calculus, chapter 5
0:13:50
- [ ] Implicit differentiation, what's going on here? | Essence of calculus, chapter 6
0:15:33
- [ ] Limits, L'Hôpital's rule, and epsilon delta definitions | Essence of calculus, chapter 7
0:18:26
- [ ] Integration and the fundamental theorem of calculus | Essence of calculus, chapter 8
0:20:46
- [ ] What does area have to do with slope? | Essence of calculus, chapter 9
0:12:39
- [ ] Higher order derivatives | Essence of calculus, chapter 10
0:05:38
- [ ] Taylor series | Essence of calculus, chapter 11
0:22:19
- [ ] What they won't teach you in calculus
0:16:22
- [ ] The Essence of Calculus, Chapter 1
-
[ ] 3Blue1Brown: Essence of linear algebra
- [ ] Vectors, what even are they? | Essence of linear algebra, chapter 1
0:09:52
- [ ] Linear combinations, span, and basis vectors | Essence of linear algebra, chapter 2
0:09:59
- [ ] Linear transformations and matrices | Essence of linear algebra, chapter 3
0:10:58
- [ ] Matrix multiplication as composition | Essence of linear algebra, chapter 4
0:10:03
- [ ] Three-dimensional linear transformations | Essence of linear algebra, chapter 5
0:04:46
- [ ] The determinant | Essence of linear algebra, chapter 6
0:10:03
- [ ] Inverse matrices, column space and null space | Essence of linear algebra, chapter 7
0:12:08
- [ ] Nonsquare matrices as transformations between dimensions | Essence of linear algebra, chapter 8
0:04:27
- [ ] Dot products and duality | Essence of linear algebra, chapter 9
0:14:11
- [ ] Cross products | Essence of linear algebra, Chapter 10
0:08:53
- [ ] Cross products in the light of linear transformations | Essence of linear algebra chapter 11
0:13:10
- [ ] Cramer's rule, explained geometrically | Essence of linear algebra, chapter 12
0:12:12
- [ ] Change of basis | Essence of linear algebra, chapter 13
0:12:50
- [ ] Eigenvectors and eigenvalues | Essence of linear algebra, chapter 14
0:17:15
- [ ] Abstract vector spaces | Essence of linear algebra, chapter 15
0:16:46
- [ ] Vectors, what even are they? | Essence of linear algebra, chapter 1
[📰] Introduction to Linear Algebra for Applied Machine Learning with Python
[📰] Entropy of a probability distribution — in layman’s terms
[📰] PageRank - How Eigenvectors Power the Algorithm Behind Google Search
-
[ ] MIT: 18.06 Linear Algebra (Professor Strang)
- [ ] 1. The Geometry of Linear Equations
0:39:49
- [ ] 2. Elimination with Matrices.
0:47:41
- [ ] 3. Multiplication and Inverse Matrices
0:46:48
- [ ] 4. Factorization into A = LU
0:48:05
- [ ] 5. Transposes, Permutations, Spaces R^n
0:47:41
- [ ] 6. Column Space and Nullspace
0:46:01
- [ ] 9. Independence, Basis, and Dimension
0:50:14
- [ ] 10. The Four Fundamental Subspaces
0:49:20
- [ ] 11. Matrix Spaces; Rank 1; Small World Graphs
0:45:55
- [ ] 14. Orthogonal Vectors and Subspaces
0:49:47
- [ ] 15. Projections onto Subspaces
0:48:51
- [ ] 16. Projection Matrices and Least Squares
0:48:05
- [ ] 17. Orthogonal Matrices and Gram-Schmidt
0:49:09
- [ ] 21. Eigenvalues and Eigenvectors
0:51:22
- [ ] 22. Diagonalization and Powers of A
0:51:50
- [ ] 24. Markov Matrices; Fourier Series
0:51:11
- [ ] 25. Symmetric Matrices and Positive Definiteness
0:43:52
- [ ] 27. Positive Definite Matrices and Minima
0:50:40
- [ ] 29. Singular Value Decomposition
0:40:28
- [ ] 30. Linear Transformations and Their Matrices
0:49:27
- [ ] 31. Change of Basis; Image Compression
0:50:13
- [ ] 33. Left and Right Inverses; Pseudoinverse
0:41:52
- [ ] 1. The Geometry of Linear Equations
-
[ ] StatQuest: Statistics Fundamentals
- [ ] StatQuest: Histograms, Clearly Explained
0:03:42
- [ ] StatQuest: What is a statistical distribution?
0:05:14
- [ ] StatQuest: The Normal Distribution, Clearly Explained!!!
0:05:12
- [ ] Statistics Fundamentals: Population Parameters
0:14:31
- [ ] Statistics Fundamentals: The Mean, Variance and Standard Deviation
0:14:22
- [ ] StatQuest: What is a statistical model?
0:03:45
- [ ] StatQuest: Sampling A Distribution
0:03:48
- [ ] Hypothesis Testing and The Null Hypothesis
0:14:40
- [ ] Alternative Hypotheses: Main Ideas!!!
0:09:49
- [ ] p-values: What they are and how to interpret them
0:11:22
- [ ] How to calculate p-values
0:25:15
- [ ] p-hacking: What it is and how to avoid it!
0:13:44
- [ ] Statistical Power, Clearly Explained!!!
0:08:19
- [ ] Power Analysis, Clearly Explained!!!
0:16:44
- [ ] Covariance and Correlation Part 1: Covariance
0:22:23
- [ ] Covariance and Correlation Part 2: Pearson's Correlation
0:19:13
- [ ] StatQuest: R-squared explained
0:11:01
- [ ] The Central Limit Theorem
0:07:35
- [ ] StatQuickie: Standard Deviation vs Standard Error
0:02:52
- [ ] StatQuest: The standard error
0:11:43
- [ ] StatQuest: Technical and Biological Replicates
0:05:27
- [ ] StatQuest - Sample Size and Effective Sample Size, Clearly Explained
0:06:32
- [ ] Bar Charts Are Better than Pie Charts
0:01:45
- [ ] StatQuest: Boxplots, Clearly Explained
0:02:33
- [ ] StatQuest: Logs (logarithms), clearly explained
0:15:37
- [ ] StatQuest: Confidence Intervals
0:06:41
- [ ] StatQuickie: Thresholds for Significance
0:06:40
- [ ] StatQuickie: Which t test to use
0:05:10
- [ ] StatQuest: One or Two Tailed P-Values
0:07:05
- [ ] The Binomial Distribution and Test, Clearly Explained!!!
0:15:46
- [ ] StatQuest: Quantiles and Percentiles, Clearly Explained!!!
0:06:30
- [ ] StatQuest: Quantile-Quantile Plots (QQ plots), Clearly Explained
0:06:55
- [ ] StatQuest: Quantile Normalization
0:04:51
- [ ] StatQuest: Probability vs Likelihood
0:05:01
- [ ] StatQuest: Maximum Likelihood, clearly explained!!!
0:06:12
- [ ] Maximum Likelihood for the Exponential Distribution, Clearly Explained! V2.0
0:09:39
- [ ] Why Dividing By N Underestimates the Variance
0:17:14
- [ ] Maximum Likelihood for the Binomial Distribution, Clearly Explained!!!
0:11:24
- [ ] Maximum Likelihood For the Normal Distribution, step-by-step!
0:19:50
- [ ] StatQuest: Odds and Log(Odds), Clearly Explained!!!
0:11:30
- [ ] StatQuest: Odds Ratios and Log(Odds Ratios), Clearly Explained!!!
0:16:20
- [ ] Live 2020-04-20!!! Expected Values
0:33:00
- [ ] StatQuest: Histograms, Clearly Explained
[🅤]ꭏ Statistics
Be able to setup project structure
[📰] pydantic
[📰] Organizing machine learning projects: project management guidelines
[📰] Best practices to write Deep Learning code: Project structure, OOP, Type checking and documentation
[📰] Reproducible and upgradable Conda environments: dependency management with conda-lock
[📰] Options for packaging your Python code: Wheels, Conda, Docker, and more
[📰] Making model training scripts robust to spot interruptions
[ ] Calmcode: tqdm
[💻] Conda Essentials
[🅤]ꭏ Writing READMEs
[📺 ] Hydra configuration
[📺 ] OO Design and Testing Patterns for Machine Learning with Chris Gerpheide
[📺 ] Tutorial: Sebastian Witowski - Modern Python Developer's Toolkit
[📺 ] Lecture 13:Machine Learning Teams (Full Stack Deep Learning - Spring 2021)
0:58:13
[📺 ] Lecture 5:Machine Learning Projects (Full Stack Deep Learning - Spring 2021)
1:13:14
[📺 ] Lecture 6: Infrastructure & Tooling (Full Stack Deep Learning - Spring 2021)
1:07:21
Be able to version control code
[📰] How to track large files in Github / Bitbucket? Git LFS to the rescue
[📰] Keep your git directory clean with
git clean
andgit trash
[📺 ] Git & Scripting
Be able to version control data
-
[📺 ] DVC Basics
[📺 ] Data versioning in machine learning projects - Dmitry Petrov
0:34:44
Be able to use experiment management tools
[📰] Supercharge your Training with Pytorch Lightning + Weights & Biases
[📺 ] Lab 5: Experiment Management (Full Stack Deep Learning - Spring 2021)
0:30:41
[📺 ] Weight & Biases
[📺 ] SE4AI: Versioning, Provenance, and Reproducibility
1:18:29
Be able to setup model validation
[📰] Precision, Recall, Accuracy, and F1 Score for Multi-Label Classification
[📰] The Complete Guide to AUC and Average Precision: Simulations and Visualizations
[📰] Best Use of Train/Val/Test Splits, with Tips for Medical Data
[📰] The correct way to evaluate online machine learning models
[📰] Proxy Metrics
[📺 ] AppliedMachine Learning 2020 - 09 - Model Evaluation and Metrics
1:18:23
[📺 ] Machine Learning Fundamentals: Cross Validation
0:06:04
[📺 ] Machine Learning Fundamentals: The Confusion Matrix
0:07:12
[📺 ] Machine Learning Fundamentals: Sensitivity and Specificity
0:11:46
[📺 ] Machine Learning Fundamentals: Bias and Variance
0:06:36
[📺 ] ROC and AUC, Clearly Explained!
0:16:26
Be familiar with inner working of models
Bays theorem is super interesting and applicable ==> - [📰] Naive Bayes classification
[📰] Decision trees
[📰] Random forests
[📰] Boosted trees
[📰] Hacker's Guide to Fundamental Machine Learning Algorithms with Python
[📰] One-vs-Rest and One-vs-One for Multi-Class Classification
-
[ ] StatQuest: Machine Learning
- [ ] StatQuest: Fitting a line to data, aka least squares, aka linear regression.
0:09:21
- [ ] StatQuest: Linear Models Pt.1 - Linear Regression
0:27:26
- [ ] StatQuest: StatQuest: Linear Models Pt.2 - t-tests and ANOVA
0:11:37
- [ ] StatQuest: Odds and Log(Odds), Clearly Explained!!!
0:11:30
- [ ] StatQuest: Odds Ratios and Log(Odds Ratios), Clearly Explained!!!
0:16:20
- [ ] StatQuest: Logistic Regression
0:08:47
- [ ] Logistic Regression Details Pt1: Coefficients
0:19:02
- [ ] Logistic Regression Details Pt 2: Maximum Likelihood
0:10:23
- [ ] Logistic Regression Details Pt 3: R-squared and p-value
0:15:25
- [ ] Saturated Models and Deviance
0:18:39
- [ ] Deviance Residuals
0:06:18
- [ ] Regularization Part 1: Ridge (L2) Regression
0:20:26
- [ ] Regularization Part 2: Lasso (L1) Regression
0:08:19
- [ ] Ridge vs Lasso Regression, Visualized!!!
0:09:05
- [ ] Regularization Part 3: Elastic Net Regression
0:05:19
- [ ] StatQuest: Principal Component Analysis (PCA), Step-by-Step
0:21:57
- [ ] StatQuest: PCA main ideas in only 5 minutes!!!
0:06:04
- [ ] StatQuest: PCA - Practical Tips
0:08:19
- [ ] StatQuest: PCA in Python
0:11:37
- [ ] StatQuest: Linear Discriminant Analysis (LDA) clearly explained.
0:15:12
- [ ] StatQuest: MDS and PCoA
0:08:18
- [ ] StatQuest: t-SNE, Clearly Explained
0:11:47
- [ ] StatQuest: Hierarchical Clustering
0:11:19
- [ ] StatQuest: K-means clustering
0:08:57
- [ ] StatQuest: K-nearest neighbors, Clearly Explained
0:05:30
- [ ] Naive Bayes, Clearly Explained!!!
0:15:12
- [ ] Gaussian Naive Bayes, Clearly Explained!!!
0:09:41
- [ ] StatQuest: Decision Trees
0:17:22
- [ ] StatQuest: Decision Trees, Part 2 - Feature Selection and Missing Data
0:05:16
- [ ] Regression Trees, Clearly Explained!!!
0:22:33
- [ ] How to Prune Regression Trees, Clearly Explained!!!
0:16:15
- [ ] StatQuest: Random Forests Part 1 - Building, Using and Evaluating
0:09:54
- [ ] StatQuest: Random Forests Part 2: Missing data and clustering
0:11:53
- [ ] The Chain Rule
0:18:23
- [ ] Gradient Descent, Step-by-Step
0:23:54
- [ ] Stochastic Gradient Descent, Clearly Explained!!!
0:10:53
- [ ] AdaBoost, Clearly Explained
0:20:54
- [⨊ ] Part 1: Regression Main Ideas
0:15:52
- [⨊ ] Part 2: Regression Details
0:26:45
- [⨊ ] Part 3: Classification
0:17:02
- [⨊ ] Part 4: Classification Details
0:36:59
- [⨊ ] Support Vector Machines, Clearly Explained!!!
0:20:32
- [ ] Support Vector Machines Part 2: The Polynomial Kernel
0:07:15
- [ ] Support Vector Machines Part 3: The Radial (RBF) Kernel
0:15:52
- [ ] XGBoost Part 1: Regression
0:25:46
- [ ] XGBoost Part 2: Classification
0:25:17
- [ ] XGBoost Part 3: Mathematical Details
0:27:24
- [ ] XGBoost Part 4: Crazy Cool Optimizations
0:24:27
- [ ] StatQuest: Fiitting a curve to data, aka lowess, aka loess
0:10:10
- [ ] Statistics Fundamentals: Population Parameters
0:14:31
- [ ] Principal Component Analysis (PCA) clearly explained (2015)
0:20:16
- [ ] Decision Trees in Python from Start to Finish
1:06:23
- [ ] StatQuest: Fitting a line to data, aka least squares, aka linear regression.
-
[📺 ] Neural Networks from Scratch in Python
- [ ] Neural Networks from Scratch - P.1 Intro and Neuron Code
0:16:59
- [ ] Neural Networks from Scratch - P.2 Coding a Layer
0:15:06
- [ ] Neural Networks from Scratch - P.3 The Dot Product
0:25:17
- [ ] Neural Networks from Scratch - P.4 Batches, Layers, and Objects
0:33:46
- [ ] Neural Networks from Scratch - P.5 Hidden Layer Activation Functions
0:40:05
- [ ] Neural Networks from Scratch - P.1 Intro and Neuron Code
[📺 ] AppliedMachine Learning 2020 - 03 Supervised learning and model validation
1:12:00
[📺 ] AppliedMachine Learning 2020 - 05 - Linear Models for Regression
1:06:54
[📺 ] AppliedMachine Learning 2020 - 06 - Linear Models for Classification
1:07:50
[📺 ] AppliedMachine Learning 2020 - 07 - Decision Trees and Random Forests
1:07:58
[📺 ] AppliedMachine Learning 2020 - 08 - Gradient Boosting
1:02:12
[📺 ] AppliedMachine Learning 2020 - 18 - Neural Networks
1:19:36
[📺 ] AppliedMachine Learning 2020 - 12 - AutoML (plus some feature selection)
1:25:38
Be able to improve models
[📰] Normalizing your data (specifically, input and batch normalization)
[📰] In-layer normalization techniques for training very deep neural networks
[📰] Uncertainty Quantification Part 4: Leveraging Dropout in Neural Networks (CNNs)
[📺 ] AppliedMachine Learning 2020 - 10 - Calibration, Imbalanced data
1:16:14
Be familiar with fundamental Machine Learning concepts
[📰] Connections: Log Likelihood, Cross Entropy, KL Divergence, Logistic Regression, and Neural Networks
[📰] Gradient descent
[📰] Dismantling Neural Networks to Understand the Inner Workings with Math and Pytorch
[💻] AI Fundamentals
[ ] Elements of AI
[📺 ] Deep Double Descent
Implement models in scikit-learn
[ ] Pluralsight: Building Machine Learning Models in Python with scikit-learn
[📺 ] dabl: Automatic Machine Learning with a Human in the Loop
00:25:43
[📺 ] Multilabel and Multioutput Classification -Machine Learning with TensorFlow & scikit-learn on Python
[📺 ] DABL: Automatic machine learning with a human in the loop- AI Latim American SumMIT Day 1
Be able to implement models in Tensorflow and Keras
Be able to implement models in PyTorch
[📰] An introduction to PyTorch Lightning with comparisons to PyTorch
[📰] From PyTorch to PyTorch Lightning — A gentle introduction
[📰] Introducing PyTorch Lightning Sharded: Train SOTA Models, With Half The Memory
[📰] Sharded: A New Technique To Double The Size Of PyTorch Models
[📰] A developer-friendly guide to mixed precision training with PyTorch
[📰] A developer-friendly guide to model quantization with PyTorch
[📰] Tricks for training PyTorch models to convergence more quickly
[📰] PyTorch Lightning Bolts — From Linear, Logistic Regression on TPUs to pre-trained GANs
[📰] Training Neural Nets on Larger Batches: Practical Tips for 1-GPU, Multi-GPU & Distributed setups
[📰] PyTorch Lightning 0.9 — synced BatchNorm, DataModules and final API!
[📰] PyTorch Multi-GPU Metrics Library and More in PyTorch Lightning 0.8.1
[📰] Distributed model training in PyTorch using DistributedDataParallel
[📰] Distributed model training in PyTorch using DistributedDataParallel
[📰] EINSUM IS ALL YOU NEED - EINSTEIN SUMMATION IN DEEP LEARNING
[📰] Faster Deep Learning Training with PyTorch – a 2021 Guide
[📰] Fit More and Train Faster With ZeRO via DeepSpeed and FairScale
[📰] PyTorch Lightning V1.2.0- DeepSpeed, Pruning, Quantization, SWA
[📰] Taming LSTMs: Variable-sized mini-batches and why PyTorch is good for your health
[📰] Pad pack sequences for Pytorch batch processing with DataLoader
[ ] Deeplizard: Neural Network Programming - Deep Learning with PyTorch
[📺 ] PyTorch Performance Tuning Guide
26:41:00
-
[📺 ] Skin Cancer Detection with PyTorch
- [ ] [PART 1] Skin Cancer Detection with PyTorch
0:10:21
- [ ] [PART 2] Skin Cancer Detection with PyTorch
0:21:57
- [ ] [PART 3] Skin Cancer Detection with PyTorch
0:22:24
- [ ] [PART 1] Skin Cancer Detection with PyTorch
[📺 ] Learn with Lightning
[📺 ] PyTorch Tutorial - RNN & LSTM & GRU - Recurrent Neural Nets
00:15:51
-
[📺 ] Pytorch Zero to All
- [ ] PyTorch Lecture 01: Overview
0:10:18
- [ ] PyTorch Lecture 02: Linear Model
0:12:52
- [ ] PyTorch Lecture 03: Gradient Descent
0:08:24
- [ ] PyTorch Lecture 04: Back-propagation and Autograd
0:15:25
- [ ] PyTorch Lecture 05: Linear Regression in the PyTorch way
0:11:50
- [ ] PyTorch Lecture 06: Logistic Regression
0:10:41
- [ ] PyTorch Lecture 07: Wide and Deep
0:10:37
- [ ] PyTorch Lecture 08: PyTorch DataLoader
0:06:41
- [ ] PyTorch Lecture 09: Softmax Classifier
0:18:47
- [ ] PyTorch Lecture 10: Basic CNN
0:15:52
- [ ] PyTorch Lecture 11: Advanced CNN
0:12:58
- [ ] PyTorch Lecture 12: RNN1 - Basics
0:28:47
- [ ] PyTorch Lecture 13: RNN 2 - Classification
0:17:22
- [ ] PyTorch Lecture 01: Overview
[📺 ] Lightning Chat: How a Grandmaster Won a Kaggle Competition Using Pytorch Lightning
[📺 ] JAX: accelerated machine learning research via composable function transformations in Python
Be able to implement models using cloud services
[ ] AWS: Amazon Transcribe Deep Dive: Using Feedback Loops to Improve Confidence Level of Transcription
[ ] AWS: Build a Text Classification Model with AWS Glue and Amazon SageMaker
[ ] AWS: Deep Dive on Amazon Rekognition: Building Computer Visions Based Smart Applications
[ ] AWS: Introduction to AWS Marketplace - Machine Learning Category
[ ] edX: Amazon SageMaker: Simplifying Machine Learning Application Development
Be able to apply unsupervised learning algorithms
[📰] GANs in computer vision - Conditional image synthesis and 3D object generation
[📰] GANs in computer vision - Introduction to generative learning
[📰] Paper Summary: DeViSE: A Deep Visual-Semantic Embedding Model
[📰] Self-supervised learning: The dark matter of intelligence
[📰] Understanding self-supervised and contrastive learning with "Bootstrap Your Own Latent" (BYOL)
[📰] Build a simple Image Retrieval System with an Autoencoder
[📰] An overview of proxy-label approaches for semi-supervised learning
[📰] From Research to Production with Deep Semi-Supervised Learning
[📰] A Framework For Contrastive Self-Supervised Learning And Designing A New Approach
[📰] A gentle introduction to HDBSCAN and density-based clustering
[📰] Unsupervised Learning of Visual Features by Contrasting Cluster Assignments
-
[ ] Berkeley: Deep Unsupervised Learning Spring 2020
- [ ] L1 Introduction -- CS294-158-SP20 Deep Unsupervised Learning -- UC Berkeley, Spring 2020
1:10:02
- [ ] L2 Autoregressive Models -- CS294-158-SP20 Deep Unsupervised Learning -- UC Berkeley, Spring 2020
2:27:23
- [ ] L3 Flow Models -- CS294-158-SP20 Deep Unsupervised Learning -- UC Berkeley -- Spring 2020
1:56:53
- [ ] L4 Latent Variable Models (VAE) -- CS294-158-SP20 Deep Unsupervised Learning -- UC Berkeley
2:19:33
- [ ] Lecture 5 Implicit Models -- GANs Part I --- UC Berkeley, Spring 2020
2:32:32
- [ ] Lecture 6 Implicit Models / GANs part II --- CS294-158-SP20 Deep Unsupervised Learning -- Berkeley
2:09:14
- [ ] Lecture 7 Self-Supervised Learning -- UC Berkeley Spring 2020 - CS294-158 Deep Unsupervised Learning
2:20:41
- [ ] L8 Round-up of Strengths and Weaknesses of Unsupervised Learning Methods -- UC Berkeley SP20
0:41:51
- [ ] L9 Semi-Supervised Learning and Unsupervised Distribution Alignment -- CS294-158-SP20 UC Berkeley
2:16:00
- [ ] L10 Compression -- UC Berkeley, Spring 2020, CS294-158 Deep Unsupervised Learning
3:09:49
- [ ] L11 Language Models -- guest instructor: Alec Radford (OpenAI) --- Deep Unsupervised Learning SP20
2:38:19
- [ ] L12 Representation Learning for Reinforcement Learning --- CS294-158 UC Berkeley Spring 2020
2:01:56
- [ ] L1 Introduction -- CS294-158-SP20 Deep Unsupervised Learning -- UC Berkeley, Spring 2020
[ ] Deck: Demystifying Self-Supervised Learning for Visual Recognition
[G] Clustering
[ ] Wandb: Unsupervised Visual Representation Learning with SwAV
[📺 ] AppliedMachine Learning 2020 - 14 - Clustering and Mixture Models
1:26:33
[📺 ] AppliedMachine Learning 2020 - 13 - Dimensionality reduction
1:30:34
[📺 ] BYOL: Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning (Paper Explained)
[📺 ] A critical analysis of self-supervision, or what we can learn from a single image (Paper Explained)
[📺 ] Week 10 – Lecture: Self-supervised learning (SSL) in computer vision (CV)
[📺 ] CVPR 2020 Tutorial: Towards Annotation-Efficient Learning
[📺 ] Yuki Asano | Self-Supervision | Self-Labelling | Labelling Unlabelled videos | CV | CTDS.Show #81
[📺 ] Variational Autoencoders - EXPLAINED!
0:17:36
[📺 ] OptaProAnalyticsForum– Learning to watch football: Self-supervised representations for tracking data
[📺 ] Can a Neural Net tell if an image is mirrored? – Visual Chirality
[📺 ] Deep InfoMax: Learning deep representations by mutual information estimation and maximization
-
[ ] Deep Learning Lecture Summer 2020
- [ ] Deep Learning: Unsupervised Learning - Part 1
- [ ] Deep Learning: Unsupervised Learning - Part 2
- [ ] Deep Learning: Unsupervised Learning - Part 3
- [ ] Deep Learning: Unsupervised Learning - Part 4
- [ ] Deep Learning: Unsupervised Learning - Part 5
- [ ] Deep Learning: Weakly and Self-Supervised Learning - Part 1
- [ ] Deep Learning: Weakly and Self-Supervised Learning - Part 2
- [ ] Deep Learning: Weakly and Self-Supervised Learning - Part 3
- [ ] Deep Learning: Weakly and Self-Supervised Learning - Part 4
-
[ ] ECCV 2020: New Frontiers for Learning with Limited Labels or Data
-
[📺 ] Self-Supervised Learning - What is Next? - Workshop at ECCV 2020, August 28th
- [ ] Next Challenges for Self-Supervised Learning - Aäron van den Oord
0:20:13
- [ ] Perspectives on Unsupervised Representation Learning - Paolo Favaro
0:42:41
- [ ] Learning and Transferring Visual Representations with Few Labels - Carl Doersch
0:32:53
- [ ] Self-Supervision as a Path to a Post-Dataset Era - Alexei Alyosha Efros
0:38:06
- [ ] Next Challenges for Self-Supervised Learning - Aäron van den Oord
[📺 ] Sebastian Ruder: Neural Semi-supervised Learning under Domain Shift
Be able to implement NLP models
[📰] Fixing common Unicode mistakes with Python – after they’ve been made
[📰] 10 Popular Keyword Extraction Algorithms in Natural Language Processing
[📰] How To Create Data Products That Are Magical Using Sequence-to-Sequence Models
[📰] Locality-sensitive Hashing and Singular to Plural Noun Conversion
[📰] Build A Keyword Extraction API with Spacy, Flask, and FuzzyWuzzy
[📰] Unsupervised creation of interpretable sentence representations
[📰] Advance BERT model via transferring knowledge from Cross-Encoders to Bi-Encoders
[📰] T5 — a model that explores the limits of transfer learning
[📰] How to build a State-of-the-Art Conversational AI with Transfer Learning
[📰] Language Models
[📰] Paraphrasing
[📰] Poor man’s GPT-3: Few shot text generation with T5 Transformer
[📰] Text Generation
[📰] Controlling Text Generation with Plug and Play Language Models
[📰] The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning)
[📰] The Illustrated GPT-2 (Visualizing Transformer Language Models)
[📰] Neural Language Models as Domain-Specific Knowledge Bases
[📰] Rebuilding the spellchecker, pt.4: Introduction to suggest algorithm
[📰] Rebuilding the spellchecker: Hunspell and the order of edits
[📰] Rebuilding the spellchecker, pt.2: Just look in the dictionary, they said!
[📰] Rebuilding the spellchecker, pt.3: Lookup—compounds and solutions
[📰] A Review of the Neural History of Natural Language Processing
[📰] Neural Transfer Learning for Natural Language Processing
[📰] Implementing Bengio’s Neural Probabilistic Language Model (NPLM) using Pytorch
[📰] Leveraging Pre-trained Language Model Checkpoints for Encoder-Decoder Models
[📰] Understanding Climate Change Domains through Topic Modeling
[📰] Question Classification using Self-Attention Transformer — Part 1.1
[📰] Question Classification using Self-Attention Transformer — Part 1
[📰] Question Classification using Self-Attention Transformer — Part 2
[📰] Question Classification using Self-Attention Transformer — Part 3
[📰] Porting fairseq wmt19 translation system to transformers
[📰] On word embeddings - Part 3: The secret ingredients of word2vec
[📰] DaCy: New Fast and Efficient State-of-the-Art in Danish NLP!
[📰] How To Create Natural Language Semantic Search For Arbitrary Objects With Deep Learning
[📰] How to Implement a Beam Search Decoder for Natural Language Processing
[📰] How to Use n-gram Models to Detect Format Errors in Datasets
[📰] The Unreasonable Effectiveness of Recurrent Neural Networks
[📰] GPT-2 A nascent transfer learning method that could eliminate supervised learning in some NLP tasks
[📰] Spelling Correction: How to make an accurate and fast corrector
[📰] Speller100: Zero-shot spelling correction at scale for 100-plus languages
[📰] Trends in input representation for state-of-art NLP models (2019)
[📰] An Overview of Multi-Task Learning in Deep Neural Networks
[📰] Multi-Task Learning Objectives for Natural Language Processing
[📰] The Current Best of Universal Word Embeddings and Sentence Embeddings
[📰] How to Outperform GPT-3 by Combining Task Descriptions With Supervised Learning
[📰] LSTM Primer With Real Life Application( DeepMind Kidney Injury Prediction )*
[📰] Using an NLP Q&A System To Study Climate Hazards and Nature-Based Solutions
[📰] Automatically Summarize Trump’s State of the Union Address
[📰] Shrinking fastText embeddings so that it fits Google Colab
[📰] Exploring LSTMs
[📰] pyLDAvis: Topic Modelling Exploration Tool That Every NLP Data Scientist Should Know
[📰] 3 subword algorithms help to improve your NLP model performance
[📰] Building a sentence embedding index with fastText and BM25
[📰] Key topics extraction and contextual sentiment of users reviews
[📰] Google mT5 multilingual text-to-text transformer: A Brief Paper Analysis
[📰] Faster and smaller quantized NLP with Hugging Face and ONNX Runtime
[📰] Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention)
[📰] How I Used Deep Learning To Train A Chatbot To Talk Like Me (Sorta)
[📰] Performers: The Kernel Trick, Random Fourier Features, and Attention
[📰] Text Similarities : Estimate the degree of similarity between two texts
[📰] How we used Universal Sentence Encoder and FAISS to make our search 10x smarter
[📰] How GPT3 Works
[📰] A Spellchecker Used to Be a Major Feat of Software Engineering
[📰] Using embeddings to help find similar restaurants in Search
[📰] Evolution of and experiments with feed ranking at Swiggy
[📰] Find My Food: Semantic Embeddings for Food Search Using Siamese Networks
[📰] String similarity — the basic know your algorithms guide!
[📖] Linguistic Fundamentals for Natural Language Processing: 100 Essentials from Morphology and Syntax
-
[ ] CMU: Low-resource NLP Bootcamp 2020
- [ ] CMU Low resource NLP Bootcamp 2020 (1): NLP Tasks
1:46:06
- [ ] CMU Low resource NLP Bootcamp 2020 (2): Linguistics - Phonology and Morphology
1:24:08
- [ ] CMU Low resource NLP Bootcamp 2020 (3): Machine Translation
1:55:59
- [ ] CMU Low resource NLP Bootcamp 2020 (4): Linguistics - Syntax and Morphosyntax
2:00:21
- [ ] CMU Low resource NLP Bootcamp 2020 (5): Neural Representation Learning
1:19:57
- [ ] CMU Low resource NLP Bootcamp 2020 (6): Multilingual NLP
2:04:34
- [ ] CMU Low resource NLP Bootcamp 2020 (7): Speech Synthesis
2:22:14
- [ ] CMU Low resource NLP Bootcamp 2020 (1): NLP Tasks
-
[ ] CMU: Neural Nets for NLP 2021
- [ ] CMU Neural Nets for NLP 2021 (1): Introduction
1:22:40
- [ ] CMU Neural Nets for NLP 2021 (2): Language Modeling, Efficiency/Training Tricks
0:58:24
- [ ] CMU Neural Nets for NLP 2021 (3): Building A Neural Network Toolkit for NLP, minnn
0:34:42
- [ ] CMU Neural Nets for NLP 2021 (4): Efficiency Tricks for Neural Nets
0:43:28
- [ ] CMU Neural Nets for NLP 2021 (5): Recurrent Neural Networks
0:38:50
- [ ] CMU Neural Nets for NLP 2021 (6): Conditioned Generation
0:45:06
- [ ] CMU Neural Nets for NLP 2021 (7): Attention
0:38:23
- [ ] CMU Neural Nets for NLP 2021 (8): Distributional Semantics and Word Vectors
0:42:44
- [ ] CMU Neural Nets for NLP 2021 (9): Sentence and Contextual Word Representations
0:50:53
- [ ] CMU Neural Nets for NLP 2021 (11): Structured Prediction with Local Independence Assumptions
0:36:43
- [ ] CMU Neural Nets for NLP 2021 (10): Debugging Neural Nets (for NLP)
0:43:58
- [ ] CMU Neural Nets for NLP 2021 (12): Model Interpretation
0:28:52
- [ ] CMU Neural Nets for NLP 2021 (13): Generating Trees and Graphs
0:41:05
- [ ] CMU Neural Nets for NLP 2021 (14): Margin-based and Reinforcement Learning for Structured Prediction
0:47:20
- [ ] CMU Neural Nets for NLP 2021 (15): Sequence-to-sequence Pre-training
0:27:22
- [ ] CMU Neural Nets for NLP 2021 (16): Machine Reading w/ Neural Nets
0:43:08
- [ ] CMU Neural Nets for NLP 2021 (17): Neural Nets + Knowledge Bases
0:44:19
- [ ] CMU Neural Nets for NLP 2021 (18): Advanced Search Algorithms
0:47:58
- [ ] CMU Neural Nets for NLP 2021 (19): Adversarial Methods
0:41:56
- [ ] CMU Neural Nets for NLP 2021 (20): Models w/ Latent Random Variables
0:41:06
- [ ] CMU Neural Nets for NLP 2021 (21): Multilingual Learning
0:33:10
- [ ] CMU Neural Nets for NLP 2021 (22): Bias in NLP
0:32:44
- [ ] CMU Neural Nets for NLP 2021 (23): Document-level Models
0:40:04
- [ ] CMU Neural Nets for NLP 2021 (1): Introduction
-
- [ ] CMU Multilingual NLP 2020 (1): Introduction
1:17:29
- [ ] CMU Multilingual NLP 2020 (2): Typology - The Space of Language
0:37:13
- [ ] CMU Multilingual NLP 2020 (3): Words, Parts of Speech, Morphology
0:38:58
- [ ] CMU Multilingual NLP 2020 (4): Text Classification and Sequence Labeling
0:45:56
- [ ] CMU Multilingual NLP 2020 (5): Advanced Text Classification/Labeling
0:49:40
- [ ] CMU Multilingual NLP 2020 (6): Translation, Evaluation, and Datasets
0:46:17
- [ ] CMU Multilingual NLP 2020 (7): Machine Translation/Sequence-to-sequence Models
0:43:51
- [ ] CMU Multilingual NLP 2020 (8): Data Augmentation for Machine Translation
0:24:42
- [ ] CMU Multilingual NLP 2020 (9): Language Contact and Similarity Across Languages
0:30:25
- [ ] CMU Multilingual NLP 2020 (10): Multilingual Training and Cross-lingual Transfer
0:39:58
- [ ] CMU Multilingual NLP 2020 (11): Unsupervised Translation
0:51:17
- [ ] CMU Multilingual NLP 2020 (12): Code Switching, Pidgins, and Creoles
0:46:37
- [ ] CMU Multilingual NLP 2020 (13): Speech
0:41:16
- [ ] CMU Multilingual NLP 2020 (14): Automatic Speech Recognition
0:39:33
- [ ] CMU Multilingual NLP 2020 (15): Low Resource ASR
0:43:38
- [ ] CMU Multilingual NLP 2020 (16): Text to Speech
0:39:00
- [ ] CMU Multilingual NLP 2020 (17): Morphological Analysis and Inflection
0:45:22
- [ ] CMU Multilingual NLP 2020 (18): Dependency Parsing
0:38:15
- [ ] CMU Multilingual NLP 2020 (19): Data Annotation
0:53:08
- [ ] CMU Multilingual NLP 2020 (20): Active Learning
0:28:37
- [ ] CMU Multilingual NLP 2020 (21): Information Extraction
0:41:00
- [ ] CMU Multilingual NLP 2020 (22): Multilingual NLP for Indigenous Languages
1:21:58
- [ ] CMU Multilingual NLP 2020 (23): Universal Translation at Scale
1:27:33
- [ ] CMU Multilingual NLP 2020 (1): Introduction
-
[ ] CS685: Advanced Natural Language Processing
- [ ] UMass CS685 (Advanced NLP): Attention mechanisms
0:48:53
- [ ] UMass CS685 (Advanced NLP): Question answering
0:59:50
- [ ] UMass CS685 (Advanced NLP): Better BERTs
0:52:23
- [ ] UMass CS685 (Advanced NLP): Text generation decoding and evaluation
1:02:32
- [ ] UMass CS685 (Advanced NLP): Paraphrase generation
1:10:59
- [ ] UMass CS685 (Advanced NLP): Crowdsourced text data collection
0:58:31
- [ ] UMass CS685 (Advanced NLP): Model distillation and security threats
1:09:25
- [ ] UMass CS685 (Advanced NLP): Retrieval-augmented language models
0:52:13
- [ ] UMass CS685 (Advanced NLP): Implementing a Transformer
1:12:36
- [ ] UMass CS685 (Advanced NLP): vision + language
1:06:28
- [ ] UMass CS685 (Advanced NLP): exam review
1:24:36
- [ ] UMass CS685 (Advanced NLP): Intermediate fine-tuning
1:10:35
- [ ] UMass CS685 (Advanced NLP): ethics in NLP
0:56:57
- [ ] UMass CS685 (Advanced NLP): probe tasks
0:54:30
- [ ] UMass CS685 (Advanced NLP): semantic parsing
0:48:49
- [ ] UMass CS685 (Advanced NLP): commonsense reasoning (guest lecture by Lorraine Li)
0:58:53
- [ ] UMass CS685 (Advanced NLP): Attention mechanisms
[ ] Notebook: Bi-LSTM with Attention - Binary Sentiment Classification
[ ] RNN and LSTM
[ ] Spacy Tutorial
-
[ ] Stanford CS224U: Natural Language Understanding | Spring 2019
- [ ] Lecture 1 – Course Overview | Stanford CS224U: Natural Language Understanding | Spring 2019
1:12:59
- [ ] Lecture 2 – Word Vectors 1 | Stanford CS224U: Natural Language Understanding | Spring 2019
1:17:10
- [ ] Lecture 3 – Word Vectors 2 | Stanford CS224U: Natural Language Understanding | Spring 2019
1:16:52
- [ ] Lecture 4 – Word Vectors 3 | Stanford CS224U: Natural Language Understanding | Spring 2019
0:38:20
- [ ] Lecture 5 – Sentiment Analysis 1 | Stanford CS224U: Natural Language Understanding | Spring 2019
1:10:44
- [ ] Lecture 6 – Sentiment Analysis 2 | Stanford CS224U: Natural Language Understanding | Spring 2019
1:03:23
- [ ] Lecture 7 – Relation Extraction | Stanford CS224U: Natural Language Understanding | Spring 2019
1:19:04
- [ ] Lecture 8 – NLI 1 | Stanford CS224U: Natural Language Understanding | Spring 2019
1:15:02
- [ ] Lecture 9 – NLI 2 | Stanford CS224U: Natural Language Understanding | Spring 2019
1:15:35
- [ ] Lecture 10 – Grounding | Stanford CS224U: Natural Language Understanding | Spring 2019
1:23:15
- [ ] Lecture 11 – Semantic Parsing | Stanford CS224U: Natural Language Understanding | Spring 2019
1:07:05
- [ ] Lecture 12 – Evaluation Methods | Stanford CS224U: Natural Language Understanding | Spring 2019
1:18:32
- [ ] Lecture 13 – Evaluation Metrics | Stanford CS224U: Natural Language Understanding | Spring 2019
1:11:32
- [ ] Lecture 14 – Contextual Vectors | Stanford CS224U: Natural Language Understanding | Spring 2019
1:14:33
- [ ] Lecture 15 – Presenting Your Work | Stanford CS224U: Natural Language Understanding | Spring 2019
1:15:11
- [ ] Lecture 1 – Course Overview | Stanford CS224U: Natural Language Understanding | Spring 2019
-
[ ] Stanford CS224N: Stanford CS224N: NLP with Deep Learning | Winter 2019
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 1 – Introduction and Word Vectors
1:21:52
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 2 – Word Vectors and Word Senses
1:20:43
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 3 – Neural Networks
1:18:50
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 4 – Backpropagation
1:22:15
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 5 – Dependency Parsing
1:20:22
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 6 – Language Models and RNNs
1:08:25
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 7 – Vanishing Gradients, Fancy RNNs
1:13:23
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 8 – Translation, Seq2Seq, Attention
1:16:56
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 9 – Practical Tips for Projects
1:22:39
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 10 – Question Answering
1:21:01
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 11 – Convolutional Networks for NLP
1:20:18
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 12 – Subword Models
1:15:30
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 13 – Contextual Word Embeddings
1:20:18
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
0:53:48
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 15 – Natural Language Generation
1:19:37
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 16 – Coreference Resolution
1:19:20
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 17 – Multitask Learning
1:11:54
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 18 – Constituency Parsing, TreeRNNs
1:20:37
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 19 – Bias in AI
0:56:03
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 20 – Future of NLP + Deep Learning
1:19:15
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2020 | Low Resource Machine Translation
1:15:45
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2020 | BERT and Other Pre-trained Language Models
0:54:28
- [ ] Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 1 – Introduction and Word Vectors
-
[ ] Stanford: Natural Language Processing | Dan Jurafsky, Christopher Manning
- [ ] Course Introduction
0:12:52
- [ ] Regular Expressions
0:11:25
- [ ] Regular Expressions in Practical NLP
0:06:05
- [ ] Word Tokenization
0:14:26
- [ ] Word Normalization and Stemming
0:11:48
- [ ] Sentence Segmentation
0:05:34
- [ ] Defining Minimum Edit Distance
0:07:05
- [ ] Computing Minimum Edit Distance
0:05:55
- [ ] Backtrace for Computing Alignments
0:05:56
- [ ] Weighted Minimum Edit Distance
0:02:48
- [ ] Minimum Edit Distance in Computational Biology
0:09:30
- [ ] Introduction to N grams
0:08:41
- [ ] Estimating N gram Probabilities
0:09:38
- [ ] Evaluation and Perplexity
0:11:09
- [ ] Generalization and Zeros
0:05:15
- [ ] Smoothing Add One
0:06:31
- [ ] Interpolation
0:10:25
- [ ] Good Turing Smoothing
0:15:35
- [ ] Kneser Ney Smoothing
0:08:59
- [ ] The Spelling Correction Task
0:05:40
- [ ] The Noisy Channel Model of Spelling
0:19:31
- [ ] Real Word Spelling Correction
0:09:20
- [ ] State of the Art Systems
0:07:10
- [ ] What is Text Classification
0:08:12
- [ ] Naive Bayes
0:03:20
- [ ] Formalizing the Naive Bayes Classifier
0:09:29
- [ ] Naive Bayes Learning
0:05:23
- [ ] Naive Bayes Relationship to Language Modeling
0:04:36
- [ ] Multinomial Naive Bayes A Worked Example
0:08:59
- [ ] Precision, Recall, and the F measure
0:16:17
- [ ] Text Classification Evaluation
0:07:17
- [ ] Practical Issues in Text Classification
0:05:57
- [ ] What is Sentiment Analysis
0:07:18
- [ ] Sentiment Analysis A baseline algorithm
0:13:27
- [ ] Sentiment Lexicons
0:08:38
- [ ] Learning Sentiment Lexicons
0:14:46
- [ ] Other Sentiment Tasks
0:11:02
- [ ] Generative vs Discriminative Models
0:07:50
- [ ] Making features from text for discriminative NLP models
0:18:12
- [ ] Feature Based Linear Classifiers
0:13:35
- [ ] Building a Maxent Model The Nuts and Bolts
0:08:05
- [ ] Generative vs Discriminative models
0:12:10
- [ ] Maximizing the Likelihood
0:10:30
- [ ] Introduction to Information Extraction
0:09:19
- [ ] Evaluation of Named Entity Recognition
0:06:35
- [ ] Sequence Models for Named Entity Recognition
0:15:06
- [ ] Maximum Entropy Sequence Models
0:13:02
- [ ] What is Relation Extraction
0:09:47
- [ ] Using Patterns to Extract Relations
0:06:17
- [ ] Supervised Relation Extraction
0:10:51
- [ ] Semi Supervised and Unsupervised Relation Extraction
0:09:53
- [ ] The Maximum Entropy Model Presentation
0:12:14
- [ ] Feature Overlap Feature Interaction
0:12:52
- [ ] Conditional Maxent Models for Classification
0:04:11
- [ ] Smoothing Regularization Priors for Maxent Models
0:29:24
- [ ] An Intro to Parts of Speech and POS Tagging
0:13:19
- [ ] Some Methods and Results on Sequence Models for POS Tagging
0:13:04
- [ ] Syntactic Structure Constituency vs Dependency
0:08:46
- [ ] Empirical Data Driven Approach to Parsing
0:07:11
- [ ] The Exponential Problem in Parsing
0:14:31
- [ ] Instructor Chat
0:09:03
- [ ] CFGs and PCFGs
0:15:30
- [ ] Grammar Transforms
0:12:06
- [ ] CKY Parsing
0:23:26
- [ ] CKY Example
0:21:25
- [ ] Constituency Parser Evaluation
0:09:46
- [ ] Lexicalization of PCFGs
0:07:03
- [ ] Charniak's Model
0:18:24
- [ ] PCFG Independence Assumptions
0:09:44
- [ ] The Return of Unlexicalized PCFGs
0:20:53
- [ ] Latent Variable PCFGs
0:12:08
- [ ] Dependency Parsing Introduction
0:10:25
- [ ] Greedy Transition Based Parsing
0:31:05
- [ ] Dependencies Encode Relational Structure
0:07:21
- [ ] Introduction to Information Retrieval
0:09:16
- [ ] Term Document Incidence Matrices
0:08:59
- [ ] The Inverted Index
0:10:43
- [ ] Query Processing with the Inverted Index
0:06:44
- [ ] Phrase Queries and Positional Indexes
0:19:46
- [ ] Introducing Ranked Retrieval
0:04:27
- [ ] Scoring with the Jaccard Coefficient
0:05:07
- [ ] Term Frequency Weighting
0:06:00
- [ ] Inverse Document Frequency Weighting
0:10:17
- [ ] TF IDF Weighting
0:03:42
- [ ] The Vector Space Model
0:16:23
- [ ] Calculating TF IDF Cosine Scores
0:12:48
- [ ] Evaluating Search Engines
0:09:03
- [ ] Word Senses and Word Relations
0:11:50
- [ ] WordNet and Other Online Thesauri
0:06:23
- [ ] Word Similarity and Thesaurus Methods
0:16:18
- [ ] Word Similarity Distributional Similarity I
0:13:15
- [ ] Word Similarity Distributional Similarity II
0:08:16
- [ ] What is Question Answering
0:07:29
- [ ] Answer Types and Query Formulation
0:08:48
- [ ] Passage Retrieval and Answer Extraction
0:06:38
- [ ] Using Knowledge in QA
0:04:25
- [ ] Advanced Answering Complex Questions
0:04:53
- [ ] Introduction to Summarization
0:04:46
- [ ] Generating Snippets
0:07:35
- [ ] Evaluating Summaries ROUGE
0:05:03
- [ ] Summarizing Multiple Documents
0:10:42
- [ ] Instructor Chat II
0:05:24
- [ ] Course Introduction
-
- [ ] Natural Language Processing Tutorial With TextBlob -Tokens,Translation and Ngrams
0:11:01
- [ ] NLP Tutorial With TextBlob and Python - Parts of Speech Tagging
0:05:59
- [ ] NLP Tutorial With TextBlob & Python - Lemmatizating
0:06:32
- [ ] NLP Tutorial with TextBlob & Python - Sentiment Analysis(Polarity,Subjectivity)
0:06:31
- [ ] Building a NLP-based Flask App with TextBlob
0:37:30
- [ ] Natural Language Processing with Polyglot - Installation & Intro
0:12:49
- [ ] Natural Language Processing Tutorial With TextBlob -Tokens,Translation and Ngrams
-
[📺 ] fast.ai Code-First Intro to Natural Language Processing
- [ ] What is NLP? (NLP video 1)
0:22:42
- [ ] Topic Modeling with SVD & NMF (NLP video 2)
1:06:39
- [ ] Topic Modeling & SVD revisited (NLP video 3)
0:33:05
- [ ] Sentiment Classification with Naive Bayes (NLP video 4)
0:58:20
- [ ] Sentiment Classification with Naive Bayes & Logistic Regression, contd. (NLP video 5)
0:51:29
- [ ] Derivation of Naive Bayes & Numerical Stability (NLP video 6)
0:23:56
- [ ] Revisiting Naive Bayes, and Regex (NLP video 7)
0:37:33
- [ ] Intro to Language Modeling (NLP video 8)
0:40:58
- [ ] Transfer learning (NLP video 9)
1:35:16
- [ ] ULMFit for non-English Languages (NLP Video 10)
1:49:22
- [ ] Understanding RNNs (NLP video 11)
0:33:16
- [ ] Seq2Seq Translation (NLP video 12)
0:59:42
- [ ] Word embeddings quantify 100 years of gender & ethnic stereotypes-- Nikhil Garg (NLP video 13)
0:47:17
- [ ] Text generation algorithms (NLP video 14)
0:25:39
- [ ] Implementing a GRU (NLP video 15)
0:23:13
- [ ] Algorithmic Bias (NLP video 16)
1:26:17
- [ ] Introduction to the Transformer (NLP video 17)
0:22:54
- [ ] The Transformer for language translation (NLP video 18)
0:55:17
- [ ] What you need to know about Disinformation (NLP video 19)
0:51:21
- [📰] Zero to Hero with fastai - Beginner
- [📰] Zero to Hero with fastai - Intermediate
- [ ] What is NLP? (NLP video 1)
-
- [ ] Word Embeddings
- [ ] Text Classification
- [ ] Language Modeling
- [ ] Seq2seq and Attention
[📺 ] BERT Research Series
-
[ ] YouTube: Level 3 AI Assistant Conference 2020
- [📺 ] Conversation Analysis Theory in Chatbots | Michael Szul
- [📺 ] Designing Practical NLP Solutions | Ines Montani
- [📺 ] Effective Copywriting for Chatbots | Hans Van Dam
- [📺 ] Distilling BERT | Sam Sucik
- [📺 ] Transformer Policies that improve Dialogues: A Live Demo by Vincent Warmerdam
- [📺 ] From Research to Production – Our Process at Rasa | Tanja Bunk
- [📺 ] Keynote: Perspective on the 5 Levels of Conversational AI | Alan Nichol
-
[📺 ] RASA Algorithm Whiteboard
- [ ] Introducing The Algorithm Whiteboard
0:01:16
- [ ] Rasa Algorithm Whiteboard - Diet Architecture 1: How it Works
0:23:27
- [ ] Rasa Algorithm Whiteboard - Diet Architecture 2: Design Decisions
0:15:06
- [ ] Rasa Algorithm Whiteboard - Diet Architecture 3: Benchmarking
0:22:34
- [ ] Rasa Algorithm Whiteboard - Embeddings 1: Just Letters
0:13:48
- [ ] Rasa Algorithm Whiteboard - Embeddings 2: CBOW and Skip Gram
0:19:24
- [ ] Rasa Algorithm Whiteboard - Embeddings 3: GloVe
0:19:12
- [ ] Rasa Algorithm Whiteboard - Embeddings 4: Whatlies
0:14:03
- [ ] Rasa Algorithm Whiteboard - Attention 1: Self Attention
0:14:32
- [ ] Rasa Algorithm Whiteboard - Attention 2: Keys, Values, Queries
0:12:26
- [ ] Rasa Algorithm Whiteboard - Attention 3: Multi Head Attention
0:10:55
- [ ] Rasa Algorithm Whiteboard: Attention 4 - Transformers
0:14:34
- [ ] Rasa Algorithm Whiteboard - StarSpace
0:11:46
- [ ] Rasa Algorithm Whiteboard - TED Policy
0:16:10
- [ ] Rasa Algorithm Whiteboard - TED in Practice
0:14:54
- [ ] Rasa Algorithm Whiteboard - Response Selection
0:12:07
- [ ] Rasa Algorithm Whiteboard - Response Selection: Implementation
0:09:25
- [ ] Rasa Algorithm Whiteboard - Countvectors
0:13:32
- [ ] Rasa Algorithm Whiteboard - Subword Embeddings
0:11:58
- [ ] Rasa Algorithm Whiteboard - Implementation of Subword Embeddings
0:10:01
- [ ] Rasa Algorithm Whiteboard - BytePair Embeddings
0:12:44
- [ ] Introducing The Algorithm Whiteboard
[📺 ] The Transformer neural network architecture explained. “Attention is all you need” (NLP)
[📺 ] How does a Transformer architecture combine Vision and Language? ViLBERT - NLP meets Computer Vision
[📺 ] Strategies for pre-training the BERT-based Transformer architecture – language (and vision)
[📺 ] What is GPT-3? Showcase, possibilities, and implications
[📺 ] TextAttack: A Framework for Data Augmentation and Adversarial Training in NLP
-
[📺 ] Spacy IRL 2019
- [ ] Sebastian Ruder: Transfer Learning in Open-Source Natural Language Processing (spaCy IRL 2019)
0:31:24
- [ ] Giannis Daras: Improving sparse transformer models for efficient self-attention (spaCy IRL 2019)
0:20:13
- [ ] Peter Baumgartner: Applied NLP: Lessons from the Field (spaCy IRL 2019)
0:18:44
- [ ] Justina Petraitytė: Lessons learned in helping ship conversational AI assistants (spaCy IRL 2019)
0:23:48
- [ ] Yoav Goldberg: The missing elements in NLP (spaCy IRL 2019)
0:30:27
- [ ] Sofie Van Landeghem: Entity linking functionality in spaCy (spaCy IRL 2019)
0:20:08
- [ ] Guadalupe Romero: Rethinking rule-based lemmatization (spaCy IRL 2019)
0:14:49
- [ ] Mark Neumann: ScispaCy: A spaCy pipeline & models for scientific & biomedical text (spaCy IRL 2019)
0:18:59
- [ ] Patrick Harrison: Financial NLP at S&P Global (spaCy IRL 2019)
0:21:42
- [ ] McKenzie Marshall: NLP in Asset Management (spaCy IRL 2019)
0:20:32
- [ ] David Dodson: spaCy in the News: Quartz's NLP pipeline (spaCy IRL 2019)
0:20:56
- [ ] Matthew Honnibal & Ines Montani: spaCy and Explosion: past, present & future (spaCy IRL 2019)
0:54:13
- [ ] Sebastian Ruder: Transfer Learning in Open-Source Natural Language Processing (spaCy IRL 2019)
[📺 ] Sentiment Analysis: Key Milestones, Challenges and New Directions
[📺 ] Simple and Efficient Deep Learning for Natural Language Processing, with Moshe Wasserblat, Intel AI
[📺 ] Why not solve biological problems with a Transformer? BERTology meets Biology
[📺 ] Self-attention step-by-step | How to get meaning from text
-
- [ ] NLP with Friends, Featured Friend: Tom McCoy
0:36:48
- [ ] NLP with Friends, Featured Friend: Maarten Sap
0:36:11
- [ ] NLP with Friends, featured friend: Nitika Mathur
1:01:42
- [ ] NLP with Friends, Featured Friend: Sabrina J Mielke
1:01:28
- [ ] NLP with Friends, Featured Friend: Tom McCoy
-
- [ ] Crash Course Linguistics Preview
0:02:50
- [ ] What is Linguistics?: Crash Course Linguistics #1
0:11:11
- [ ] Crash Course Linguistics Preview
[📺 ] Talks # 3: Lorenzo Ampil - Introduction to T5 for Sentiment Span Extraction
[📺 ] Frontiers in ML: Learning from Limited Labeled Data: Challenges and Opportunities for NLP
[📺 ] spaCy v3.0: Bringing State-of-the-art NLP from Prototype to Production
00:22:40
[📺 ] Conversational AI with Transformers and Rule-Based Systems
1:53:24
[ ] Talk: EmoTag1200: Understanding the Association between Emojis and Emotions
-
[📺 ] Research Paper Walkthrough
- [ ] Simple Unsupervised Keyphrase Extraction using Sentence Embeddings | Research Paper Walkthrough
0:21:23
- [ ] Leveraging BERT for Extractive Text Summarization on Lectures | Research Paper Walkthrough
0:20:10
- [ ] Data Augmentation Techniques for Text Classification in NLP | Research Paper Walkthrough
0:14:33
- [ ] CRIM at SemEval-2018 Task 9: A Hybrid Approach to Hypernym Discovery | Research Paper Walkthrough
0:23:47
- [ ] Data Augmentation using Pre-trained Transformer Model (BERT, GPT2, etc) | Research Paper Walkthrough
0:17:43
- [ ] A Supervised Approach to Extractive Summarisation of Scientific Papers | Research Paper Walkthrough
0:19:01
- [ ] BLEURT: Learning Robust Metrics for Text Generation | Research Paper Walkthrough
0:13:38
- [ ] DeepWalk: Online Learning of Social Representations |Machine Learning with Graphs | Research Paper Walkthrough
0:17:44
- [ ] LSBert: A Simple Framework for Lexical Simplification | Research Paper Walkthrough
0:20:27
- [ ] SpanBERT: Improving Pre-training by Representing and Predicting Spans | Research Paper Walkthrough
0:14:21
- [ ] Text Summarization of COVID-19 Medical Articles using BERT and GPT-2 | Research Paper Walkthrough
0:21:52
- [ ] Extractive & Abstractive Summarization with Transformer Language Models | Research Paper Walkthrough
0:16:58
- [ ] Unsupervised Multi-Document Summarization using Neural Document Model | Research Paper Walkthrough
0:15:11
- [ ] SummPip: Multi-Document Summarization with Sentence Graph Compression | Research Paper Walkthrough
0:16:54
- [ ] Combining BERT with Static Word Embedding for Categorizing Social Media | Research Paper Walkthrough
0:13:51
- [ ] Reformulating Unsupervised Style Transfer as Paraphrase Generation | Research Paper Walkthrough
0:19:41
- [ ] PEGASUS: Pre-training with Gap-Sentences for Abstractive Summarization | Research Paper Walkthrough
0:15:04
- [ ] Evaluation of Text Generation: A Survey | Human-Centric Evaluations | Research Paper Walkthrough
0:15:54
- [ ] TOD-BERT: Pre-trained Transformers for Task-Oriented Dialogue Systems (Research Paper Walkthrough)
0:15:25
- [ ] TextRank: Bringing Order into Texts (Research Paper Walkthrough)
0:14:34
- [ ] Node2Vec: Scalable Feature Learning for Networks |Machine Learning with Graphs (Research Paper Walkthrough)
0:14:33
- [ ] HARP: Hierarchical Representation Learning for Network |Machine Learning with Graphs (Research Paper Walkthrough)
0:15:10
- [ ] URL2Video: Automatic Video Creation From a Web Page | AI and Creativity (Research Paper Walkthrough)
0:15:21
- [ ] On Generating Extended Summaries of Long Documents (Research Paper Walkthrough)
0:14:24
- [ ] Nucleus Sampling: The Curious Case of Neural Text Degeneration (Research Paper Walkthrough)
0:12:48
- [ ] T5: Exploring Limits of Transfer Learning with Text-to-Text Transformer (Research Paper Walkthrough)
0:12:47
- [ ] DialoGPT: Generative Training for Conversational Response Generation (Research Paper Walkthrough)
0:13:17
- [ ] Hierarchical Transformers for Long Document Classification (Research Paper Walkthrough)
0:12:46
- [ ] Beyond Accuracy: Behavioral Testing of NLP Models with CheckList (Best Paper ACL 2020)
0:14:00
- [ ] Simple Unsupervised Keyphrase Extraction using Sentence Embeddings | Research Paper Walkthrough
-
[ ] NLP Summit 2020
- [ ] The 2020 Trends for Applied Natural Language Processing | NLP Summit 2020
0:21:10
- [ ] NLP Industry Survey Analysis: the landscape of natural language use cases in 2020 | NLP Summit 2020
0:20:23
- [ ] Auto NLP: Pretrain, Tune & Deploy State-of-the-art Models Without Coding
0:19:57
- [ ] Proof-of-Concept Delight | NLP Summit 2020
0:16:50
- [ ] Distributed Natural Language Processing Apps for Financial Engineering | NLP Summit 2020
0:34:49
- [ ] Bleeding Edge Applications of 2020 Transformers | NLP Summit 2020
0:33:34
- [ ] How Freshworks Freddy AI leverages NLP for Ethics-First Customer Experiences | NLP Summit 2020
0:26:49
- [ ] NLP for Recruitment Automation: Building a Chatbot from the Job Description | NLP Summit 2020
0:22:31
- [ ] The 2020 Trends for Applied Natural Language Processing | NLP Summit 2020
[📺 ] Gibberish Detector
-
[📺 ] NLP Lecture 7 Constituency Parsing
- [ ] NLP Lecture 7 - Overview of Constituency Parsing Lecture
0:01:50
- [ ] NLP Lecture 7 - Introduction to Constituency Parsing
0:10:29
- [ ] NLP Lecture 7(a) - Context Free Grammar
0:17:03
- [ ] NLP Lecture 7(b) - Constituency Parsing
0:13:28
- [ ] NLP Lecture 7(c) - Statistical Constituency Parsing
0:09:38
- [ ] NLP Lecture 7(d) - Dependency Parsing
0:17:15
- [ ] NLP Lecture 7 - Overview of Constituency Parsing Lecture
-
[📺 ] SpaCy for Digital Humanities with Python Tutorials
- [ ] Introduction to SpaCy and Cleaning Data (SpaCy and Python Tutorials for DH - 01)
0:06:07
- [ ] How to Install SpaCy and Models (Spacy and Python Tutorial for DH 02)
0:07:40
- [ ] How to Separate Sentences in SpaCy (SpaCy and Python Tutorials for DH - 03)
0:08:33
- [ ] Spacy and Named Entity Recognition NER (Spacy and Python Tutorial for DH 04)
0:08:32
- [ ] Finding Parts of Speech (SpaCy and Python Tutorial for DH 05)
0:02:55
- [ ] Extracting Nouns and Noun Chunks (SpaCy and Python Tutorial for DH 06)
0:05:46
- [ ] Extracting Verbs and Verb Phrases (SpaCy and Python Tutorial for DH 07)
0:08:10
- [ ] Lemmatization: Finding the Roots of Words (Spacy and Python Tutorial for DH 08)
0:04:52
- [ ] Data Visualization with DisplaCy (Spacy and Python Tutorial for DH 09)
0:09:13
- [ ] Customizing DisplaCy Render Data Visualization (Spacy and Python Tutorial for DH 10)
0:08:19
- [ ] Finding Quotes in Sentences (SpaCy and Python Tutorial for DH 11)
0:08:45
- [ ] Introduction to Named Entity Recognition (NER for DH 01)
0:16:43
- [ ] Machine Learning NER with Python and spaCy (NER for DH 03 )
0:13:36
- [ ] How to Use spaCy's EntityRuler (Named Entity Recognition for DH 04 | Part 01)
0:36:50
- [ ] How to Use spaCy to Create an NER training set (Named Entity Recognition for DH 04 | Part 02)
0:10:32
- [ ] How to Train a spaCy NER model (Named Entity Recognition for DH 04 | Part 03)
0:15:40
- [ ] Examining a spaCy Model in the Folder (Named Entity Recognition for DH 05)
0:15:06
- [ ] What are Word Vectors (Named Entity Recognition for DH 06)
0:18:49
- [ ] How to Generate Custom Word Vectors in Gensim (Named Entity Recognition for DH 07)
0:23:05
- [ ] How to Load Custom Word Vectors into spaCy Models (Named Entity Recognition for DH 08)
0:10:46
- [ ] Getting the Data for Custom Labels (Holocaust NER for DH 09.01)
0:11:00
- [ ] How to Add a Custom NER Pipe in spaCy and a Custom Label (NER for DH 09.02 )
0:07:49
- [ ] How to Training Custom Entities into spaCy Models (Named Entity Recognition for DH 09 03)
0:15:29
- [ ] How to Add and Place Pipes from other Models into a New Model (NER for DH 09 04)
0:12:24
- [ ] How to Add Custom Functions to spaCy Pipeline (NER for DH 09.05)
0:15:20
- [ ] Precision vs. Recall and Adding PERSON to Holocaust NER Pipeline (Named Entity Recognition DH 09.06)
0:26:02
- [ ] Finalizing the Holocaust NER Pipeline (Named Entity Recognition for DH 09.07)
0:14:16
- [ ] Classical Latin Named Entity Recognition (NER for DH 10.01)
0:55:30
- [ ] How to Package spaCy Models (Even with Custom Factories) (NER for DH 11)
0:15:31
- [ ] Introduction to SpaCy and Cleaning Data (SpaCy and Python Tutorials for DH - 01)
[📺 ] Cheuk Ting Ho - Fuzzy Matching Smart Way of Finding Similar Names Using Fuzzywuzzy
[📺 ] What's in a Name? Fast Fuzzy String Matching - Seth Verrinder & Kyle Putnam - Midwest.io 2015
[📺 ] Jiaqi Liu Fuzzy Search Algorithms How and When to Use Them PyCon 2017
[📺 ] 1 + 1 = 1 or Record Deduplication with Python | Flávio Juvenal @ PyBay2018
[📺 ] Approximate nearest neighbors and vector models, introduction to Annoy
[📺 ] Librosa Audio and Music Signal Analysis in Python | SciPy 2015 | Brian McFee
-
[📺 ] Deep Learning (for Audio) with Python
- [ ] 1- Deep Learning (for Audio) with Python: Course Overview
0:08:02
- [ ] 2- AI, machine learning and deep learning
0:31:15
- [ ] 3- Implementing an artificial neuron from scratch
0:19:05
- [ ] 4- Vector and matrix operations
0:25:51
- [ ] 5- Computation in neural networks
0:23:19
- [ ] 6- Implementing a neural network from scratch in Python
0:21:03
- [ ] 7- Training a neural network: Backward propagation and gradient descent
0:21:41
- [ ] 8- TRAINING A NEURAL NETWORK: Implementing backpropagation and gradient descent from scratch
1:03:00
- [ ] 9- How to implement a (simple) neural network with TensorFlow 2
0:24:37
- [ ] 10 - Understanding audio data for deep learning
0:32:55
- [ ] 11- Preprocessing audio data for Deep Learning
0:25:05
- [ ] 12- Music genre classification: Preparing the dataset
0:37:45
- [ ] 13- Implementing a neural network for music genre classification
0:33:25
- [ ] 14- SOLVING OVERFITTING in neural networks
0:26:29
- [ ] 15- Convolutional Neural Networks Explained Easily
0:35:23
- [ ] 16- How to Implement a CNN for Music Genre Classification
0:49:10
- [ ] 17- Recurrent Neural Networks Explained Easily
0:28:35
- [ ] 18- Long Short Term Memory (LSTM) Networks Explained Easily
0:28:08
- [ ] 19- How to Implement an RNN-LSTM Network for Music Genre Classification
0:14:29
- [ ] 1- Deep Learning (for Audio) with Python: Course Overview
-
[📺 ] Advanced Information Retrieval 2021 - TU Wien
- [ ] AIR - 2021 Course Introduction
0:21:39
- [ ] Crash Course IR - Fundamentals
0:46:31
- [ ] Crash Course IR - Evaluation
0:37:15
- [ ] Crash Course IR - Test Collections
0:51:12
- [ ] Word Representation Learning
0:42:02
- [ ] Sequence Modelling with CNNs and RNNs
0:55:04
- [ ] Transformer and BERT Pre-training
0:47:15
- [ ] Introduction to Neural Re-Ranking
0:59:20
- [ ] Transformer Contextualized Re-Ranking
0:49:06
- [ ] Domain Specific Applications
0:38:32
- [ ] Dense Retrieval ❤ Knowledge Distillation
0:59:28
- [ ] AIR - 2021 Course Introduction
-
[📺 ] Introduction to Dense Text Representation
- [ ] Introduction to Dense Text Representations - Part 1
0:12:56
- [ ] Introduction to Dense Text Representations - Part 2
0:23:13
- [ ] Introduction to Dense Text Representation - Part 3
0:38:07
- [ ] Training State-of-the-Art Sentence Embedding Models
0:43:43
- [ ] Introduction to Dense Text Representations - Part 1
[📺 ] Fine-tuning a large language model without your own supercomputer
[📺 ] Transformers 🤗 to Rule Them All? Under the Hood of the AI Recruiter Chatbot 🤖, with Keisuke Inoue
[📺 ] Artificial Intelligence and Natural Language Processing in E-Commerce by Katherine Munro | smec
[📺 ] Abhishek Thakur - Classifying Search Queries Without User Click Data
[📺 ] Chatbots Revisted | by Abhishek Thakur | Kaggle Days Warsaw
[📺 ] Design Considerations for building ML-Powered Search Applications - Mark Moyou
[📺 ] Productionizing an unsupervised machine learning model to understand customer feedback
[📺 ] Extracting topics from reviews using NLP - Dr. Tal Perri
[📺 ] Bringing innovation to online retail: automating customer service with NLP
[📺 ] Transform customer service with machine learning (Google Cloud Next '17)
[📺 ] Real life aspects of opinion sentiment analysis within customer reviews - Dr. Jonathan Yaniv
[📺 ] Deep Learning Methods for Emotion Detection from Text - Dr. Liron Allerhand
[📺 ] Learning How to Learn NLP : Developing Introductory Concepts Through Scaffolded Discoveries
[📺 ] AppliedMachine Learning 2020 - 15 - Working with Text Data
1:27:08
[📺 ] AppliedMachine Learning 2020 - 16 - Topic models for text data
1:18:34
[📺 ] AppliedMachine Learning 2020 - 17 - Word vectors and document embeddings
1:03:04
[📺 ] A Briefish Introduction to Discourse Representation Theory
[📺 ] HuggingFace Crash Course - Sentiment Analysis, Model Hub, Fine Tuning
-
[📺 ] Transformers From Scratch
- [ ] How-to Use HuggingFace's Datasets - Transformers From Scratch #1
0:14:21
- [ ] Build a Custom Transformer Tokenizer - Transformers From Scratch #2
0:14:17
- [ ] Building MLM Training Input Pipeline - Transformers From Scratch #3
0:23:11
- [ ] Training and Testing an Italian BERT - Transformers From Scratch #4
0:30:38
- [ ] How-to Use HuggingFace's Datasets - Transformers From Scratch #1
Be familiar with multi-modal machine learning
-
[ ] CMU: MultiModal Machine Learning Fall 2020
- [ ] Lecture 1.1: Course Introduction
- [ ] Lecture 1.2: Multimodal applications and datasets
- [ ] Lecture 2.1: Basic concepts: neural networks
- [ ] Lecture 2.2: Basic concepts: network optimization
- [ ] Lecture 3.1: Visual unimodal representations
- [ ] Lecture 3.2: Language unimodal representations
- [ ] Lecture 4.1: Multimodal representation learning
- [ ] Lecture 4.2: Coordinated representations
- [ ] Lecture 5.1: Multimodal alignment
- [ ] Lecture 5.2: Alignment and representation
- [ ] Lecture 7.1: Alignment and translation
- [ ] Lecture 7.2: Probabilistic graphical models
- [ ] Lecture 8.1: Discriminative graphical models
- [ ] Lecture 8.2: Deep Generative Models
- [ ] Lecture 9.1: Reinforcement learning
- [ ] Lecture 9.2: Multimodal RL
- [ ] Lecture 10.1: Fusion and co-learning
- [ ] Lecture 10.2: New research directions
Be familiar with Recommendation Systems
[ ] Pluralsight: Understanding Algorithms for Recommendation Systems
[📺 ] Learning to Rank: From Theory to Production - Malvina Josephidou & Diego Ceccarelli, Bloomberg
[📺 ] Learning to rank search results - Byron Voorbach & Jettro Coenradie [DevCon 2018]
Be able to implement computer vision models
[📰] EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks
[📰] Understanding the receptive field of deep convolutional networks
[📰] Deep learning in medical imaging - 3D medical image segmentation with PyTorch
[📰] Intuitive Explanation of Skip Connections in Deep Learning
[📰] Understanding coordinate systems and DICOM for deep learning medical image analysis
[📰] The 9 Deep Learning Papers You Need To Know About (Understanding CNNs Part 3)
[📰] DenseNet Architecture Explained with PyTorch Implementation from TorchVision
[📰] Squeeze and Excitation Networks Explained with PyTorch Implementation
[📰] Object Detection for Dummies Part 1: Gradient Vector, HOG, and SS
[📰] Object Detection for Dummies Part 2: CNN, DPM and Overfeat
[📰] How to extract Key-Value pairs from Documents using deep learning
[📰] Essential Pil (Pillow) Image Tutorial (for Machine Learning People)
[📰] A Brief History of CNNs in Image Segmentation: From R-CNN to Mask R-CNN
[📰] NonCompositional
[📰] Part 1: Deep Representations, a way towards neural style transfer
[📰] Looking Inside The Blackbox — How To Trick A Neural Network
-
[ ] Stanford: CS231N Winter 2016
- [ ] CS231n Winter 2016: Lecture 1: Introduction and Historical Context
1:19:08
- [ ] CS231n Winter 2016: Lecture 2: Data-driven approach, kNN, Linear Classification 1
0:57:28
- [ ] CS231n Winter 2016: Lecture 3: Linear Classification 2, Optimization
1:11:23
- [ ] CS231n Winter 2016: Lecture 4: Backpropagation, Neural Networks 1
1:19:38
- [ ] CS231n Winter 2016: Lecture 5: Neural Networks Part 2
1:18:37
- [ ] CS231n Winter 2016: Lecture 6: Neural Networks Part 3 / Intro to ConvNets
1:09:35
- [ ] CS231n Winter 2016: Lecture 7: Convolutional Neural Networks
1:19:01
- [ ] CS231n Winter 2016: Lecture 8: Localization and Detection
1:04:57
- [ ] CS231n Winter 2016: Lecture 9: Visualization, Deep Dream, Neural Style, Adversarial Examples
1:18:20
- [ ] CS231n Winter 2016: Lecture 10: Recurrent Neural Networks, Image Captioning, LSTM
1:09:54
- [ ] CS231n Winter 2016: Lecture 11: ConvNets in practice
1:15:03
- [ ] CS231n Winter 2016: Lecture 12: Deep Learning libraries
1:21:06
- [ ] CS231n Winter 2016: Lecture 14: Videos and Unsupervised Learning
1:17:36
- [ ] CS231n Winter 2016: Lecture 13: Segmentation, soft attention, spatial transformers
1:10:59
- [ ] CS231n Winter 2016: Lecture 15: Invited Talk by Jeff Dean
1:14:49
- [ ] CS231n Winter 2016: Lecture 1: Introduction and Historical Context
[📺 ] Autoencoders - EXPLAINED
0:10:53
[📺 ] Building an Image Captioner with Neural Networks
0:12:54
[📺 ] Convolution Neural Networks - EXPLAINED
0:19:20
[📺 ] Depthwise Separable Convolution - A FASTER CONVOLUTION!
0:12:43
[📺 ] Generative Adversarial Networks - FUTURISTIC & FUN AI !
0:14:20
[📺 ] Mask Region based Convolution Neural Networks - EXPLAINED!
0:09:34
[📺 ] Sound play with Convolution Neural Networks
0:11:57
[📺 ] The Evolution of Convolution Neural Networks
0:24:02
[📺 ] An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale (Paper Explained)
[📺 ] Deep Residual Learning for Image Recognition (Paper Explained)
[📺 ] Evolution of Face Generation | Evolution of GANs
0:12:23
[📺 ] ConvNets Scaled Efficiently
0:13:19
[📺 ] DETR: End-to-End Object Detection with Transformers (Paper Explained)
[📺 ] Unpaired Image-Image Translation using CycleGANs
0:16:22
[📺 ] Formula Indexing, Search, and Entry in the MathSeer Project
0:52:07
[📺 ] What's new in computer vision | July Queensland AI
1:21:33
Be able to model graphs and network data
Be able to implement models for timeseries and forecasting
Be familiar with basics of Reinforcement Learning
-
[ ] DeepLizard: Reinforcement Learning - Goal Oriented Intelligence
- [ ] Reinforcement Learning Series Intro - Syllabus Overview
0:05:51
- [ ] Markov Decision Processes (MDPs) - Structuring a Reinforcement Learning Problem
0:06:34
- [ ] Expected Return - What Drives a Reinforcement Learning Agent in an MDP
0:06:47
- [ ] Policies and Value Functions - Good Actions for a Reinforcement Learning Agent
0:06:52
- [ ] What do Reinforcement Learning Algorithms Learn - Optimal Policies
0:06:21
- [ ] Q-Learning Explained - A Reinforcement Learning Technique
0:08:37
- [ ] Exploration vs. Exploitation - Learning the Optimal Reinforcement Learning Policy
0:10:06
- [ ] OpenAI Gym and Python for Q-learning - Reinforcement Learning Code Project
0:07:52
- [ ] Train Q-learning Agent with Python - Reinforcement Learning Code Project
0:08:59
- [ ] Watch Q-learning Agent Play Game with Python - Reinforcement Learning Code Project
0:07:22
- [ ] Deep Q-Learning - Combining Neural Networks and Reinforcement Learning
0:10:50
- [ ] Replay Memory Explained - Experience for Deep Q-Network Training
0:06:21
- [ ] Training a Deep Q-Network - Reinforcement Learning
0:09:07
- [ ] Training a Deep Q-Network with Fixed Q-targets - Reinforcement Learning
0:07:35
- [ ] Deep Q-Network Code Project Intro - Reinforcement Learning
0:06:26
- [ ] Build Deep Q-Network - Reinforcement Learning Code Project
0:16:51
- [ ] Deep Q-Network Image Processing and Environment Management - Reinforcement Learning Code Project
0:21:53
- [ ] Deep Q-Network Training Code - Reinforcement Learning Code Project
0:19:46
- [ ] Reinforcement Learning Series Intro - Syllabus Overview
Be able to perform hyperparameter tuning
[ ] Coursera: Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization
[📺 ] How do GPUs speed up Neural Network training?
0:08:19
[📺 ] Why use GPU with Neural Networks?
0:09:43
Be familiar with literature on model interpretability
[📰] TracIn — A Simple Method to Estimate Training Data Influence
[📰] How to Explain the Prediction of a Machine Learning Model?
[📺 ] Jay Alammar - Take A Look Inside Language Models With Ecco
[📺 ] How do we check if a neural network has learned a specific phenomenon?
[📺 ] What is Adversarial Machine Learning and what to do about it? – Adversarial example compilation
[📺 ] SE4AI: Explainability and Interpretability (Part 1)
1:17:45
[📺 ] SE4AI: Explainability and Interpretability (Part 2)
1:21:50
Be able to optimize models for inference
[📰] FasterAI
[📰] Is the future of Neural Networks Sparse? An Introduction (1/N)
[📰] Sparse Neural Networks (2/N): Understanding GPU Performance.
[📰] Block Sparse Matrices for Smaller and Faster Language Models
[📰] How to accelerate and compress neural networks with quantization
[📰] Scaling up BERT-like model Inference on modern CPU - Part 1
Be able to write unit tests
[📰] How to Unit Test Deep Learning: Tests in TensorFlow, mocking and test coverage
[📰] Test-Driven Machine Learning Development (Deployment Series: Guide 07)
[🅤]ꭏ Software Testing
[🅤]ꭏ Software Debugging
[📺 ] Beyond Accuracy: Behavioral Testing of NLP Models with CheckList | AISC
[📺 ] Lecture 10:Machine Learning Testing & Explainability (Full Stack Deep Learning - Spring 2021)
1:41:12
[📺 ] Lab 8: Testing and Continuous Integration (Full Stack Deep Learning - Spring 2021)
0:13:26
Be familiar withMachine Learning System Design
[📰] How to build scalable Machine Learning systems — Part 1/2
[📰] How to properly ship and deploy your machine learning model
[📰] Key Concepts for Deploying Machine Learning Models to Mobile
[📰] Machine Learning Infrastructure Tools for Model Building
[📰] Machine Learning Infrastructure Tools for Production (Part 1)
[📰] Building scalable and efficientMachine Learning Pipelines
-
[📺 ] AppliedMachine Learning in Production
- [ ] Objective · AppliedMachine Learning in Production
0:02:38
- [ ] Solution · AppliedMachine Learning in Production
0:08:53
- [ ] Evaluation · AppliedMachine Learning in Production
0:04:02
- [ ] Iteration · AppliedMachine Learning in Production
0:04:35
- [ ] Annotation · AppliedMachine Learning in Production
0:14:34
- [ ] Exploratory Data Analysis (EDA) · AppliedMachine Learning in Production
0:09:02
- [ ] Objective · AppliedMachine Learning in Production
[📺 ] SE4AI: Software Architecture of AI-Enabled Systems
1:14:24
[📺 ] SE4AI: Invited Talk Molham Aref "Business Systems with Machine Learning "
0:47:53
[📺 ] MLOps #4: Shubhi Jain - Building anMachine Learning Platform @SurveyMonkey
0:55:42
[📺 ] MLOps Meetup #6: Mid-Scale Production Feature Engineering with Dr. Venkata Pingali
1:01:35
[📺 ] MLOps meetup #5 High StakesMachine Learning with Flavio CLesio
0:55:27
[📺 ] MLOps meetup #7 Alex Spanos // TrueLayer 's MLOps Pipeline
0:56:17
[📺 ] #11 Machine Learning at scale in Mercado Libre with Carlos de la Torre
0:59:28
[📺 ] MLOps #14: Kubeflow vs MLflow with Byron Allen
0:54:57
[📺 ] MLOps #15 - Scaling Human in the Loop Machine Learning with Robert Munro
0:55:04
[📺 ] MLOps #18 // Nubank - Running a fintech on ML
0:53:19
[📺 ] Feature Stores: An essential part of theMachine Learning stack to build great data / Kevin Stumpf - CTO at Tecton
1:05:46
[📺 ] MLOps #31 Path to Production and Monetizing Machine Learning // Vin Vashishta - Data Scientist
0:56:35
[📺 ] MLOps #35: Streaming Machine Learning with Apache Kafka and Tiered Storage // Kai Waehner, Confluent
0:52:50
[📺 ] Luigi in Production // MLOps Coffee Sessions #18 // Luigi PatrunoMachine Learning in Production
0:47:23
[📺 ] The Current MLOps Landscape // Nathan Benaich & Timothy Chen // MLOps Meetup #43
0:58:31
-
- [ ] Stanford MLSys Seminar Episode 0:Machine Learning + Systems
0:11:49
- [ ] Stanford MLSys Seminar Episode 1: Marco Tulio Ribeiro
1:00:38
- [ ] Stanford MLSys Seminar Episode 2: Matei Zaharia
0:59:44
- [ ] Stanford MLSys Seminar Episode 3: Virginia Smith
1:00:55
- [ ] Stanford MLSys Seminar Episode 4: Alex Ratner
1:13:34
- [ ] Stanford MLSys Seminar Episode 5: Chip Huyen
1:06:44
- [ ] Stanford MLSys Seminar Episode 0:Machine Learning + Systems
[📺 ] Xavier Amatriain on Practical Deep Learning Systems (Full Stack Deep Learning - November 2019)
[📺 ] How to avoid a single point of failure in distributed systems
[📺 ] How to start with distributed systems? Beginner's guide to scaling systems.
[📺 ] What is a microservice architecture and it's advantages?
[📺 ] Whatsapp System Design: Chat Messaging Systems for Interviews
[📺 ] Capacity Estimation: How much data does YouTube store daily?
[📺 ] How Netflix onboards new content: Video Processing at scale
[📺 ] Distributed Consensus and Data Replication strategies on the server
[📺 ] System design : Design Autocomplete or Typeahead Suggestions for Google search
[📺 ] Distributed Consensus and Data Replication strategies on the server
[📺 ] How Netflix onboards new content: Video Processing at scale
[📺 ] How to start with distributed systems? Beginner's guide to scaling systems.
[📺 ] Capacity Estimation: How much data does YouTube store daily?
[📺 ] System design : Design Autocomplete or Typeahead Suggestions for Google search
Be able to serveMachine Learning models
[📰] Deploy a Keras Deep Learning Project to Production with Flask
[📰] Selecting gunicorn worker types for different python web applications.
[📰] What Does it Mean to Deploy a Machine Learning Model? (Deployment Series: Guide 01)
[📰] Software Interfaces for Machine Learning Deployment (Deployment Series: Guide 02)
[📰] Batch Inference for Machine Learning Deployment (Deployment Series: Guide 03)
[📰] The Challenges of Online Inference (Deployment Series: Guide 04)
[📰] Online Inference forMachine Learning Deployment (Deployment Series: Guide 05)
[📰] Model Registries forMachine Learning Deployment (Deployment Series: Guide 06)
[📰] 5 Challenges to Running Machine Learning Systems in Production
[📰] Enabling Machine-Learning-as-a-Service Through Privacy Preserving Machine Learning
-
[ ] Cortex Blog
- [ ] Server-side batching: Scaling inference throughput in machine learning
- [ ] How we served 1,000 models on GPUs for $0.47
- [ ] Designing a machine learning platform for both data scientists and engineers
- [ ] How to serve batch predictions with TensorFlow Serving
- [ ] How to deploy Transformer models for language tasks
- [ ] How we scale machine learning model deployment on Kubernetes
- [ ] Why we built a serverless machine learning platform—instead of using AWS Lambda
- [ ] Why we don’t deploy machine learning models with Flask
- [ ] How to deploy machine learning models from a notebook to production
- [ ] A/B testing machine learning models in production
- [ ] How to deploy 1,000 models on one CPU with TensorFlow Serving
- [ ] How to reduce the cost of machine learning inference
- [ ] Improve NLP inference throughput 40x with ONNX and Hugging Face
- [ ] How to deploy PyTorch Lightning models to production
[🅤]ꭏ HTTP & Web Servers
[📺 ] PyConBY 2020: Sebastian Ramirez - ServeMachine Learning models easily with FastAPI
[📺 ] Python pydantic Introduction – Give your data classes super powers
[📺 ] PyData Vancouver meetup: cortex.dev : Serving machine learning models in production
[📺 ] Lecture 11A: DeployingMachine Learning Models (Full Stack Deep Learning - Spring 2021)
0:53:25
[📺 ] Hands-on serving models using KFserving // Theofilos Papapanagiotou // MLOps Meetup #40
0:57:40
[📺 ] Shawn Scully: Production and Beyond: Deploying and Managing Machine Learning Models
Be able to setup batch inference
[📰] Distill: Why do we need Flask, Celery, and Redis? (with McDonalds in Between)
[📰] Celery: an overview of the architecture and how it works
Be able to build interactive UI for models
Be able to use Docker for containerization
[📰] Pass Docker Environment Variables During The Image Build
[📰] Setting Default Docker Environment Variables During Image Build
[📰] How Docker Can Help You Become A More Effective Data Scientist
[📰] Deploying conda environments in (Docker) containers - how to do it right
[📰] A Beginner-Friendly Introduction to Containers, VMs and Docker
[📰] Using Docker to Generate Machine Learning Predictions in Real Time
[📰] Connection refused? Docker networking and how it impacts your image
[📰] Where’s your code? Debugging ImportError and ModuleNotFoundErrors in your Docker image
[📰] A tableau of crimes and misfortunes: the ever-useful docker history
[📰] Broken by default: why you should avoid most Dockerfile examples
[📰] A review of the official Dockerfile best practices: good, bad, and insecure
[📰] The best Docker base image for your Python application (February 2021)
[📰] Building on solid ground: ensuring reproducible Docker builds for Python
[📰] Less capabilities, more security: minimizing privilege escalation in Docker
[📰] Security scanners for Python and Docker: from code to dependencies
[📰] Speed up pip downloads in Docker with BuildKit’s new caching
[📰] Multi-stage builds #2: Python specifics—virtualenv, –user, and other methods
[📰] Multi-stage builds #3: Why your build is surprisingly slow, and how to speed it up
[📰] What’s running in production? Making your Docker images identifiable
[📰] Docker BuildKit: faster builds, new features, and now it’s stable
[📰] Docker vs. Singularity for data processing: UIDs and filesystem access
[📺 ] Docker
[📺 ] Why Your Web Server Should Log to Stdout (Especially with Docker)
Be able to use Cloud
[📰] Getting started with large-scale ETL jobs using Dask and AWS EMR
[ ] Pluralsight: AWS Networking Deep Dive: Virtual Private Cloud (VPC)
[ ] Pluralsight: Building Applications Using Elastic Beanstalk
[ ] Whitepaper: Architecting for the Cloud AWS Best Practices
[ ] Whitepaper: Optimizing Enterprise Economics with Serverless Architectures
[ ] Whitepaper: Practicing Continuous Integration and Continuous Delivery on AWS
Be familiar with serverless architecture
Be able to monitorMachine Learning models
[📰] Production Machine Learning Monitoring: Outliers, Drift, Explainers & Statistical Performance
[📰] The Playbook to Monitor Your Model’s Performance in Production
[📰] Lessons Learned from 15 Years of Monitoring Machine Learning in Production
[📰] Using Statistical Distances for Machine Learning Observability
[📺 ] Instrumentation, Observability & Monitoring of Machine Learning Models
[📰] Machine Learning Infrastructure Tools —Machine Learning Observability
[📺 ] MLOps #24 Monitoring theMachine Learning stack // Lina Weichbrodt
0:55:32
[📺 ] Josh Wills: Visibility and Monitoring for Machine Learning Models
[📺 ] Lecture 11B: MonitoringMachine Learning Models (Full Stack Deep Learning - Spring 2021)
0:36:55
[📺 ] OpML '20 - HowMachine Learning Breaks: A Decade of Outages for One LargeMachine Learning Pipeline
[📺 ] MLOps #28Machine Learning Observability // Aparna Dhinakaran - Chief Product Officer at Arize AI
0:55:04
[📺 ] MLOps #29 Continuous Evaluation & Model Experimentation // Danny Ma - Founder of Sydney Data Science
1:00:46
[📺 ] SE4AI: Quality Assessment in Production
1:18:45
[📺 ] SE4AI: Infrastructure Quality, Deployment and Operations
1:04:54
Be able to perform load testing
Be able to perform A/B testing
[📰] A/B Testing Machine Learning Models (Deployment Series: Guide 08)
[🅤]ꭏ A/B Testing
[📺 ] Hypothesis testing with Applications in Data Science
0:10:33
Be proficient in Python
[📰] Speeding Up Python with Concurrency, Parallelism, and asyncio
[📰] A Python prompt into a running process: debugging with Manhole
[📖] A Byte of Python
[📖] Python 201
[ ] Calmcode: ray
[📺 ] Python 3 Programming Tutorial - Regular Expressions / Regex with re
[📺 ] Python Tutorial: re Module - How to Write and Match Regular Expressions (Regex)
[📺 ] Aaron Richter- Parallel Processing in Python| PyData Global 2020
Have a general understanding of other parts of the stack
[📖] Refactoring UI
[ ] Pluralsight: CSS: Specificity, the Box Model, and Best Practices
[ ] Treehouse: HTML
[🅤]ꭏ Intro to Javascript
[🅤]ꭏ Object Oriented JS 1
[🅤]ꭏ Object Oriented JS 2
Be familiar with fundamental Computer Science concepts
[📰] Asymptotic Analysis Explained with Pokémon: A Deep Dive into Complexity Analysis
[🅤]ꭏ Intro to Algorithms
Be able to apply proper software engineering process
[ ] Pluralsight: Security Awareness: Basic Concepts and Terminology
[ ] Pluralsight: Clean Architecture: Patterns, Practices, and Principles
[🅤]ꭏ Product Design
[🅤]ꭏ Rapid Prototyping
Be able to efficiently use a text editor
[📰] How To Use Visual Studio Code for Remote Development via the Remote-SSH Plugin
[📺 ] What every GitHub user should know about VS Code - GitHub Satellite 2020
[⚙] and GitHub
Be able to communicate and collaborate well
[📖] Leaders Eat Last: Why Some Teams Pull Together and Others Don't
[📺 ] Building a psychologically safe workplace | Amy Edmondson | TEDxHGSE
Be familiar with the hiring pipeline
[📰] Salesforce Data Science Interview— Acing the AI Interview
[📰] Data Science Interview Questions and Solutions — Linear and Logistic regression
[💻] Practicing Machine Learning Interview Questions in Python
[🅤]ꭏ Optimize your GitHub
[🅤]ꭏ Refresh Your Resume
[🅤]ꭏ Technical Interview
[📺 ] Panel Discussion: Do I need a PhD to work in ML? (Full Stack Deep Learning - Spring 2021)
[📺 ] The Importance of Writing in a Tech Career - Eugene Yan
[📺 ] How to prepare for Machine Learning interviews- Part 1 | Applied AI Course
[📺 ] How to prepare for Machine Learning interviews- Part 2 | Applied AI Course
[📺 ] Guest Lecture - Chip Huyen - Machine Learning Interviews - Full Stack Deep Learning
Broaden Perspective
[📖] Atomic Habits
[📖] Deep Work
[📖] The 10X Rule
[📺 ] Why specializing early doesn't always mean career success | David Epstein
[📺 ] Chamath Palihapitiya, Founder and CEO Social Capital, on Money as an Instrument of Change
[📺 ] How to Use Twitter
[📺 ] A. Jesse Jiryu Davis - Write an Excellent Programming Blog - PyCon 2016
[📺 ] The GreatMachine Learning Stagnation (Mark Saroufim and Dr. Mathew Salvaris)
[📺 ] What Machine Learning Can Teach Us About Life: 7 Lessons - Talk Python Live Stream
[📺 ] Ross Tuck - Things I Believe Now That I'm Old at Laracon EU 2014
Top comments (9)
Wonderful, but a lot of resource.
Can you publish a learning flow?
This is a good way to use these resources
I agree with @nghiadoan20, a lot of resources but at the same time is so overwhelming to chosoe where to start and what path to choose, so it could be helpful to have that and use all of these awesome resources.
Wonderful but not helpful at all!!!
Too many resources, so confusing.
I don't agree with your opinion
There are several Web3 clouds I know about. Data security is an obvious concern with such platforms -- you have to give away your information to third parties you cannot trust. The problem can be solved with confidential computing (protecting data-in-use), but it's a relatively new tech and not the easiest to implement.
I don't know many teams going in this direction. If you're interested in Web3 or cloud computing, you should probably check Super Protocol out. The project is still in an early stage of development though, but they recently launched a public testnet with several premade Python solutions (face recognition, speech recognition, and image classification). As far as I know, they are going to allow (and incentivize) uploading your own.
Anyway, here's a Typeform (you can google their website yourself Wink): superprotocol.typeform.com/testnet
They send out invitations a few times a week.
A proper roadmap can help
Rather than dumping resources
But what if I don't want to become a machine that learns things?
Excellent and appreciated work...
Great work :)