Autoplay
Autocomplete
Previous Lesson
Complete and Continue
High Performance Time Series
Welcome to High Performance Time Series!
High-Performance Time Series - Become the Time Series Expert for Your Organization (2:34)
Private Slack Channel - How to Join
Video Subtitles (Captions)
What is a High-Performance Forecasting System?
[IMPORTANT] System Requirements - R + Python Requirements & Common Issues
Would You Like To Become An Affiliate (And Earn 20% On Your Sales)?
Prerequisites
Prerequisite - Data Science for Business Part 1
Getting Help
Getting Help (IMPORTANT!!!)
Module 0 - Introduction to High-Performance Forecasting
High-Performance Forecasting - What You're Learning, Why You're Learning It (0:43)
0.1 Forecast Competition Review
The Forecasting Competition Review & Course Progression (3:34)
2014 Kaggle Walmart Recruiting Challenge (5:11)
2018 M4 Competition (3:37)
2018 Kaggle Wikipedia Website Traffic Forecasting Competition (4:30)
2020 M5 Competition (5:59)
5 Key Takeaways from the Forecast Competition Review (5:41)
0.2 Course Projects - Google Analytics, Email Subscribers, & Sales Forecasting
The Business Case - Developing a Best-in-Class Forecasting System (3:03)
0.3 What Tools are in Your Toolbox?
Timetk: Time Series Data Preparation, Visualization, & Preprocessing (5:54)
Modeltime: Time Series Machine Learning (5:25)
GluonTS: Time Series Deep Learning (2:01)
๐บ๏ธ [Cheat Sheet] Forecasting Workflow
Module 01 - Time Series Jumpstart
Time Series Jumpstart (0:54)
1.1 Time Series Project Setup
Project Setup (2:28)
๐ฝ Course Data (File Download) (1:02)
๐ฝ R Package Installation - Part 1 (File Download) (5:26)
R Package Installation - Part 2 (5:14)
๐ฝ Jumpstart Setup (File Download) (0:44)
1.2 Business Understanding & Dataset Terminology
Establish Relationships, Part 1 - Google Analytics Summary Dataset (4:11)
Establish Relationships, Part 2 - Google Analytics Top 20 Pages (5:23)
Build Relationships - Mailchimp & Learning Lab Events (4:49)
Generate Course Revenue - Transaction Revenue & Product Events (3:03)
๐ฝ Code Checkpoint (File Download) (0:54)
1.3 TS Jumpstart: Dive into Forecasting Email Subscribers!
Read This! - Time Series Jumpstart Intent
๐ฝ Time Series Jumpstart - Setup (File Download) (3:20)
Libraries & Data (3:13)
1.3.1 Exploratory Data Analysis for Time Series
EDA for Time Series (1:08)
Summarize By Time (5:46)
Time Series Summary Diagnostics (4:47)
Pad by Time (4:08)
Visualize the Time Series (3:12)
1.3.2 Evaluation & Train/Test Windows
Evaluation Window - Filter By Time (4:43)
Time Series Train/Test Split (4:53)
1.3.3 Forecasting with Prophet
Training a Prophet Model with Modeltime (4:21)
Modeltime Forecasting Workflow - Round 1 (7:43)
1.3.4 Forecasting with Feature Engineering
Visualizing Seasonality (4:34)
Feature Engineering - Part 1 (5:45)
Feature Engineering - Part 2 (5:51)
Machine Learning with Workflows (3:35)
Modeltime Forecasting Workflow - Round 2 (5:59)
1.3.5 Recap & Code Checkpoint - Module 01 - TS Jumpstart
Here's where you are going. (3:11)
๐ฝ Code Checkpoint (File Download)
โจ[Part 1] Time Series with Timetk
Welcome to Part 1 - Time Series with Timetk! (2:17)
Module 02 - Time Series Visualization
๐ฝ Setup (File Download) & Overview - Visualization (2:11)
Data Preparation - Part 1 (4:29)
Data Preparation - Part 2 (3:23)
2.1 Time Series Plots [MUST KNOW FUNCTION] ๐ก
[MUST KNOW] Plotting Time Series ๐ก (5:31)
Plotting with Transformations (4:37)
Adjusting the Smoother (6:11)
Smoother for Groups (1:54)
Interactive & Static Plots (2:00)
2.2 Autocorrelation Plots
ACF & PACF Concepts - Autocorrelation & Partial Autocorrelation
ACF & PACF Plotting (7:49)
Lag Adjustment (1:24)
CCF Plotting - Cross Correlations (7:58)
2.3 Seasonality Plots
Seasonality Box Plot (5:52)
Seasonality Violin Plot (0:53)
2.4 Anomaly Plots
Anomaly Plot Basics (4:50)
Getting the Anomaly Data (2:00)
Working with Grouped Data (1:43)
2.5 STL Decomposition & Regression Plots
STL Decomposition Plot (4:44)
STL Decomposition - Grouped Time Series (2:11)
2.6 Regression Plots [SECRET WEAPON FOR FEATURE ENGINEERING]
[SECRET WEAPON] Time Series Regression Plot ๐ฅ๐ฅ๐ฅ (7:08)
Time Series Regression Plot - Grouped Time Series (4:05)
2.7 Code Checkpoint - Module 02 - Visualization
๐ฝ Code Checkpoint (File Download)
Module 03 - Time Series Data Wrangling
๐ฝ Setup (File Download) & Overview - Data Wrangling (2:34)
3.1 Summarise By Time [MUST KNOW] ๐ก
Single & Grouped Time Series Summarizations (4:37)
Using Across (to Summarize Wide-Format Tibbles by Time) (5:11)
Weekly/Monthly/Quarterly/Yearly Aggregations (3:33)
Floor, Ceiling, Round (5:15)
3.2 Pad by Time
Filling in Gaps (2:54)
From Low-Frequency to High-Frequency (3:36)
3.3 Filter By Time
Zooming & Slicing (5:14)
Offsetting by Time (2:01)
3.4 Mutate By Time
Extrapolate the Mean, Median, Max, Min By Time (7:57)
3.5 Joining By Time
Combining Subscribers & Web Traffic (3:48)
Inspecting the Join (3:00)
Formatting the Join for Feature Relationships (5:49)
Join Cross Correlations (3:22)
3.6 Time Series Index Operations
Making a Time Series (4:39)
Making a Holiday Sequence (3:14)
Time Offsets (3:01)
Making a Future Time Series (3:12)
3.7 Forecasting with Future Frames ๐
The Future Frame (2:47)
[FORECAST SPOTLIGHT] Forecasting with the Future Frame ๐ (6:53)
3.8 Code Checkpoint - Module 03 - Data Wrangling
๐ฝ Code Checkpoint (File Download)
Module 04 - Transformations for Time Series
๐ฝ Setup (File Download) & Overview - Transformations (2:15)
Libraries & Data (2:12)
4.1 Variance Reduction Transformations - Log & Box Cox [MUST KNOW] ๐ก
Why is Variance Reduction Important? (4:43)
Log - Log (and Log1P) Transformation (4:17)
Log - Assessing the Benefit of Log1P Transformation (2:51)
Log - Groups & Inversion (3:43)
Box Cox - What is the Box Cox Transformation? (2:34)
Box Cox - Assessing the Benefit (4:04)
Box Cox - Inversion (2:05)
Box Cox - Managing Grouped Transformations & Inversion (8:36)
4.2 Rolling & Smoothing Transformations
Introduction to Rolling & Smoothing (1:49)
๐ฝ Rolling Windows - What is a Moving Average? (File Download) (3:53)
Rolling Windows - Moving Average & Median Applied (8:53)
Loess Smoother (7:02)
Rolling Correlation - Slidify, Part 1 (4:16)
Rolling Correlation - Slidify, Part 2 (7:40)
[BUSINESS SPOTLIGHT] The Problem with Forecasting using a Moving Average (6:43)
4.3 Range Reduction Transformations
Introduction to Normalization & Standardization (0:59)
What is Normalization? [Min = 0, Max = 1] (4:50)
What is Standardization? [Mean = 0, Standard Deviation = 1] (2:31)
4.4 Imputation & Outlier Cleaning
Introduction to Imputation & Outlier Cleaning (0:44)
Imputation - Time Series NA Repair (6:40)
Anomalies - Time Series Outlier Cleaning (7:22)
Anomalies - When to Remove Outliers (5:21)
4.5 Lags & Differencing Transformations [MUST KNOW] ๐ก
Introduction to Lags & Differencing (1:08)
Lags - What is a Lag? (1:49)
Lags - Lag Detection with ACF/PACF (3:54)
Lags - Regression with Lags (5:06)
Differencing - Growth vs Change (4:00)
Differencing - Acceleration (6:22)
Differencing - Comparing Multiple Time Series (4:44)
Differencing - Inversion (0:57)
4.6 Fourier Series [MUST KNOW] ๐ก
Introduction to the Fourier Series (7:23)
Fourier Regression (4:24)
4.7 Constrained Interval Forecasting [FORECAST SPOTLIGHT] ๐
What is the Log Interval Transformation? (5:47)
Visualizing the Transformation (4:12)
Transformations & Preprocessing (5:09)
Modeling (6:29)
Preparing Future Data (3:36)
Making Predictions (1:05)
Combining the Forecast Data (4:08)
Estimating Confidence Intervals (8:24)
Visualizing Confidence Intervals (2:10)
Inverting the Log Interval Transformation (4:08)
4.8 Code Checkpoint - Module 04 - Transformations
๐ฝ Code Checkpoint (File Download)
โฐ๏ธ Challenge #1 - Exploring Transactions & Web Page Traffic
๐ฝ Challenge #1 Discussion (File Download) (4:21)
๐ฝ Solution - Part 1 (File Download) (7:18)
Solution - Part 2: Begins at "Identify Relationships" (7:51)
Module 05 - Introduction to Feature Engineering (for Time Series)
๐ฝ Setup (File Download) & Overview - Intro to Feature Engineering (2:30)
Data Prep, Part 1 - Log Standardize (5:27)
Data Prep, Part 2 - Getting Ready to Clean (5:01)
Data Prep, Part 3 - Targeted Cleaning with Between Time (4:18)
5.1 Time-Based Features (Trend & Seasonal/Calendar) [MUST KNOW] ๐ก
The Time Series Signature (7:55)
Feature Removal (3:28)
Linear Trend (2:10)
Non-Linear Trend - Basis Splines (4:41)
Non-Linear Trend - Natural Splines (Stiffer than Basis Splines) (4:29)
Seasonal Features - Weekday & Month (3:21)
Seasonal Features - Combining with Trend (5:23)
5.2 Interactions
Interaction Features - Spikes Every Other Wednesday (7:35)
5.3 Fourier Features
Selecting & Adding Fourier Frequency Features (4:21)
Modeling & Visualizing the Fourier Effects (2:07)
5.4 Autocorrelated Lag Features
Selecting & Adding Lag Features (6:59)
Modeling & Visualizing the Lag Effects (5:20)
5.5 Special Event Features
Preparing Event Data for Analysis (6:34)
Visualizing Events (2:57)
Modeling & Visualizing Event Effects (2:08)
Fixing the Spline (2:07)
5.6 External Regressors (Xregs)
Transforming Xregs (5:05)
Joining Xregs (1:49)
Examining Cross Correlations (1:53)
Modeling with Xregs (3:28)
Visualizing PageViews vs Optins & Modeling Lags (6:58)
5.7 Recommended Model Features
Collecting the Recommended Model (3:44)
Saving the Model Artifact (2:28)
5.8 Code Checkpoint - Module 05 - Introduction to Feature Engineering
๐ฝ Code Checkpoint (File Download)
Module 06 - Advanced Feature Engineering Workflow
Forecasting Workflow [CHEAT SHEET] ๐บ๏ธ (3:40)
๐ฝ Setup (File Download) & Overview - Advanced Feature Engineering (1:43)
Data Preparation (4:42)
6.1 Creating the "Full" Dataset - Extending & Adding Lagged Features & Events [IMPORTANT] ๐ก
The "Full" Dataset (2:50)
Extending - Future Frame (3:21)
Adding Lag Features (4:02)
Add Lagged Rolling Features (5:03)
Add Events (External Regressors) (2:57)
Format Column Names (3:09)
6.2 Separate into Modeling Data & Forecast Data
Data Prepared / Future Data Split (2:48)
6.3 Separate into Training Data & Testing Data
Train / Test Split (3:55)
6.4 Recipes - Feature Engineering Pipeline Steps
Recipes Intro (2:41)
Step - Time Series Signature Features (5:48)
Step - Feature Removal (3:10)
Step - Standardization (2:11)
Step - One-Hot Encoding (1:55)
Step - Interaction Features (2:28)
Step - Fourier Series Features (2:03)
6.5 Building the Spline Model
Model Spec: LM Model (1:02)
Recipe Spec: Spline Features (5:59)
Workflow: Spline Recipe + LM Model (2:49)
6.6 Introduction to Modeltime Workflow
Modeltime Table & Calibration (2:08)
Forecasting the Test Data (2:40)
Measuring the Test Accuracy (1:19)
Comparing the Training & Testing Accuracy (1:32)
6.7 Building the Lag Model
Recipe Spec: Lag Features (3:00)
Workflow: Lag Recipe+ LM Model (2:40)
Modeltime: Comparing Spline & Lag Models (4:23)
6.8 Forecasting the Future
Refitting the Models (4:37)
Transformation Inversion (5:23)
Visualizing the Forecast in the Original Scale (1:59)
Overfitting (An Optional Fix)
6.9 Saving the Artifacts
Creating an Artifact List, Part 1 (4:34)
Creating an Artifact List, Part 2 (3:11)
Organizing the Artifacts List (1:57)
Saving the Artifacts (1:28)
6.10 Code Checkpoint - Module 06 - Advanced Feature Engineering
๐ฝ Code Checkpoint (File Download)
โฐ๏ธ Challenge #2 - Feature Engineering & Modeltime Workflow [YOU'VE GOT THIS!]
๐ฝ Challenge Discussion, Part 1 (File Download) - Feature Preparation (5:11)
Challenge Discussion, Part 2 - Feature Engineering & Modeling (4:56)
Challenge #2 - Solution
๐ฝ Solution, Part 1 (File Download) - Collect & Prepare Data (3:49)
Solution, Part 2 - Visualizations (3:19)
Solution, Part 3A - Create Full Dataset (5:46)
Solution, Part 3B - Visualize the Full Dataset (3:47)
Solution, Part 4 - Model/Forecast Data Split (1:05)
Solution, Part 5 - Train/Test Data Split (0:56)
Solution, Part 6 - Feature Engineering (4:18)
Solution, Part 7 - Modeling: Spline Model (6:08)
Solution, Part 8 - Modeling: Lag Model (2:25)
Solution, Part 9 - Modeltime (4:03)
Solution, Part 10 - Forecast (6:49)
Challenge #2 Bonus - Regularization
๐ฝ Regularization, Part 1 (File Download) - Model: GLMnet (4:01)
Regularization, Part 2 - Improving the Lag Model with GLMNet (5:28)
Regularization, Part 3 - Forecasting the Future Data with GLMNet + Lag Recipe (3:02)
Part 1 Complete - You rock! ๐๐๐
WOOO HOOO - You crushed it!
โจ[Part 2] Machine Learning for Time Series with Modeltime
๐ฝ Picking Up From Part 1 (Project Download)
Module 07 - Modeltime Workflow [DEEP DIVE] ๐
Setup - Modeltime Workflow [In-Depth] (1:25)
Overview - Modeltime Workflow [In-Depth] (1:16)
Libraries & Artifacts Preparation (2:33)
7.1 Making Models - Object Types & Requirements
Model Requirements for Modeltime (1:34)
Parsnip Object Models - Univariate (3:37)
Workflow Objects - Multivariate, Date-Based Features (7:14)
Workflow Object - Multivariate, External Features (4:53)
7.2 Modeltime Table
Modeltime Table - Key Requirements (4:27)
7.3 Calibration Table
Calibration Table - How It Works (3:29)
7.4 Measuring Model Accuracy [IMPORTANT!!!]
Primary Accuracy Metrics & Uses [SUPER IMPORTANT] (7:40)
Custom Metric Sets using Yardstick (3:54)
Customizing the Accuracy Table Output (3:28)
7.5 Forecasting the Test Data
Modeltime Forecast - How It Works (6:22)
Customizing the Forecast Visualization (5:00)
7.6 Model Refitting & Forecasting
Refitting - How It Works (3:02)
Making the Forecast (5:20)
7.7 Code Checkpoint - Module 07A - Modeltime Workflow [In-Depth]
๐ฝ Code Checkpoint (File Download)
7.8 New Features of Modeltime 0.1.0 - Module 07B ๐
๐ฝ Setup (File Download) - Modeltime New Features (1:53)
Expedited Forecasting - Modeltime Table (5:20)
Expedited Forecasting - Skip Straight to Forecasting (2:20)
Visualizing a Fitted Model (2:57)
Calibration - In-Sample vs Out-of-Sample Accuracy (5:25)
Residual Diagnostics - Getting Residuals (2:16)
Residuals - Time Plot (2:39)
Residuals - Plot Customization (2:29)
Residuals - ACF Plot (4:06)
Residuals - Seasonality Plot (3:50)
7.9 Code Checkpoint - Module 07B - Modeltime New Features!
๐ฝ Code Checkpoint (File Download)
Module 08 - ARIMA
๐ฝ Setup (File Download) (0:40)
ARIMA Training Overview (1:29)
Libraries & Artifacts Setup (1:49)
8.1 ARIMA Concepts ๐ก
Auto-Regressive Functions: ar() & arima() (5:15)
Auto-Regressive (AR) Modeling with Linear Regression (3:11)
Single-Step Forecast for AR Models (4:43)
Multi-Step Recursive Forecasting for AR Models (4:44)
Integration (Differencing) (5:42)
Moving Average (MA) Process (Error Modeling) (7:36)
Seasonal ARIMA (SARIMA) (4:29)
Adding XREGS (SARIMAX) (4:44)
8.2 ARIMA in Modeltime
Setting Up Basic ARIMA in Modeltime (4:45)
Trying Different ARIMA Parameters (5:11)
About AIC (Akaike Information Criterion) (3:42)
8.3 Modeltime Auto ARIMA
Implementing Auto ARIMA in Modeltime (1:49)
How Auto ARIMA Works - Lazy Grid Search (1:27)
Comparing ARIMA & Auto ARIMA (3:15)
Adding Fourier Features to Pick Up More than 1 Seasonality (3:49)
Adding Event Features to Improve R-Squared (Variance Explained) (1:33)
Refitting & Reviewing the Forecast (2:57)
Adding Month Features to Account for February Increase - BEST MAE 0.564 (3:35)
8.4 Recap - ARIMA
ARIMA Strengths & Weaknesses (and Strategies that Worked) (3:56)
Saving Artifacts - Best ARIMA Model (3:28)
8.5 Code Checkpoint - Module 08 - ARIMA
๐ฝ Code Checkpoint (File Download)
Module 09 - Prophet
๐ฝ Setup (File Download) (0:27)
Prophet Training Overview (0:51)
Libraries & Artifacts (2:02)
9.1 Prophet with Modeltime
Prophet Regression: prophet_reg() (3:23)
Modeltime Workflow (2:02)
Adjusting the Key Prophet Parameters (5:13)
9.2 Prophet Concepts ๐ก
Extracting the Prophet Model from Modeltime (3:11)
Visualizing the Effect of Key Parameters on the Prophet Model (5:48)
Understanding Prophet Components & Additive Model (2:37)
9.3 Back to Modeling with Prophet - XREGS!
Fitting Prophet w/ Events (2:19)
Comparing No Events vs Events - BEST MAE 0.488 (w/ Events) ๐ (3:05)
Making the Forecast (2:10)
9.4 Recap - Prophet
Logging (Saving) Your Progress (2:40)
Recap - Prophet Strengths & Weaknesses (3:02)
9.5 Checkpoint - Module 09 - Prophet
๐ฝ Code Checkpoint (File Download)
Module 10 - Exponential Smoothing, TBATS, & Seasonal Decomposition
๐ฝ Setup (File Download) (0:18)
Overview - Exponential Smoothing (0:35)
Libraries & Artifacts (1:37)
10.1 Exponential Smoothing
The Exponential Weighting Function (4:50)
Applying the Exponential Weighting Function to Make a Forecast (2:41)
ETS Model: exp_smoothing() (3:52)
Visualizing the ETS Model (4:48)
10.2 TBATS
TBATS Model: seasonal_reg() (3:36)
Visualizing the TBATS Model (2:48)
10.3 Seasonal Decomposition Models
Seasonal Decomposition & Multiple Seasonality Time Series (MSTS) Objects (2:28)
STLM ETS Model (2:33)
STL Plot & Relationship to STLM ETS Model (2:49)
STLM ARIMA Model (1:55)
STLM ARIMA - Adding XREGS (1:08)
10.4 Evaluation
Preparing the Test Forecast Visualization (3:30)
Comparing Multiple Models - ETS, TBATS, STLM ARIMA & ETS - BEST MAE 0.523 (TBATS) ๐ช (3:45)
Refitting - Examining the Future Forecasts (3:34)
10.5 Recap - ETS, TBATS, Seasonal Decomp
Saving Artifacts (2:22)
Strengths & Weaknesses - ETS, TBATS, Seasonal Decomp (2:05)
10.6 Code Checkpoint - Module 10 - ETS, TBATS, & Seasonal Decomposition
๐ฝ Code Checkpoint (File Download)
โฐ๏ธ Challenge #3 - ARIMA + Prophet + ETS + TBATS
๐ฝ Challenge #3 Discussion, Part 1 (File Download) - Start through ARIMA (5:32)
Challenge #3 Discussion, Part 2 - Prophet to End of Challenge (2:33)
Challenge #3 - Solution
๐ฝ Solution, Part 1 - Train/Test Setup (Solution File Download) (1:55)
Solution, Part 2 - ARIMA (Model 1): Basic Auto ARIMA (3:03)
Solution, Part 3 - ARIMA (Model 2): Auto ARIMA + Adding Product Events (2:14)
Solution, Part 4 - ARIMA (Model 3): Auto ARIMA + Events + Seasonality (2:08)
Solution, Part 5 - ARIMA (Model 4): Forcing Seasonality with Manual ARIMA (1:17)
Solution, Part 6 - ARIMA (Model 5): Auto ARIMA + Events + Fourier Series (0:57)
Solution, Part 7 - ARIMA - Modeltime Workflow (2:26)
Solution, Part 8 - ARIMA - Forecast Review (3:18)
Solution, Part 9 - Prophet Models: Basic (6), Yearly Seasonality (7), Events (8), Events + Fourier (9) (2:52)
Solution, Part 10 - Prophet - Modeltime Workflow (1:38)
Solution, Part 11 - Prophet - Forecast Review (3:13)
Solution, Part 12 - Exponential Smoothing Models: ETS (10), TBATS (11) (3:24)
Solution, Part 13 - Exponential Smoothing - Modeltime Workflow (1:45)
Solution, Part 14 - Exponential Smoothing - Forecast Review (1:30)
Solution, Part 15 - Forecasting the Future Data - ARIMA, Prophet & ETS/TBATS (3:40)
Solution, Part 16 - Final Review - ARIMA, Prophet, & ETS/TBATS (2:47)
Challenge #3 BONUS - ARIMA & Prophet vs Linear Model
๐ฝ Bonus, Part 1 (File Download) - Adding the LM from Challenge #2 (4:43)
Bonus, Part 2 - Why is the LM forecast high in March? (4:41)
11.0 Machine Learning Algorithms [IMPORTANT] ๐ก
๐ฝ Welcome to Machine Learning for Time Series (File Download) (5:22)
11.1 Elastic Net Algorithm (GLMNet) - Linear
GLMNet - Model Spec (3:43)
GLMNet - Spline & Lag Workflows (2:40)
GLMNet - Calibration, Accuracy, & Plot (4:06)
GLMNet - Tweaking Parameters - BEST MAE 0.519 (Lag Model) ๐ช (2:33)
*** Plotting Utility *** - Let's make a helper function to speed evaluation up!
calibrate_and_plot() (5:50)
Visualizing the Effect of Parameter Adjustments (3:19)
11.2 Multiple Adaptive Regression Splines (MARS) - Linear
We come from MARS (3:30)
MARS - A Simple Example (6:55)
MARS - Spline & Lag Models - BEST MAE 0.518 (Spline Model) ๐ช (4:28)
11.3 Support Vector Machine (SVM) - Polynomial
SVM Polynomial - Model Specification (2:54)
SVM Poly - Tweaking Parameters - BEST MAE 0.615 (Spline Model) - BOOO ๐ (5:09)
11.4 Support Vector Machine (SMV) - Radial Basis Function
16% Improvement - SVM RBF vs SVM Poly (2:29)
SVM RBF - Parameter Tweaking (3:11)
SVM RBF - Lag Model - BEST MAE 0.520 (Spline Model) - Niiiice! ๐ช (1:55)
11.5 [Important Concept] KNN & Tree-Based Algorithms - The Problem with Predicting Time Series Trends
Strengths/Weakness - KNN & Tree-Based Algorithms Can't Predict Beyond the Min/Max (1:24)
KNN vs GLMNET - Making Sample Data with Trend (2:08)
KNN vs GLMNET - Making Simple Trend Models (4:12)
KNN vs GLMNET - Visualize the Trend Predictions w/ Modeltime - Yikes, GLMNET just schooled KNN (4:14)
11.5 K-Nearest Neighbors (KNN) - Similarity (Distance) Based
KNN - Spline Model (3:30)
KNN - Tweaking Key Parameters (5:52)
KNN - Lag Model - BEST MAE 0.558 (Spline Model) (2:05)
You're kicking butt... But, don't forget to take breaks
[COFFEE BREAK] With Bill Murray
11.6 Random Forest (Tree-Based)
RF - Spline Model (4:27)
RF - Lag Model - 32% Better vs Spline Model (3:11)
RF - Tweaking Parameters - BEST MAE 0.516 (Lag Model) ๐ช (4:02)
11.7 XGBoost (Gradient Boosting Machine) - Tree-Based
XGBoost - Spline & Lag Models (5:00)
XGBoost - Tweaking Parameters - 0.484 MAE (Lag Model) (6:35)
XGBoost - Tweaking Parameters 2 - BEST MAE 0.484 (Lag Model) ๐ (3:32)
11.8 Cubist - Combo of Trees (Rules) + Linear Models at Nodes
Cubist - Spline & Lag Models - 0.514 MAE out of the gate! (4:53)
Cubist - Tweaking Parameters - OPTIMAL MAE / R-SQUARED (0.524 / 0.316) (5:48)
11.9 Neural Net (NNET) - Like a Linear Regression but Better
NNET - Spline & Lag Models (4:57)
NNET - Tweaking Parameters - BEST MAE 0.553 (Spline Model) (5:39)
11.10 NNETAR - Combining AR Terms with a NNET!
What the heck is NNETAR? (NNET + ARIMA - IMA = NNETAR) (2:22)
NNETAR - Model, Recipe, & Workflow (4:11)
NNETAR - Tweaking AR Parameters (2:24)
NNETAR - Tweaking NNET Parameters - BEST MAE 0.512 ๐ช (4:13)
11.11 Modeltime Experimentation Review
Organizing in a Modeltime Table (4:22)
Updating the Descriptions Programmatically (4:02)
Model Selection - Process & Tips (using Accuracy Table) (3:39)
Model Inspection - Process & Tips (using Test Forecast Visualization) (3:03)
Model Inspection - Visualizing the Future Forecast (5:42)
11.12 Saving Your Work - Artifacts!
Saving Models (2:34)
Saving your calibrate_and_plot() function (1:29)
11.13 Checkpoint - Module 11 - Machine Learning Algorithms
๐ฝ Code Checkpoint (File Download)
12.0 Boosted Algorithms - Prophet Boost & ARIMA Boost
Boosted Algorithms - A Powerful Technique for Improving Performance (3:37)
12.1A Prophet Baseline Model
Baseline: Best Prophet Model (2:38)
๐ก [Pro Tip] How to Fix a Broken Model (2:50)
Prophet Baseline - Best Model MAE 0.488 (0:54)
12.1B Prophet Boost
Recipe for Prophet Boost (3:33)
Model Strategy - Using XGBOOST for Seasonality/XREG Modeling (4:39)
Workflow - No Parameter Tweaking (3:41)
๐ก [KEY CONCEPT] Prophet Boost - Modeling Trend with Prophet, Residuals with XGBoost (3:00)
Prophet Boost - Tweaking Parameters - BEST MAE 0.457 ๐ (6:33)
12.2 ARIMA Boost
Modeling Strategy - ARIMA for trend, XGBOOST for XREGS (3:50)
ARIMA Boost - Model Specification (5:57)
ARIMA Boost - Tweaking Parameters - BEST MAE 0.523 (4:34)
12.3 Boosted Models - Modeltime Workflow
Modeltime - Accuracy Evaluation & Identifying Broken Models (2:43)
Modeltime - Forecast Test Data (2:10)
Modeltime - Refitting & Forecasting Future (3:08)
Save Your Work (1:26)
12.4 Code Checkpoint - Boosted Algorithms
๐ฝ Code Checkpoint (File Download)
13.0 Hyper Parameter Tuning & Cross Validation - For Time Series
๐ฝ Hyperparameter Tuning for Time Series (File Downloads) (3:56)
๐บ๏ธ [CHEAT SHEET] Hyperparameter Tuning Workflow (4:47)
Getting Started - Setup & Workflow (3:09)
13.1 Reviewing 28 Models (It's Easy with Modeltime)
Combining Our Artifacts - 28 Models! ๐ช (3:06)
Accuracy Review & Hyperparameter Tuning Candidate Selection (This Used to Take Me Weeks To Do) (4:36)
13.2 [SEQUENTIAL MODELS] NNETAR - Hyperparameter Tuning Process
What are Sequential Models? (& Why do we need to tune them differently?) (2:55)
Extracting the Workflow from a Modeltime Table: pluck_modeltime_model() (1:40)
Time Series Cross Validation (TSCV) Specification, Part 1: time_series_cv() (4:34)
Time Series Cross Validation (TSCV), Part 2: plot_time_series_cv_plan() (4:14)
Identify Tuning Parameters - Recipe Spec (3:07)
Identify Tuning Parameters - Model Spec (5:14)
Make a Grid for Parameters - Grid Spec (5:55)
13.2.1 - NNETAR Tuning, Round 1 - Default Params
Grid Latin Hypercube Specification: grid_latin_hypercube() (3:19)
Tuning Workflow Preparation (3:30)
Tune Grid & Show Results (7:24)
Visualize the Parameter Results (3:24)
13.2.2 NNETAR Tuning, Round 2 - Finding the Sweet Spot!
Update Grid Parameter Ranges (8:13)
Parallel Processing - Speed-Up Tuning (5:13)
Speed Comparison (Parallel vs Series) - 3.4X Speed Boost (44 sec vs 151 sec)
Review Parameters vs Performance Metrics (1:09)
NNETAR - Train the Final Model - Best RMSE 0.507 ๐ช (4:15)
13.3 [NON-SEQUENTIAL MODELS] Prophet Boost - Hyperparameter Tuning Process
What are Non-Sequential Models? (2:44)
Model Extraction: pluck_modeltime_model() (1:04)
K-Fold Cross Validation (Use with Non-Sequential Models ONLY) (4:23)
Prophet Boost - Recipe (1:10)
Prophet Boost - Model Spec (Identify Parameters for Tuning) (3:57)
13.3.1 Prophet Boost Tuning, Round 1 - Default Parameters
Grid Specification - Grid Latin Hypercube w/ Default Parameters (4:52)
Tuning the Grid (in Parallel) (6:18)
Visualize Results - Learning Rate Dominates โก (2:58)
13.3.2 Prophet Boost Tuning, Round 2 - Controlling Learning Rate
Grid Specification - Controlling Learning Rate (4:45)
Hyperparameter Tuning - Round 2 - We can see parameter trends! ๐คฟ (3:17)
13.3.3 Prophet Boost Tuning, Round 3 - Honing In
Grid Specification & Tuning - Honing the parameter ranges in (5:49)
Best RMSE Model (Central Tendency) - MAE 0.466, RMSE 0.630, RSQ 0.450 ๐ (6:13)
Best R-Squared Model (Variance Explained) - MAE 0.464, RMSE 0.643, RSQ 0.459 ๐ (2:42)
13.4 Saving Our Progress
Recap & Saving the Models (6:53)
13.5 Code Checkpoint - Model 13 - Hyperparameter Tuning
๐ฝ Code Checkpoint (File Download)
14.0 Ensemble Time Series Models (Stacking)
Competition Ensembling Review (5:57)
What is an Ensemble Model? (7:21)
Modeltime Ensemble: Documentation (2:01)
๐ฝ Forecasting Cheat Sheet Upgrade ๐บ๏ธ [Download Here] (1:00)
14.1 Model Performance Review
๐ฝ Code Setup [File Download] (6:49)
Reviewing Models - Combining Tables & Organizing Results (4:24)
Reviewing Models - Making Sub-Model Selections (7:46)
14.2 Average Ensemble
Mean Ensemble - RMSE 0.640 vs 0.630 (Best Submodel) (5:00)
Median Ensemble - RMSE 0.648 vs 0.630 (Best Submodel) (2:23)
14.3 Weighted Average Ensembles
Introduction to Weighted Ensembles (1:02)
Loading Selection (4:29)
Accuracy Assessment - RMSE 0.628 vs RMSE 0.630 (Baseline) (2:37)
14.4.A Stacked Ensembles - Stacking Process
Introduction to Meta-Learner Ensembling with Modeltime Ensemble (3:57)
Resampling: Time Series Cross Validation (TSCV) Strategy (5:17)
Making Sub-Model CV Predictions - modeltime_fit_resamples() (4:27)
Resampling & Sub-Model Prediction: K-Fold Strategy (6:28)
Linear Regression Stack - TSCV - RMSE 1.00 (Ouch!) ๐คฎ (7:16)
Linear Regression Stack - K-Fold - RMSE 0.651 (Much Better, but We Can Do Better) ๐ (3:25)
14.4.B Stacked Ensembles - Stacking with Tunable Algorithms
GLMNET Stack - RMSE 0.641 (On the right track) ๐ (6:38)
Modeltime Ensemble - In-Sample Prediction Error - Bug Squashed (1:10)
Random Forest Stack - RMSE 0.587!!! (7% improvement) ๐ค๐ (4:33)
Neural Net Stack - RMSE 0.643 (4:05)
XGBoost Stack - RMSE 0.585!!! ๐ฅ๐ฅ๐ฅ (4:29)
Cubist Stack - RMSE 0.649 (3:11)
SVM Stack - RMSE 0.608!! ๐ช (3:26)
14.5 Multi-Level Stacking
Level 2 - Model Evaluation & Selection (4:27)
Level 3 - Weighted Ensemble Creation, Evaluation, & Selection - RMSE 0.595 (Level 2 RF is New Baseline RMSE 0.585) (3:34)
14.6 Modeltime Workflow for Ensembles
Ensemble Calibration (4:45)
Ensemble Refitting, Method 1: Retraining Submodels Only (5:43)
Ensemble Refitting, Method 2: Retraining both Sub-Models & Super-Learners (5:33)
14.7 Saving Your Work
Save the Multi-Level Ensemble (1:27)
Object Size: 50MB! Here's why. ๐ก (3:15)
14.8 Code Checkpoint - Module 14 - Ensemble Methods
๐ฝ Code Checkpoint [File Download]
15.0 Forecasting at Scale - Time Series Groups [Panel Data]
Welcome to Module 15 - Forecasting at Scale using Panel Data (Non-Recursive) Strategies (2:30)
๐ฝ Setup [File Download] (4:30)
15.1 Data Understanding & Preparation
Data Understanding (4:33)
Data Prep, Part 1: Padding by Group | Ungrouped Log Transformation (3:53)
Data Prep, Part 2: Extend by Group (2:44)
Data Prep, Part 3: Fourier Features & Lag Features by Group (6:03)
Data Prep, Part 4: Rolling Features by Group | Adding a Row ID (4:59)
Future & Prepared Data - Preparation (7:34)
15.2 Time Splitting - Train/Test
Time Series Split (Train/Test) (3:50)
15.3 Preprocessing & Recipes
Cleaning Outliers by Group (5:18)
Recipe, Part 1: Time Series Calendar Features (3:24)
Recipe, Part 2: Normalization (Standardization) & Categorical Encoding (5:36)
15.4 Modeling: Make 7 Panel Models
Panel Model 1: Prophet with Regressors (2:11)
UPDATE: HARDHAT 1.0.0 FIX
Panel Model 2: XGBoost (2:41)
Panel Model 3: Prophet Boost (1:57)
Panel Model 4: SVM (Radial) (2:02)
Panel Model 5: Random Forest (1:31)
Panel Model 6: Neural Net (1:27)
Panel Model 7: MARS (1:27)
Accuracy Check - This will help us select models for tuning (3:22)
15.5 Hyperparameter Tuning the Panel Models
Tuning Resamples: K-Fold Cross Validation (2:45)
Panel Model 8: XGBoost Tuned | Tunable Workflow Spec (3:37)
Panel Model 8: XGBoost Tuned | Hyperparameter Tuning (8:12)
Panel Model 9: Random Forest Tuned | Tunable Workflow Spec (1:56)
Panel Model 9: Random Forest Tuned | Hypeparameter Tuning (3:28)
Panel Model 10: MARS Tuned | Tunable Workflow Spec (2:00)
Panel Model 10: MARS Tuned | Hyperparameter Tuning (3:07)
15.6 Modeltime Panel Evaluation
Modeltime Table, Calibration & Accuracy for Panel Data [No Changes] (4:37)
๐กForecast Visualization for Panel Data [Use keep_data = TRUE] (4:23)
15.7 Time Series Cross Validation (Modeltime Resample)
Time Series Cross Validation (TSCV) (3:37)
Modeltime Fit Resamples (1:48)
Modeltime Resample Accuracy (3:53)
Plot Modeltime Resamples (2:15)
15.8 Ensemble Panel Models
Ensemble Average (Mean) & Sub-Model Selection (2:47)
Accuracy (Test Set, No Inversion) (1:18)
Forecast Visualization (Test Set, Inverted) (3:57)
Accuracy by Group (Test Set, Inverted): summarize_accuracy_metrics() [MAE 46 ๐ช] (4:29)
Refitted Ensemble & Future Forecast (6:11)
Ensemble Median: Avoid Overfitting (3:29)
15.9 Recap
๐ Congrats - You Just Forecasted 20 Time Series Using Panel Data Techniques! (2:28)
15.10 Code Checkpoint - Panel Data
๐ฝ Code Checkpoint [File Download]
โจ [PART 3] Deep Learning with GluonTS
Welcome to Part 3 - Deep Learning with GluonTS (0:53)
RStudio IDE Preview Version | Best for Working with Python
Module 16 - Setting Up GluonTS & Intro to Reticulate
What is a Python Environment? And, why do I need it?
๐ฝ Setup [File Download] (1:19)
R Package Installation Requirements (2:30)
16.1 Configuring the "r-gluonts" Python Environment
GluonTS Environment Setup Overview (2:10)
Installing the Python "r-gluonts" Environment (2:15)
Connecting to the "r-gluonts" Environment (2:48)
Troubleshooting Installation (2:50)
16.2 Testing Modeltime GluonTS ๐งช
Deep Learning Experiment - Predict a Straight Line, Part 1 (3:08)
Deep Learning Experiment - Predict a Straight Line, Part 2 (3:32)
16.3 Getting to Know Reticulate & Your Python Environments ๐
Managing Python Environments with Reticulate - Conda & Virtual Env (3:18)
Which Environment am I using & What's in it? (4:43)
16.4 Using a Custom GluonTS Environment
Setting Up a Custom Python Environment (6:58)
Activating (Connecting to) a Custom Python Environment (5:39)
Reactivating the Default GluonTS Environment (2:13)
16.5 Code Checkpoint - GluonTS Environment Setup
๐ฝ Code Checkpoint [File Download]
Module 17 - Using GluonTS with Reticulate
GluonTS Deep Learning | Navigating the Documentation ๐ (4:46)
๐ฝ Setup & Introduction [File Download] (3:27)
Load Libraries (0:42)
17.1 Reticulated Python Basics
Reticulated Python, Part 1 (7:00)
Reticulated Python, Part 2 (4:36)
17.2 Data, Preprocessing, & GluonTS ListDatasets
Getting the Weekly Transactions Data (1:35)
Preparing the Full Data for Deep Learning (4:36)
Creating a GluonTS ListDataset from a Data Frame (Tibble) (3:10)
Examining a GluonTS ListDataset (5:33)
Converting from GluonTS ListDataset to Pandas Series (7:20)
17.3 DeepAR GluonTS Model
The DeepAREstimator & Trainer (8:43)
Making Our First DeepAR Model (5:14)
The Prediction (Generator) (3:27)
Probabilistic Forecasting (5:06)
17.4 Visualizing the Forecast | matplotlib, ggplot, & plotly
Matplotlib, Part 1 (5:06)
Matplotlib, Part 2 (3:47)
ggplot + plotly (Interactive), Part 1 (6:26)
ggplot + plotly (Interactive), Part 2 (4:43)
17.5 Introducing Modeltime GluonTS
Modeltime DeepAR | Workflow Benefits (6:56)
Modeltime DeepAR | Adding More Epochs (1:17)
17.6 Saving & Loading GluonTS Models
Save & Load | Using GluonTS & Reticulate (6:06)
Save & Load | Modeltime GluonTS Models (3:28)
17.7 ๐Bonus!! GluonTS Deep Factor Models
Creating a DeepFactorEstimator (5:11)
Visualizing the Deep Factor Predictions with Matplotlib (3:17)
17.8 Conclusions & Pro/Cons
Reticulated GluonTS vs Modeltime GluonTS (Pros & Cons) (4:43)
17.9 Code Checkpoint - GluonTS Reticulate
๐ฝ Code Checkpoint [File Download]
Module 18 - Time Series Groups with Modeltime GluonTS
Deep Learning At Scale (with Modeltime GluonTS) ๐
๐ฝ Setup [File Download] (2:52)
18.1 Data Collection & Initial Preparation
Getting the Data | GA Webpage Visits Daily (2:17)
Full Data | Padding the Data (4:02)
Alternative Padding Strategy
Full Data | Log1P Transformation (Target) (1:01)
Full Data | Extend (Future Frame) (1:41)
Full Data | Group-Wise Fourier Series (2:33)
Full Data | Group-Wise Adding Lagged Features (1:47)
Full Data | Group-Wise Rolling Features (3:10)
Full Data | Adding a Row ID (0:52)
Data Prepared | skimr::skim() - Watch out for missing data (2:11)
Future Data | skimr::skim() - Watch out for missing data (4:07)
Split Data Prepared (Train/Test) (2:15)
Visually Inspect the Train/Test Splits - Inspect for missing groups (3:37)
18.2 Deep Learning Models - DeepAR
Modeltime GluonTS Recipe (4:07)
DeepAR (Model 1) | Understanding deep_ar() & Training Our 1st Model (9:56)
DeepAR (Model 1) | Model Accuracy Evaluation (MAE 0.546) (4:07)
Ahhh My Model Errored (Skimr to the Rescue!) (3:59)
DeepAR (Model 2) | Adjusting Hyperparameters (4:19)
DeepAR (Model 2) | Model Accuracy Evaluation (MAE 0.537) (1:49)
DeepAR (Model 3) | Scaling by Group (3:31)
DeepAR (Model 3) | Model Accuracy (MAE 0.509) (1:17)
18.3 Deep Learning Models - N-BEATS
N-BEATS (Model 4) | Understanding nbeats() & Training Our 1st N-BEATS Model (9:57)
N-BEATS (Model 5) | Improving our model with a new loss_function (MAE 0.611) (4:25)
N-BEATS (Model 6) | Ensemble Multiple N-BEATS (7:09)
N-Beats (Model 6) | Model Accuracy (MAE: 0.544) (3:04)
Future Forecast | Inspect Refitted Models (6:01)
18.4 Machine Learning - XGBoost
Setting up the Parallel Processing Backend (1:33)
Recipes for ML (XGBoost Model) (7:01)
XGBoost Tunable Model Spec (2:34)
Hyperparameter Tuning the XGBoost Model (6:20)
18.5 Evaluating Our ML & DL Models
Evaluate Accuracy on the Testing Set (MAE: 0.527) (4:35)
Visualize the Testing Set Forecast (2:46)
Refit & Visualize the Future Forecast (2:40)
18.6 Ensembles of ML & DL Models [The Best of Both Worlds] ๐ฅ๐ฅ๐ฅ
Ensembles | Combining ML & DL (MAE: 0.496) (5:54)
Ensemble | Refitting & Forecasting the Future (4:31)
18.7 DL Wrapup - Saving/Loading Models & Conclusions
Saving | Ensemble & Submodels (5:59)
Loading | Ensemble & Submodels (4:23)
Conclusions | Deep Learning with Modeltime & GluonTS (2:40)
๐ฝ Code Checkpoint [File Download]
CONGRATULATIONS!!! You. Did. It.
WOO HOO!!! Get YOUR Certificate & a discount on your next purchase! (1:07)
Linear Regression Stack - TSCV - RMSE 1.00 (Ouch!) ๐คฎ
Lesson content locked
If you're already enrolled,
you'll need to login
.
Enroll in Course to Unlock