top of page

LSTM Google Stock Prediction  part 2 -Seasonal Decomposition Time SeriesLevel 3, 22 minutes

In our second part of the LSTM stock prediction using Tensorflow. Now that we have our deep learning model with LSTM recurrent layer set up we focus on the times series part of our project and dive deep into the seasonal decomposition of our MACD, RSI, Fast Stochastic, and pct_change indicators. The goal here is that we can just use the last 20 days but for each indicator that create a hundred features. As common with time series stock market prediction, we are limited on the number of rows. So we can't use so many features and we use statsmodel seasonal decompose function from the statsmodel.tsa.seasonal library. This allows use to study the autocorrelation in only the seasonal component and extract which of the past days have a correlation. This allows us to choose only a few features instead of so many.


The main idea here is to help reduce overfitting in our time series stock market predictions with Tensorflow.












Send data science teacher brandyn a message if you have any questions










Here we build our LSTM recurrent neural network wit Tensorflow. This deep learning model uses times series to predict the stock price of GOOG.


As is create our indicator we will create several null values that need to be dropped.


Here we use the seasonal decomposition function from statsmodel.tsa . This will allow use to plot and choose certain lagged days to predict today.


Here with plot the acf_plot from statsmodel time series library. Looking at the seasonal component we can find where the autocorrelation lays and from there limit the number of past days we use to predict today.

Here we engineer our time series features from what with learned in the seasonal decomposition.

Heading into the train test split it can be useful to build a list of your features to allow for the experimental process and remove different features as you head into the deep learning with Tensorflow modeling section.


Pandas plots are a great little shortcut to plot the training history of our Tensorflow neural networks R squared and MAE over the epochs.



5 visualizaciones0 comentarios
bottom of page