How can we used the "cleaned" version from the end of the notebook to use predictions? Is there a specific method on rescaling back to the original scale of the data set?
@juanotavalo2 жыл бұрын
you can check that subject in this video: kzbin.info/www/bejne/qpvJopl6dtaniJY&ab_channel=ritvikmath
@nisargpatel20152 жыл бұрын
This is an invaluable series of videos, thank you! I have question about the constant volatility requirement for ARMA models, why is it so? Wouldn't the AR & MA component be able to capture the volatility?
@gauravipatil1834 жыл бұрын
I must say, your videos are way easy to understand the concepts. Thank you so much.
@oliesting49214 жыл бұрын
Great videos. I am learning a lot from your videos. Thanks!
@socksdealer3 жыл бұрын
Extremely useful and clear video! Thank you!
@enes98li13 жыл бұрын
And how is that help on predictions?
@jiezhiguo5093 жыл бұрын
Hi Ritvik, would you be concern about "Data Leakage" when we detrend and remove seasonality as such?
@climbscience48139 ай бұрын
Your concern is right, I have the same. He essentially uses future values for normation and thereby leaks data into the prediction. It's not only in removing seasonality, but subtracting the mean and dividing by standard deviation is also using future data. If you replace these operations by their respective equivalents, just from the prior year or 2, you can remove that bias again. ;-)
@sandyherho19973 жыл бұрын
Thanks ritvik. Could you share the math notations for each step in this cleaning process?
@mmczhang4 жыл бұрын
Thanks Ritvik, it is a useful video. The question I have is how can we drop the beta 1 as beta 2 is significant but not beta 1. do we need to keep both beta in garch model?
@grantchan37244 жыл бұрын
Hi Ritvik, thank you for the excellent video! I have a question about your use of normalization. In this case, why was it necessary? Isn't normalization typically used in a multivariate scenario to prevent the relative scale of different variables from skewing the analysis?
@SKINhead-lq8yn2 жыл бұрын
I'm not sure about this particular case but in general if you want to use Yule-Walker estimation (which is, i think, the most common for estimating time series parameters) the series should have 0 mean, hence the normalization
@sanchitgoyal67203 жыл бұрын
Couldn't we have removed seasonality by differencing with a lag of 12
@chintamanigautam25233 жыл бұрын
Hi, Would you please give an idea: How can we normalize time series data which are not normal even taking natural log?
@eshaankirpal3 жыл бұрын
How do you automate this process to remove seasonality when dealing with lots of time series?
@wussboi4 жыл бұрын
thank you sir! very helpful!
@phillipgallas3 жыл бұрын
Hello Ritvik! I have a question I have this time series data with a number of different variables, and originally they have high correlation. After adjusting my data for variance, though, basically all values in the correlation matrix became closer to zero. Does that mean that these data points lost the observable connection with each other and thus cannot be used to train a neural net? Btw, great content. I really like your videos on time series
@minghanliu93904 жыл бұрын
Great video! But I have some problems with removing seasonality and volatility, I can't figure out how we inverse back to the original data while forecasting. Can u please explain this? Thanks a lot.
@minghanliu93904 жыл бұрын
Or maybe I should ask about if it is more reasonable by dividing past period average and volatility, helps a lot if u can help me to solve the problem!
@ritvikmath4 жыл бұрын
Hi, thanks for your kind words! Perhaps this video of mine can help, it talks about undo-ing your transformations to get back the original series: kzbin.info/www/bejne/qpvJopl6dtaniJY
@minghanliu93904 жыл бұрын
@@ritvikmath Thanks! I do see this video while I am trying to find out the answer, but the question is that while inverting back, how can I possibly know the volatility of this year, or the seasonality of this season. Thanks for the response above, it will be really helpful if you can answer my problem.
@zollen1233 жыл бұрын
Am I correct that *all* time series data must go through all these mandatory steps (normalization, first differences, removing volatility and seasonal patterns)?
@sgpleasure3 жыл бұрын
Have the same question, what practical purpose need we 'cleanse' the data?
@EvsEntps2 жыл бұрын
@@sgpleasure All of that stuff makes it harder to isolate the stochastic part of the process we're trying to model - seasonal effects and time-trends are not stochastic properties, they're deterministic. In theory, by 'cleaning' the time series we're removing all the deterministic stuff so that we're only left with a time series that looks like it was generated by a (non-random-walk) stochastic process, i.e. a time series that allows us to use all the theory and tools around ARIMA. We use our ARIMA to forecast the future values of the stochastic part of the series and then 'add' this to the future values of the deterministic part of the series (usually through reverse transformation) to get the complete forecast of our time series. Note: there is no guarantee that this whole ARIMA approach will actually be any good at modelling and forecasting your time series, it just so happens that a lot of real world time series data can be decomposed and modelled in this way successfully, that's why it's popular but it's not always appropriate.
@Rm-no6jr9 ай бұрын
Great thanks
@user-wr4yl7tx3w10 ай бұрын
But how do you make a prediction from the model then and still have the result in scale with the original data?