Autoregressive order 1 process - conditions for Stationary Covariance and Weak Dependence

  Рет қаралды 95,184

Ben Lambert

Ben Lambert

Күн бұрын

This video explains the conditions which are necessary for an Autoregressive Order One process to have a constant covariance structure, and for it to be weakly dependent. Check out ben-lambert.co... for course materials, and information regarding updates on each of the courses. Quite excitingly (for me at least), I am about to publish a whole series of new videos on Bayesian statistics on youtube. See here for information: ben-lambert.co... Accompanying this series, there will be a book: www.amazon.co....

Пікірлер: 30
@joshdunning9068
@joshdunning9068 4 жыл бұрын
Literally spent 4 hours trying to understand stationarity from my class lecture slides, as well as nearly 30 different resources all over the web, and nothing explained it well until this video. Thank you so much!
@luisloretdemola1870
@luisloretdemola1870 9 жыл бұрын
Very helpful. I use your videos often to supplement my Time Series Class. Great stuff! Thank you
@barbone4646
@barbone4646 10 жыл бұрын
Very insightful lecture. A small remark. The covariance of X_t and eps_{t+h-i} is zero not only because eps is iid but also because t+h-i>t. Indeed, X_t and eps_n would be correlated for t>n
@SpartacanUsuals
@SpartacanUsuals 10 жыл бұрын
Hi, yes you are correct to point out that slip of the tongue. Many thanks for pointing this out. Best, Ben
@hahaliu1421
@hahaliu1421 2 жыл бұрын
Hi, I am still confused about this part. Q1: eps iid can only show the eps are independent with each other ,how could we use it to prove the eps are independent with Xt? Q2: why we could get the covariance of X_t and eps_{t+h-i} is zero according to the relationship between t+h-i > t? Thank you very much!
@olekristensen9096
@olekristensen9096 8 жыл бұрын
You just made me pass my exam!!
@SpartacanUsuals
@SpartacanUsuals 11 жыл бұрын
Hi, thanks for your message. Ok - I am generally talking about modelling processes using an AR(1) model. However you are correct - adding AR(1) errors can be used to remove serial correlation. The method you propose below should work fine. Best, Ben
@nihalchandrakashyap453
@nihalchandrakashyap453 6 жыл бұрын
Sir what is ur email id, i have to ask 2 questions
@zhaoxunyan4016
@zhaoxunyan4016 6 жыл бұрын
Thank you for the clear illustration.
@sukirtiranasaria736
@sukirtiranasaria736 5 жыл бұрын
For the covariance of AR(p) , shouldn't the term contain an X_t-h term instead of just X_t?
@spes9850401
@spes9850401 2 жыл бұрын
Thank you sir!
@YourSkyliner
@YourSkyliner 6 жыл бұрын
0:36 It seems to me like you're iterating backwards to get your Xt+h. Actually you'd get an expression for Xt depending on Xt-h (and error terms ranging from t-h-1 to t) continuing like this. It works of course, because the covariance function only depends on the modulus of h, but it's kind of unintuitive. Great video otherwise!
@MrFrenchfriesman
@MrFrenchfriesman 11 жыл бұрын
so please correct me if i am wrong when you are adding ar(1) into your model, the coefficient of the ar(1) will be row in order to solve auto correlation problem, we keep adding ar(1) ar(2),...ar(n) until we get insignificant row ( large p value of coefficient ar(n)
@Maxxulis
@Maxxulis 4 жыл бұрын
THANK YOU
@mobileentertainment212
@mobileentertainment212 Жыл бұрын
Does anyone have a link to a video explaining why the sample acf has a distribution of 1/T?
@okanaybar
@okanaybar 9 жыл бұрын
WONDERFUL
@lastua8562
@lastua8562 4 жыл бұрын
Why is the derivation of x(t+h) equal to p^h*x(t)? Should it not be p^(t+h)*x(h) - This would be analogous to the first derivation in video 77.
@jameshosking7801
@jameshosking7801 7 жыл бұрын
why when you simplify var(Xt) to sigma squared is it also divided by (1-p^2) ???!!!
@halfvolley11
@halfvolley11 5 жыл бұрын
What about conditions for MA process?
@looploop6612
@looploop6612 5 жыл бұрын
How you take out the rho from Cov. ?
@orchisamadas2222
@orchisamadas2222 7 жыл бұрын
What about the covariance of an AR(p) process? How do we derive that?
@zhaoxunyan4016
@zhaoxunyan4016 6 жыл бұрын
That involves linear algebra and is not covered in this tutorial.
@lastua8562
@lastua8562 4 жыл бұрын
@@zhaoxunyan4016 Can you recommend a resource?
@uj1xt5m98ap
@uj1xt5m98ap 8 жыл бұрын
Shouldn't the summation in the 4th line be from 0 to h (instead of h-1)?
@SpartacanUsuals
@SpartacanUsuals 8 жыл бұрын
Hi, thanks for your comment. I just checked the video, and it is correct - You only need to sum to h-1. This follows from the above pattern, for the few examples I show. Best, Ben
@uj1xt5m98ap
@uj1xt5m98ap 8 жыл бұрын
+Ben Lambert: Ok. Got it - let me watch these videos and work it out. Thanks for the reply! These videos are *really* well made and very helpful! :)
@uj1xt5m98ap
@uj1xt5m98ap 8 жыл бұрын
Yep! You were right! I worked out the equations on paper and it checks out. This reminds me what my highschool teacher used to say - don't try to do complicated algebra in your head.
Autoregressive vs Moving Average Order One processes - part 1
3:49
An introduction to Moving Average Order One processes
8:08
Ben Lambert
Рет қаралды 145 М.
Glow Stick Secret Pt.4 😱 #shorts
00:35
Mr DegrEE
Рет қаралды 19 МЛН
I Took a LUNCHBAR OFF A Poster 🤯 #shorts
00:17
Wian
Рет қаралды 15 МЛН
Time Series Talk : Autoregressive Model
8:54
ritvikmath
Рет қаралды 325 М.
On Autocovariances and Weak Stationarity
6:35
Justin Eloriaga
Рет қаралды 12 М.
The covariance matrix
13:57
Serrano.Academy
Рет қаралды 96 М.
5. Stochastic Processes I
1:17:41
MIT OpenCourseWare
Рет қаралды 900 М.
The AR(1) process
32:51
Jochumzen
Рет қаралды 28 М.
What are Autoregressive (AR) Models
5:01
Aric LaBarr
Рет қаралды 122 М.
AR(1) Autoregressive Process: Mean, Autocovariances, ACF
7:50
ARMA Stationarity, Invertibility, and Causality [Time Series]
11:15
Moving Average processes - Stationary and Weakly Dependent
7:08
Ben Lambert
Рет қаралды 81 М.