I'm starting a new channel on AI at youtube.com/@parttimeai, please subscribe!
@DavidZShi3 жыл бұрын
Best channel on KZbin.
@stevenaohio3 жыл бұрын
Thanks again PTL.... getting caught on your new videos... you continue to shorten my learning curve. Amazing content & instruction.
@akin2420023 жыл бұрын
Stocks + Python programming language = Liked video.
@DavidCVdev3 жыл бұрын
thank u for all these videos, im learning a lot more than I did in some courses I paid for.
@yarinifrach56643 жыл бұрын
Dude you are genius my friend. Thank you for all these videos!
@detroiter4eva3 жыл бұрын
Yeah, I was thinking the same thing.
@rexxsv3 жыл бұрын
Hi Larry, I rarely comment on videos (never to be honest, this is actually my first time ever and I've been on KZbin daily for probably more than a decade now). Just wanted to let you know I really appreciate your work, it's been VERY informative! I've added some web-scraping and archiving functionalities to this project and some other features I'd like to share with you :)
@parttimelarry3 жыл бұрын
Thanks! If you wanna share something, my email is in the About section of the channel or you can send me a message on twitter @parttimelarry
@Rickynator133 жыл бұрын
Thank you for all your videos! You are awesome man!
@nassehk3 жыл бұрын
Great video. Thanks for sharing your knowledge.
@rpraka3 жыл бұрын
Awesome idea Larry!
@rpraka3 жыл бұрын
I read about some tests that showed Postgres was in fact faster than timescale db for many tasks and that timescale had very marginal benefits in some cases. Are these tests valid in your own experience with the two?
@julianriise56183 жыл бұрын
Links to previous and next videos would be good to have, as a new sub I'm thrown into a series of random videos, so a little hard to follow along :)
@SushiToyota3 жыл бұрын
Your videos are amazing but they keep making realize how lil I know. I started with the 100 lines of code trading bot>went back to watch from the start etc. I have like 5 of your videos open that I need to get through! Would you mind referencing previous videos that you build upon in your comment in future vids? Like the WSB scrapper one brought me here
@edwardhasekamp31043 жыл бұрын
Thanks Larry! Awesome video
@EMoscosoCam3 жыл бұрын
I think that you can know when they updated the CSV from the file itself: The first column is a date, and when comparing with the downloadable Holdings PDF, one could infer that the header title is "As Of".
@parttimelarry3 жыл бұрын
Good point, this is a better solution.
@Kate-ud2xc2 жыл бұрын
you are the best teacher :) thanks so much!!!
@wajihmsedi12303 жыл бұрын
Thank you so much for your videos Larry. A practical question since I am in Europe I do not have access to alpaga. Would you know what would be the alternative ?
@ballyoracle3 жыл бұрын
i love your voice man !!! thank you for sharing your knowledge .... this is awesome stuff
@kristopherleslie83433 жыл бұрын
sounds creepy loving another mans voice lol
@sasucarefree46943 жыл бұрын
Doesn't sound creepy to me more like a compliment.🤔
@ballyoracle3 жыл бұрын
@@kristopherleslie8343 this is respect to a man with knowledge and attitude to share with everyone ..
@cetilly3 жыл бұрын
Really great vid, and a cool topic
@uday27943 жыл бұрын
why not use pandas dataframes for both reading in the sql query results and the files?
@Jason-ru7xt3 жыл бұрын
Great content Larry. Just curious can we allow remote access to our postgres/timescale database ? so that we can access the DB in one PC from another PC.
@TuranInsight3 жыл бұрын
Are you also gonna show us how r u gonna periodically track the holding via scheduling a job or through scraping from the website?
@赢家交易策略3 жыл бұрын
i love your channel very much. Thank you very much!
@pedrojesusrangelgil50643 жыл бұрын
Hey what is the book of the thumb? thanks
@parttimelarry3 жыл бұрын
It is a book written near the peak of the dot com bubble in 1999, before tech stocks crashed: en.m.wikipedia.org/wiki/Dow_36,000
@TransformationApplied3 жыл бұрын
DBeaver Universal Database Manager is also pretty cool and has community edition
@degenviking60683 жыл бұрын
Thanks Larry!
@JellyBean012 Жыл бұрын
Hey PTI - can you help with the identified issue that ARK changed the format of the csv files to add commas to the shares and percentage formatting to the weight?
@JellyBean012 Жыл бұрын
this was my approach... I think it works... thanks to chatgpt... import os import pandas as pd folder_paths = ['data/2023-01-18', 'data/2023-01-19', 'data/2023-01-20'] for folder_path in folder_paths: for filename in os.listdir(folder_path): if filename.endswith('.csv'): file_path = os.path.join(folder_path, filename) df = pd.read_csv(file_path,skipfooter=1, engine='python') df[df.columns[5]] = pd.to_numeric(df[df.columns[5]].str.replace(',','')) df[df.columns[7]] = df[df.columns[7]].astype(str) df[df.columns[7]] = pd.to_numeric(df[df.columns[7]].str.replace('%','')) df.to_csv(file_path, index=False)
@bmveee35973 жыл бұрын
Hi Larry - great video and awesome explanation. Quick question though, if I try to download ARK etf holdings on a daily basis, their cloudflare service blocks my GET requests. Have you figured out how to automatically download daily updates and not get blocked?
@micfrancs3 жыл бұрын
Love your vids Larry! What would be a good way to pull the .CSVs daily instead of manually retrieving them?
@EMoscosoCam3 жыл бұрын
You could read the CSV from the web into a Panda Data Frame. For example: req = Request("ark-funds.com/wp-content/fundsiteliterature/csv/ARK_INNOVATION_ETF_ARKK_HOLDINGS.csv", headers={'User-Agent': 'Mozilla/5.0'}) df = pd.read_csv(StringIO(urlopen(req).read().decode('utf-8')))
@kylehammerberg38752 жыл бұрын
The ARK CSVs use string numbers with commas now. Is there a good way to deal with this with psyopg2? Would it be easier/possible to just use pandas?
@kylehammerberg38752 жыл бұрын
As of now, the best solution I can think of is to convert the CSVs to pd dataframes and clean them that way and then re-save them as CSVs so psycopg can interface with them. Another issue is not all CSVs have tickers so reading as dict causes an index error once the loop hits one of these rows.
@kylehammerberg38752 жыл бұрын
update for others following along with this video now: I developed a solution to deal with the way the CSVs are structured currently. If you're interested, feel free to PM.
@AnonymousSkimaHarvey3 жыл бұрын
Twitter up 40% since Cathy bought it
@parttimelarry3 жыл бұрын
🚀🚀🚀 $70
@silentrobcanada3 жыл бұрын
Man I wish Sequel Ace worked with Postgres.
@PB_Chill3 жыл бұрын
With this and the tradekit, I am set. The tradekit is not to big and very fast. I ran UPDATE stocks SET is_etf = TRUE WHERE name LIKE %EFT%. I could not find a stock with EFT in the name or an ETF without it. A few Assets like commodities are Trust or Funds. I will try Alpaca then Yahoo.
@MrClueso1023 жыл бұрын
is it possible to create an restapi based on the postgres-timescaleDB ?
@SolidBuildersInc3 жыл бұрын
Kudos..... Have you heard og BitsGap? Would be interested in the Grid strategy they use with the Trailing Up done in Python, Flask, Postgress and Dash..... Appears to be the most accurate Ninja analysis on the Market.... It's only Crypto and we could do Forex and others as well..... Really the Best RISK management strategy that I can Dance with........ Your thoughts? P.S. Are those Lick Machines doing anything with a Band, or you just Free Styling some Vibes? Let's hear it Bro, Come On !!!!
@hermannandarusdy92773 жыл бұрын
wonderful idea... why not you just scrape it though?
@parttimelarry3 жыл бұрын
Since reddit provides an official API with structured responses, it is much better to use it IMO. Scraping is more error prone since the underlying markup of the web page can easily change. Also as mentioned in the video, the library provides generators to allow for easy pagination and exponential back off in case there is throttling. I only use scraping when there is no official API. For instance, I use BeautifulSoup for scraping on my video on the CNN fear and greed index.
@yourpersonaldatadealer22393 жыл бұрын
Found a bug in the code I sent, the aiofile needs to await the read, it wasn't before (working code below). BTW absolutely cannot connect to the docker container timescaledb instance but I think it is something to do with being on windows although I have tried everything on stakcoverflow (changing IP, etc) but still no luck. Will have to move onto my linux machine at some point to get all this working with docker but for now it works when not dockerised: @_connection_transaction async def _create_all_tables_if_not_exist(self, _connection): """ Creates all required tables within the database (if they do not already exist) """ try: async with async_open(os.path.join("database", "sql", "create_tables.sql"), 'r') as async_file: await self._execute( _connection, await async_file.read()) except Exception: self.LOG.exception("Failed") else: self.LOG.info("Success")
@parttimelarry3 жыл бұрын
Thanks so much for the code. Still getting to this! I don't think I did anything special with my Docker setup, I did have to stop my local PostgreSQL server. Will see if I can reproduce in my Windows VM.
@apcdeeplearning3 жыл бұрын
Twitter in the $70s now. I watched this video too late :(