I'm starting a new channel on AI at youtube.com/@parttimeai, please subscribe! Twitter: twitter.com/PartTimeLarry Website: hackingthemarkets.com Source Code: github.com/hackingthemarkets Source Code: github.com/hackingthemarkets/ark-funds-tracker
@retrofutur1st3 жыл бұрын
Slowly I'm working through your awesome vids, great stuff!
@m1k3thirteen813 жыл бұрын
I like what you have done to your videos and setup. Lighting and video is crisp an clean. It looks great PTL! We all appreciate the great content! Thank you!!!
@IonPerpelea3 жыл бұрын
Brilliant as always.
@Jon_Dang3 жыл бұрын
Nice vid, keep up the good work!
@x.e.b.u3 жыл бұрын
Thank you for the video. It is very reassuring that your mac desktop is similar to mine 😂
@ivanchan77693 жыл бұрын
The design of stock table will cause the problem if we execute the insert SQL more than one times. We need to update the stock list daily because there are always new listed company in the market. Could we use the id from alpaca API as primary to distinguish whether it is the new record. But I am not sure the id of a stock will keep the same if its symbol or other information changed.
@parttimelarry3 жыл бұрын
In the full stack tutorial, I do a simple SELECT FROM stock WHERE symbol = 'NEWINCOMINGSYMBOL', and only insert if the there are no existing records returned. I ran a populate_stocks script daily using a cron job. I didn't go over it again in this tutorial since I felt like I would be re-explaining the same thing to people who just watched the other series and I wanted to get to PostgreSQL/Timescale specific stuff.
@ivanchan77693 жыл бұрын
@@parttimelarry I get it. But using vivendi as an example, vivendi was using "v" as stock symbol. Now Visa use "v". Assume the symbol of vivendi "v" was in our table, the new coming symbol "v" for Visa will not be inserted in our table. As you mentioned in previous video, the stock symbol can be changed or reused by another company. So checking the existence of symbol in stock table cannot avoid this problem.
@redrum44863 жыл бұрын
I'm using my paper trading api key and secret but I still get 'alpaca_trade_api.rest.APIError: request is not authorized' ? any ideas?
@skeye49723 жыл бұрын
Hello Part Time Larry , Great content.... But I would like to ask a question if you have any other recomendation instead of alpaca since alpaca is only for US citizens. Do you know any EU one like alpaca????
@parttimelarry3 жыл бұрын
I think most people use Interactive Brokers (though it is a bit harder to use)
@TransformationApplied3 жыл бұрын
Alpaca carries "ETF" in the name, so for "is_etf" column value I used "True if " ETF" in asset.name else False" . Not bulletproof but helps assigning is_etf values while populating.
@parttimelarry3 жыл бұрын
Nice! This is a good idea to save some time. Will try that method out and see what edge cases there are.
@TransformationApplied3 жыл бұрын
@@parttimelarry then I had to change the query that you use in the csv processing video to cursor.execute("select symbol, name from stock where is_etf = true and (name like '%ARK% ETF%' or name like 'The 3D Printing%')"). I guess this is the downside of assigning is_etf for all since you are focusing on ARK
@petteri54393 жыл бұрын
I there any alternative service for alpaca? I am from europe and alpaca is only for US citizens :(( pls help!!!!!
@yourpersonaldatadealer22393 жыл бұрын
When creating hypertables (through a script), I've come across a problem a few weeks ago when they already exist, so I started using this if anyone gets stuck: -- Indices CREATE INDEX IF NOT EXISTS stock_id_symbol_idx ON stock (id, symbol); CREATE INDEX IF NOT EXISTS etf_holding_etf_id_date_idx ON etf_holding (etf_id, date DESC); CREATE INDEX IF NOT EXISTS stock_price_stock_id_date_idx ON stock_price (stock_id, date DESC); -- TimescaleDB CREATE EXTENSION IF NOT EXISTS timescaledb CASCADE; SELECT create_hypertable( 'stock_price', 'date', if_not_exists => TRUE, chunk_time_interval => INTERVAL '1 days' ); SELECT create_hypertable( 'etf_holding', 'date', if_not_exists => TRUE, chunk_time_interval => INTERVAL '1 days' );
@yourpersonaldatadealer22393 жыл бұрын
If you read this Larry, would be good to do a vid on how you got a connection from your .py file through to the docker container instance of postgres... Can't seem to do it unless the db is not in a container, cheers dude
@parttimelarry3 жыл бұрын
Thanks for the tips here. You are already ahead of me on Timescale here, but I will be catching up as this series progresses. Also, some developer advocates from TimescaleDB saw the first video and liked it, so I think they will be stopping by. I bet they will be willing to provide some additional insight and best practices that will be helpful. Definitely point out anything you have already run into and share code if you are willing. I am going to take a stab at using asyncpg and aiohttp like you suggested and I know you have spent some time optimizing the speed here, so any code suggestions would be awesome.
@yourpersonaldatadealer22393 жыл бұрын
@@parttimelarry Sure dude, we all owe you big time anyway. Best coding channel on YT right now and I suspect you'll blow up in future. For the asyncpg I created this Database class (I've cut out some functions and just given you the basics to get started). It creates a connection pool (doesn't work with docker, but I suspect that is down to me not knowing how to get the port correctly as it'll no longer be localhost if it's contained, right? Hoping you can help in the docker realm) which will allow you to have multiple async connections at once in your asyncio.gather(*tasks) (function in another file). The connection_transaction wrapper function I wrote can be applied to other functions (like execute) through the use of a decorator. Note that the fastest way to write/read data with asyncpg is by using the copy_to_table or copy_from_table functions that the lib provides. import asyncpg from aiofile import async_open from pandas import DataFrame from config import DB_LOGGING_REF class Database: """ Asynchronous TimescaleDB (Postgresql extension) class """ def __init__(self, DB_HOST, DB_USERNAME, DB_PASSWORD, DB_PORT, DB_NAME, LOG): self._database_host = DB_HOST self._database_username = DB_USERNAME self._database_password = DB_PASSWORD self._database_port = DB_PORT self._database_name = DB_NAME self.LOG = LOG self._connection_pool = None async def init_async(self): """ Asynchronous function initializations for this class """ await self._create_connection_pool() await self._create_all_tables_if_not_exist() async def _create_connection_pool(self): """ Creates a pool of database connections for asynchronous usage """ if self._connection_pool is None: try: self._connection_pool = await asyncpg.create_pool( host=self._database_host, port=self._database_port, user=self._database_username, password=self._database_password, database=self._database_name, command_timeout=300 ) except Exception: self.LOG.exception("Failed") else: self.LOG.info(f"Success: {DB_LOGGING_REF}") async def _destroy(self): """ Class destruction process """ if self._connection_pool is not None: await self._connection_pool.close() def _connection_transaction(function): """ Wraps function with pool connection and transaction block """ async def _connection_transaction_wrapper(self, *args): try: async with self._connection_pool.acquire() as _connection: async with _connection.transaction(): # Database interaction function that we wrap with a connection and transaction return await function(self, _connection, *args) except Exception: self.LOG.exception('Connection pool or transaction error: ') return _connection_transaction_wrapper async def _execute(self, _connection, sql_statement): """ Executes an sql_statement command (or commands) """ try: await _connection.execute(sql_statement) except Exception: self.LOG.exception(f"{sql_statement} ") @_connection_transaction async def _create_all_tables_if_not_exist(self, _connection): """ Creates all required tables within the database (if they do not already exist) """ try: async with async_open("/sql/create_tables.sql", 'r') as async_file: await self._execute( _connection, async_file.read()) except Exception: self.LOG.exception("Failed") else: self.LOG.info("Success")
@yourpersonaldatadealer22393 жыл бұрын
To create the Database class you'll need a helper function to init the async methods, I wrote this one which will go in the same directory as the Database class: from database.Database import Database from config import DB_HOST, DB_USERNAME, DB_PASSWORD, DB_PORT, DB_NAME async def create_database(LOG): """ Creates an instance of the Database class and initializes its asynchronous functionality """ database = Database(DB_HOST, DB_USERNAME, DB_PASSWORD, DB_PORT, DB_NAME, LOG) await database.init_async() return database
@yourpersonaldatadealer22393 жыл бұрын
And then this file would essentially be your main: import aiohttp, asyncio, logging.config from database.async_creators import create_database from scrapers.async_creators import create_ark_invest_scraper from http_client.HttpClient import HttpClient from logs.log_config import LOGGING_CONFIG from config import APP_NAME async def _init(client_session): """ Initializes this script by providing logging functionality and all required resources """ # Load the logging configuration logging.config.dictConfig(LOGGING_CONFIG) LOG = logging.getLogger("get_data") LOG.info(f"{APP_NAME} - started") # Initialise client and database singletons http_client = HttpClient(client_session, LOG) database = await create_database(LOG) return http_client, database, LOG async def _destroy(database, LOG): """ Tear-down script, used to destroy objects and clean up resources before exiting """ await database._destroy() LOG.info(f"{APP_NAME} - exited") async def main(): """ Sets up the database connection, a singleton HTTP-client session and logging functionality before obtaining and locally (or cloud, if configured) storing the required historical data """ async with aiohttp.ClientSession() as client_session: # Initializations http_client, database, LOG = await _init(client_session) # ark_invest_scraper = await create_ark_invest_scraper(http_client, database, LOG) # stocks = await create_stocks(database, http_client, LOG) await _destroy(database, LOG) # Entry point if __name__ == "__main__": asyncio.get_event_loop().run_until_complete(main())
@austinbyersking3 жыл бұрын
For some reason I get ModuleNotFoundError: No module name alpaca_trade_api However if i run pip list in terminal it is showing I have alpaca_trade_api installed
@phananhacs3 жыл бұрын
check if python that ur VSCode is using is the same as the python that has alpaca_trade_pi installed in. Likely they're not.
@coreybonnette57873 жыл бұрын
how do i properly import the config.py if it is in another folder called config?
@0123456789527523 жыл бұрын
from folder import config
@ajax01163 жыл бұрын
Has anyone else figured out how to add psycopg2 with macOS 11.2.1? I've also tried it with homebrew per thru stackoverflow recommendations.
@ajax01163 жыл бұрын
All - So I used the command $pip install psycopg2-binary This worked, even though the Pycharm IDE gave me an error. I ignored the error and was move forward.
@jatindhamecha27253 жыл бұрын
Hey Larry, thanks a lot for making these videos. However there's something you need to address. I went to Alpaca and they are not available to Non US citizens (invite only beta). So I went to Alpha Vantage, and had some trouble working around. now I'm stuck with an error stating "module 'alpha_vantage' has no attribute 'REST'". I'd also like to mention I have almost no experience in coding at all, just following your videos and googling when I'm stuck. Any help is greatly appreciated ! EDIT 1 : experienced same with using RapidAPI Alpha Vantage
@brianrowe11523 жыл бұрын
Hi and help :) I'm trying to get Ubuntu 20.04, with python3 from anaconda installed with everything, but can't get there. psycopg2 will only install via conda; but alpca skelton pypi alpaca_trade_api fails.. so if I pip install the requirements.txt it fails. Has anyone gotten all of these packages to install on Ubuntu 20.02? thx
@kylerpettitt40133 жыл бұрын
This should work. Run the following commands in terminal. sudo apt-get install libpq-dev sudo apt-get intall python3-psycopg2
@ajax01163 жыл бұрын
Larry have you thought about charging for your courses? This would help with hiring a TA. I