is there a easy way to add if the card is first edition?
@josephsatow48222 күн бұрын
Great explanation and visual thank you
@barnestimАй бұрын
I didn't manage to get this working using the AZURE_STORAGE_ACCESS_KEY method but did get it working using 2 environment variables: AZURE_STORAGE_ACCOUNT and AZURE_STORAGE_ACCESS_KEY with their relevant values. Also, a shapefile didn't work but a COG raster did. Just to re-highlight also, that adding/modifying of the environment variables requires restarting of QGIS- I found that some older values I had tried were being retained if I didn't close all instances of QGIS. The best way to check is to examine the 'Current environment variables (read-only - .....) section and checking the AZURE_....values are as expected.
@kiddjami252 ай бұрын
Thank you for this video. I hope you can assist me on a problem I am having. After I refresh the data, under queries & connections, it shows I have errors. I do not know how to check these or fix them. Can you help? Thanks!
@dakotakristenthornton46142 ай бұрын
Love the video! I'm trying to do this for my MTG collection. How would you go about inputting cards when pricecharting doesn't have the number available in the url of the specific card? Example would be the plain jane Consuming Blob.
@pavanpenugonda49242 ай бұрын
Simpel straignt forward video, really helpful
@sammyvr942 ай бұрын
When we do this it appears a use security integration type saml instead of account parameter
@LimitedWard3 ай бұрын
I'm lost. Why wouldn't the maintainers of pip simply apply the same optimizations as uv then if it doesn't actually need to download all that extra content?
@dotpi59073 ай бұрын
Good question, I think pip is still considered the more stable package manager, I imagine pip will adopt some things from uv pip once it reaches version 1.0.0
@rogercarlson23193 ай бұрын
My issue is I want to link one of those tables that are duplicated in separate databases (as you showed in the beginning). This produces an error in Access. "Cannot define field more than once." I suspect the quick and dirty solution is to create a Snowflake Role that can only see that database. Thus it will appear only once. The cleaner solution is to link the table programmatically. This will require building an odbc connection string. I'm looking into that now. If I get a chance, I'll post a solution.
@dotpi59073 ай бұрын
Thanks! That would be great to hear what you find
@mak448a3 ай бұрын
It’s way faster when making venvs!
@dotpi59073 ай бұрын
That's good to know!
@RoamingAdhocrat3 ай бұрын
ooh. I support a SaaS app and _hate_ Azure Storage Explorer with a burning passion. if I can access logs etc from Python instead of ASE that would be a very happy rabbithole to go down. I suspect I don't have access to those keys though
@RoamingAdhocrat3 ай бұрын
how can it be faster, it's three extra keystrokes
@thecowmilk48573 ай бұрын
Rust is the new programming contagion
@dotpi59073 ай бұрын
The Rustaceans are sweeping the nations
@muhammadraza51083 ай бұрын
Hi a very good tutorial. I am using SQL Server 2019 where order by command is not allowed with cte . Please suggest alternate. Also tell you are working on which environment. Regards
@ZuperPotato3 ай бұрын
You're so underated
@dotpi59073 ай бұрын
Thanks! The support means a lot
@clockblower64143 ай бұрын
Hate using pip because of venvs. At least pipx exists when its just Python programs
@TheStanglehold3 ай бұрын
Pray tell why does venv make pip difficult?
@clockblower64143 ай бұрын
its not stangie its just extra steps
@EwaneGigga3 ай бұрын
Thanks a lot for this very clear video. I spent hours trying to do this until I luckily stumble across your video. I agree that this video should definitely have more views!!
@dotpi59073 ай бұрын
I'm glad it helped. Thanks for the support!
@adasalda4 ай бұрын
Hey, do you know by any chance how to connect to Microsoft Azure Data Lake Storage Gen 2? I cant do it the same way as you are doing it. It needs vsiadls file system handler (your method uses vsiaz), but i dont know how to set it :(
@dotpi59073 ай бұрын
Hi, thanks for the comment, I can have a go this weekend, I dont have QGIS installed on my computer at the moment. My only thought would be to set the Type as HTTP/HTTPS/FTP in the data source manager window and passing in a url to your files, rather than selecting Microsoft Azure blob. It looks like the authentication for vsiadls and vsaiz should be the same, so I dont imagine there would be any change needed to the environment variable. Let me know if you figure it out.
@barnestimАй бұрын
There's a provider/Protocol Type now in the the 'Add layer' dialogue: 'Microsoft Azure Data Lake Storage', try that?
@vardhanreddy4354 ай бұрын
Hi ,Thanks for your info.Hi, I want to display column headings without single quotes before and after in the output. You displayed 'BA', 'AB', 'CA', and I aim to dynamically achieve BA AB CA.can you explain how to achieve BA AB CA
@latebloomer10204 ай бұрын
This is not working these days. Don't waste your time.
@dotpi59074 ай бұрын
Hi, which part doesnt work for you and what is the error message you get?
@TaylorDezio4 ай бұрын
How do you add ungraded, along with all the graded prices?
@TravelChroniclesHD4 ай бұрын
Could You put code somewhere ?
@dotpi59074 ай бұрын
Thanks for the reminder! I remembered I had trouble adding the code the the description because you can add < and > signs in a video description, ill paste the code in this comment until I figure it out
@dotpi59074 ай бұрын
--create a FIFO schema create schema EXAMPLE_DB.FIFO; --create a BUY table create or replace table EXAMPLE_DB.FIFO.BUY ("INDEX" integer, QUANTITY float, PRICE float); --create a SELL table create or replace table EXAMPLE_DB.FIFO.SELL ("INDEX" integer, QUANTITY float); --add values to the initial tables -------------------------------------------------------------- insert into EXAMPLE_DB.FIFO.BUY ("INDEX", QUANTITY, PRICE) values (1, 200, 10), (2, 150, 12), (3, 225, 16); insert into EXAMPLE_DB.FIFO.SELL ("INDEX", QUANTITY) values (1, 100), (2, 300); CREATE OR REPLACE VIEW INVENTORY ("INDEX", QUANTITY, PRICE, TOTAL_COST) as --get the total number of units sold WITH total_units_sold AS ( SELECT SUM(QUANTITY) TOTAL_UNITS_SOLD FROM EXAMPLE_DB.FIFO.SELL ), --CUMULATIVE_SUM table as: the inventory table with a CUMULATIVE SUM COST column cumulative_sum AS ( SELECT *, SUM(QUANTITY) OVER (PARTITION BY NULL ORDER BY "INDEX") CUMULATIVE_SUM FROM EXAMPLE_DB.FIFO.BUY --do a cross join to add TOTAL_UNITS_SOLD AS A NEW COLUMN cross join total_units_sold ), updated_quantity AS ( SELECT *, CUMULATIVE_SUM-TOTAL_UNITS_SOLD BUY_SELL_DIFFERENCE, --whatever is left over from the buy-sell difference needs to be subtracted IFF(BUY_SELL_DIFFERENCE > 0 AND LAG(BUY_SELL_DIFFERENCE) OVER (PARTITION BY NULL ORDER BY "INDEX") < 0, BUY_SELL_DIFFERENCE, QUANTITY) UPDATED_QUANTITY FROM cumulative_sum ) SELECT "INDEX", UPDATED_QUANTITY QUANTITY, PRICE, UPDATED_QUANTITY*PRICE TOTAL_COST FROM updated_quantity --purchases that get completely sold out with have a negative buy sell difference, so get removed from the inventory WHERE BUY_SELL_DIFFERENCE >=0 ; --Cost of Goods sold CREATE OR REPLACE VIEW cogs ("INDEX", cogs) as WITH cumulative_units_sold AS ( SELECT *, SUM(QUANTITY) OVER (PARTITION BY NULL ORDER BY "INDEX") CUMULATIVE_SELL FROM EXAMPLE_DB.FIFO.SELL ), --work out the total amount of stock sold as of the previos sale previous_sales AS ( SELECT SELL.INDEX SELL_INDEX, SELL.QUANTITY SELL_QUANTITY, BUY.INDEX BUY_INDEX, BUY.QUANTITY BUY_QUANTITY, BUY.PRICE BUY_PRICE, CUMULATIVE_SELL - SELL_QUANTITY SOLD_PREVIOSLY, --deal with the SOLD PREVIOSLY FIRST THEN SUBTRACT THE REMAINDER SUM(BUY_QUANTITY) OVER (PARTITION BY SELL_INDEX ORDER BY BUY_INDEX) CUMULATIVE_BUY, --how much is left in the st CUMULATIVE_BUY-SOLD_PREVIOSLY BUY_SELL_DIFFERENCE FROM cumulative_units_sold SELL FULL OUTER JOIN BUY ORDER BY SELL_INDEX, BUY_INDEX ), --update the stocks to what we would have after the previous sale --rows with negative (OR 0) buy-sell differences are removed in the next table updated_inv_quantity AS ( SELECT SELL_INDEX, SELL_QUANTITY, BUY_INDEX, BUY_QUANTITY, BUY_PRICE, BUY_SELL_DIFFERENCE, --remaining_stock_rank will be 1 for any partially remaining stock. Which will come after some of the prev or same rows have used up stock rank() over (partition by sell_index order by iff(buy_sell_difference <=0, NULL, buy_index)) remaining_inv_rank, --make a adjustments for any partially depleted stock. Keep non-partially depleted stock the same number as what was bought iff(remaining_inv_rank = 1, BUY_SELL_DIFFERENCE, BUY_QUANTITY) updated_inv --updated holdings after the previous sale FROM previous_sales ORDER BY SELL_INDEX, BUY_INDEX ), cogs_whole AS ( SELECT SELL_INDEX, SELL_QUANTITY, BUY_INDEX, BUY_PRICE, updated_inv, SUM(updated_inv) OVER (PARTITION BY SELL_INDEX ORDER BY BUY_INDEX) CUMULATIVE_inv, CUMULATIVE_inv-SELL_QUANTITY UPDATED_BUY_SELL_DIFFERENCE, --if a buy record has been completely depleted, that is counted as a whole cogs IFF(UPDATED_BUY_SELL_DIFFERENCE<=0, UPDATED_inv*BUY_PRICE, 0) COGS_WHOLE FROM updated_inv_quantity WHERE BUY_SELL_DIFFERENCE >0 ORDER BY SELL_INDEX, BUY_INDEX ), cogs_part as ( select *, rank() over (partition by sell_index order by iff(cogs_whole = 0,buy_index, NULL)) cogs_part_rank, IFF(cogs_part_rank = 1, (UPDATED_inv - UPDATED_BUY_SELL_DIFFERENCE)*BUY_PRICE, 0) cogs_part, cogs_whole+cogs_part cogs_all --IFF(UPDATED_BUY_SELL_DIFFERENCE>SELL_QUANTITY) --LAG(COGS_WHOLE) OVER (PARTITION BY SELL_INDEX ORDER BY BUY_INDEX) TEST from cogs_whole ORDER BY SELL_INDEX, BUY_INDEX ) select sell_index, sum(cogs_all) cogs from cogs_part group by sell_index ;
@TravelChroniclesHD4 ай бұрын
Love it!
@dotpi59074 ай бұрын
Glad it helped!
@kevinduffy24285 ай бұрын
What if you did not want to bring the files down to the local machine? How would you process the files up on Azure? And run the Python code on Azure. For instance, the files were placed in blob storage and now you wanted process them, clean them up and then save out the results out in blob storage. The Python code is not complicated , just what are the pices/configuration up on Azure.
@busydoingnothing26772 ай бұрын
You'd probably need a VM
@ruigerdАй бұрын
I'm trying to do this in an azure datafactory and a custom activity in the datafactory can execute python files that are saved on a blob. You will need an azure batch account with a pool that has python installed on it (pools are based on vm's and some vm's have python pre-installed). Another way could be azure function apps but I have not tried that enough
@hipotenus61545 ай бұрын
can i create different layouts and size them in specific area without them stepping outside of the limits i gave them?
@100529415 ай бұрын
Nice Video! I completed what you did but I want to make an addition. I would like to have a column with the grade and that the power query gets the web info for that specific grade. So to have the "ungraded" between square brackets replaced with an value of a column in de source file. = Table.AddColumn(#"Added Custom", "Value", each Web.Page(Web.Contents([urls]))[Data]{0}[Ungraded]{0}) But after trying I'm not getting the desired results. only errors that the specific column cant be found in the table of the URL (understandable) Do you have an idea how to fix this?
@gilissantos65326 ай бұрын
I'm having connection problems, but this video is solving them, thank you very much!
@MarcGKiwi6 ай бұрын
Hi, can you write to Blob storage from ArcGIS Pro using these connections?
@cargouvu6 ай бұрын
How do you remove the single quotes on the column headers after the pivot? Also, I am unable to retrieve in another select statement once it has the single quotes.
@georgetong6 ай бұрын
Just saying thanks for this. The online docs for how to authenticate with cloud providers is absolutely trash, and this was the only one that showed you needed to go to settings to add env vars. Helped me figure out how to authenticate with GCS
@dotpi59073 ай бұрын
Thanks! Yes this took me ages to figure out, so had to share
@dhruvajmeri86777 ай бұрын
Thank you fo this video! It saved my time
@k2line7067 ай бұрын
Fantastic video. Very clear explanation and clean code for us to follow. Thank you!
@ryansylvester19937 ай бұрын
Dude, thank you so much!!!
@beemac797 ай бұрын
What about Snowflake Roles? How do assign them to a user?
@paulopraca61977 ай бұрын
Thank you for this video! It's exactly what I'm looking for . Thanks again!
@VikneshKoodalingam-qu2bl8 ай бұрын
It is very detailed and clear explanation! Great work:)
@dashathomas93958 ай бұрын
What happens to the connection when i share the workbook to colleagues? Just won't refresh, or does the data revert to errors of some sort? (Users who don't have a snowflake driver or creds)
@b3llydrum8 ай бұрын
Notice the decorator function is *returning a function*, not simply calling it. That's a key part to understand.
@Kay-qg5td8 ай бұрын
I need to establish a direct connection, do not want to export data to excel, my data tables are larger in size. How do I achieve that?
@saltrocklamp1999 ай бұрын
I had to put "dsn=my-db-name" in the "Database" field, instead of just "my-db-name". Not sure why.
@remorabay9 ай бұрын
I have a python script that reads an EDI file and, from there, creates unique data tags and elements (basically a CSV file with one tag and data field, per line). I need to process to load this into Azure and, for the outbound, to extract into the same tags+data. This looks close. Anyone interested in giving me a quote for this (can you show it working?). Thanks.
@o.chatri9 ай бұрын
Thanks a lot for this ! It's working for me.
@luismedina87469 ай бұрын
My database already has 2 columns: Lat and Long. Can I just call these two columns straight into GIS Pro and the software will identify it? Or I have to convert it using the same format as you?
@Justice4299 ай бұрын
Thanks! that was very helpful
@dotpi59079 ай бұрын
Thanks for watching!
@hebasamir743110 ай бұрын
thanks☺
@dotpi59079 ай бұрын
Thanks for watching!
@subhamchakraborty682210 ай бұрын
Hi, how to download that data and store it in excel or csv? Thanks in advance.
@CapitanFeeder10 ай бұрын
Videos like yours should have way more views. Thank you for what you do.
@dotpi59079 ай бұрын
Thanks so much! I really appreciate the support
@kartikgupta841310 ай бұрын
thank you for this video
@dotpi59079 ай бұрын
Thanks for watching!
@ericbixby10 ай бұрын
Do you have any suggestions for how to then write a file in a similar fashion to the storage blob?
@MengoMyth2 ай бұрын
I have the same question.
@investing337011 ай бұрын
What with happen when you have SAS token on hand, can it be replaced with account key?
@dotpi590710 ай бұрын
Hi @investing3370, try changing line 16 to: sas_token = 'your sas token' connect_str = 'DefaultEndpointsProtocol=https;SharedAccessSignature=' + sas_token + ';EndpointSuffix=core.windows.net' you wont line lines 11 to 13 let me know if that works