Introduction OpenAI Assistant
7:40
Пікірлер
@Steve.Goldberg
@Steve.Goldberg 5 сағат бұрын
Bought the guide but having trouble simply connecting my nocodb account to n8n. I created an API token and added my railway nocodb url and it says connection succesful, but it cant pull in any of my workspaces. I read both docs and I am lost.
@derekcheungsa
@derekcheungsa 5 сағат бұрын
Hi Steve, thanks for purchasing the template. Let me reach-out to you via email and get you up and running
@Steve.Goldberg
@Steve.Goldberg 4 сағат бұрын
​@derekcheungsa i figured out the issue. Nocodb recently got rid of workspaces completely for self hosted instances, so it just says No Workspace. My auth worked after deleting and creating g a new one.
@derekcheungsa
@derekcheungsa 4 сағат бұрын
@@Steve.Goldberg Awesome! Great to hear Steve.
@CristinaPena-b2z
@CristinaPena-b2z Күн бұрын
I´m the same person. The v9 template is not avalaible in the cumroad. Can you help me with that please?
@derekcheungsa
@derekcheungsa Күн бұрын
Hi Cristina, I've sent update via email. Should be on gumroad now as well.
@CristinaPenaMartinez
@CristinaPenaMartinez Күн бұрын
The template doesn´t work. n8n indicates that Data AI Agent is a custom node and it is not installed. Please can you give any information about how to fix it? What is the n8n release that you are using?
@derekcheungsa
@derekcheungsa Күн бұрын
Thanks Cristina for your feedback. I'm on 1.72.1. I've updated the gumroad template to be backward compatible with earlier releases of n8n. Please download the v9 template.
@CristinaPena-b2z
@CristinaPena-b2z Күн бұрын
@@derekcheungsa the v9 template is not avalaible in gumroad. Please can you help me with that?
@derekcheungsa
@derekcheungsa Күн бұрын
@@CristinaPena-b2z I think the gumroad server might need just a few minutes to sync. It's there now. But let me email it to you to make it a bit easier 🙂
@themorethemerrier281
@themorethemerrier281 Күн бұрын
Your way of delivery is up there among the best. This is perfect for me and I will build this. But I want to do it all locally. Wish me luck! 😊
@derekcheungsa
@derekcheungsa Күн бұрын
Thanks! Good luck 🙂
@delphiguy23
@delphiguy23 3 күн бұрын
Hi, Im quite interested with what you have here. I've installed n8n on my local network via docker and trying it out now. I even purchased your template. Question, should the AI be Claude? or can I change that to DeepSeek since its cheaper ;) thanks
@derekcheungsa
@derekcheungsa 2 күн бұрын
Thanks for buying the template. Unfortunately, Deepseek doesn't yet support multi-modal, so that LLM can't be used. In my evaluation of the LLM's for this use case, I found Claude Sonnet to be the best. I have also seen good results with gpt-4o which is slightly cheaper than Sonnet. I don't recommend weaker models like gpt4o-mini as the results are not as good.
@delphiguy23
@delphiguy23 2 күн бұрын
@@derekcheungsa cool thanks for the reply ;)
@jimlee7272
@jimlee7272 4 күн бұрын
Is it possible to connect to sqlserver fetching data and then perform data analysis
@derekcheungsa
@derekcheungsa 4 күн бұрын
Hi Jim, yes, absolutely. That's a good idea. There's two key things that are needed: the meta info for the table column names and then a filter mechanism. nocodb has a simple sql like filter that makes this easy, but I can see that postgres db such as supabase has a simple filter ability as well. So this should be possible to adapt. Do you have use case you are thinking of?
@jimlee7272
@jimlee7272 3 күн бұрын
​@@derekcheungsa Ah yeah, I am a data analyst working for a company. It would be really convenient if I could interact with the database while conducting data analysis to gain insights
@jimlee7272
@jimlee7272 3 күн бұрын
​@@derekcheungsa Ah yeah, I am a data analyst working for a company. It would be really convenient if I could interact with the database while conducting data analysis to gain insights
@Steve.Goldberg
@Steve.Goldberg 4 сағат бұрын
​@@derekcheungsaI'm thinking of using supabase and connecting nocodb to it as the data source for easier management. Have you ever done this? Also, would love to see a crash course or video going more in depth on using nocodb for using formulas, prompt templates and prompt chains instead of putting prompts in n8n, which is a pain to manage and keep track of.
@user-uv3nv2bc6v
@user-uv3nv2bc6v 6 күн бұрын
Derek, can you make this for Flowise? I guess many users prefer Flowise, because of the MIT License. Chat with PDF with Flowise.
@RamonTomzer
@RamonTomzer 7 күн бұрын
another awesome video man, keep it up
@derekcheungsa
@derekcheungsa 7 күн бұрын
Thanks Ramon!
@SibghatUllahDevelopLogix
@SibghatUllahDevelopLogix 7 күн бұрын
Can we do it using Supabase Sql db? and if it is efficient to do it using Supabase vector store then please refer me to a video explaining the use of Ai agent with big data analysis using vector db. Thanks
@silverheretic7327
@silverheretic7327 7 күн бұрын
Can the chart visualization and insight be sent to Telegram?
@derekcheungsa
@derekcheungsa 7 күн бұрын
Yes, a bit more work needed to send to telegram node that displays the image as a photo. I'm curious the use case you are thinking of.
@silverheretic7327
@silverheretic7327 7 күн бұрын
@derekcheungsa I want to create data analysis agent which can be sent the output to telegram both chart as an image and insight, can you guide me how it can be send?
@derekcheungsa
@derekcheungsa 7 күн бұрын
Generally, you would add a telegram send text node to the data analyst agent. And in the Visualization Tool workflow, add a telegram send photo node. There's a few tricky parts: before sending to telegram send photo node, you'll need to download image first. Additionally, you'll need to store the chat_id from the telegram trigger into a global variable so that it can be referenced in the send photo telegram node in the tool.
@silverheretic7327
@silverheretic7327 5 күн бұрын
@@derekcheungsa thanks for ideas, do you have any video explanation on the telegram integration part?
@TheIdosimon
@TheIdosimon 7 күн бұрын
I bought this to modify with chat triggers and haven't gotten it to work as keep getting perplexity errors either on API authorization or on the json.query.query. I have tried both perplexity/open router to no avail - how can this be fixed?
@derekcheungsa
@derekcheungsa 7 күн бұрын
thanks for the feedback. happy to help get this sorted out. Can you send me a screen shot of what you are seeing via the support email? Additionally, there are now additional setup steps in the gumroad page.
@TheIdosimon
@TheIdosimon 7 күн бұрын
@@derekcheungsathanks for the quick reply! Was trying to find an email before I commented but must have missed it can you point me to where it is? Will also look at the gumroad again
@derekcheungsa
@derekcheungsa 6 күн бұрын
@@TheIdosimon Just following up. Were you able to find the information you needed?
@TheIdosimon
@TheIdosimon 6 күн бұрын
@@derekcheungsa Thanks for following up, got the perplexity stuff working. Is there a way to auto publish with ghost or can it only do auto draft and manual publishing?
@derekcheungsa
@derekcheungsa 6 күн бұрын
@@TheIdosimon Yes, you can use a custom api call in n8n to directly publish without auto draft. Here's how you would go about doing this: ghost.org/docs/admin-api/
@paritoshmishra9875
@paritoshmishra9875 7 күн бұрын
Call n8n Workflow Tool does not have return value field
@derekcheungsa
@derekcheungsa 7 күн бұрын
If you use it as a tool, it does :-)
@sleepless-nite
@sleepless-nite 8 күн бұрын
I noticed you did not have any system prompt, does your agent hallucinate? or does it sometimes response without calling your tool?
@krisszostak4849
@krisszostak4849 8 күн бұрын
You really deserve more recognition my man!!!
@derekcheungsa
@derekcheungsa 8 күн бұрын
Thanks very much! I appreciate it.
@ashutoshsharma328
@ashutoshsharma328 10 күн бұрын
Newbie here. What benefit does it have over gemini deep research or regular perpelxity pro research ?
@derekcheungsa
@derekcheungsa 10 күн бұрын
Thanks for the feedback. It uses the perplexity api as a tool in the automated workflow and ability specify the length, number of sections, tone of voice in generation. It also uses a form of chain of thought to break down the work first by creating a table of contents. This workflow can also be adapted to create full ebooks :-) That's something I will work a bit later.
@wape-info
@wape-info 10 күн бұрын
Thanks for your sharing, thats help a lot
@derekcheungsa
@derekcheungsa 10 күн бұрын
Thanks for your feedback :-)
@70sJazzRockFusion101
@70sJazzRockFusion101 16 күн бұрын
Hello, nice work. I get error on Telegram trigger re webhooks URLs need to be HTTPS but they are HTTP.
@yttech.7108
@yttech.7108 17 күн бұрын
I found that we can use N8N to build a lot of practical agents and I am wondering when we should consider using Flowise, thanks for the good videos
@MH-xx6df
@MH-xx6df 18 күн бұрын
But really great work! I need this!
@derekcheungsa
@derekcheungsa 18 күн бұрын
Thanks!
@MH-xx6df
@MH-xx6df 18 күн бұрын
I'd like to see this combined with E2B somehow for changing functionality as well?
@derekcheungsa
@derekcheungsa 18 күн бұрын
If you have a specific use case idea, please let me know :-)
@MH-xx6df
@MH-xx6df 18 күн бұрын
@ flowise has an E2B and Anthropic flow where you can do data analysis and generate charts. Similar to Pandas AI functionality
@24pfilms
@24pfilms 19 күн бұрын
Hello Derek Great stuff Will these templates be rolled into your Udemy course because I am very interested in getting access to these and more than willing to pay for the Udemy course please let me know.
@derekcheungsa
@derekcheungsa 19 күн бұрын
Yup, it's already there. All my paid templates are in the course.
@24pfilms
@24pfilms 19 күн бұрын
@@derekcheungsa Thanks Derek I signed up ;)
@kitlee888
@kitlee888 19 күн бұрын
Thanks....this is nice alternative to use gsheet as database. can it be dynamically or automatically connected to nocoDB?.
@derekcheungsa
@derekcheungsa 19 күн бұрын
Do you mean to select which db at runtime? If so, yes.
@kitlee888
@kitlee888 19 күн бұрын
I mean, if there's an update with google sheet data, can it persist(realtime update in) thru nocodb. so that i will not manually convert the googlesheet ro csv then download the file to nocodb
@derekcheungsa
@derekcheungsa 19 күн бұрын
@@kitlee888 I see. Yes, to do this, you setup a monitor flow that monitors for changes in gsheets and then makes the update in nocodb.
@kitlee888
@kitlee888 19 күн бұрын
@@derekcheungsa thank you, sorry i'm just a beginner in ai automation but very interested to learn(hopefully there's available n8n template for this use case, ill avail it to test and learn). thanks again.
@huahuapro
@huahuapro 20 күн бұрын
can you share this in 8n8?
@derekcheungsa
@derekcheungsa 19 күн бұрын
Yes, I've updated the video description with link.
@phantomslayer9714
@phantomslayer9714 20 күн бұрын
Thanks for the content bro ❤😎. Btw, if I wanted to scrape multiple websites , would I just have different workflows ?
@derekcheungsa
@derekcheungsa 19 күн бұрын
thanks for the feedback! You can hook up to gsheets with the row of websites you want and then connect to split node that effectively acts as a loop. You put these two nodes in front of the scrape workflow
@phantomslayer9714
@phantomslayer9714 19 күн бұрын
@ thanks dude. Appreciate the response 😎🔥
@MisCopilotos
@MisCopilotos 20 күн бұрын
Can you teach us Derek how to use the chatflow-tools? In this case, could that be an option instead of building the custom-tool?
@MarcBauBenavent
@MarcBauBenavent 20 күн бұрын
This is a very inefficient way of doing it, it could never be used in an environment that comes close to reality. You have to use a vector database. Keep in mind that in reality a normal database usually has thousands of data and you can't pass all this in the prompt because of context tokens. Greetings
@derekcheungsa
@derekcheungsa 20 күн бұрын
Hi Marc, thanks for your comment. Please keep in mind that the example I'm using has 14,000 records and over 40 columns. I'm pretty sure this approach can scale to 100,000 rows or more as it uses essentially an sql approach to get exactly the data needed to add to the context window for the LLM. One of the key insights in this video is that with the nocodb, the sql needed is super easy unlike traditional sql needs essentially LLM code gen and more prone to variability.
@Praskand_Upadhyay
@Praskand_Upadhyay 19 күн бұрын
Could you share the csv?
@istaruscanada6572
@istaruscanada6572 21 күн бұрын
it would be nice if you make a video and build the whole thing again and explained the steps. This was good Thanks
@derekcheungsa
@derekcheungsa 20 күн бұрын
Thanks for the suggestion. I appreciate it.
@MisCopilotos
@MisCopilotos 21 күн бұрын
Nice tutorial! Question Derek, is it possible to make the same analysis with Flowise CSV Agent?
@derekcheungsa
@derekcheungsa 21 күн бұрын
Thanks for the question. No, csv agent won't scale to this size of data set. There's 14K rows.
@wulfrum5567
@wulfrum5567 26 күн бұрын
Is this locally hosted?
@derekcheungsa
@derekcheungsa 26 күн бұрын
On railway.app
@fredrickskogstad3084
@fredrickskogstad3084 27 күн бұрын
Hey, great video. I am having problems configuring Perplexity the same way as you. I also would much appreciate if you can do a node by node walk trough:))
@ricardot3763
@ricardot3763 28 күн бұрын
Can it click on elements of a list and extract from there
@derekcheungsa
@derekcheungsa 20 күн бұрын
Hi Richardo, unfortunately not.
@payuharris2686
@payuharris2686 29 күн бұрын
It keeps returning a pre built message re emails...if I disable the email tool it still returns this phantom 2 emails
@Lanc840930
@Lanc840930 Ай бұрын
Hi great video! May I know which framework to use? Langgraph or something like that?
@derekcheungsa
@derekcheungsa Ай бұрын
Thanks for your feedback. n8n uses the langchain framework for it's visual AI implementation.
@vitalis
@vitalis Ай бұрын
As a perplexity pro user, it often doesn’t actually scrap the latest news, it searches in their vector database for the news. Also, often it gives dead URLs as citations. I wonder if it is different with the API
@vitalis
@vitalis Ай бұрын
How does it handle hallucinations?
@derekcheungsa
@derekcheungsa Ай бұрын
Thanks for the feedback. Indeed it's important to use AI with a co-pilot mindset. We have to do our own due diligence with the output to verify. But that's where the citations helps because you can then see the source.
@vitalis
@vitalis Ай бұрын
Yes, I’m asking specifically because I have perplexity pro and it often gives fake urls
@GrowStackAi
@GrowStackAi Ай бұрын
With AI, every challenge is an opportunity in disguise 🔥
@derekcheungsa
@derekcheungsa Ай бұрын
That is so true
@soniyabanerjee4031
@soniyabanerjee4031 11 күн бұрын
😮😅 😊 0:21
@user-uv3nv2bc6v
@user-uv3nv2bc6v Ай бұрын
WOW Derek, this is super great
@derekcheungsa
@derekcheungsa Ай бұрын
Thanks for your feedback!
@musumo1908
@musumo1908 Ай бұрын
Nice! How does it compare to langraph…and is the template in your newsletter? Thx
@derekcheungsa
@derekcheungsa Ай бұрын
Thanks :-) Yes, I've just sent-out the newsletter with the template included. This is going to be a paid template on gumroad soon. As for comparison to langgraph, n8n is very powerful and I find more expressive -- for example, a split node already parallelizes agent execution.
@musumo1908
@musumo1908 Ай бұрын
@@derekcheungsathanks! Any idea how to get perplexity to stick to a time frame..Im finding it can pull really outdated info thx
@AIVisionaryLab
@AIVisionaryLab Ай бұрын
@Derek could you please share the workflow
@derekcheungsa
@derekcheungsa Ай бұрын
I've added this to the bonus section of my udemy course and have sent out a link to download this to my newsletter community. I want to give them first access :-)
@AIVisionaryLab
@AIVisionaryLab Ай бұрын
@derekcheung2598 thanks I have already purchased your course on udemy, but no link 😢
@derekcheungsa
@derekcheungsa Ай бұрын
​@@AIVisionaryLab thanks for your support! It's in lecture 32, in the resources section.
@krisszostak4849
@krisszostak4849 Ай бұрын
That's amazing Derek! Is there a reason why you chose to use Perplexity tool as a separate workflow rather than http request tool? Is it because different agents use the same tool?
@derekcheungsa
@derekcheungsa Ай бұрын
Thanks Kris. Yes, it's because I need to get the citations from the output. This needs some additional processing that I can't do with just the http request tool.
@funnydogfargo1026
@funnydogfargo1026 Ай бұрын
Great video, as always. Thank you. What is the cost for one such article? Because judging by how long the prompts are, it can be concluded that the token count is quite large.
@derekcheungsa
@derekcheungsa Ай бұрын
Great question. There are two main factors in cost: token count and model cost per million. gpt4o-mini is very inexpensive while gpt4o is a bit more expensive. The Research Assistant agents do a lot of the heavy lifting in the workflow and they use mini. The Editor node requires a stronger model to get all the editing of the citations right, so it uses gpt4o. For an article of 1000 words, the cost is about $0.10.
@funnydogfargo1026
@funnydogfargo1026 Ай бұрын
@derekcheung2598 not so expensive. Nice.
@FarisandFarida
@FarisandFarida Ай бұрын
🎉🎉🎉 love it! Is this blueprint going the udemy course? cos i dont see it
@derekcheungsa
@derekcheungsa Ай бұрын
Yup, I'm uploading this to the bonus section in the course along with the template. Should be available by tomorrow.
@funnydogfargo1026
@funnydogfargo1026 Ай бұрын
​@@derekcheungsa I bought your Udemy course some time ago. And now I'm wondering if it's the same one, or do you have multiple courses on Udemy?
@funnydogfargo1026
@funnydogfargo1026 Ай бұрын
I bought your Udemy course some time ago. And now I'm wondering if it's the same one, or do you have multiple courses on Udemy?
@derekcheungsa
@derekcheungsa Ай бұрын
@@funnydogfargo1026 Thanks for your support. It's the same course on n8n. I'm adding the most popular tutorials and templates in the bonus section of the course. You can download the template now from the bonus section.
@funnydogfargo1026
@funnydogfargo1026 Ай бұрын
@@derekcheungsa nice. Thank you.
@phillycitas
@phillycitas Ай бұрын
so pdf to image is more accurate then using a extract from pdf node?
@derekcheungsa
@derekcheungsa Ай бұрын
For complex tables and graphs, using LLM vision approach will yield better results. For simple text, pdf extraction node works well
@sonsilos
@sonsilos Ай бұрын
Thanks for feed me a good video, I learned a lot from you now I supported you on Udemy!
@derekcheungsa
@derekcheungsa Ай бұрын
Thanks for your support!
@Dev_skoll
@Dev_skoll Ай бұрын
Can't tell if Chat-IMG is working. Edit: You must use "x-api-key" as the name for key
@derekcheungsa
@derekcheungsa Ай бұрын
Thanks for your feedback. In your chat window, to the right, you'll see a log of all the calls that the agent is making and the output of the call. Also, at the top is part of the screen there is an Execution tab. You can use that to see the values of the nodes as well. But mostly, you should be able to see the chart visually displayed. Make sure to set the exchange variable in the settings node. it defaults to NYSE
@Dev_skoll
@Dev_skoll Ай бұрын
@@derekcheungsa Got it, it was being weird because I was also changing and testing right away, but without saving it. Thought it would use the current instance. Thanks!