Thanks for the review of my custom integration. Extreamly well presented as always! I'll be working on the auto YAML output next, it's nearly there. And hopefully the local LLM if people would also like that. ❤
@BJGrady2 ай бұрын
This is an amazing integration. Thank you to the developer and thank you. Bearded tinker for letting us know about it.
@brianmorgan72072 ай бұрын
The timing of this video is perfect. I was at the HA get together at the Github office this last weekend. Paulus and Frank were there and during the discussion session, I and another participant talked about having a function that did something similar to the AI assistant. We said that we would like a function that looked at usage patterns, that it, who did what, with what entities and made suggestions about automations. AI Assistant is a good first step. I believe I will get involved in this project. Thanks.
@BeardedTinker2 ай бұрын
Awesome!!! Hope you had great time! I've organised local meetup here with Paulus in July this year - it was a blast (plus early spoiler of upcoming voice assistant device). Yes, this one is great start and I know that they were also looking at this, how to bring AI in HA to help with Automation.
@brianmorgan72072 ай бұрын
We were shown the same new device. It was completely working.
@dominoxan2 ай бұрын
Hi! Thanks for the video! Very interesting topic, to be honest I've started to prepare similar project but will hold on for a sec as this one seems to be mature enough. Not sure about part which says "will suggest automations based on newly added devices". What about existing ones, should I re-add all my integrations? I hope that project will move thru all planned phases - well written maybe a game changer for home assistant which is far away from being ... inteligent. Also local Ollama / LMStudio would be a nice to have if this solution at some point will need to send a huge amount of data from SQL database logs to cloud.
@BeardedTinker2 ай бұрын
Not sure how this will work with LLM, but for this type of integration speed is not critical - I wouldn't care if it would take hours for it to chew the data, as this is not voice assistant that requires fast responses. Did you start working on your own version or were you in planning phase?
@grahamahosking2 ай бұрын
I have some thoughts on logging newly added devices and existing entities so that we can segregate this type of request. I will add it to the issues log to be worked on. Loving all these ideas to make this integration better - Thank you
@Tntdruid2 ай бұрын
Now we get Skynet in HA 😄
@BeardedTinker2 ай бұрын
Skynet lives forever ;)
@grahamahosking2 ай бұрын
The end goal is to always have a human involved in the process. However, the Suggester could generate the YAML automation for you, allowing you to simply press "GO" to execute it. I'm exploring ways to assist people in creating exciting automations without requiring extensive initial cognitive skills. While it's rewarding to develop automations from scratch, it's also beneficial to understand the range of possibilities and discover features you might have overlooked ;) 'Skynet rulez
@ScienceGuyRoy2 ай бұрын
Thanks for the video! I have never gone down the path of getting an API key. I see a lot of different models. Do you have a recommendation for someone new to API keys, please?
@grahamahosking2 ай бұрын
The OpenAI API keys operate on a pay-as-you-go model, meaning that more requests increase the cost, but it's only a matter of pence. Another option is to use local LLMs, which involve only compute costs. However, with this custom integration, speed is not a critical factor, making it more suitable for this use case. I plan to enhance the custom integration over the next few weeks.
@ScienceGuyRoy2 ай бұрын
@@grahamahosking Thanks so much! I made an account and will try the install later today.
@rumunn952 ай бұрын
so for the free plan this not working?
@BeardedTinker2 ай бұрын
I don't think so - API access is different than free accoutn AFAIK.
@grahamahosking2 ай бұрын
Would be cool to call out to "free" services but that's not how the models work today. I'm working on integrating to Ollama for local model access which then would be at no cost for open source models.
@geejayem592 ай бұрын
How does the billing work with these things. Presumably it doesn't bill 1c (or whatever) for each query. Do you pay in an amount up front that gets drawn down for each query, then top up when it gets low.
@grahamahosking2 ай бұрын
You can top up your OpenAI account with "an" amount, then each query will take from it. It's important to set limits on your OpenAI account, but the amount will last for a long time
@gemargordon68852 ай бұрын
Will this work with Gemini?
@grahamahosking2 ай бұрын
Yes the latest version works with AI Providers: OpenAI, Anthropic, Google, Groq, LocalAI, and Ollama for their AI models and APIs.
@gemargordon68852 ай бұрын
@ thank you
@ianrobson96122 ай бұрын
It keeps telling me the Entity is unavailable. I tried with a new API Key and also checked the funding.
@BeardedTinker2 ай бұрын
Do you have any errors in the log file? You can try adding this to configuration.yaml file: logger: default: warning logs: custom_components.ai_suggester: debug openai: debug Also in troubleshoot section there is part related to dependency - github.com/ITSpecialist111/ai_automation_suggester?tab=readme-ov-file#troubleshooting
@BeardedTinker2 ай бұрын
I don't see my answer (again) - do you see anything in log files? You can try to enable debugging for this integration and check if there is anything in logs too... Looks like it is not loading up. There is only one prerequisite - maybe that's the problem. Check troubleshooting section on GitHub... And logs, check logs.
@ianrobson96122 ай бұрын
With the help of the developer Graham I was able to get it working. We deleted the integration from HACS. Deleted the directory ai_automation_suggester in custom_components and restarted home assistant. Then reinstalled the integration again and used the new API key and everything worked as expected. Great Video and Great integration.
@michaelthompson6572 ай бұрын
So if you have ChatGPT plus would you still get charged?
@BeardedTinker2 ай бұрын
I would say yes, but really not sure. API tokens are different than pure ChatGPT.
@grahamahosking2 ай бұрын
The ChatGPT chatbot is charged seperately to accessing the API's. The API access is charged on a Pay as you go bases. Which isn't very much at all for GPT-4o-mini :) Other models will be available soon
@michaelthompson6572 ай бұрын
@ thanks! So even if you have the subscription it’s only first part, no api
@grahamahosking2 ай бұрын
@michaelthompson657 no APIs in the chatbot side no. Head over to the openai playground to setup your usage
@michaelthompson6572 ай бұрын
@ no problem, thanks!
@KubedPixel2 ай бұрын
Sorry but as soon as you said it has to have access to an external "AI" processor, I lost interest which resulted in a thumbs down. The whole idea and core purpose of home assistant is for local control. I won't have a 3rd party having access to devices on my network.
@grahamahosking2 ай бұрын
Local access to the LLM will be available in the coming weeks, allowing for complete local control. I understand that Home Assistant users prefer everything to be local. However, for the initial version of the integration, it was simpler to build on public APIs for ease of use. Local access is part of our roadmap, so please be patient.