Firstly, this explains the inner workings well (I have jumped directly to the code and could only understand like 70-80% of it, but watching this video made it more sense of what's overall happening inside) So is there any way to track the number of API calls made for a given task and even the total number of input tokens sent and the total number of output tokens generated. I looked briefly through the code and could not find it. Would be happy to contribute for the same if such functionality is required
@johntanchongmin7 күн бұрын
Indeed, we could have a counter in the llm function to track this. You could implement this yourself too in your own custom llm function If you would like to contribute this functionality for the default llm, you can directly modify the chat function in base.py Thanks!
@tharunbhaskar67957 күн бұрын
@@johntanchongmin thats exactly I thought after a while. Like one method is to use a counter in the llm function directly which will not alter the actual codebase
@faisalIqbal_AI4 ай бұрын
Thanks
@princemathew9034Ай бұрын
Really loved your session. I have been looking at agent frameworks. Felt like most of them are too bloated. But this one i have high hopes. Is there a way i can contribute? Please let me know
@johntanchongminАй бұрын
@@princemathew9034 sure, can go to the GitHub and make a pull request anytime. Do chat with me on my discord to find out more about what we are doing next :) And we have a session tomorrow as well to run through the latest TaskGen paper.
@johntanchongminАй бұрын
Discord link is here btw: discord.com/invite/bzp87AHJy5
@princemathew9034Ай бұрын
@johntanchongmin Persisting the agent is a great feature to have. Also if i have a chat interface or API interface then user and agent can have a conversation. How do you handle that in here?
@johntanchongminАй бұрын
Check out shared variables for persistent memory. Also we have a conversational interface in tutorial 6 wrapping over the agent
@princemathew9034Ай бұрын
@@johntanchongmin I directly jumped to creating an agent. Got to see it.