Learn how to become an Expert at Travel Planning with Tommy Travel AI

  Рет қаралды 163

Fast and Simple Development

Fast and Simple Development

Күн бұрын

Пікірлер: 10
@fastandsimpledevelopment
@fastandsimpledevelopment Жыл бұрын
People have asked what is the Tech Stack - Java 17, Spring Boot, AWS, Linux, Docker, MongoDB, Javascript, Python, Microservices
@ExpertKNowledgeGroup
@ExpertKNowledgeGroup Жыл бұрын
Amazing concepts, love teh ChatGPT Integration
@eprohoda
@eprohoda Жыл бұрын
Good night, Fast. super~you did good vlog. =)
@fastandsimpledevelopment
@fastandsimpledevelopment Жыл бұрын
Thanks
@lololoololdudusoejdhdjswkk347
@lololoololdudusoejdhdjswkk347 8 ай бұрын
Great project. I was wondering, since you used an LLM for this, how did you integrate real time data and other things into it? Did you perhaps use some webscraper or real time API and then pushed into prompt? Sorry for the bombardment! Very very cool!
@lololoololdudusoejdhdjswkk347
@lololoololdudusoejdhdjswkk347 8 ай бұрын
I’m also interested in if you did do API calls, how did you guys change the query into the form of an API call? did you guys perhaps need to create dummy data in a database to knowledge embed or did you just create a prompt to configure API requests based on user input
@fastandsimpledevelopment
@fastandsimpledevelopment 8 ай бұрын
I created a context workflow at the beginning of the system so based on the context (one pass thru LLM) I then called external API's to collect that data (Like flight info for a date with prices) and then let teh LLM have access to that information. So yes I did use real time API for many things including Stock prices and history and Jira and Confluence as well as Slack integrations for the LLM
@lololoololdudusoejdhdjswkk347
@lololoololdudusoejdhdjswkk347 8 ай бұрын
Interesting, i wonder how much your AWS cost must be for hosting the LLM. I’m still a beginner, but If I were to try to recreate the structure I would assume you had a prompt set up to get context of the query , then made it so it would generate a desired response as well as a url(from context) to get the correct API you needed to call. Then I assume you would return this response. Is there perhaps a way for an LLM to call an API directly from the prompt?
@lololoololdudusoejdhdjswkk347
@lololoololdudusoejdhdjswkk347 8 ай бұрын
Look forward to more of your videos too
@lololoololdudusoejdhdjswkk347
@lololoololdudusoejdhdjswkk347 8 ай бұрын
Also would it be plausible to organize/ force the response from LLM to be a dictionary(url, response), so it’s much easier to classify the data and get what you want?
The 8 AI Skills That Will Separate Winners From Losers in 2025
19:32
My 17 Minute AI Workflow To Stand Out At Work
17:30
Vicky Zhao [BEEAMP]
Рет қаралды 170 М.
REAL or FAKE? #beatbox #tiktok
01:03
BeatboxJCOP
Рет қаралды 18 МЛН
So Cute 🥰 who is better?
00:15
dednahype
Рет қаралды 19 МЛН
We Attempted The Impossible 😱
00:54
Topper Guild
Рет қаралды 56 МЛН
Generative AI in a Nutshell - how to survive and thrive in the age of AI
17:57
8 AI Tools I Wish I Tried Sooner
16:10
Futurepedia
Рет қаралды 315 М.
Transformers (how LLMs work) explained visually | DL5
27:14
3Blue1Brown
Рет қаралды 4,4 МЛН
Expert Guide: Installing Ollama LLM with GPU on AWS in Just 10 Mins
10:14
Fast and Simple Development
Рет қаралды 10 М.
ChatGPT Said My Render Sucks | AI
5:22
RenderRam
Рет қаралды 2,2 М.
How to Build Effective AI Agents (without the hype)
24:27
Dave Ebbelaar
Рет қаралды 61 М.
Google's 9 Hour AI Prompt Engineering Course In 20 Minutes
20:17
Tina Huang
Рет қаралды 182 М.
Turn ANY Website into LLM Knowledge in SECONDS
18:44
Cole Medin
Рет қаралды 120 М.