Please do more on AWS Bedrock to develop on RAG applications......your explanation is simple and effective.......stay motivated and upload more videos about LLM
@AboniaSojasingarayar8 ай бұрын
Thanks for your kind words! Sure I will do it.
@akshaykotawar58168 ай бұрын
Yes same thing i want @@AboniaSojasingarayar
@AboniaSojasingarayar7 ай бұрын
Here the tutorial link to Deploying a Retrieval-Augmented Generation (RAG) in AWS : kzbin.info/www/bejne/nZrGpJVvpZyooJY
@raulpradodantas93865 ай бұрын
Save my life to create lambda layers... I have been trying for days. TKS!
@AboniaSojasingarayar4 ай бұрын
@@raulpradodantas9386 Glad to hear that! You most welcome.
@zerofive36999 ай бұрын
Very informative
@AboniaSojasingarayar8 ай бұрын
Glad it was helpful!
@humayounkhan79466 ай бұрын
Hi Abonia, thanks for the thorough guide, but i'm abit confused with the lambda_layer.zip file, why did you have to create it through docker? is there an easier way to provide the dependencies in a zip file without going through docker? Thanks in advance!
@AboniaSojasingarayar6 ай бұрын
Hi Humayoun Khan, Yes we can but Docker facilitates the inclusion of the runtime interface client for Python, making the image compatible with AWS Lambda. Also it ensures a consistent and reproducible environment for Lambda function's dependencies. This is crucial for avoiding discrepancies between development, testing, and production environments. Hope this helps.
@NJ-hn8yu6 ай бұрын
Hi Abonia, thanks for sharing. I am facing this error . can you please tell how to resolve it "errorMessage": "Unable to import module 'lambda_function': No module named 'langchain_community'",
@AboniaSojasingarayar6 ай бұрын
Hello, You are most welcome. You must prepare your ZIP file with all the necessary packages. You can refer to the instructions starting at the 09:04
@tapiaomars2 ай бұрын
Hi, its possible integrate DynamoDB for store and retrive context of last user prompts in lambda function?
@AboniaSojasingarayar2 ай бұрын
Hello, Yes , DynamoDB, S3, or in-memory storage depending on requirements. Each piece of context is associated with a user ID, ensuring that contexts are isolated per user with conversation ID. Hope this helps.
@tapiaomars2 ай бұрын
@@AboniaSojasingarayar Thanks, I'll try it and let you know how it goes.
@Bumbblyfestyle9 ай бұрын
👍👍
@AboniaSojasingarayar8 ай бұрын
😊😊
@akshaykotawar58168 ай бұрын
Very Informative thanks for uploading
@AboniaSojasingarayar8 ай бұрын
Glad it helped!
@RajuSubramaniam-ho6kd6 ай бұрын
Thanks for the video. Very useful for me as I am new to AWS lambda and bedrock. Can you please upload the lambda function source code? Thanks again!
@AboniaSojasingarayar6 ай бұрын
Glad it helped. Sure you can find the code and complete the article on this topic in the description. In any way here is the link to the code : medium.com/@abonia/build-and-deploy-llm-application-in-aws-cca46c662749
@fkeb37e9w07 ай бұрын
Can we use openai and chromadb on aws??
@AboniaSojasingarayar7 ай бұрын
Yes we can! In the below tutorial I have demonstrated how we can create and deploy lambda layer via container for larger dependencies : kzbin.info/www/bejne/nZrGpJVvpZyooJYsi=F_X7-6YCAb0Kz3Jc
@fkeb37e9w07 ай бұрын
@@AboniaSojasingarayar yes but can this be done without eks or containers?
@AboniaSojasingarayar7 ай бұрын
Yes! You can try it by creating a custom lambda layer.If you face issue try to use only the required libraries and remove any unnecessary dependencies from your zip file.Hope this helps.
@user-wr4yl7tx3w3 ай бұрын
Can we simply rely on open source only without using Amazon? What if it is just prototyping?
@AboniaSojasingarayar3 ай бұрын
Yes, we can use open source completely.
@Nabeel272 ай бұрын
I get error: Runtime.ImportModuleError: Unable to import module 'lambda_function': Error importing numpy: you should not try to import numpy from its source directory; please exit the numpy source tree, and relaunch your python interpreter from there. followed all steps as in your video.
@Nabeel272 ай бұрын
Looks like I had to setup the lambda as arm64 and the layer (created on mac Docker) also as arm64. Next, it also requires Bedrock setup and access request to llama model to use. llama 2 is no longer available, have to request llama 3 8B or something else.
@AboniaSojasingarayar2 ай бұрын
Hello Nabeel, Are you still facing the above issue?
@Nabeel272 ай бұрын
@@AboniaSojasingarayar Thank you so much for following up! the error I am getting now is this: "errorMessage": "Error raised by bedrock service: An error occurred (AccessDeniedException) when calling the InvokeModel operation: User: arn:aws:sts::701934491353:assumed-role/test_demo-role-sfu6wu6d/test_demo is not authorized to perform: bedrock:InvokeModel on resource: arn:aws:bedrock:us-east-1::foundation-model/meta.llama3-8b-instruct-v1:0 because no identity-based policy allows the bedrock:InvokeModel action",
@Nabeel272 ай бұрын
@@AboniaSojasingarayar I was able to solve it. I got the permission to use llama3 and also had to update role permissions to use Bedrock.