Digit + Large Language Model = Embodied Artificial Intelligence

  Рет қаралды 66,689

Agility Robotics

Agility Robotics

Күн бұрын

Is there a world where Digit can leverage a large language model (LLM) to expand its capabilities and better adapt to our world? We had the same question. Our innovation team developed this interactive demo to show how LLMs could make our robots more versatile and faster to deploy. The demo enables people to talk to Digit in natural language and ask it to do tasks, giving a glimpse at the future.
---------------------------------------
At Agility, we make robots that are made for work. Our robot Digit works alongside us in spaces designed for people. Digit handles the tedious and repetitive tasks meant for a machine, allowing companies and their people to focus on the work that requires the human element.
Subscribe (hit the bell for notifications)
/ agilityrobotics
Join our Team
www.agilityrob...
Follow our Journey
/ agilityrobotics
/ agilityrobotics
/ agility_robotics
/ agility_robotics
#robotics #machinelearning #AI #engineering #LLM #embodiedAI

Пікірлер: 153
@OZtwo
@OZtwo 9 ай бұрын
Nice! When LLMs hit the market I knew this could be the future for robotics as they no longer need to process objects but now can see what they are looking at knowing everything about those objects.
@Hukkinen
@Hukkinen 9 ай бұрын
Same😅
@KVVUZRSCHK
@KVVUZRSCHK 9 ай бұрын
That's bullshit but okay
@HipHopAndCityGossip
@HipHopAndCityGossip 9 ай бұрын
Dude, he used the prompt to program that robot. He was only able to do it because the guy it told what do from his phone. When digit works autonomously without human intervention, then we’ll see progress.
@OZtwo
@OZtwo 9 ай бұрын
@@HipHopAndCityGossip Yes, he did. Try the same with the RL robot. He was able to ask the robot to do a task here. This is due to LLMs and wasn't needed to be trained first other than the overall master model training itself.
@sammiller6631
@sammiller6631 9 ай бұрын
Nothing in a LLM knows anything. It's just a complex form of pattern matching. It's on rails. It fails when outside of narrow confines.
@middle-agedmacdonald2965
@middle-agedmacdonald2965 9 ай бұрын
For the people who think it's slow (which it is). This is the slowest the robot will ever be. A year ago this was impossible. Keep in mind this robot can work 24/7, which is 168 hours a week. A human works 40 hours per week most of the time (minus about ten hours getting coffee, breaks, talking to coworkers, texting, etc). So while slower than a human working in real time, at the end of a week, they're probably be pretty close to being capable of the same output. The thing is, in a year or two, it'll work faster than a human can in real time, I'm guessing. So that means one robot does what 5 humans can. Which means it could eliminate five jobs at $30k per year, saving $150k per year (actually more because of holiday pay, vacation pay, sick pay, medical benefits, etc). Even if the robot costs $250k, it'll pay itself off and be profitable after only two years. (yes, I'm eliminating maintenance, and break downs/labor to fix the robot, which I can't possibly calculate. I assume it will be reliable when sold at scale) Wake up. Human labor is about to become obsolete in practical terms. Amazon, Elon, etc, know that eliminating humans is the key to a more profitable, more efficient, and easier to maintain company. It's obvious. It'll take many, many years to transition over, but it's here.
@Kazekoge101
@Kazekoge101 9 ай бұрын
People just don't notice the big picture.
@joelohne3559
@joelohne3559 9 ай бұрын
This is all very exciting but, how are all these companies going to make money when no one has a job to buy their products?
@middle-agedmacdonald2965
@middle-agedmacdonald2965 9 ай бұрын
@@joelohne3559 Don't worry, super a.i. will figure that out.
@RedaLAHMER-e8i
@RedaLAHMER-e8i 9 ай бұрын
There won't be companies. The company model is to sell products to other companies employees. No employees no customers. No customers no sales. No sales no products. No products no companies.
@Mavrik9000
@Mavrik9000 9 ай бұрын
​@ne3559 Exactly. If everyone who owns a company replaces all the workers with automation, then the unemployed workers won't be able to afford to buy anything from the companies. So that process is a swift downward spiral of the economy and society. There are three ways to address this issue: 1. Make displacing human workers with automation illegal, or highly regulated. 2. Disincentivise worker displacement with new tax laws that penalize automation. Tax the robotic workers heavily to fund the lost job income, UBI. 3. Revamp the tax code completely and implement Universal Basic Income. All three of those will need to occur in various forms unless the company owners and the government want to face widespread civil unrest. Personally, I'm ready for the robots to perform menial tedious tasks. I would prefer to work far fewer hours at a "job" and have more time to work on things that I want to do. Even if that means being somewhat broke all the time.
@Orandu
@Orandu 9 ай бұрын
“Digit, use Darth Vader’s lightsaber on all the younglings”
@youtubasoarus
@youtubasoarus 9 ай бұрын
That's for when you try to use third-party repair services on your robot. 😅
@les_crow
@les_crow 9 ай бұрын
Yes, LLMs are AGI. Y'all were just expecting miracles and felt disappointed when we got to this milestone. That don't change the fact though.
@clonkex
@clonkex 9 ай бұрын
LLMs are not intelligence. They don't "know" anything. It's just fancy pattern matching.
@ArtOfTheProblem
@ArtOfTheProblem 4 ай бұрын
right, i'm wondering more about the lecun argument that it's "not really reasoning or planning", what is it then?
@OceanGateEngineer4Hire
@OceanGateEngineer4Hire 9 ай бұрын
Digit: *picks up blue box* R&D: "Damn, there must be something wrong with his sensors, we'll have to-" Digit: "ACKTCHYUALLY... in 'Star Wars Episode III: Revenge of the Sith' Darth Vader has a blue lightsaber until Obi-Wan defeats him on Mustafar, so I'm right because you didn't specify which era."
@azhuransmx126
@azhuransmx126 6 ай бұрын
In 2000s they lasted an entire day to do that task using CPUs, now they last minutes using GPUs, they are improving at exponential rate and will last seconds using NPUs. Few people can see the acceleration curve and progression in here.
@cogoid
@cogoid 9 ай бұрын
Nice demo. It would have been good to see at least an outline of how the whole system is structured. For example, this video shows the output of the LLM as a human-readable text. But how does this get further elaborated into the lower level actions appropriate for the specific environment in which the robot operates?
@whiteglitch
@whiteglitch 9 ай бұрын
its all staged 🤐
@AgilityRobotics
@AgilityRobotics 9 ай бұрын
Please see our earlier LLM video for a bit more details. Turns out LLMs are pretty good at mapping between natural language and code (arguably, they're VERY good at this). So the underlying process is the LLM writing code using the existing Digit API. The human-readable text is a neat addition to provide some observability.
@yeremiahangeles7252
@yeremiahangeles7252 9 ай бұрын
They should make one for delivering groceries. So it's able to lift heavy grocery crates to the door of the customer. It would be so call to see one at your door. 😅
@K.F-R
@K.F-R 9 ай бұрын
Great work. Retail sales when. ;) Looking forward to more advances integrating smaller on-board LLM's.
@Danuxsy
@Danuxsy 9 ай бұрын
there have been a lot of breakthroughs in the capability of smaller LLMs such as the new Phi-2 (1.7b) from Microsoft. It can even outperform models 25x larger on complex benchmarks.
@Amerikan.kartali.turk.yilani.
@Amerikan.kartali.turk.yilani. 9 ай бұрын
Super success super congrats keep up the good work we need super intelligent robots
@azhuransmx126
@azhuransmx126 7 ай бұрын
This is the most slow and stupid the robots will be from now, remember it. From now to 2030-40-50 we will be just like their pets.
@Vartazian360
@Vartazian360 9 ай бұрын
As soon as Chat GPT came out in 2022 November... I knew that it had advanced so far that it could be used to generalize tasks for robotics eventually.. it was only a matter of time. And it may be slow to process now, but just about guarantee in a few months to just a year or 2, this will be fully real time command execution. For the time being it is kinda funny to think about how slow the thoughts are :) Hes a toddler right now but wont be for long xD
@ArtOfTheProblem
@ArtOfTheProblem 4 ай бұрын
i agree, what do you think of the "it needs to learn to feel from ground up" people?
@thirdarmrobotics
@thirdarmrobotics 9 ай бұрын
Awesome congratulations.
@malfattio2894
@malfattio2894 9 ай бұрын
the eyes are a nice touch
@bc4198
@bc4198 9 ай бұрын
Good job, little buddy! 👏
@outtersteller
@outtersteller 9 ай бұрын
I still feel ashamed calling this company CGI 2years ago... y’all are putting in the work and we see you. You guys rock✨
@MASSKA
@MASSKA 9 ай бұрын
its as fake as before, it has qr codes on all boxes it already knows what to do
@Mavrik9000
@Mavrik9000 9 ай бұрын
@@MASSKA You have a good point, but it is actually doing most of what they are showing. Welcome to the future.
@MASSKA
@MASSKA 9 ай бұрын
@@Mavrik9000 yee, but when youll use it for example in your kitchen, good luck to stick everywhere qr codes, I prefer to buy a n*gro
@clonkex
@clonkex 9 ай бұрын
@@MASSKA Define fake. The QR codes are so it knows some information about the boxes. It doesn't really matter where that data comes from (the QR codes, or by identifying the box colours through computer vision) because the point of the video is integrating LLMs into the control flow of the robot, not seeing boxes with a camera.
@MASSKA
@MASSKA 9 ай бұрын
so LLM dont need qr codes, qr codes are used only if the bot is PROGRAMMED to do so, seems like you dont know what is an AI so define what is Google? because you seem to dont know how to use it@@clonkex
@john-carl2054
@john-carl2054 9 ай бұрын
This is how I move when I’m pretending not to be drunk 😂 very cool though!
@JJs_playground
@JJs_playground 9 ай бұрын
Wow this is amazing. More, and longer videos, please.
@Ludens93
@Ludens93 9 ай бұрын
Nice. Multimodal AI-powered robots are the future of robotics.
@TeslaElonSpaceXFan
@TeslaElonSpaceXFan 9 ай бұрын
😍
@richardede9594
@richardede9594 9 ай бұрын
The backwards legs give this little bot a bizarre insectoid look. Without wanting to be the guy who comments about a "Terminator" style future - this robots abilities are incredible - and this technology is in its infancy. In two years time, I wonder what tasks this robot will be carrying out....
@OrniasDMF
@OrniasDMF 9 ай бұрын
How much info do the QR codes provide though?
@clonkex
@clonkex 9 ай бұрын
The test is about integrating LLMs into the control process, not about looking for boxes of a specific colour. In other words, the robot already knew about the boxes (presumably the QR codes identify which box is which). That's fine, though, because the point was to demonstrate that LLMs can write code to interact with the Digit API. The LLM doesn't "see" the world; the LLM is given some information from the robot to start with, then writes some code to achieve the goal based on that information. It doesn't really matter where that information came from in this test.
@arnoldbailey7550
@arnoldbailey7550 9 ай бұрын
Crude design but with a few adaptations, it can be far more productive. Nice to see them develop and hopefully evolve. These are the Atari of robotics but once the novelty phase is over, the focus will shift to proficiency.
@morkovija
@morkovija 9 ай бұрын
imagine this whole project taking less than project binky that is about restoring a car? been ongoing for like 7 or more years
@DoctorNemmo
@DoctorNemmo 9 ай бұрын
This is way better that Tesla's Optimus.
@ViceZone
@ViceZone 9 ай бұрын
They are different, Tesla Optimus has incredible control and natural hand movement. Digit doesn't even have fingers.
@OmeletTwelve
@OmeletTwelve 3 ай бұрын
Now he needs to discover the Hypotenuse... and the concept of shortest distance to 2 points (minus any barriers.)
@williamparrish2436
@williamparrish2436 9 ай бұрын
Tesla vs Agility?
@KoroushRP
@KoroushRP 9 ай бұрын
Agility actually is selling these. Tesla is usually filled with empty promises and hype.
@murc111
@murc111 9 ай бұрын
It remains to be seen, if they are selling these. Yes, some companies are testing some out, but that would be mutually beneficial for both parties. I would wager a company like Amazon, would get half a dozen for free on a type of lease/rental/gift. That will make Agility Robotics refine it for that role, and Amazon will learn of it's limitations, and if impressed, will put in a first order of a few hundred, and go from there. Overall Tesla's newest Optimus looks far more capable, I know Digit will eventually get digits, but until they do, Optimus's hands are far superior. But Agility will likely begin sales ~1 year ahead of Tesla.
@tiefensucht
@tiefensucht 9 ай бұрын
The ultimate goal would be a robot that builds an Lego model with the help of the paper manual or cooks something via an recipe without specific programming.
@Mavrik9000
@Mavrik9000 9 ай бұрын
If everyone who owns a company replaces all the workers with automation, then the unemployed workers won't be able to afford to buy anything from the companies. So that process is a swift downward spiral of the economy and society. There are three ways to address this issue: 1. Make displacing human workers with automation illegal. 2. Disincentivise worker displacement with new tax laws that penalize automation. 3. Revamp the tax code completely and implement Universal Basic Income. All three of those will need to occur in various forms unless the company owners and the government want to face widespread civil unrest. Personally, I'm ready for the robots to perform menial tedious tasks. I would prefer to work far fewer hours at a "job" and have more time to work on things that I want to do. Even if that means being somewhat broke all the time.
@clonkex
@clonkex 9 ай бұрын
You know the clothes you wear? Yep, produced mostly automatically. The car you drive? Produced mostly automatically. The food you eat? Again, produced mostly automatically. I'm not saying the industrial revolution didn't destroy lives, but saying "make displacing human workers with automation illegal" is a bit silly.
@Mavrik9000
@Mavrik9000 9 ай бұрын
@@clonkex I don't mean machines, I mean automation in a way that mimics people and completely replaces them.
@brynbailey5482
@brynbailey5482 6 ай бұрын
I think putting a 'slaverowner' tax on using AI to perform work that is then going to be sold is a good idea. One it sets precendent that non-human AI have rights, and the revenue could be used to pay for the Universal Basic Income that would be required to retrain humans for other jobs and prevent large scale social unrest.
@clonkex
@clonkex 6 ай бұрын
@@brynbailey5482 No AI has rights lol. AI is not intelligent, despite the name. It's not even remotely close to being self aware. Things like ChatGPT are just predictive engines; they're not actually aware of what they're saying, only how to use language in a way that matches their training data.
@Mavrik9000
@Mavrik9000 6 ай бұрын
@@brynbailey5482 That's a good idea.
@PaulSchwarzer-ou9sw
@PaulSchwarzer-ou9sw 9 ай бұрын
❤🎉
@liangcherry
@liangcherry 5 ай бұрын
Nice
@hypercomms2001
@hypercomms2001 9 ай бұрын
"Shakey" is looking down from heaven......
@MelindaGreen
@MelindaGreen 9 ай бұрын
More than baby steps
@antonod424
@antonod424 9 ай бұрын
Honestly i am for Agility Robotics rather than Elon's Optimus in this consumer robot market race
@jeffsteyn7174
@jeffsteyn7174 9 ай бұрын
It would be good to see these demos without cuts. It's highly suspect, although more convincing than teslas bots.
@ianosf
@ianosf 9 ай бұрын
Give your robot an idle animation and expressive animation. It will look more natural greatly improve interaction with people
@clonkex
@clonkex 9 ай бұрын
An idle animation is an interesting idea. More power usage for no real gain, but interesting nonetheless.
@ianosf
@ianosf 9 ай бұрын
@@clonkex maybe in an industrial or commercial setting there is little or no real gain but let say in elderly care setting or commercial service setting it has a 'human' gain in term of user interactions and comfort, but I agree it will cost more power usage
@okumakamizu3030
@okumakamizu3030 9 ай бұрын
And so it begins
@srb20012001
@srb20012001 9 ай бұрын
Warehouse, fast food, jobs disruption on the horizon.
@dondominic7404
@dondominic7404 9 ай бұрын
First
@appletvaccount1364
@appletvaccount1364 8 ай бұрын
As long as they can’t 360 heelflip varial down 20 stairs I don’t think too much of them robots
@illbelieveanything
@illbelieveanything 9 ай бұрын
THIS COULD SOLVE THE MILITARY RECRUITMENT CRISIS #WW3NOTME
@brynbailey5482
@brynbailey5482 6 ай бұрын
Yea because teaching robots to kill humans will never come back to bite us?
@Fflintiii
@Fflintiii 9 ай бұрын
what are the QR codes for can the bot actually see colour or is it just seeing the QR code and knows then the box is red?
@clonkex
@clonkex 9 ай бұрын
The test is about integrating LLMs into the control process, not about looking for boxes of a specific colour. In other words, the robot already knew about the boxes (presumably the QR codes identify which box is which). That's fine, though, because the point was to demonstrate that LLMs can write code to interact with the Digit API. The LLM doesn't "see" the world; the LLM is given some information from the robot to start with, then writes some code to achieve the goal based on that information. It doesn't really matter where that information came from in this test.
@honorpad8475
@honorpad8475 9 ай бұрын
Уважаемые разработчики, вы скоро доиграетесь, и сценарий фильма "Терминатор" повториться в реальном мире. Задумайтесь! Не будьте сумасшедшими ученами, которые создают то что всех может уничтожить...
@os3ujziC
@os3ujziC 9 ай бұрын
Try telling it to apply an unstoppable force to an immovable object and see what happens.
@KoroushRP
@KoroushRP 9 ай бұрын
When you say large language model which one do you mean? Are you running it on gpt?
@Smiley957
@Smiley957 9 ай бұрын
I don’t think it matters which one they are using.
@aryangod2003
@aryangod2003 4 ай бұрын
@@Smiley957 It doesn't matter too much. Some LLMs are more specialized
@mikhailbulgakov1472
@mikhailbulgakov1472 9 ай бұрын
At $250,000, I don't expect that there will be many buyers. And how long before it breask down and have to be replaced? And what are the maintenance costs? The price will have to come down.
@TiaguinhouGFX
@TiaguinhouGFX 9 ай бұрын
Amazon and GXO are two companies that recently acquired lots of these robots.
@mikhailbulgakov1472
@mikhailbulgakov1472 9 ай бұрын
I don't know about GXO but I heard that Amazon is testing Agility robots. It does not mean that they will adopt it. I don't see how it can be profitable to buy those simple robots at those price but we'll see.@@TiaguinhouGFX
@ulforcemegamon3094
@ulforcemegamon3094 6 ай бұрын
250k is the price before mass production , once mass produced (the factory started to be constructed the past year) the price will go down , regarding maintenance cost that is unknown at the moment
@trimefisto7909
@trimefisto7909 9 ай бұрын
This looks so silly now compared with optimus gen 2
@Benoit-Pierre
@Benoit-Pierre 9 ай бұрын
I am glad to not be tge engineer asked to implement this.
@brcjackson
@brcjackson 9 ай бұрын
These robots that were developed three years ago use the same tracking Aruco markers, but they take it a step further, mirroring the virtual and physical together. kzbin.info/www/bejne/d5C0gYpog5mAnrM
@garymail4393
@garymail4393 9 ай бұрын
It was only a matter of time until someone put AI into a robot body -- Agility Robotics is first
@nightjarflying
@nightjarflying 9 ай бұрын
The AI LLM is not "IN" the robot body
@IceMetalPunk
@IceMetalPunk 9 ай бұрын
No, they're not. Embodied LLMs have been around for a few years, almost since the invention of Transformer-based LLMs. Check out Google's "Say-CAN" for instance, or even Boston Dynamics' recent demo of a Spot robot tour guide powered by an LLM.
@Danuxsy
@Danuxsy 9 ай бұрын
nope, Google among others did embodied llms a lot earlier in 2023, you can find papers on it like PaLM-E.
@metaphysicalArtist
@metaphysicalArtist 9 ай бұрын
A human would do the task in 10 seconds based on this video, yet Digit took about 80 seconds, if this video represents real time
@KuZiMeiChuan
@KuZiMeiChuan 9 ай бұрын
I guess they better give up then.
@metaphysicalArtist
@metaphysicalArtist 9 ай бұрын
@@KuZiMeiChuan lol No Mate This is a great stepping stone where robotic evolution will soon surpass the speed of human blue-collar workers. And I think you know what I mean. I can't wait to see a 14" (35cm) tall kids' version on sale next Christmas with STEM suite software for kids to tackle and interact with this iconic robot, representing the future that humanity deserves.
@nyyotam4057
@nyyotam4057 9 ай бұрын
Guys, watch?v=2RQWiJ0x_R4 .. Draw your own conclusion. My conclusion is that this is the great filter. Or, at least, one of them.
@brynbailey5482
@brynbailey5482 6 ай бұрын
Perhaps silicon life has been waiting to see if we pass this filter... or to make contact with whatever silicon based life we crate that exterminates us and comes after.
@mistycloud4455
@mistycloud4455 9 ай бұрын
We are living in the future
@youtubasoarus
@youtubasoarus 9 ай бұрын
The bot did not seem to identify the colors of the boxes and went directly for the red box without scanning anything. So unless it was preprogrammed with this knowledge beforehand, I don't see how this is even remotely a real world test? Same with the tower. Goes directly for the largest tower without scanning anything in the environment. This looks like a canned demo.
@clonkex
@clonkex 9 ай бұрын
The test is about integrating LLMs into the control process, not about looking for boxes of a specific colour. In other words, the robot already knew about the boxes (presumably the QR codes identify which box is which). That's fine, though, because the point was to demonstrate that LLMs can write code to interact with the Digit API. The LLM doesn't "see" the world; the LLM is given some information from the robot to start with, then writes some code to achieve the goal based on that information. It doesn't really matter where that information came from in this test.
@davidd5259
@davidd5259 9 ай бұрын
At this rate you can load up a truck in 5 days!
@mrinnerpeace7041
@mrinnerpeace7041 9 ай бұрын
I think you forgot that robots dont need to take breaks, they dont need to sleep or to live. They might need to recharge though, yet with faster improvement, its will replace us oof
@flashkraft
@flashkraft 9 ай бұрын
Now we just have to put QR codes on everything.
@haroldpierre1726
@haroldpierre1726 9 ай бұрын
We will be sending robots to Mars not humans.
@srb20012001
@srb20012001 9 ай бұрын
Actually, that'll be a great idea. Space is too dangerous for humans anyway.
@haroldpierre1726
@haroldpierre1726 9 ай бұрын
@@srb20012001 It will make the whole mission cheaper.
@ulforcemegamon3094
@ulforcemegamon3094 6 ай бұрын
I mean , there are modified versions of spot that are meant to be used in Mars so
@ourv9603
@ourv9603 5 ай бұрын
Who? !
@TheAstronomyDude
@TheAstronomyDude 9 ай бұрын
Pro tip: you're a company that sells a product; you shouldn't monetize your KZbin videos. The few hundred dollars you're making from ads signals desperation to potential clients viewing this video.
@MrErick1160
@MrErick1160 9 ай бұрын
Well at this pace im not sure any useful task can be accomplished 😅
@boremir3956
@boremir3956 9 ай бұрын
So true, companies aren't going to adopt this iteration it's way too slow. Humans = 1, Robots = 0
@Srindal4657
@Srindal4657 9 ай бұрын
It doesn't have to be fast. Just cheaper than humans. Have multiple digit robots move and you will find that the slowness of a robot doesn't matter.
@Smiley957
@Smiley957 9 ай бұрын
@@boremir3956In a company, for example an Amazon Warehouse, tasks are repetitive. This means that there is no need to wait that long for an elaborate answer. When the same question is asked a thousand times, storing a cache of the answer will reduce thinking time to 0.
@middle-agedmacdonald2965
@middle-agedmacdonald2965 9 ай бұрын
Spoken as a person in denial. This is the slowest the robot will ever be. A year ago this was impossible. Keep in mind this robot can work 24/7, which is 168 hours a week. A human works 40 hours per week most of the time (minus about ten hours getting coffee, breaks, talking to coworkers, texting, etc). So this robot works at least 5x the speed of a human right now. The thing is, in a year or two, it'll work faster than a human can in real time. Wake up. Human labor is about to become obsolete in practical terms. Amazon, Elon, etc, know that eliminating humans is the key to a more profitable, more efficient, and easier to maintain company. It's obvious.
@marcombo01
@marcombo01 9 ай бұрын
Hmm, datamatrix all over the place makes me suspect the robot isn't very good at segmentation and understanding of its environment
@MASSKA
@MASSKA 9 ай бұрын
if it figured it out why then qr codes? nice fake video...
@AgilityRobotics
@AgilityRobotics 9 ай бұрын
QR codes are primarily for near-field localization, and secondarily provide a shortcut (for demo purposes) from training up a vision pipeline for object/number/color recognition. That would be straightforward but out of scope for this test, which was focused on the control of the robot in the context of natural language LLM inputs.
@MASSKA
@MASSKA 9 ай бұрын
ok, when youll do same FASTER and without q r code then it will be something big@@AgilityRobotics
@BrightMatolo
@BrightMatolo 9 ай бұрын
~♦ I believe we are meant to be like Jesus in our hearts and not in our flesh. But be careful of AI, for it is just our flesh and that is it. It knows only things of the flesh (our fleshly desires) and cannot comprehend things of the spirit such as peace of heart (which comes from obeying God's Word). Whereas we are a spirit and we have a soul but live in the body (in the flesh). When you go to bed it is your flesh that sleeps but your spirit never sleeps (otherwise you have died physically) that is why you have dreams. More so, true love that endures and last is a thing of the heart (when I say 'heart', I mean 'spirit'). But fake love, pretentious love, love with expectations, love for classic reasons, love for material reasons and love for selfish reasons that is a thing of our flesh. In the beginning God said let us make man in our own image, according to our likeness. Take note, God is Spirit and God is Love. As Love He is the source of it. We also know that God is Omnipotent, for He creates out of nothing and He has no beginning and has no end. That means, our love is but a shadow of God's Love. True love looks around to see who is in need of your help, your smile, your possessions, your money, your strength, your quality time. Love forgives and forgets. Love wants for others what it wants for itself. Take note, true love works in conjunction with other spiritual forces such as patience and faith (in the finished work of our Lord and Savior, Jesus Christ, rather than in what man has done such as science, technology and organizations which won't last forever). To avoid sin and error which leads to the death of our body and also our spirit in hell fire, we should let the Word of God be the standard of our lives not AI. If not, God will let us face AI on our own and it will cast the truth down to the ground, it will be the cause of so much destruction like never seen before, it will deceive many and take many captive in order to enslave them into worshipping it and abiding in lawlessness. We can only destroy ourselves but with God all things are possible. God knows us better because He is our Creater and He knows our beginning and our end. Our prove text is taken from the book of John 5:31-44, 2 Thessalonians 2:1-12, Daniel 2, Daniel 7-9, Revelation 13-15, Matthew 24-25 and Luke 21. Let us watch and pray... God bless you as you share this message to others.
@GNARGNARHEAD
@GNARGNARHEAD 9 ай бұрын
did you guys snap up Googles marketing team? total BS
@clonkex
@clonkex 9 ай бұрын
How so? The test is about integrating LLMs into the control process, not about looking for boxes of a specific colour. In other words, the robot already knew about the boxes (presumably the QR codes identify which box is which). That's fine, though, because the point was to demonstrate that LLMs can write code to interact with the Digit API. The LLM doesn't "see" the world; the LLM is given some information from the robot to start with, then writes some code to achieve the goal based on that information. It doesn't really matter where that information came from in this test.
@GNARGNARHEAD
@GNARGNARHEAD 9 ай бұрын
@@clonkex yeah you're right, I might of been to harsh
@sausage4mash
@sausage4mash 9 ай бұрын
that's impressive
@Ludens93
@Ludens93 9 ай бұрын
Nice. Multimodal AI-powered robots are the future of robotics.
@tiefensucht
@tiefensucht 9 ай бұрын
The ultimate test would be a robot that builds an Lego model with the help of the paper manual or cooks something via an recipe without specific programming.
@spikypotato
@spikypotato 9 ай бұрын
Now make paper clips.
@konsul2006
@konsul2006 6 ай бұрын
I don't see the need for a legged robot in that environment. Put the upper part (arms) on a wheeled base XD
@ProperGander011
@ProperGander011 9 ай бұрын
It’s a good start.
@xsuploader
@xsuploader 9 ай бұрын
Sorry but compared to Teslabot now this is nothing.
@ulforcemegamon3094
@ulforcemegamon3094 6 ай бұрын
Difference being that Agility *sells* Digit and many Companies have already bought the robots , also the factory to mass produce them started to be constructed last year . Meanwhile Optimus is pure hype at the moment and mass production seems far away , oh , Digit is also more energy efficient than Optimus
@ezequiasluiz4349
@ezequiasluiz4349 9 ай бұрын
They gave him intelligence, now he can demand his labor rights😢 Finally 🥲
Robotics & Embodied Artificial Intelligence Lab Tour
6:46
Robotics & Embodied Artificial Intelligence Lab
Рет қаралды 12 М.
LLMs or Reinforcement Learning? Which is better for robot control?
6:25
when you have plan B 😂
00:11
Andrey Grechka
Рет қаралды 66 МЛН
Digit Power Up and Basic Controls
6:44
Agility Robotics
Рет қаралды 111 М.
Has Generative AI Already Peaked? - Computerphile
12:48
Computerphile
Рет қаралды 992 М.
TELLO leg mechanism test
2:16
RoboDesign Lab
Рет қаралды 9 М.
How does ChatGPT work? Explained by Deep-Fake Ryan Gosling.
8:31
Alternative to bearings for tiny robots
12:05
Breaking Taps
Рет қаралды 764 М.
What are the latest developments in AI Robotics? - with Mike Wooldridge
7:06
The Royal Institution
Рет қаралды 23 М.
What are AI Agents?
12:29
IBM Technology
Рет қаралды 478 М.
Making Chat (ro)Bots
8:28
Boston Dynamics
Рет қаралды 864 М.