I've fallen at the first hurdle. Before building the robot you have to turn the servo motor by 90 degrees, but the python script they provide returns the error: ModuleNotFoundError: no module named 'machine' I'm running the script directly from the Pico and pre-installed micropython on it. Any ideas gratefully received.
@thecosmicbeeКүн бұрын
I can't recall offhand if it came with CircuitPython or micropython (apologies, it has been a bit since I worked with the car). If it was the former I think you'll need to modify the script to work with the other or install micropython first to run it. IIRC I installed micropython myself and then ran the logic.
@awapppp2 ай бұрын
Thats so cool bro, i just got my own and was trying to see if you could make it a static color out of the box, but i dont think you can unfortunately.
@_MomeM3 ай бұрын
funny idea definetely
@kingkasma46603 ай бұрын
Good work!
@Apollost5 ай бұрын
Nice
@JayIsDed5 ай бұрын
Very cool video! Glad I came across this video. I'm currently working on a project with the same esp32c3 dev board. So far it's running off a i2c oled display but I might get one of those 4 or 7in einks after seeing your video. Didn't know there was support for these displays on esphome.
@thecosmicbee5 ай бұрын
Hey they work great. There's the issue with partial refreshing but with the full refresh it has been pretty consistent. There's also a PR someone mentions in this thread re: the 3 color displays. I have a few of those so hoping to try it out in the future. github.com/esphome/feature-requests/issues/2101
@JayIsDed5 ай бұрын
@thecosmicbee definitely looking forward to seeing how those work! I've been told their refresh is a bit slower but for most data statistic displays that shouldn't be an issue unless if it's like over 5minutes or something absurd like that. Either way, great work and I'll sub for future videos! Best of luck to you and your projects!
@christopherarendt35315 ай бұрын
I guess it’s giving a pic to lama periodically then projecting the new result in a loop
@thecosmicbee5 ай бұрын
Yeah, taking a photo of the sand each time on change for use with DepthAnythingV2 with ControlNet using the automatic1111 standard diffusion web UI with API to handle the rendering. Detecting hands in the photo each iteration to determine if it should rerender (once hands leave the viewport).
@Ila_Aras5 ай бұрын
doesnt really look like any terrain, i dont understand.
@Ila_Aras5 ай бұрын
Nevermind, I see now. You can directly input what it should be. Very cool
@majverhovnik39115 ай бұрын
Pretty cool. How does this detect the relief?
@thecosmicbee5 ай бұрын
Using DepthAnythingV2 -- it takes a photo and uses opencv to warp it so it's constrained to the sandbox based on the CharucoBoard projection. Then with that it uses it as the input to the standard diffusion ControlNet Depth for a set prompt. The prompt can be updated via voice.
@adrianinvents5 ай бұрын
What a cool project!
@thecosmicbee5 ай бұрын
Thank you!
@Electromakerio5 ай бұрын
Fantastic video and an interesting project! We love the Blues notecards at Electomaker - they are so simple to use! Thanks for uploading your project tutorial to the Electromaker project hub :)
@frankdearr27725 ай бұрын
Great topic, thanks 👍
@RixtronixLAB6 ай бұрын
Creative video, keep it up,thanks :)
@codydyck6 ай бұрын
Man, when it was delayed while you were telling it to park I've never felt someone's struggle that much. Mine tracks people using Google CV Pose and it always drives too far and hits stuff.
@thecosmicbee6 ай бұрын
Yeah for sure, in retrospect I should have used one of the custom commands and taught it to just "Stop" over that built in "Park a Car" which feels a bit harder to parse. Definitely had me sweating when it ignored me the first time haha. I think the wheels on the floor from the mecanum car make it a bit harder to hear as they were pretty loud. I think if I were to do it again I'd make the voice part a handheld device that then just sends the commands to move over its hosted access point network perhaps with custom commands for shorter movement like advancing only a few feet in that direction each time.
@codydyck6 ай бұрын
@@thecosmicbeeI like that idea, just have a headset on and send the commands. Then connect multiple robots together in Ardupilot Rover and have a voice controlled robot swarm.
@uihi96606 ай бұрын
really nice~
@thecosmicbee6 ай бұрын
Intentionally included the API key as part of the demonstration -- I had deleted the bot / invalidated the key prior to posting the video. In general you want to avoid sharing it (unless demonstrating as part of a tutorial) as it could give access to others (in this case giving them the ability to spam you).
@The22v106 ай бұрын
Very Cool, great Seeed Hardware..
@thecosmicbee7 ай бұрын
Apologies re: the mention of the i2c display -- I misspoke that's a TM1637 so using its CLK and DIO pins to control the display (well ESPHome is). So used to i2c displays my mind blanked when I saw the two wire control.
@ernestogabrielreyesvazquez62737 ай бұрын
Hey bro can You explain how do You program the module plis
@thecosmicbee7 ай бұрын
They have a python and arduino library for the sensor. github.com/DFRobot/DFRobot_DF2301Q my rust module is located at github.com/Cosmic-Bee/df2301q.rs docs.esp-rs.org/book/installation/rust.html has some information about getting setup for developing on the esp32 platform for rust (I used it here for this module). The arduino library may be the easiest path forward though as you can do a search from the arduino ide to install that and get up and running fairly quickly.
@aleksandarshishkov30528 ай бұрын
Hi, Could you tell me where to find this custom flash? Thank you
@thecosmicbee8 ай бұрын
Hey there, sorry for the late reply, right now it's in a fork of the SenseCAP Indicator ESP32 repository as a pull request. I have the pull request: github.com/Seeed-Solution/SenseCAP_Indicator_ESP32/pull/24 The associated branch: github.com/Timo614/SenseCAP_Indicator_ESP32/tree/timo614-indicator-matter I recently updated it to a more recent esp idf version as the underlying repository was updated as well.
@SpencerTechMelody9 ай бұрын
Cool 😎Bro.
@f_erbey10 ай бұрын
It seems like a weak recognition algorythim. I want to create an app that while reading a text it recognize the voice and show the word or the phrase on the text. Do you think that this crate will be suitable for that? And if not what kind of library should i use?
@thecosmicbee11 ай бұрын
The DK's schematic is located: www.st.com/content/ccc/resource/technical/layouts_and_diagrams/schematic_pack/group1/0a/d5/78/b6/d7/bd/45/de/mb1292_shematics/files/MB1292-WB5MM-B01_Schematic.PDF/jcr:content/translations/en.MB1292-WB5MM-B01_Schematic.PDF It has a LD1117S50TR on the VIN which drops the voltage down to 5V and provides 800 mA current. I opted to use a 5V regulator from WeAct as it has a 5V 5A output via its SY8205. Also using a WeAct PD module (which I believe uses a CH224K) for USB PD (I have it set to 12V which I'm using to power the board via VIN and passing to the 5V regulator).
@thecosmicbee11 ай бұрын
Apologies in advance for my terrible soldering. I tried to use the leads to connect the different components and then ended up letting the solder pool a bit (probably a terrible idea) to connect a few of the junctions. I probably should have cut individual wire to connect it in some nice neat way. It works though just isn't the prettiest. I plan on creating a proper board for it in the future and ordering it out time permitting so once I do that this issue should disappear).
@thecosmicbee11 ай бұрын
My soldering is a bit rough on the protoboard (some blobs, etc) -- unfortunately didn't see it pooling there when I connected the headers at the end but it works well enough. I opted to make the larger blobs when connecting the leads on the back as well. Wish I had a bit of time to put together a proper board but wouldn't ship for the contest date so opting for this. At least the wires look somewhat nice. Time permitting I'll be putting together a proper board in the future to replace this one.
@dylanwillyams11 ай бұрын
So it’s a clock that’s really hard to use. Sweet
@ericbenitez408211 ай бұрын
Nice
@tueurdelombre7395 Жыл бұрын
Red LEDs could make it even better. But it's already really good
@iSEKaiSG Жыл бұрын
Hi, i wanna ask. How do u power a LED strip with 1 3.7V L-ion battery? I thought u would need a few cells in series to power it since the strip probably runs off 5v/12V/24V?
@thecosmicbee Жыл бұрын
I had originally seen a similar setup from the Ruiz Brother's darksaber tutorial so knew I could handle LED strips from the propmaker itself and went with a similar setup here: learn.adafruit.com/ble-darksaber-propmaker/circuit-diagram I just took a look at the schematic now to see if perhaps it had some boost element but it doesn't look like it does: learn.adafruit.com/assets/69243 It looks like it sends either the vbat or the vbus to the v+ line which is then used for the neopixel out port. So yeah nothing there boosting it. I suppose it must be powering it with whatever voltage the battery has and it must be enough for the ws2812b strips (I had thought they were 5V myself and it was boosting it).
@adampouliot8409 Жыл бұрын
Great work, what is it?
@adampouliot8409 Жыл бұрын
Hope to see more!
@thecosmicbee Жыл бұрын
Hey, this was a LED strip for a poolside lane indicator for night swimming. Hackster had a contest for creating some devices for assisted living and one of the topics was visual issues while swimming. Went super low tech with this one as I ran out of time. I had originally hoped to do something with machine learning to help indicate the user's location in the lane and then update the LED strip (so perhaps something like flooding the entire right side red to indicate you've gone out of the lane).
@MonkeyNeuronActivation Жыл бұрын
So it's 4 touch sensors with a joystick?
@thecosmicbee Жыл бұрын
Yup, from talking with Vivek one of the big issues he had was with normal mice and the movement involved for both positioning and clicking. The conductive paint worked well for translating the touch elements making for a mouse that triggers easily and responds well with little movement on the user's part.
@abhiramsp9972 Жыл бұрын
After turning on using wake up command , what's the maximum duration it remains active without interruption?also can I give wakeup command as clapping sounds?
@thecosmicbee Жыл бұрын
You can configure the maximum duration for the active state with the set wake time command. From their header file it looks like 255 seconds may be the limit: github.com/DFRobot/DFRobot_DF2301Q/blob/master/DFRobot_DF2301Q.h#L168 You can setup a new wake word and it'll learn it. I haven't tried clapping for the wake word but I think it would work. You can't configure the opening phrase it says when it boots but for low key setups it seems like a really neat way to get up and running for simple voice commands without needing to do any ML. I wish there was a way to disable all of the commands and then enable a subset as it's a bit weird for it to respond to things like "Park a Car" when I'm using it as a trigger for lights in my kid's room.
@thecosmicbee Жыл бұрын
Shocked I have to say this here: It's not a real creature, it's ferrofluid. Two people somehow found my video and got confused reaching out to me fearing others would be as well. I put that in the description. I was just messing around, you have nothing to be concerned about regarding dangerous magnetic lifeforms.