Testing Brain-Computer Interfaces

  Рет қаралды 139,787

James Bruton

James Bruton

Күн бұрын

Пікірлер: 331
@jamesbruton
@jamesbruton 4 жыл бұрын
You can support me on Patreon or through KZbin channel membership - Patrons and Members get access to all the videos up to a week early! www.patreon.com/XRobots
@GeeveGeorge
@GeeveGeorge 4 жыл бұрын
Would be cool if you could build an User Interface like Emotiv , where they record spikes for few seconds and asks the users to make specific movements. They then use pattern recognition to classify the spikes accordingly in real time.
@JClemente1980
@JClemente1980 3 жыл бұрын
Emotive epoc also has a sdk for everyone to use, and it has from the start a small fpv demo for you to control the movement with your own mind. They have been around for at least 7-8 years. The difference from your set and approach, is that they do calibrate the movement from your own thought. which means you can still use your members and have an extension. For example, to control a 3rd arm, or even a tail :P . I do not like their wet electrodes, I had already been testing with dry electrodes, a design very similar to the ones you're using. I've thought on using that on my own PhD, wanted to supply my own eeg to a few growing neurons on a petri dish with electrodes, just to see what happened... What I was actually testing was the coating for electrodes to be implanted, but could be a more realistic situation if they were being stimulated with my own waves... P.S. No, I did not wanted to download my conscience to a set of neurons!!!!
@Graham_Wideman
@Graham_Wideman 4 жыл бұрын
James: There is a huge literature in EEG signal detection and interpretation that you could be drawing on. You are confronting at least two problems here. One is using just a few electrodes on the scalp to attempt to localize signals from a specific part of the brain. This is known as the "inverse" problem, and it's non-trivial, to say the least. Search for papers on that topic. The second problem is trying to pick up signals that convincingly correlate to some pattern of thought. Those are quite small signals relative to the coarse waves seen at an electrode that represents the sum of activity of thousands or millions of neurons near that electrode. To detect such a signal, experimenters use a paradigm like: present the experimental subject with a succession of trials, repeating the same stimulus over and over again (intermingled with control different-stimulus trials), and then average the signals from like trials together, aligned to the time of the stimulus onset. Then subtract the average control signal from that of the stimulus trials to produce a result signal. Hoping to discern a thought with a single trial, no control trial, and no sync to stimulus is pretty optimistic. As a side note: 18:02 -- "the electrode is pointing in to the other side of my brain". No, that's not a thing. Those electrodes are simply making contact with the skin on your head, and the only way it receives a signal is plain old conduction of the signal from all the firing neurons, through the tissue of the brain, the outer membrane that surrounds the brain, the cerebrospinal fluid (CSF), the skull, and the skin, with all the inhomogeneities that involves.
@MollyWi
@MollyWi 4 жыл бұрын
Yeah I agree, the problem is associated with sorting through all the large noise to find something localized. Even then the electronics would need to be very accurate differential amplifiers that can detect signals below the noise floors. Electromagnetic interference, from say a television in another room, is far more likely to appear on that graph than an inner brain signal. Get any some of muscle EEG is a good starting point though.
@jamesbruton
@jamesbruton 4 жыл бұрын
Super, interesting, thanks!
@minibigs5259
@minibigs5259 4 жыл бұрын
Excited to see James develop some self trials, baseline, cat, cat, cat, dog etc along with the deep learning for filtration!!
@noaht5654
@noaht5654 4 жыл бұрын
You always have to worry about what you are actually measuring. EEG picks up magnetic fields on the skin. Anything that can cause those fields can produce a signal, like your 50Hz line noise. Around 7:20, you may be producing a muscle signal ever so slightly on your scalp. Try clenching your jaw next time you wear the headset. You'll clearly see the muscle signal. 19:00 If you are thinking about controlling a robot limb, it may still be beneficial to have sensors over the somatic and (pre)motor cortices still. Think of it like your brain is "simulating" it's own limb movements. These are great resources to learn more, www.mikexcohen.com/#books. Some people like Cohen, some people don't. I think he at least tries to fully explain competing schools of thought on EEG signal, so he may be a good reference.
@donaldviszneki8251
@donaldviszneki8251 4 жыл бұрын
The weirdest thing he suggested was that the electrodes were "pointing" somewhere inside his brain. That's not how it works.
@zabridges
@zabridges 4 жыл бұрын
Man, there is a DIY kit for everything! Incredible!
@thesacredsword7230
@thesacredsword7230 4 жыл бұрын
DIY replacement kidney next step in diy kits
@donaldviszneki8251
@donaldviszneki8251 4 жыл бұрын
The 8 channel base kit is $500. That's affordable, but not affordable enough IMO.
@GusCraft460
@GusCraft460 3 жыл бұрын
@@donaldviszneki8251 compared to what it would normally cost, that is insanely cheap. Of course most people don’t just have $500 lying around to spend on something like this, but it’s still not completely out of most people’s price range for something like a birthday gift.
@OMAR-fq4qi
@OMAR-fq4qi 3 жыл бұрын
Link to buy all electronic parts please
@RIPtechnoblade1
@RIPtechnoblade1 2 жыл бұрын
DIY kit for getting 1.5k if what I nedd
@alexisandersen1392
@alexisandersen1392 4 жыл бұрын
Because this is open source, I imagine that the data on display is rather immediate and raw from the sensors themselves (which is good and bad)... the problem with EEG sensors is that blood pressure will have an effect, you've can see that when you tense your muscles and it lights up your whole brain, it's not brain activity though, it's blood that is going to your head rather than to your constricted extremities. Bio sensing of the core blood pressure typically positioned on the lower neck or chest can be used as a point of reference to interpret the sensor data filtering out blood pressure. While Cerebral blood flow is decoupled from body blood pressure, you still have the skin and muscled over the skull with bodily blood pressures interfering with the probes. Typical commercial products will already have means to filter out these discontinuities to some degree so it's easy to work with out of the box, but it also means the data has less fidelity to the sensors measurements. The sensor data needs treatment and filtering to be useful, but treating the data taints it; with an open source set up, how much and what kind of treatment is up to you, but if you want to use the sensor data, it needs to be heavily treated such that you can faithfully reproducible interaction. There's effectively too many variables and they're all interfering with your intent, you can't really look to one sensor's data and say much of anything.
@izhamazman209
@izhamazman209 4 жыл бұрын
Sorry i didnt fully understand your comment since im only just starting to get into these kinda things and not really a person with biology knowledge. But i gotta ask, you said that the data is not fully indicating brain activity since you said too many variables at play. Is there any possible way to gather only brain signal data aside from being invasive such as neuralink? Will a full body crippled person with only a functioning brain be able to provide only brain signal since muscle is out of the equation? How much is the difference in data between being awake and sleeping? Does the brain is less active during sleeping? Will the data be similar if compared between an awake full body crippled person with a person that is in deep sleep? Hope you can answer even a few of those question. Thanks in advance
@alexisandersen1392
@alexisandersen1392 4 жыл бұрын
There are far too many factors at play when it comes to human brains in general, much less accounting for abnormal brain conditions. That said someone being crippled doesn't necessarily indicate a problem with their brain, such is the case with someone with spinal chord damage. If the brain is typical, and undamaged, it should behave similarly to other typical, undamaged brains. As for sleep vs Awake, it really depends on what stage of sleep the brain is in. Sleep is a whole other can or worms. Regardless, since neural signals are so finely granular, and because there are so many connections with their own signal, it's infeasible to obtain perfect data from the brain. There will always be interference, if not from peripheral systems of the brain, then from the concert of neurons all "talking at once" into your extremely limited number of sensors. It's like trying to reconstruct a stadium full of sound over space and time with a single microphone. You will experience a loss of signal fidelity, it's just the nature of the problem. However, we can use the limited information that we can gather, and intelligent systems to infer signal data, and this is what is typically used even with neural implants, the signals from the sensors must be fed into a system that can find known patterns that are associated with brain intended signals. One approach is to use an artificial neural network, that can be fed the sensor data, and trained to infer the implied signals, but the topology of such an artificial neural network capable of infer brain implied signals from sensor signals is it's own field of study as well.
@donaldviszneki8251
@donaldviszneki8251 4 жыл бұрын
Hey why do you say cerebral blood flow is decoupled from body blood pressure?
@alexisandersen1392
@alexisandersen1392 4 жыл бұрын
@@donaldviszneki8251 cerebral blood pressure is regulated separately from the rest of the body as too much blood pressure could lead to bruising in the brain matter.
@monad_tcp
@monad_tcp 3 жыл бұрын
@@alexisandersen1392 brains are so complex. Can you imagine the advances we will be able to make when we start having 100000 of probes in a human brain, so we can actually see what's happening between small groups of neurons.
@mathieusan
@mathieusan 4 жыл бұрын
You're one head hit away from discovering the flux capacitor there
@World_Theory
@World_Theory 4 жыл бұрын
I didn't know there were brain interface kits like this! That's pretty darn cool. I assume that you'll eventually use some machine learning with those readings, to interpret what's going on. But I'm not 100% confident in this assumption. See this has sparked an idea though. I've been diving into the subject of VR lately, so that's the track my mind tends to be on now. And I've seen some really interesting tracking technology, and ways to drive VR avatars. But I've never seen someone use a brain-computer interface (other than in fiction) to help control an avatar. With a brainwave reading kit, though, I think it would allow you to control the body language at the very least, of non-human body parts in an avatar. How much you're concentrating, for example, could be used to modify the idle animation of a tail, wings, or ears, taking inspiration from real life animals when they're concentrating on something. I think that combining brainwave tracking with full body tracking, and facial expression tracking, would give a neural network (AI) interpreter a lot of cues to guess at the state of your mind. Perhaps just enough cues to actually fully animate a non-human avatar, despite the lack of physical body parts in some cases, to actually track with traditional means. (With "traditional" being relative. Considering that VR and body tracking are still relatively new fields, even counting the movie industry's use for CGI motion capture.) Another thing that I think could be useful, is actually using VR as a prototyping tool for controlling and tracking methods for robots and other things. So that instead of having to spend money on physical materials to build a prototype model, just to work out the bugs in your control interface, you could use a 3D avatar instead. Even if the avatar is hideous and super basic, the movement of the bones should still be useful. Providing that the skeleton measurements are accurate.
@vgaggia
@vgaggia 4 жыл бұрын
I wonder if you had to set that up for movement, and you apply some sort of anesthetic to the person, they'd be able to move like they're actually in the vr, obviously it'd take some machine learning voodoo and insanely accurate sensors, but would still be interesting edit: could be useful to give to people who have medical issues that can't move anything but they're brains are still functioning perfectly
@feda9562
@feda9562 4 жыл бұрын
​@@vgaggia that's exactly what BCIs where first developed for in the '70s
@matthewcollier3482
@matthewcollier3482 4 жыл бұрын
"Mooom can we get Neuralinks please?" "No we have Neuralinks at home!" Neuralinks at home:
@mrmartinwatson1
@mrmartinwatson1 4 жыл бұрын
probably be the only person to successfully log of off sao
@TechMarine
@TechMarine 4 жыл бұрын
If I may suggest, you should use the "tree" support structure, you would save so much plastic for that kind of construction
@ApanLoon
@ApanLoon 4 жыл бұрын
“Do you know what this means? It means that this damn thing doesn’t work at all!”
@StevenIngram
@StevenIngram 4 жыл бұрын
I thought of the same quote. LOL
@jonmayer
@jonmayer 4 жыл бұрын
Listen, Doc.
@sunboy4224
@sunboy4224 4 жыл бұрын
I actually have worked in labs that did BCI's, and just graduated with a PhD in Biomedical engineering, with a focus on neuroengineering. I'm sorry to say, but this project is dead in the water. It's been about 5 years since I've looked at an EEG signal, so my memory is admittedly a bit fuzzy on the details, but my lab was attempting to decode leg kinematics from EEG. We were using decently complex filtering (Unscented Kalman filters, H-Infinity filters, all kinds of frequency-domain filters, as well as PCA reconstruction [for offline analysis]), and in the year that I was there we weren't able to decode a signal that "looked good" (we were getting some metrics that it was kind of working, but DEFINITELY nothing that looked correct to the eye). We believe this is because leg kinematics are actually encoded in the spine rather than the brain, but even if that wasn't the case (as in hand-movements), the EEG signal is just WAY too noisy to do anything with. It's been a while since I've looked at EEG, but I'm pretty confident in saying that most or all of the signal you were seeing on your headset was EMG artifact. If you WERE to see some kind of movement signal, it probably wouldn't look like what you think it would. It's not going to be that enormous amplitude spike, 5x larger than the noise floor. It would probably be some small increase in the spectral power that correlates loosely to the limb moving. You just don't have enough electrodes in the area to get a useful signal, and you're probably not going to without opening up your skull and sticking something in. As for putting electrodes on your frontal cortex, well...you might actually have a little bit better luck there. Again, you're not going to be able to think "cup" and have the computer understand. But, MAYBE you can put the probes on your visual cortex and get the computer to recognize when you look at a very large red piece of paper that you mean cup, and a very large blue piece of paper means something else (and even then, you have to be very careful that the computer isn't just seeing the difference between "look left" and "look right", i.e. EMG from your eyes). A great example of a functioning EEG device that you might base your design on is the "P300 speller". The P300 wave is a repeatable brain signal that occurs when you observe a specific stimulus among a background of similar stimuli (imagine a room full of people who are sporadically clapping, and you focus on a specific person and count every time they clap). Using a specially built system, a computer is able to determine which letter a person is focusing on in a displayed keyboard, and then type the chosen letter, allowing someone to type without using their hands. THAT would be a cool project to get going...either spell the object you want ("cup", "remote", etc), or just display pictures of each object and have a "P300 speller"-like system pick out what you want.
@1kreature
@1kreature 4 жыл бұрын
You said it your self... These are just single conductor electrodes that contact the skin. Thus, I don't know why you would think they "point" in any direction and thus have directional pickup capability. As for the moving of the arms: Try not waving the arms closer to the headset. Less noise. Sideways out from the body or simply making flat hand/fist with arms resting on lap works better.
@benGman69
@benGman69 4 жыл бұрын
*James tenses his arms and suddenly drops to the floor with spit around his mouth
@clancywiggum3198
@clancywiggum3198 3 жыл бұрын
For what it's worth, and I admit I'm a bit late here, the primary motor cortex is effectively just a bank of output pins for the brain, it just contains the brain end of the neurons that run down the spine and, through a couple of subsequent more or less direct connections, connect to parts of muscles, so if you are directly reading it you can only sense actual physical movement. They're also perfectly mirrored - all neurons for one side of the body go to the other side of the brain. BCIs get more interesting if you target the motor planning areas because you can plan how to move part of you without actually having to move it so there's scope for pure mind control of computers or hardware through that technique, although of course the resolution is low. I would presume that's where you've accidentally targeted because that's a larger area and probably the suggestions of mixed right and left hemisphere involvement in any given limb would be from mixed processing in the motor planning areas as the true primary motor cortex does little if any actual processing.
@ollie-d
@ollie-d 3 жыл бұрын
I have been doing EEG-based BCI research for just under 6 years now and there are a lot of errors in this video, but it’s a fairly good start. It wouldn’t be productive to nitpick everything in here, but know that you’re unlikely to get a good motor imagery-based system with more than two classes using the OpenBCI. You need to collect more controlled data. If you want to be making inferences on imagined movement, make sure you’re not moving your limbs but rather imagining movements (rotating a door knob works well). You should see a decrease in the mu rhythms over the sensory motor cortex during imagined movement. You’ll typically see a decrease in both C3 and C4 mu power but the mu power over the electrode contralateral to the imagined hand should be lower. If you wanted to use the activity from the actual muscle movements then you’ll have a much easier time using EMG electrodes on the arms since the signals are orders of magnitude more powerful and classification is trivial
@LucasMcDonald
@LucasMcDonald 4 жыл бұрын
James, I used to do “bio feedback” as a training method for ADD. It used basically the same interfaces as your using here. The mantra they taught me to “brain train” I found years later is just mindful meditation. I’m sure looking into this will help you with your results.
@Rooey129
@Rooey129 4 жыл бұрын
That helmet is a giant antenna, especially for low frequency, you should shield and ground it out.
@jamesbruton
@jamesbruton 4 жыл бұрын
I might put my whole head in a metal box
@Rooey129
@Rooey129 4 жыл бұрын
@@jamesbruton thats a good idea, if it works, you could even just run the sensors off a shielded cat6, making sure to ground the shield and aluminum foil around the helmet.
@quattrocity9620
@quattrocity9620 4 жыл бұрын
I've seen bigger...
@noahluppe
@noahluppe 4 жыл бұрын
@@jamesbruton The man in the iron mask 2: electroencephalographic bogaloo
@masterninjaworrior
@masterninjaworrior 4 жыл бұрын
Can’t wait to see what you do with this, I am very interested in BCI!
@wbretherton
@wbretherton 4 жыл бұрын
'We're getting closer to Elon Musk's pig' is my favourite quote of the day
@Fury9er
@Fury9er 4 жыл бұрын
This was very interesting, I would like to try this out one day so hopefully the open source stuff will become more common and a little cheaper in the future.
@mossm717
@mossm717 4 жыл бұрын
Glad to see your doung this. I always wanted to try one of these eeg setups, but they’re so expensive
@deepakjoshi6242
@deepakjoshi6242 4 жыл бұрын
This guy always bring interesting stuff in a fun to watch way..
@ted5610
@ted5610 3 жыл бұрын
just being able to see the different outputs from the electrodes is wild :o
@georgemathieson6097
@georgemathieson6097 4 жыл бұрын
Favourite quote: "We're getting close to Elon Musk's pig"
@georgemathieson6097
@georgemathieson6097 4 жыл бұрын
Great video by the way James, keep them coming.
@Genubath1
@Genubath1 3 жыл бұрын
Something that might affect the signals is something called irradiation. even if you are using one main muscle for a movement, there are tons of other muscles that support it and muscles that support those, and so on. When you lift your legs, you are also flexing your core and moving your arms to balance. When you make a fist, you flex your muscles all the way up your arm and into your core.
@tiagotiagot
@tiagotiagot 3 жыл бұрын
Would be interesting to see a visualization that colors each region based on the frequencies picked by the respective sensor, like mapping the lowest frequency to red, the highest to blue, and interpolating in between, and blending the various hues in weighted by the respective intensities; producing white where all frequencies are being detected at maximum strength together, and black when none is at all.
@BLBlackDragon
@BLBlackDragon 4 жыл бұрын
Even a rudimentary system like OpenBCI can have a number of applications. Motor control, bio-feedback training, etc. This could be fun to play with.
@farazsayed5730
@farazsayed5730 4 жыл бұрын
What would be super awesome is if there was a headset with loads and loads of pins connected to something like an fpga, and some software to configure the fpga to pick and choose regions of interest. You could even get away with using sprung test probes that probably won't hurt your head if you had them sufficiently densely packed
@michaelarview
@michaelarview 2 жыл бұрын
Also with cones around the pins to increase range
@richardstock1
@richardstock1 4 жыл бұрын
Great video looking forward to the next one. Subjects on the brain really intrigue me.
@wesmatchett615
@wesmatchett615 4 жыл бұрын
This is amazing similar to Dr. Emmett Brown’s thought helmet from Back To The Future
@ViniciusMiguel1988
@ViniciusMiguel1988 4 жыл бұрын
Oh James come on do it properly! Drill your head and stick the electrodes into the brain! 😁
@tiagotiagot
@tiagotiagot 3 жыл бұрын
Maybe you could train some sort of neural net with videos of your face and the EEG readings, to identify and subtract the patterns that are produced by blinking, eye movements etc, when those patterns are present, while leaving the rest of the signal intact?
@zeekjones1
@zeekjones1 4 жыл бұрын
I'd point towards the upper forehead, as there is more there to work with. As you said, to record a limb, it's better off probing from said limb, however thought is only in the head, therefore targeting logic and reasoning can approximate a new digital 'limb'. Tandem AI and brain training; eventually you could move a cursor, then a game controller, maybe even some custom macros on the PC.
@reggiep75
@reggiep75 4 жыл бұрын
Reminds my of the numerous EEG tests I had for epilepsy but this kit is just more enjoyable and fun.
@BenKDesigns
@BenKDesigns 3 жыл бұрын
James, you're like John Oliver, if John Oliver were an incredibly cool nerd. And, of course, I mean that in the best way possible. Love your channel.
@graealex
@graealex 4 жыл бұрын
"That's my brain" WTF, put it back!
@RupertBruce
@RupertBruce 4 жыл бұрын
A 12 node band across the top, two either side at back for visual, two on temples for easy signaling/button pressing. Top band is 3 side-by-side on each side. I wouldn't expect much from lower limbs but I'd love to build a neural net with sensors as input and Nvidia's gesture model from a camera pointed at you so that it can build a body language model.
@LokiLeDev
@LokiLeDev 4 жыл бұрын
Cool stuff! It is really hard to extract meaningful information from non invasive probes like that. I've seen a conference where they said that actually we can read about 1bit/s from the brain! And they tried machine learning to decode the signals but it was actually easier to train the brain to produce more clear signals!
@jamesbruton
@jamesbruton 4 жыл бұрын
Interesting idea...
@jesseshakarji9241
@jesseshakarji9241 4 жыл бұрын
It may be interesting to pair this with an artificial neural network(ANN) that you could train on your brain sensor data. This way instead of looking for specific jumps of activity an certain channels, you could use an ANN to classify the data for you so it would know if you're say moving a leg vs an arm and other muscle groups. Seems like the sensors are pretty sensitive so who knows how well this would work.
@jonathanballoch
@jonathanballoch 2 жыл бұрын
awesome stuff!! did you ever build the new adjustable headset?
@marc-antoinebelanger2841
@marc-antoinebelanger2841 3 жыл бұрын
@James Bruton: My father-in-law just been diagnosed with ALS. I would like to build a dataset of its speech and EEG signals. Before bying the kit I would like to know if it is possible to sacrifice a EEG input for a microphone. Or, is it possible to expand the board to add a microphone?
@bgg4865
@bgg4865 4 жыл бұрын
My first thought was that experimenting on yourself is not a good idea, you have to think about what you want to do, and you can't help but think about what you're seeing on the charts as you do the movement. I'd say get someone to give you commands, and don't look at the plot as you obey them. Get the assistant to write out the moves in a different order too, so you don't know what's coming. Record as you go and try and interpret later.
@HKallioGoblin
@HKallioGoblin 3 жыл бұрын
Secret services created a very advanced form of technological mindcontrol in 2008 that can understand all those thoughts. All readings can be animated to screen, so we know what those readings mean.
@kicktangerines8528
@kicktangerines8528 4 жыл бұрын
I was just doing a paper on this. SO AWESOME!
@MuhammadDaudkhanTV100
@MuhammadDaudkhanTV100 4 жыл бұрын
Great full ideas and good work
@EngineeringSpareTime
@EngineeringSpareTime 4 жыл бұрын
Very interesting! Did you think about EMI on the cables of the sensors? There are very sensitiv..bundle them might look better, it‘s might not be better in terms of data quality though (analog output?)
@pawzubr
@pawzubr 4 жыл бұрын
Read about cocktail party problem - you have to use some kind of decoupling method, ex. Independent Component Analysis
@wesleymays1931
@wesleymays1931 3 жыл бұрын
I know! Let's use an FFT!
@seeigecannon
@seeigecannon 4 жыл бұрын
I bought an Emotiv a while back for a project that I never got around to. Unfortunately, I would not recommend them. The dongle is paired to the headset in such a way that if you lose the receiving dongle the headset is effectively bricked. The lowest version also hides all of the outputs and does not have any kind of API for doing something with the data. There is a python program somebody made to decrypt the stream, but they sent a mass email out about how the python library is taking money from their pockets and they must raise the price of all headsets because hobbiests want to actually do stuff with the headsets without paying $500. Note: I last interacted with the headset around 7 or so years ago, so the review may be very out of date. Also, with the full helmet, could you try different sensations like pain, hot, cold, and maybe something like sweet/sour/spicy to see if anything interesting falls out?
@jamesbruton
@jamesbruton 4 жыл бұрын
Thanks for that, I'm intending to stick with this one for now.
@barrettdent405
@barrettdent405 4 жыл бұрын
Reminded of Doc Brown’s rig in back to the future.
@excitedbox5705
@excitedbox5705 4 жыл бұрын
You could try a tight fitting bathcap or a swimming cap and mount probes all over that. You really just need wires connected to a metal tack. if you coat the tack in conductive gel you will get better readings.
@antonwinter630
@antonwinter630 4 жыл бұрын
what a time to be alive
@JMRCREATIONS
@JMRCREATIONS 4 жыл бұрын
Near to 1 milliom👏👏🎶🎶🎇
@oliverer3
@oliverer3 4 жыл бұрын
The fact to you specified that it wasn't your actual brain made me laugh. Also side note, can we consider this a thinking hat?
@pvic6959
@pvic6959 4 жыл бұрын
lol i didnt need to be told. i would assume his real brain is 10x that size lolol
@wolf1066
@wolf1066 2 жыл бұрын
This was very interesting since I was wondering about Open BCI and how well it would work to control a robot arm. It looks like a lot of fiddling about would be required before I could get it to read brainwaves to the degree of precision I would need.
@Shinika01
@Shinika01 4 жыл бұрын
Sooooo EPIC!!!!! Please go further with this toy!!!! Show us what could be done!
@jamesbruton
@jamesbruton 4 жыл бұрын
I will try my best!
@badWithComputer
@badWithComputer 4 жыл бұрын
I'd love to be able to show 1989 me this video
@andy-in-indy
@andy-in-indy 4 жыл бұрын
You don't need to reposition the probes if you use the interaction of several probes to "triangulate" the point of activity. That will be mathematically complex, since the signal intensity is not a point source and it is surrounded by other signal sources. An interesting possibility is to train a neural net to read your brain pattern instead of trying to calculate all the equations simultaneously. I expect the training would involve something like a camera to detect joint position and angles of your movement being compared to the neural net output to predict what the joints position and rotations should be. Eventually, you should get a close correlation between the actual and predicted. That information could then be fed into robot arm or something like the performance robots you built. Once there is a correlation between actual movement and the output, you would need to begin training the neural net to detect visualization instead of motor cortex activity. I expect the training would be to have yourself visualize a movement routine, and train the neural net to output something that matches that routine. The routine would have to be changed up frequently to prevent the neural net from learning to just output the routine instead of trying to match your visualization. Anyway, If I had more time and money, that would be how I would approach it. That may be a bit too long and boring a process for KZbin videos.
@sunboy4224
@sunboy4224 4 жыл бұрын
The problem with this is that EEG doesn't really have hand kinematics information in it (or if it doesn't, it's INCREDIBLY hard to get to). Chances are, if a schema like this works, it will be because the neural network is recognizing EMG artifacts as features.
@mhnoni
@mhnoni 2 жыл бұрын
Not trying to be religious here, but someone made that brain @ 15:04, what an engineering, I really have no idea how some people think human/universe is just a matter of random. I wonder if the signal that goes from our brain to the target like fingers are coded or analog, I mean each finger has a single nerve connected to our brain like a copper wire or multiple fingers get the signal from a single nerve? if there is a nerve that control more than one system, then the human brain is more complex than what I thought.
@nasim3269
@nasim3269 4 жыл бұрын
I saw those electrodes spiked for no reason were of a lesser amplitude compared to the electrodes which made sense. I think if you do further filtering you'll get a much better resolution for the motor command signals.
@CyberSyntek
@CyberSyntek 4 жыл бұрын
As much as I love the fact you have this up and running and believe if anyone can pump out great diy results on this project it is you James, but... my god man you are jumping from project to project on a weekly basis! XD Perhaps working on one thing at a time and really making the solid gains on it for a little longer. I understand it is fun to experiment and play with new concepts and designs but... you are James Burton! You have the ability to really push these projects to a next level. Either way keep doing you, love what you are doing. I get it that you need to keep the fresh content coming to bring in more supporters. We need a James clone at some point to allow the project focus time. XD
@vincentpaniccia109
@vincentpaniccia109 4 жыл бұрын
2030: Here’s the upload your consciousness DIY kit.
@wesleymays1931
@wesleymays1931 3 жыл бұрын
I sure f**king hope so, getting tired of this stupid human body
@ryansummer1589
@ryansummer1589 4 жыл бұрын
This is pretty awesome!
@motbus3
@motbus3 4 жыл бұрын
hey James. I think the signal is just too complex and have very low amplitudes to be visually curated. probably you'll have to record some time length of the movements and build some models to understand. I know your stuff is robotics but I guess you may find some deep learning pretrained models to validate the idea. I think some Transformer models could be built in order to predict the labels of the movement but it would require too much date to get it working from scratch. if you think this idea is cool just give a shout out on next video and I would be really happy :)
@AmaroqStarwind
@AmaroqStarwind 3 жыл бұрын
Brain-Computer Interfaces would probably be really effective when used in conjunction with conventional user inputs. Imagine a neural interface helmet for an LMP1 race car.
@stefanguiton
@stefanguiton 4 жыл бұрын
This would be amazing on your Exo suit!
@bloodypommelstudios7144
@bloodypommelstudios7144 3 жыл бұрын
The other thing about the pig walking is it's a pretty predictable movement, if you know where in the walk cycle it is you should be able to predict fairly accurately what each limb is doing.
@thorley1983
@thorley1983 4 жыл бұрын
could you calibrate the input with a hand movement followed by a jaw movement? and use a signal strength thing so that the position of the electrodes isnt as critical?
@Saraseeksthompson0211
@Saraseeksthompson0211 3 жыл бұрын
You are an absolute genius
@flloyd86
@flloyd86 4 жыл бұрын
I'm sure you've already thought of this, so. Are you aiming to apply Machine Learning? I see you've spoken about it while working on the jetson. It would be great to translate the input from cranial sensors into, dare I say, ROS messages? That would be cool and I might be able to find a repository to help. :)
@Quasar_QSO
@Quasar_QSO 2 жыл бұрын
I sure hope this mind reading tech gets so much better soon. I want an omnidirectional wheelchair that I can control with my thoughts.
@tedhuntington7692
@tedhuntington7692 4 жыл бұрын
fairly soon we may see a kind of menograph, a device that records thought-audio, as was thought about by Hugo Gernback and mentioned in his Ralph 124C41+ in 1911 CE.
@johncaccioppo1142
@johncaccioppo1142 3 жыл бұрын
Have you figured out how to convert this into a synth controller yet? I think the Spitfire audio orchestra would overlay perfectly.
@negative258
@negative258 4 жыл бұрын
Maybe you can look into MyoWare EMG sensor to activate the claws. EEG activated tasks will be pretty interesting tho
@RIchardBH3
@RIchardBH3 4 жыл бұрын
awesome product
@isbestlizard
@isbestlizard 3 жыл бұрын
I wonder if this could recreate those famous experiments where people make detectable intentions to do things slightly before they consciously attribute the decision point that'd be cool to see!
@electronetiq
@electronetiq 3 жыл бұрын
Can I translation waves to text by this brain sensor? Please answer thank you.
@thebagelbomb
@thebagelbomb 4 жыл бұрын
Imagine an Iron Man helmet opening and closing when you think it.
@at0mic282
@at0mic282 3 жыл бұрын
Can you test responses to sudden shocks or surprises? so maybe a protection hardware would be possible to shield the user when they are suddenly put in a precarious situation
@chrism9976
@chrism9976 2 жыл бұрын
I'd like to see a BCI connected to a keyboard. I'd like to know if one could communicate with others in the event of a paralyzing stroke.
@marks47
@marks47 4 жыл бұрын
So are the pros using AI learning for the patterns associated with the limbs for the prosthetics to perform tasks semi-auto-matically? Or would they be better off controlling it "manually" in real-time so minute adjustments can be made?
@jamesbruton
@jamesbruton 4 жыл бұрын
I'd like to use deep learning to process the results, but I need some consistent results first!
@thanhvinhpham3586
@thanhvinhpham3586 3 жыл бұрын
Is there a way to prevent others from using BCI technology on your body?
@ahmedkamel821
@ahmedkamel821 4 жыл бұрын
The video is amazing as usual, I am just surprised by how bad the 3d printing quality is from such a professional maker!
@jamesbruton
@jamesbruton 4 жыл бұрын
It could have been better, but each half took 12 hours as it was, so it wasn't on the highest quality settings.
@aamirbangash985
@aamirbangash985 2 жыл бұрын
Thanks a lot for the video How can I use this run time data to control a wheelchair using Raspberry pi as microcontroller? In other words, is this board compatible with Raspberry pi? Regards
@ebybbob
@ebybbob 3 жыл бұрын
Hey man, looking down like that in a superhero fight is dangerous... even if it let's you charge up your war-face.... stay safe out there
@outsider7654
@outsider7654 3 жыл бұрын
Combine this product with itrack 5 and an eye tracker camera; use xpadder to assign signals to keyboard and eye tracker and itrack 5 to mouse. Then you can play without moving or using your hands. And dont even think what can happend if you add a VR set.
@Vionbringer
@Vionbringer 4 жыл бұрын
This is really awesome and there are plenty of possibilities, but the main question I have is why is it so bulky?
@cosmicrider5898
@cosmicrider5898 4 жыл бұрын
When you realize hes been building these robots to upload into and become an Android with gray hair.
@TheAstronomyDude
@TheAstronomyDude 4 жыл бұрын
This is so cool! Can I make my own using the arduino touch sensor library?
@jamesbruton
@jamesbruton 4 жыл бұрын
maybe?
@userou-ig1ze
@userou-ig1ze 4 жыл бұрын
brilliant, just thought last week it would be great if you'd try the openbci stuff. It's prohibitively expensive at around 1000$ (lowest you can go AFAIK) but it's still better than the non-opensource alternatives (in terms of: it's actually open source). Emotiv even encrypts their electrode readings - originally open source, then the urge for money kicked in - now you pay - for every recording *facepalm* - i.e. to decrypt their electrode readings
@TheArashhak
@TheArashhak 3 жыл бұрын
Can you share the CAD files (STL?) for the custom headset, please? I'm interested in targeting just the hand movement for a custom assistive device.
@CNGboyevil
@CNGboyevil 4 жыл бұрын
I'm certainly no expert, but a couple of thoughts. Maybe some brass mesh to shield from external signals. Maybe chain up a bunch of ground, and place them in the comfort pegs
@toniwalter2911
@toniwalter2911 3 жыл бұрын
at 15:28 what does the diagram mean by trunk? was it made for elephants or am i not getting it right?
@BillyHardcase
@BillyHardcase 4 жыл бұрын
Well there is definitely some activity there ;) very interesting.
@4.0.4
@4.0.4 4 жыл бұрын
I wonder if you could train a neural network to take those signals and convert them into an estimated intention.
@rustycobalt5072
@rustycobalt5072 4 жыл бұрын
Build a faraday cage around ya to remove all noise, have the Bluetooth receiver inside as well
@rcninjastudio
@rcninjastudio 4 жыл бұрын
Just one step closer to James becoming a cyborg😂, seriously though I find things like this fascinating, we've been to the moon, sent probes into deep space but we still don't fully know how the brain works
@OMAR-fq4qi
@OMAR-fq4qi 3 жыл бұрын
Link to buy all electronic parts please
@doublea47
@doublea47 4 жыл бұрын
taran is taking notes
@tylerwright6006
@tylerwright6006 3 жыл бұрын
This is like the James Hoffman that never drank coffee and got into electronics.
@kestergascoyne6924
@kestergascoyne6924 4 жыл бұрын
Mindblowing stuff!
@catalinalb1722
@catalinalb1722 3 жыл бұрын
Hello, where can I order the kit with sensors and boards?
Experimental Jamming 'Beanbag' Robot Gripper
15:54
James Bruton
Рет қаралды 238 М.
Из какого города смотришь? 😃
00:34
МЯТНАЯ ФАНТА
Рет қаралды 2,7 МЛН
А я думаю что за звук такой знакомый? 😂😂😂
00:15
Денис Кукояка
Рет қаралды 4,9 МЛН
За кого болели?😂
00:18
МЯТНАЯ ФАНТА
Рет қаралды 3,3 МЛН
Machine Learning Prosthetic Arm Concept
16:55
James Bruton
Рет қаралды 209 М.
They were SO close! - Weird controllers that defined modern gaming
12:25
Linus Tech Tips
Рет қаралды 986 М.
Lex Fridman wears the Kernel Flow brain-computer interface
9:36
Democratizing Bio-Sensing Tools With OpenBCI
26:29
Games for Change
Рет қаралды 29 М.
What is a Prioritising Mechanical Multiplexer?
15:23
James Bruton
Рет қаралды 240 М.
How do non-euclidean games work? | Bitwise
14:19
DigiDigger
Рет қаралды 2,4 МЛН
I let people control this robot
28:57
James Bruton
Рет қаралды 120 М.
I found a use for the Mobius Strip Tank
16:35
James Bruton
Рет қаралды 316 М.
Из какого города смотришь? 😃
00:34
МЯТНАЯ ФАНТА
Рет қаралды 2,7 МЛН