Person: is the answer A or B Quantum Computer: Yes.
@predraggrujic9734 жыл бұрын
And no. And neither kinda.
@badnietzsche4 жыл бұрын
Tyler the creator xDDD
@ElonMuskTheOne4 жыл бұрын
it is rater: cpu: yes qpu: not sure (but if you really want to know, I will tell you the definite answer)
@gutefrage94252 жыл бұрын
The answer is 42
@LiamBall-j8g9 ай бұрын
this is very realistic.
@weekendresearcher5 жыл бұрын
2019: Google achieve Quantum Supremacy 2029: Google can use Quantum CNN to recognize Schrödinger's cat
@gaurangbelekar49304 жыл бұрын
lol
@belakun12443 жыл бұрын
Turns out the cat is undead
@shivamsinghcuchd3 жыл бұрын
🤣
@gokuldas0276 жыл бұрын
This truly made me like Ant-Man.. "do you guys just put the word quantum in front of everything" 😂
@statikkkkk5 жыл бұрын
Okay time to learn quantum physics again. Hope I can still find my old textbook.
@aichabenlahrech6355 жыл бұрын
Yeah 😂😂😂
@eddyeffy2 жыл бұрын
I don't understand what you are saying but I won't stop watching all these videos, I believe with consistency, they will begin to make sense
@ianprado14886 жыл бұрын
The dislikes are from people overexposed in Bitcoin
@antoniomartinez34295 жыл бұрын
@@galihindra5901 If we get to quantum computing the laws that maintain Bitcoin secure will be broken so it's value will go to $0
@andrewfowler58454 жыл бұрын
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMonster kill
@andrewkaufman12764 жыл бұрын
@@andrewfowler5845 lmfao couldn't have put it better XD
@deidara_85983 жыл бұрын
@@galihindra5901 The more advanced explaination would be that Bitcoin wallets varify their identities using digital signatures generated with an elliptic curve, since quantium computers can easily compute discrete logarithms, this signature protocol is effectively broken, and anyone in possession of a sufficiently strong quantum computer could impersonate any bitcoin wallet they wanted by cracking the private key and generating their own signatures.
@2049bits5 жыл бұрын
Thank You! Finally a cohesive overview of the programming environment! Also, Yay Python!!
@даладновам4 жыл бұрын
Programming a quantum computer. new statement: if true then maybe... ;
@doublevgreen2 жыл бұрын
awesome video. great to see videos that go past the usual "qubits can be both 1 and 0 at the same time".
@d3solace6 жыл бұрын
At 4:01 Dave states that the Controlled Not gate is shown as containing an "amersand symbol", when the diagram displayed contains an @ (at sign) not an ampersand (&).
@d3solace6 жыл бұрын
Examples from the Cirq docs contain @. No worries, just commenting this here for the curious.
@mattcall905 жыл бұрын
correct
@dreyplatt53285 жыл бұрын
I get the arroba and ampersand confused all the time.
@truedreams14 жыл бұрын
Just to further confuse you.
@louistech1125 жыл бұрын
This is some ground breaking bleeding edge of tech learning here
@truedreams14 жыл бұрын
Quantum processors aren't binary, which means you code doesn't always have to be boiled down to binary logic, you can instead build the circuit logic in your Cirq code (with the limitations mentioned). So if you're building for example a rock/paper/scissor game with an regular binary processor that would take many steps and computation steps, but a quantum processor would do that in one processor stroke (or whatever they're called in quantum world- quantum strokes?), as long as the logic could fit in to the circuit.
@nigeljohnson98205 жыл бұрын
How about explaining what the gates can do! There is so many new terms which are being used without first being defined, by the end you might as well be explaining in an obscure dialect of Eskimo.
@aichabenlahrech6355 жыл бұрын
After one or two years the youtube tutorials will be like : the final video in the course , how to print "Hello world" 😴
@ipkeez4 жыл бұрын
Geez
@kim157423 жыл бұрын
Basically like Haskell ^^
@stevenmael4 жыл бұрын
Just want to point out that apparently, earlier this year someone figured out a way to take quantum noise down to 0 using machine learning.
@GeorgWilde2 жыл бұрын
So what runs faster? A simulation of a quantum computer on a classic computer or a quantum computer? I'm not optimistic.
@ifstatementifstatement27045 жыл бұрын
Oh I’m gonna pip install cirq as soon as I get back to the office.
@parimalarenga925 жыл бұрын
Tensorflow for artificial intelligence using python... Now Google released cirq python framework for quantum computing...
@jairacarvalhojr5 жыл бұрын
Just saw the presentation by Mr. Bacon at TQC-NIST conference. Awesome talk. Very grounded on the near term outlook. But optimistic nonetheless. Congrats 👏👍
@MCWaffles2003-15 жыл бұрын
2:39 thats a 4 way factorio balancer
@DorothyJohnson-r9j2 ай бұрын
Thanks for sharing! Just a quick off-topic question: I have USDT in my OKX wallet and I have the recovery phrase. [pride]-[pole]-[obtain]-[together]-[second]-[when]-[future]-[mask]-[review]-[nature]-[potato]-[bulb]. What is the best way to transfer them to Binance?
@RajvirSingh13133 жыл бұрын
I want some more quantum computing videos!!! Google make more videos explaining and practical uses of quantum computers please!
@BigyanChap3 жыл бұрын
What is Hardamard gate? What is Control-NOT? Neither Cirq nor Qiskit is explaining these kinds of stuffs and directly jumping into the circuit. Where can I find the documentation?
@t3hPoundcake Жыл бұрын
I'm starting to get deeper but I still am looking for an explanation (no matter how in depth or scientific and high level to understand) of how Google or IBM takes this Python syntax and interprets/compiles it to manipulate a physical qubit existing on a quantum chip. Is the Python code literally compiled and interpreted by the quantum machine natively like when you compile a C++ program on a windows machine? Or is this Python code just a tool for helping them physically configure a quantum processor? Does anyone have more resources on how we go from this code or Qiskit code to execution on a quantum processor?
@ankitdei15 жыл бұрын
Hi , Any link for doumntion to learn the Cirq
@jaimevarela58005 жыл бұрын
yes sounds like a strong statement to the question of whether we can build a large scale q-computer
@MuhammadNurdinnewspecies4 жыл бұрын
At the end of video... My brain : doesn't catch anything
@Czeckie3 жыл бұрын
what's the difference between cirq and qiskit? Should I learn cirq, if I already know some qiskit?
@jujharbansal5 жыл бұрын
Great video - well balanced introductory tutorial
@jujharsingh54615 жыл бұрын
Lol we share same name
@iKnowOfficialYT4 жыл бұрын
Hello Google, instead of using parallelogram grid for the quantum processing chips, can you instead use triangular grid to allow more adjacent cubits to interact?
@iKnowOfficialYT4 жыл бұрын
Arranged possibly in equilateral triangular manner arranged into hexagons which are the best tiling 2D shape
@ming-yuanyu55976 жыл бұрын
3:50 Why 1993?
@dabacon6 жыл бұрын
It was the year I graduated high scool ;)
@jonclement4 жыл бұрын
when a screen of text took a minute to load on a 14.4kbps modem. you dare not download an image!
@pragueexpat51064 жыл бұрын
Is this hardware-independent?
@M.G.R...5 жыл бұрын
*It will become more simple in the future*
@nethacker915 жыл бұрын
Yeah compare Assembly where you handle registers and memory directly with something like C or even Java where it runs on a JVM.
@kvenkat6650 Жыл бұрын
Tensorflow-quantum is not installing ?
@fu22012 жыл бұрын
Hmmm the quantum circuit looks a lot like the pentatonic minor scale.....
@MrYingding3 жыл бұрын
I don't get it. With the Python GIL in place, Python doesn't seem to be a good language for quantum computer? Or quantum computer is just suppose to be lightening fast with no need of parallel tasks?
@alxjones2 жыл бұрын
Python is just the language used to build the circuits. In the end, the circuit is compiled to a quantum assembly language (QASM) to be run on real quantum hardware. The only thing that relies on Python's performance capabilities is the construction of the circuit object, which is fairly simple even with hardware constraint checking, and the running of the simulator, which is going to be slow on anything. Python is a popular language among scientists because it's easy to learn and use, so it seems like a natural choice for this application, even if it's not the one I personally would have chosen.
@alexandrumoraru42866 жыл бұрын
Are you going to provide support for Julia programming language?
@CraigGidney6 жыл бұрын
At this time, there are no plans to port Cirq into languages besides Python.
@pigizoid99243 жыл бұрын
Take this sentance seriously, i have made a quantum superposition in a game called mindustry and it has codeability processors in the game that can do calculations quite fast
@circulartextАй бұрын
im sorry im so late to class i thought AI was going to be the limit this is the new limit i was here 5 years ago for things like the draw shape game me and my kids loved
@obsidian99984 жыл бұрын
1:30 Zachatronic needs to make a game inspired by this. It's weird how some the game designs remind me of this diagram.
@neelg70574 жыл бұрын
I was watching this video while running "pip install cirq" in another teminal....
@IvaikinInsights4 жыл бұрын
So you’ve watched KZbin in the terminal too?
@rajupatel61335 жыл бұрын
Good sir
@CirqueAlvis5 жыл бұрын
Hi Cirq!
@heyakash58504 жыл бұрын
Nice work...!
@fathiyul3 жыл бұрын
Hello, I'm new here. I don't quite understand how my PC could run the quantum computation program without having quantum computing processor in it. Anyone? Also, since 2 years have passed, have we surpassed the NISQ era? thanks in advance
@alxjones2 жыл бұрын
No. As far as I know, 127-qubits is the current maximum, achieved by IBM in November 2021. The general rule is that 10 to 100 noisy qubits equate to 1 "perfect" qubit, so this means our best possible error-corrected computer could theoretically have somewhere around 1 to 13 effective bits. Realistically, there's more to it than just the number of qubits, so I don't think we're really even capable of that. NISQ isn't particularly well-defined, but to get an idea, Shor's algorithm requires on the order 10^3 qubits, which if error corrected, corresponds to 10^4 to 10^5 noisy qubits at best. This video (7:33) estimates it closer to 10^6, or 100,000 noisy qubits. IBM has a lofty goal of a 1,000-qubit processor by 2023 and 1,000,000-qubits by 2030, so even if they stay on track, it will be quite a few years before we're definitively out of NISQ.
@cameleon57242 жыл бұрын
One content, two languages. What I have now written may have a perfect mirror in another language.!!!!!!!!!!!
@onkarkoli4586 жыл бұрын
Hey what is future of cybery sicurity in quantum computers
@user-ye4ox7hz5r4 жыл бұрын
i just wanted to print "hello world"
@doubtingdaniel83206 жыл бұрын
What will happen if one hacks from a quantam computer
@nickyn2865 жыл бұрын
change time-space continuoum
@gabrieljosereyesacosta70073 жыл бұрын
But can it run doom?
@Ms.Robot.4 жыл бұрын
THEM: Quantum computer capable of accomplishing a task in a second that would take a supercomputer years to accomplish ME: Prove it.
@eyitayoadebiyi45604 жыл бұрын
Who else watches this and acts like they understand what they’re saying
@mahogs95 жыл бұрын
I don’t know a whole lot about programming or computers, so this idea might be completely stupid. Is it possible that quantum computers might not replace standard computers? Instead, can we combine quantum computers with blockchain to develop a ledger faster than a traditional computer can make, but have a traditional computer use that ledger?
@johnkelfy72564 жыл бұрын
We said the same thing on gpus. Plus I don’t see people cool their system to literallly kelvin -9.2 just to watch ray tracing he*ntai
@unlockwithjsr4 жыл бұрын
I just think everything is possible
@alxjones2 жыл бұрын
I don't know why blockchain had to enter this discussion, but generally, yes. We already know that there are things classical computers do better than quantum, cost and efficiency among them. Hybrid classical-quantum architectures seem to be the best of both worlds, though we are quite far from a consumer QPU. Until then, we'll likely be limited to cloud platforms like GCP for hybrid computing tasks.
@dom70155 жыл бұрын
But can you overclock it?🧐
@ethanlai10444 жыл бұрын
Idk, I did find some info that the fastest one runs at 125 MHz
@sahajrajmalla5 жыл бұрын
What are the skills or knowledge required to jump into this tutorial, brother ?
@tombradford70353 жыл бұрын
Is he good or is he good? Cirq: yes and no.
@Zachstetson5 жыл бұрын
What do I have to do to learn and get a job with Google?
@eliewilhelm2 жыл бұрын
I think when he said "all the goodness of python" he meant "all the godness of python".
@linkcell6 жыл бұрын
need... node package... now...
@samtube43585 жыл бұрын
😅😅🖕
@primodernious5 жыл бұрын
so how many millions of years will it take to make a quantum computer laptop?
@muratahmetgenc69422 жыл бұрын
don't worry, General AI would solve this problem over night...
@DineshThakur-hv1jz3 жыл бұрын
2:57 "Welcome to the circus"
@cameleon57242 жыл бұрын
Endless enigmatic book in all languages. You can write a book with mirrors in all languages of the world. You can speak two languages at once, you just need to find the perfect reflection, same content, different translation. Infinite Mirrors. Pi 3.14 XBooks. Hybrid language!!!!!
@jonclement4 жыл бұрын
"...procedure for turning a bunch of noisy qbits into a fewer number of much less noisy qbits.." Time to look up the different flavors of gates
@dishonfano75995 жыл бұрын
Cool...This is cool
@diceblue68174 жыл бұрын
7:06 no, and that's why videos like this are always talking more about python than just listing the gate operations on one hand and explaining them
@Xinvoker6 жыл бұрын
10/10.
@eeeeeek Жыл бұрын
i lost it at qubits
@Thegreatpablopicaso6 жыл бұрын
i thought that i was going to familiar with new programming language but all in vain.
@roberttoyonaga1715 жыл бұрын
Boom!
@2by34 жыл бұрын
Boom! Boom!
@billsandmarks2 жыл бұрын
Now i watch the cats videos
@ravisoni62625 жыл бұрын
So I can attach quantum chip to my computer.
@soufianosse3334 жыл бұрын
yeah, like any external GPU, it works with USB, bluetooth or Wifi.
@ajaykumarbharaj32836 жыл бұрын
How to run the code on real quantum computer?
@aze2165 жыл бұрын
I want that tee shirt.
@jajwarehouse16 жыл бұрын
Ah, quantum computers; the pursuit of adding more hardware to correct the errors that are generated by the hardware before the hardware outputs random results depending on how the hardware feels at the moment, that then have to be quantified by the billions to see which of the possible results were created the most so that we may assume it is the most probable correct answer. I do not want to dismiss quantum computers altogether, but I have yet to see anyone show a single example of a problem that a quantum computer has solved that could easily be solve by a normal computer, let alone a problem that a normal computer could not possibly solve.
@alxjones2 жыл бұрын
This is the idea of quantum supremacy. In the most technical sense, this has been shown, but not in any useful sense. It's extremely reasonable to be skeptical of quantum hardware until someone has build a significant post-NISQ processor, but it has been mathematically proven that some quantum algorithms beat the best known classical ones under the assumption of the existence of said hardware. The fact that the results are random is actually a non-issue in many cases. Many algorithms are hard to solve but easy to check, so we can just verify if the solution is correct and re-run if not. For example, factoring is hard but multiplying is easy, searching an array is hard but resolving an index is easy, solving matrix equations is hard but multiplying matrices is easy, etc. Retry logic is constant-order complexity, so any quantum speedup effects are still there (though possibly requiring larger problems to see real effects).
@cembayraktar20002 жыл бұрын
I don't think they could do this computer. I would do better if I were.
@oorcinus6 жыл бұрын
Please tell me this isn’t the quantum annealing crap yet again.
@rhopsi-q6b5 жыл бұрын
I would NOT call quantum annealing "crap" :) But, no/yes, this is NOT for the annealers. At least not directly.
@noconomyzh12295 жыл бұрын
So far future AI or network programers need to learn qt computer??? Wow.... Intel is enough to me.....