I need your help to customize/modify the MediaPipe models to have better pose detection. Is there any tutorial?
@MohammadFaizanKhanJ10 күн бұрын
In machine learning, knowledge distillation or *model distillation* is the process of transferring knowledge from a large model to a smaller one. While large models (such as very deep neural networks or ensembles of many models) have more knowledge capacity than small models, this capacity might not be fully utilized. It can be just as computationally expensive to evaluate a model even if it utilizes little of its knowledge capacity. * Knowledge distillation transfers knowledge from a large model to a smaller one without loss of validity *. As smaller models are less expensive to evaluate, they can be deployed on less powerful hardware (such as a mobile device)
@genericmeme8 ай бұрын
I am obsessed with the ditty at the beginning of the video! The rest of the video is also good, I suppose.
@alexd-tl3mw Жыл бұрын
Hahahaha, the model maker is my IBDP ICT EE! I have use the old one to do a lot of project! Thanks!
@Villaainn Жыл бұрын
When I use the Average word classifier in the colab notebook and try to export it its not exporting the vocab.txt
@viettruonghoang4879 Жыл бұрын
thank for your contribute
@nikhilbwadibhasme Жыл бұрын
Hope we can use all features in js
@khanhleviet5416 Жыл бұрын
Yes you can use all features in JS today.
@nikhilbwadibhasme Жыл бұрын
@@khanhleviet5416 thanks . I thought objectron and few features were not supporting in web( codepen examples isn’t working on iPhone XR)
@yapayzekaokulum Жыл бұрын
How I can find Colab file?
@dnksoftware5178 Жыл бұрын
thank you!
@JL-tt1qw Жыл бұрын
where to download slides
@tomschuring Жыл бұрын
is there still a c/c++ interface ? or has this been dropped ?
@khanhleviet5416 Жыл бұрын
The C++ API is still there. You'll need to build with bazel to integrate with it. All other language wrappers (Python, JS, Java etc.) are built on top of the C++ API.