Rasa Algorithm Whiteboard - Incremental Training

  Рет қаралды 7,801

Rasa

Rasa

Күн бұрын

Пікірлер: 20
@parthmadan671
@parthmadan671 2 жыл бұрын
Exactly, what I was looking for. Thanks
@LNMLucasMasiero
@LNMLucasMasiero 11 ай бұрын
Love this video. Love the analysis behind the execution you've made to create a more incremental and modular training. Thinking further away on the scalability. I've found that machine learning usually encounters tons of centralization or vertical and monolithic scalability which is annoying... I hope as in microserviced architecture, models can learn incrementally too by reusing previous knowledge. For example a model that first learns the language itself, then learns based on the pre trained neurons of the output layer, medical informacion, laws, etc. In this way you reuse training and scales. But wonder why I don't see any videos talking about this.
@ifranrahman
@ifranrahman 2 жыл бұрын
Hi. Is there any paper which shows the Incremental learning approach you have demonstrated?
@growwitharosh5052
@growwitharosh5052 3 жыл бұрын
Hi, is it possible to add incremental learning by creme with cnn in facial emotion recognition?
@RasaHQ
@RasaHQ 3 жыл бұрын
(Vincent here) The creme project recently merged with another project to become river, more info here: github.com/online-ml/river. While I know some of the creators of that package, I should stress that it's completely unrelated to Rasa. Two final comments; what use-case exactly do you have with facial emotion recognition? There's ethical concerns in that realm. I might run through this check-list before proceeding: deon.drivendata.org/. Finally, when you train on batches of data, you're also technically learning on a stream. Keras and Pytorch also natively support online learning in that sense.
@interesting2906
@interesting2906 3 жыл бұрын
Is there any chance I can speed up the process of training my model from scratch on my laptop? I am talking not about adding some new data to the existed dataset, but speeding up the very first training of the model. Is there any chance I can do something about it without buying a new PC with large amount of RAM and GPU's and so on...? Thank you in advance
@RasaHQ
@RasaHQ 3 жыл бұрын
Maybe but it depends on why your training is slow. In general, I've found that the most common cause for a Rasa model training really slowly is a large number of intents, so reducing that might help. -R
@atinsood
@atinsood 4 жыл бұрын
Vincent, in addition to reduced number of epochs, wouldn't it make sense to over sample A and B datasets and under sample the original dataset when re-training.
@RasaHQ
@RasaHQ 4 жыл бұрын
(Vincent here) You could experiment with it, but you need to remain careful. You want to prevent "catastrophic forgetting".
@atinsood
@atinsood 4 жыл бұрын
@@RasaHQ any chance that this notebook/code and dataset that you ran the numbers on is available on github. I can try the experiement on the same notebook and report back the numbers.
@RasaHQ
@RasaHQ 4 жыл бұрын
@@atinsood the command line arugments are described here blog.rasa.com/rasa-new-incremental-training/ and the dataset that we use can be found here github.com/RasaHQ/rasa-demo
@avinashpaul1665
@avinashpaul1665 4 жыл бұрын
Do you also keep extra empty classes for output. For example is A is trained with 3 classes and the B is trained with 2 classes the final modle should o/p 5 classes
@RasaHQ
@RasaHQ 4 жыл бұрын
(Vincent here) Nope, as the video mentions, you *must* keep the size of the weights the same. So the current approach will not work if you add new labels.
@bofenghuang7763
@bofenghuang7763 3 жыл бұрын
Does it get the model overfitted if we repeat this process each time getting new training examples ? Should we set a limit for the max times of re-training ? Thanks in advance.
@RasaHQ
@RasaHQ 3 жыл бұрын
(Vincent here) This is hard to say upfront. It will depend a lot on your dataset. On the datasets that I've tried, it seems like 5-10 epochs of fine-tuning is enough.
@TechWithUmraz
@TechWithUmraz 3 жыл бұрын
hey can i have a sample code of keras incremental learning model for binary image classification(i mean loading previous model and adding new weights and features to it) thanks :)
@RasaHQ
@RasaHQ 3 жыл бұрын
(Vincent here) The incremental training feature shown in the video is designed to work for Rasa pipelines and isn't made to support general Keras models.
@nishantkumar3997
@nishantkumar3997 3 жыл бұрын
Hi Vincent, Can you also link the research paper for this method of incremental learning ? #RASA
@RasaHQ
@RasaHQ 3 жыл бұрын
(Vincent here) We never wrote a paper about this feature. To my knowledge it's also not that novel. SpaCy does something very similar and the partial_fit mechanic in scikit-learn can also be used for this.
@nishantkumar3997
@nishantkumar3997 3 жыл бұрын
@@RasaHQ Hi Vincent , thank you for the information . I’ll also check out Spacy. Again love your videos Vincent. They are really good. Thank you.
Rasa Algorithm Whiteboard - Bulk Labelling UI
2:58
Rasa
Рет қаралды 1,7 М.
Rasa Algorithm Whiteboard - TED Policy
16:10
Rasa
Рет қаралды 10 М.
coco在求救? #小丑 #天使 #shorts
00:29
好人小丑
Рет қаралды 120 МЛН
小丑女COCO的审判。#天使 #小丑 #超人不会飞
00:53
超人不会飞
Рет қаралды 16 МЛН
Support each other🤝
00:31
ISSEI / いっせい
Рет қаралды 81 МЛН
Rasa Algorithm Whiteboard - Language Agnostic BERT
12:51
Continual Learning and Catastrophic Forgetting
42:07
Paul Hand
Рет қаралды 14 М.
Why Does Diffusion Work Better than Auto-Regression?
20:18
Algorithmic Simplicity
Рет қаралды 447 М.
Online Machine Learning With RiverML by Max Halford
25:00
Rasa Algorithm Whiteboard - Universal Sentence Encoder
11:43
All Machine Learning algorithms explained in 17 min
16:30
Infinite Codes
Рет қаралды 571 М.
MAMBA from Scratch: Neural Nets Better and Faster than Transformers
31:51
Algorithmic Simplicity
Рет қаралды 221 М.
coco在求救? #小丑 #天使 #shorts
00:29
好人小丑
Рет қаралды 120 МЛН