Thank you for making this package and sharing it with the community! It's awesome!⭐⭐⭐⭐⭐
@taraskywalker453 Жыл бұрын
omg your videos are not just simple, but incredibly impressive too! The sheer extent of your academic coverage is so appreciated! Thank you for providing all of these resources
@doggodotjl Жыл бұрын
You're welcome! Thanks for watching!
@LasmaHan Жыл бұрын
THX for always. Nowadays, I am frustrated with Julia due to its slooow debugging speed. Especially debugging the massive iteration code makes me crazy. Could you handle this topic once?
@doggodotjl Жыл бұрын
Sorry for the frustration. I am planning on covering debugging at some point, but it probably won't be for a while. If you don't mind me asking, is the slow debugging in VS Code?
@LasmaHan Жыл бұрын
Oh, I have not realized what I didn't response to you. As you guessed it, I'm using VScode as a IDF for JULIA!
@TobeFreeman Жыл бұрын
Dear Doggo.jl, I love these presentations. I only wanted to say you are using a legacy pattern in Flux to set the optimisation parameters. Optimisations.jl has moved out of FluxML and I wondered if you could update the repos to reflect this?
@doggodotjl Жыл бұрын
Thanks for letting me know. I didn't realize that Flux had changed. I'm happy to update my code in GitHub but I need some help since I can't get the code to work. Here's what I did. I added OptimizationOptimisers in the Package Manager. I added the lines "using OptimizationOptimisers" and "using OptimizationOptimisers.Optimisers: Adam" to my code and I removed the related Flux lines. But when I try to train the model I get the following error message: "ERROR: Optimisers.jl cannot be used with Zygote.jl's implicit gradients, `Params` & `Grads`". If you'd prefer to submit a PR, I'd be happy to take a look at it. Thanks!