Check out our premium machine learning course with 2 Industry projects: codebasics.io/courses/machine-learning-for-data-science-beginners-to-advanced
@interesting_vdos3 жыл бұрын
I have never seen any other video explaining the concepts of machine learning so clearly. Keep up the great work..!!
@sidduhedaginal4 жыл бұрын
Finally i got perfect trainer for ML, your skills are excellence sir, we are very proud of you sir.
@codebasics4 жыл бұрын
Glad you liked it :)
@matinpathan51864 жыл бұрын
Yes is good but if you like his tutorials then tell your friend to subscribe his channel and hit the like button... that we can do from our side
@manusingh90074 жыл бұрын
On first attempt, i considered 'left' as dependent variable and everything else including salary and department as independent variable, got 77% score of accuracy. Thanks for the wonderful video.
@codebasics4 жыл бұрын
Great job manu. its a good score. Video description has a solution link, you can verify your code with mine.
@riyamitra89012 жыл бұрын
For the first time after so many courses, videos, whitepapers, github, kaggle, exercises, wiki pages I am genuinely enjoying Machine Learning and I am doing all the coding and exercises by myself obviously after learning and understanding it all. Thanks a lot!!!
@codebasics2 жыл бұрын
Glad you like them Riya and I wish you all the best! I have many playlists and recently left my job to focus on online teaching. My goal is to produce even a better quality tutorials then this.
@riyamitra89012 жыл бұрын
@@codebasics I am trying to follow all of your videos to improve in my career. I am trying to get a job with a clear concept.
@riyamitra89012 жыл бұрын
@@codebasics One question here. Why we did not remove one of the dummy variable after dropping salary column in Logistic regression like we did for Linear?
@09_samarpanbasu7 Жыл бұрын
@@riyamitra8901 I think ...as logistic regression can handle multicollinearity between the dummy variables so it's not necessary to drop the last col.
@ansh68483 жыл бұрын
Best course you can get for learning ML is this only. Explanation is super awesome. Actually most of the books and courses shows you complex looking mathematical equations but this guy made all that easy for us.
@PollyMwangi-cp3jn9 ай бұрын
Actually, I fine tuned my model and was able to achieve an accuracy of 1.0. Thankyou so much sir. This might just be the best channel I have seen.🥳
@kibs_neville9 ай бұрын
Hi, I have some slight problem. How can I plot the prediction curve after training my model? Would be glad if you reply. Thanks
@anishagarwal716 ай бұрын
Could you pls tell me what exactly did you do to fine tune it?
@bhawin1012836 жыл бұрын
Perfect explanation with proper examples. Great job.
i'm Not afraid to learn things with complicated term anymore! this teacher is the best at explanation.
@zerostudy75085 жыл бұрын
@@codebasics You are good at it. I thank you.
@MoreBalaji3 жыл бұрын
Perfectly balanced video. It forces anyone to continue to watch other videos of this series. Very well explained in simple language. 👌
@Sarah-st7jp3 жыл бұрын
Sir, I know so surely that I can bank on your data science and python videos when I need to gain an in-depth understanding. Your content gives me the hope and clarity that I needed. God bless you and your undying passion to make such useful content for us. Thank you so much for all your hard-work sir!!! :)
@codebasics4 жыл бұрын
Step by step roadmap to learn data science in 6 months: kzbin.info/www/bejne/fmW8lKSLgb5kY7M How to learn coding for beginners | Learn coding for free: kzbin.info/www/bejne/eaHXo5-veZV_gJo 5 FREE data science projects for your resume with code: kzbin.info/www/bejne/b2aal4R5opqUetE
@hanselsamuel3 жыл бұрын
I really appreciate your tutorial videos, thank you. But how to find the "bought_insurance" values ?
@piyushjha88884 жыл бұрын
78 percent accuracy. I do all your exercises but in this I learned a lot. Thank you sir for such a great series @codebasics
@kalaipradeep2753 Жыл бұрын
Hi bro....
@kalaipradeep2753 Жыл бұрын
Now I learn machine learning.... Now What are you doing. I mean study or work
@piyushjha8888 Жыл бұрын
I work in a bank as a software engineer. This channel is a gem as this explains the ML concept in laymen terms. I was able to give most of the answers related to ML because of codebasics and deep learning Andrew Ng course
@icyjump168410 ай бұрын
I only could get 77 with logisitic, but then i used lazypredict to find a higher accuracy and then used desicion tree classifier to get a 98, ty!
@ObianujuFlorenceOhawuchi6 ай бұрын
You make people feel so welcomed to data field with your teaching skills. You are always the best.
@sunnyveerpratapsingh11025 жыл бұрын
bro you are best .. tried to swirl thru other online videos and then I end up watching your videos and I understand better .
@codebasics5 жыл бұрын
Sunnny Singh, I am happy this was helpful to you
@rutvikkumbhar6736 Жыл бұрын
Any update you can give how's your data science journey is going as I am aspiring to be a data scientist..
@mageshs57264 жыл бұрын
I got 78.833 accuracy value. In this exercise i had known lot of things thank you bro.
@pranavnigam114 жыл бұрын
I also got around ~78.5% using the average monthly hours and satisfaction level. Did you use the same features??
@codebasics4 жыл бұрын
That’s the way to go Magesh, good job working on that exercise
@nilupulperera4 жыл бұрын
Dear Sir What a beautiful datasheet you have provided for practice with this video. Spent more than two days to play with it. Playing with the datasheet opened another dimension of the learning curve. Thank you very much for providing relevant exercises like this as a challenge!
@codebasics4 жыл бұрын
Happy that this is helping you Nilupul.
@siddhantkaushik46063 жыл бұрын
amazing. astounding. bewildering. breathtaking. extraordinary. impressive. marvelous. miraculous. even all these adjectives are less to tell the quality of the video. Thanks a million.
@codebasics3 жыл бұрын
ha ha .. nice. you made my day with this shower of praise Siddhant. Thank you for your kind words :)
@adityabikramarandhara94772 жыл бұрын
Thanks a lot for the lucid explanation. In the exercise, I got an accuracy of 77.2% in my model prediction.
@kalaipradeep2753 Жыл бұрын
Hi bro... Now I learn machine learning... What are you doing.... I mean study or work
@nguyentuanhung3858 Жыл бұрын
Started learning machine learning on your youtube. Absolute Masterclass , you are my real teacher sir!!!
@pamp36572 жыл бұрын
One of the few videos that clearly shows the training data that the model is attempting to fit to. Thank you.
@michaelpeel37162 жыл бұрын
Many thanks this is the first explanation that provides context and examples making its so simple to understand.
@codebasics2 жыл бұрын
Glad you liked it Michael
@engihabit2 жыл бұрын
@@codebasics 15:35 I can’t execute it?? model.predict(57) and any number like 25, 60 got the following ValueError: Expected 2D array, got 1D array instead: array=[57].
@shivangitomar55574 жыл бұрын
You are the best teacher! I love the exercises at the end of each topic, which strengthens our understanding of what we learnt!!! Thank you so much! :)
@codebasics4 жыл бұрын
I am glad it was helpful. :)
@businnovate5 ай бұрын
Why didn't you plot the sigmoid curve but only showed the scatter plot?
@khalidhasan17932 жыл бұрын
I paused the video and commented, it's an excellent series that begins with ML.
@shreyasb.s38192 жыл бұрын
I never seen anyone explaining simple as like this. Others making complicated like maths intuition. Thanks code basics
@pratikghute23432 жыл бұрын
I have no other words to say, the comments done by others have already conveyed my message to you!, Lots of love and thank you !
@mario1ua Жыл бұрын
Great explanation, I've understood everything, thanks!
@codebasics Жыл бұрын
Glad you found it helpful!
@GeorgeTrialonis5 жыл бұрын
Thank you very much for the videos on ML, AI, Python, etc. They help me learn a lot. Your explanations are clear and well understood. Thanks.
@codebasics5 жыл бұрын
George I am glad 😊
@MohamedAshraf-zs6nv4 жыл бұрын
at 14:00 , why you used model.score(x_test, y_test), as you have to replace x_test with the prediction from the model to be like this: prediction = model.predict(x_test) model.score(prediction, y_test) then you can compare the prediction with the real value and so you can get the actual score
@codebasics4 жыл бұрын
Score function will internally calculate y predicted from x test. After that it will compare y predicted with y test.
@suleimanachimugu90482 жыл бұрын
0.84 accuracy score and 0.83 mean cross validation score. Thanks, Your tutorials have been helpful.
@antimuggle_ridhi2565 Жыл бұрын
how?
@SandeepYadav-pm8yc5 жыл бұрын
Finally got the Python version of Andrew Ag's machine learning course. With a better explanation. thanks.
@codebasics5 жыл бұрын
😊👍
@zestful144 жыл бұрын
This video is fantastic. I'm teaching myself machine learning and this was one of the most helpful resources I've found online. Excited to watch/work-through the rest of the videos! Thank you so much
Solution link for the exercise: github.com/codebasics/py/blob/master/ML/7_logistic_reg/Exercise/7_logistic_regression_exercise.ipynb Step by step guide on how to learn data science for free: kzbin.info/www/bejne/jJ_CnqCFqraeiaM Machine learning tutorials with exercises: kzbin.info/www/bejne/nZ7Zp5Sll9Jqm7M
@shreyaasthana7313 Жыл бұрын
Sir I tried this method, it is very easy to understand and use.. thank you sir
@guneyfatma Жыл бұрын
Thnaks a lot for theese amazing contents. I have just discovered your videos!
@abhishekdobliyal71784 жыл бұрын
Thank You Sir, I have learned a lot from your vids :). I was really perplexed by Logistic Regression and I am glad KZbin recommended this to me :)
@tariqahassan56927 ай бұрын
it is one of the fantastic videos about Logistic Regression .. Many thanks
@Malayalam_learnerАй бұрын
Daily 2 videos of your ml playlist completes my day❤
@mehmetkaya43306 жыл бұрын
Thank you again! Great explanation! Always great tutorials!
@shuaibalghazali3405 Жыл бұрын
Thanks a lot for this i was able to implement logistic regression after so many tutorials
@vinitasharma50252 жыл бұрын
very useful video.... you explain everything in a very simple manner. Thank you
@codebasics2 жыл бұрын
Glad it was helpful!
@flyingsalmon2 жыл бұрын
I love your tutorials. They're perfectly paced, with right amount of context and explanation, great examples, and patient but efficient delivery. I hope you continue to produce more. Subscribed here and also Liked all of the videos I've found so far from you. Best.
@codebasics2 жыл бұрын
👍🤗
@leooel46506 жыл бұрын
Awesome as always, thanks for everything! i got a 77% model accuracy based on the satisfaction_level
@jsbean84154 жыл бұрын
How did you get the prediction model accuracy by depedent variable? And 77% meaning is the probability that they will leave the company?
@nxbil23974 жыл бұрын
@@jsbean8415 model.score()
@jsbean84154 жыл бұрын
@@nxbil2397 that will show you the overal accuracy of your model. My question is , how you will get the probablity % that the employee will leave given the dependent variables? Like the one you have mentioned "satisfaction level".
@asamadawais3 жыл бұрын
Dhavel you are excellent in explaining difficult concepts in very simple language!
@codebasics3 жыл бұрын
I am happy this was helpful to you.
@sundayagu20783 жыл бұрын
God bless you and may He provide angles to solve all your problems. Thank you
@userhandle-u7b6 ай бұрын
Thank you so much, sir. I've got the score in the exercise 0.797. 🙂
@fahadreda30606 жыл бұрын
Another Great Tutorial, Thank you sir, Waiting for the next tutorial, keep up the good work
@boughrood Жыл бұрын
Very interesting and useful - well presented too
@mapa5000 Жыл бұрын
Thank you very much ! Your videos are always my best choice to learn ML
@flyingsalmon2 жыл бұрын
I have an interesting problem for you that I think you'll really enjoy sharing in a video. We would like to know your approach to solving the following problem: We have a soccer (euro football) game where there are penalty picks. We have data on who's the shooter/kicker and who's the goalie, and how many historically have been saved or not saved by the goalie as follows: (Assume the Shooter and Goalie letters are names of players and Saved is 0 (goal not saved, saved) Shooter Goalie Saved A Z 1 B Z 0 A Z 0 B Y 0 A Y 1 .... and so on for various rows Challenge: What we want to know is if shooter B is facing goalie Z, will the goal be saved or not? We want to leverage machine language (not just probability calculation, which can be done manually based on data). How should we solve this? Many thanks in advance! If you make a video on such, I'll even donate :) Promise!
@0xN1nja2 жыл бұрын
one of the best explanation I've ever seen
@devanshgoel90703 жыл бұрын
Thank you sir for this amazing explanation of Logistic Regression.
@codebasics3 жыл бұрын
Glad you liked it
@mponcardas945 жыл бұрын
I love your series of videos as you are concerned with the student's learning! Thanks!
@harshvardhanpardeshi3584 Жыл бұрын
for anyone facing problem in using the df.groupby().mean() function, just drop the non numeric columns. It'll work :)
@JunaidAnsari-my2cx5 ай бұрын
Tysm It worked
@jugabrat6797 Жыл бұрын
Your way of teaching is very good. Thanks for the video ❤❤❤
@pablu_74 жыл бұрын
Reasons behind the employee retentions are: 1.Satisfaction Level 2.Average Monthly Hours 3.Promotion Last 5 Years
@nxbil23974 жыл бұрын
Noted that
@MrPrudhvisai4 жыл бұрын
why not work accident?
@arshdeepsingh82894 жыл бұрын
@@MrPrudhvisai work accident should be there
@VyNguyen-xy3il Жыл бұрын
Sir, I extremely appreciate your videos and efforts in teaching these things. Very helpful and great explanation!!
@codinghighlightswithsadra7343 Жыл бұрын
Thanks a bunch, Subscribed here and also Liked all of the videos I've found so far from you. Best.
@bestineouya57164 жыл бұрын
Actually you are the best explainer
@RiefvanAchmadMasrury5 жыл бұрын
Really nice and clear explanation, will be very useful for my students
@codebasics5 жыл бұрын
Sure riefvan, feel free to use this content in your classroom. May I know which country, university and school are you from?
@RiefvanAchmadMasrury5 жыл бұрын
@@codebasics It's in Indonesia, Telkom University :)
@anuk98463 жыл бұрын
thank u for the video as i m a teacher it helped me a lot
@codebasics3 жыл бұрын
👍🙏🙏
@noorameera263 жыл бұрын
This video is really really good. Love the way you teach, your pacing and all the things you mentioned are really useful. Thank u and may god bless u!
@codebasics3 жыл бұрын
Glad it was helpful!
@study_with_thor3 жыл бұрын
Perfect explanation!
@codebasics3 жыл бұрын
Glad it was helpful!
@namitakala3935 ай бұрын
thankyou for making videos, your content is great
@1bitmultiverse3 жыл бұрын
I got 80% accuracy in my model of HR analytics! I used satisfaction_level,number_project,average_montly_hours,promotion_last_5years,salary as my independent variables
@codebasics3 жыл бұрын
Good job Banzer, that’s a pretty good score. good job working on that exercise
@1bitmultiverse3 жыл бұрын
@@codebasics Thank you soo much I am working on it and following all of your playlists ♥😍 Thanks for this playlist ♥
@mridulahmed71865 жыл бұрын
wow so nice. Thanx to explain in a very nice way.
@codebasics5 жыл бұрын
Mridul I am happy it helped you 😊
@mridulahmed71865 жыл бұрын
@@codebasics make this kind of tutorial more and more. Your teaching methods are very nice. All the good wishes for you.☺️☺️
@codebasics5 жыл бұрын
Sure mridul, I am trying my best and I am actively working on adding many more tutorials 👍
@prakritreeeeeee2 жыл бұрын
Thank you so much for the graphical explanation...the concepts are crystal clear in my mind now.
@phaniauce4 жыл бұрын
Awesome explanation. I like this practical math and algorithmic explanation.
@bhavyanaik744 жыл бұрын
Thank you so much....very good explanation
@codebasics3 жыл бұрын
Glad it was helpful!
@abdulmuzakir4 жыл бұрын
Thanks, sir .. your explanation is really clear and so easy to understand 👍🏼
@SohelRana-eq4ib3 жыл бұрын
You are the best teacher
@MultiSpiros1235 жыл бұрын
Thanks one of the best tutorials !
@codebasics5 жыл бұрын
Glad you liked it
@PolkadOfficial Жыл бұрын
Very well explained, thanks!
@codebasics Жыл бұрын
Glad you enjoyed it.
@sameer-verma3 жыл бұрын
tried on three different set of features and the accuracy scores are: 0.799 (keeping all the features), 0.763 (keeping five features selected by chi square) and 0.785 (keeping variables based on correlation test/filter). below is my entire script: import pandas as pd import numpy as np from sklearn.model_selection import train_test_split from sklearn.linear_model import LogisticRegression from sklearn.feature_selection import SelectKBest from sklearn.feature_selection import chi2 from matplotlib import pyplot as plt import seaborn as sns %matplotlib inline pd.set_option('display.max_columns',15) hrd = pd.read_csv("D:/SelfLearning/python/data/HR_comma_sep.csv") hrd.head() hrd.describe() hrd.pivot_table(values='salary',index=['satisfaction_level'],columns=['left'],aggfunc="count") hrd.pivot_table(index='time_spend_company',columns='left',values='salary',aggfunc="count") hrd.groupby('number_project').count()[['left']] hrd.groupby('time_spend_company').count()[['left']] hrd.groupby('salary').count()[['left']] hrd.groupby('Department').count()[['left']] sts_conditions = [(hrd['satisfaction_level']0.2) & (hrd['satisfaction_level']0.4) & (hrd['satisfaction_level']0.6) & (hrd['satisfaction_level']0.8)] sts_values = [0.2,0.4,0.6,0.8,1] last_evl_conditions = [(hrd['last_evaluation']0.2) & (hrd['last_evaluation']0.4) & (hrd['last_evaluation']0.6) & (hrd['last_evaluation']0.8)] last_evl_values = [0.2,0.4,0.6,0.8,1] hrs_condition = [(hrd['average_montly_hours']100) & (hrd['average_montly_hours']200) & (hrd['average_montly_hours']300)] hrs_values = [100,200,300,400] tm_spent_conditions = [(hrd['time_spend_company']2) & (hrd['time_spend_company']4) & (hrd['time_spend_company']6) & (hrd['time_spend_company']8)] tm_spent_values = [2,4,6,8,10] no_prjct_conditions = [(hrd['number_project']2) & (hrd['number_project']4) & (hrd['number_project']6) & (hrd['number_project']8)] no_prjct_values = [2,4,6,8,10] hrd['satisfaction_lvl'] = np.select(sts_conditions,sts_values,default=0) hrd['last_eval']=np.select(last_evl_conditions,last_evl_values,default=0) hrd['avg_monthly_hrs'] = np.select(hrs_condition,hrs_values,default=0) hrd['tm_spent_company'] = np.select(tm_spent_conditions,tm_spent_values,default=0) hrd['no_projects_done'] = np.select(no_prjct_conditions,no_prjct_values,default=0) hrd['departments_group'] = ["sales" if i == "sales" else "technical" if i=="technical" else "others" for i in hrd.Department] hrd.pivot_table(index='last_eval',columns='left',values='salary',aggfunc="count") hrd.pivot_table(index='avg_monthly_hrs',columns='left',values='salary',aggfunc="count") hrd.pivot_table(index='tm_spent_company',columns='left',values='salary',aggfunc="count") hrd.groupby('departments_group').count()[['left']] sal_dummies = pd.get_dummies(hrd['salary'],prefix='sal') dept_grps = pd.get_dummies(hrd['departments_group'],prefix='dpt') sal_dummies.head() dept_grps.head() hrd_new = pd.concat([hrd,sal_dummies,dept_grps],axis=1) hrd_new.shape hrd_new.info() hr_data = hrd_new.drop(['satisfaction_level','last_evaluation','number_project', 'average_montly_hours','time_spend_company','Department', 'departments_group','salary','sal_high','dpt_others'] ,axis='columns') #hrd2=hrd[['satisfaction_bins','left']] #hrd2.head() #hrd2.plot(kind='bar', stacked=True) #plt.show() #sns.pairplot(hrd) hr_data.info() hr_data.shape hr_data.pivot_table() sns.pairplot(hr_data) hr_data[['satisfaction_lvl','left']].plot(kind='bar',stacked=True) eda=sns.barplot(x="promotion_last_5years",y="satisfaction_lvl",hue='left',data=hr_data).title('testing') plt.show() x = hr_data.drop(['left'],axis=1) y = hr_data.left y.head() x_new = SelectKBest(chi2, k=5).fit_transform(x,y) x_new hr_data.var() plt.figure(figsize=(10,10)) sns.heatmap(hr_data.corr(),annot=True,fmt='.1g',vmin=-1, vmax=1, center=0,cmap='coolwarm' ,square=True) plt.show() cor = abs(hr_data.corr()) cor plt.figure(figsize=(10,10)) sns.heatmap(cor,annot=True,fmt='.1g',vmax=1,vmin=-1,center=0) a=cor['left'] features=a[a>0.5] features x = hr_data.drop(['left'],axis=1) y = hr_data.left x_train,x_test,y_train,y_test = train_test_split(x,y,test_size=0.2,random_state=10) model=LogisticRegression() model.fit(x_train,y_train) model.score(x_test,y_test) #features selection based on chi square x_new = SelectKBest(chi2, k=5).fit_transform(x,y) y = hr_data.left x_train,x_test,y_train,y_test = train_test_split(x_new,y,test_size=0.2,random_state=10) model=LogisticRegression() model.fit(x_train,y_train) model.score(x_test,y_test) #features select based on correalation x = hr_data[['Work_accident','satisfaction_lvl','tm_spent_company' ,'last_eval','avg_monthly_hrs','sal_low','sal_medium' ,'dpt_sales','dpt_technical']] y = hr_data.left x_train,x_test,y_train,y_test = train_test_split(x,y,test_size=0.2,random_state=10) model=LogisticRegression() model.fit(x_train,y_train) model.score(x_test,y_test)
@gusinthecloud3 жыл бұрын
The best explanation as always
@codebasics3 жыл бұрын
Glad you liked it
@shylashreedev26852 жыл бұрын
Thank u so much..it really helped to clear my concepts
@ravisrivastava29094 жыл бұрын
Hello Sir! Great work by you. there is a problem in your code may be due to version of python .... If we use X_train, X_test, y_train, y_test = train_test_split(df[['age']],df[['bought_insurance']],train_size=0.9) then only we get score of 1 otherwise with your code it is 0.66.
@Myanmartaeyang4 жыл бұрын
OMG. I was trying to run the code and did not know why mine was 66%. Thank you for this !!!
@paramjeetgill15585 жыл бұрын
Very nice and you present easiest way to understand. Thank you
@micro_Dots3 жыл бұрын
clearly understandable explanation. Thank you so much.
@shreyjoshi184 жыл бұрын
How to plot the sigmoid function over the scatter plot in the graph in a single visualisation ?
import numpy as np import pandas as pd import math import matplotlib.pyplot as plt %matplotlib inline x = np.arange(-10,11,1) df1 = pd.DataFrame(data = x, columns=["X"]) list = [] def sigmoid(x): sigmoid = 1/(1+ pow(math.e,-x)) return sigmoid for i in x: list.append(sigmoid(i)) df2 = pd.DataFrame(list, columns=["Y"]) df = pd.concat([df1, df2], axis = 1) X = df.X Sigmoid_X = df.Y plt.figure(figsize=(10,10)) plt.xlabel("X") plt.ylabel("Sigmoid_X") plt.title("Sigmoid_X versus X") plt.scatter(X,Sigmoid_X) plt.plot(X,Sigmoid_X) plt.show()
@laxpanwar115 ай бұрын
Thanks. It is really informative.
@guneyfatma Жыл бұрын
Why do we write x and y arguments in split method? Is it because of syntax?
@ChiTamNguyen-d9d Жыл бұрын
At 8:00, when converting the linear regression line to the logistic regression line, I think the point where y = m*x + b = 0 (left corner) is also the point where the y = 1 / 1 + e^-(m*x + b) = 0.5 (inflection point). Why are the two graphs are different? Or am I missing something?
@dongliu77393 жыл бұрын
Great! by the time of 5:17, I understand sth I did not figure out by hours
@khalidfahim10011 ай бұрын
I was doing the exercise and noticed something in the solution you provided. I think you need to drop one of the salary columns to avoid dummy variable trap. This was not done in the solution python notebook provided. Btw, excellent tutorial!!!
@aashishshrestha81803 жыл бұрын
Very Clear Explanation
@codebasics3 жыл бұрын
Glad it was helpful!
@pagnonig4 жыл бұрын
Your videos are awesome. I'm learning so much!
@PlayingCode3 жыл бұрын
Hey plz tell me what to do when I have multiple Columns like age ,weight , bmi that i need to consider for prediction
@danielnderitu58862 жыл бұрын
I like your tutorials very much, the explanation therein is superb and makes one understand even very hard to grasp concepts.
@Sparshchokra5 жыл бұрын
Hi, i found your course is truly enhancing the path towards Machine Learning concepts, kindly continue this and sir achieve a complete set of this machine learning course including all the kick start algorithms. Thanks Sparsh
@codebasics5 жыл бұрын
Thanks for appreciation Sparsh. I am continuing the series, it is just that due to my schedule I am not finding lot of time to work on it but I will try my best to speed up new tutorial additions.
@Hari9833 жыл бұрын
Very well done and explained even for beginners - thank you so much!
@REPATHATHYALA2 жыл бұрын
Hi, thank you for the video, how do we plot the output into a graph as you have shown in the initial part of the video... can you please help
@lakshmanans37602 жыл бұрын
I am also having the same doubt. I am getting a very wierd graph with multiple lines if i try. can you please help @codebasics
@REPATHATHYALA2 жыл бұрын
@@lakshmanans3760 Hi Lakshmanan, I am learning, sorry, at this stage I may not help you, but I will keep this problem in my mind... thank you.
@ayushpant61903 жыл бұрын
I have a question, how is the slope value updated for logistic regression when finding the best fit plane or line to seperate the 2 classes? At first I thought gradient descent but that doesn't work because it is used to find the global minima
@ojassaxena28913 жыл бұрын
Maaeri aapahi hans di Maaeri aapahi rondi Maaeri yaad yaad wo aaeri
@shashikantpandit6634 жыл бұрын
awesome explanation....really
@codebasics4 жыл бұрын
😊👍
@anselmtuachi32982 жыл бұрын
class example: 2+2 = 4 homework: if 3+3 = 6, calculate the mass of the sun😂😂 i'm really grateful for this tutorial though its been really great till this point
@Christian-mn8dh5 жыл бұрын
just subscribed, your very good at explaining. thank you!
@codebasics5 жыл бұрын
I am glad you liked it Pablo
@vikramjitsingh676910 ай бұрын
The instructor is doing fabulous work in teaching these concepts. But aspiring Data Scientists don't just learn how to use libraries, and know whats the maths behind ... u should know what derivative is, gradient is, what log functions sigmoid etc as in later stages you would need these. Avoiding maths can not make u good with this.
@codebasics10 ай бұрын
Vikram, in this playlist I have a math video related to gradient descent. Also I have a separate playlist called "math and statistics for data science" where I covered few of those topics.
@bandhammanikanta16645 жыл бұрын
Perfect explanation on logistic regression. Loved it. Thanks a lot.