Amazing teacher Very nicely Explains each and every topic I watch every video I have Cleared so many Concepts.
@amarxkumar5 жыл бұрын
Really Nice Tutorial, I was all confused. Later i got to know thats the offset,LN# in the Map as input. Very nice video. Keep making more.
@sarveshkumaryadav62222 жыл бұрын
NICE........., Your accent is wonderful
@allmix53845 жыл бұрын
while running hadoop fs -put file.txt file i am getting below error, Could you please Assist 19/02/20 14:45:33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable put: /home/klu/big data pracice/file._COPYING_ (Permission denied)
@prajwaldeepkhokhar74164 ай бұрын
Great explanation bro. Just one question bro, what is that 36 in the second input line of map reduce
@sudipchatterjeetube3 жыл бұрын
Can you give the link of the code?
@kiranugalmugale4347 Жыл бұрын
share/haddop/mapreduce -> lib folder is missing, what should I do?
@the_sky5 жыл бұрын
Ab time aa gaya hai ke aap hadoop he sare topics like HDFS, YARN and sab ke detailed video series banaye.
@ysirisha3412 Жыл бұрын
Implement a simple map-reduce job that builds an inverted index on the set of input documents (Hadoop) please explain this program
@ronaksengupta61745 жыл бұрын
Word length agar calculate krna hai toh kyase krna padega ?
@karangupta64025 жыл бұрын
Hi Sandeep, you have really made very nice videos. I have thoroughly enjoyed your videos. Also, the examples are excellent and easy to understand.
@the_sweet_heaven3 жыл бұрын
If file is of size peta byte the how would you count words?
@shubhamsen45724 жыл бұрын
awesome attitude of teaching.
@ravjotsingh5376 жыл бұрын
// from where i can find the direct jar file of word count??
@ManidipMandal7 жыл бұрын
I have installed the latest hadoop 3.0. but there is no lib folder or nor jar file within the lib folder. What to do.
@syedimran52647 жыл бұрын
bro can u make a video for mapreduce step by step.. actually ye pata nahi chalra ye class kyun liye kaisa flow rehna programme ka.. Plz make it in detail.. By the way ur doing awesome for beginners hats off bro. 🎩
@TechnologicalGeeksHindi7 жыл бұрын
+Tech With Me Thank you bro , Actually Jo jar files hamne import karwai hai un jar files ke under mapreduce ke methods hai , aur is wale program me teen methods use ki hai map , reduce aur main method , aur Jo program ka flow hai wo main method me mention kiya gaya hai , Actually ye video mapreduce ke theory video pr based hai Jo Maine is video se pehle upload kiya hai , I'll suggest you to watch that video and java basics for hadoop once 😊 For any query you can contact me sdp117@gmail.com
@harshal31232 жыл бұрын
Thanks sir, It helps alot
@peoples_channel5 жыл бұрын
I am getting an exception when I am executing Hadoop jar command. Exception in thread main java.lang.classnotfoundexception Can you please help?
@dharmendranamdev45624 жыл бұрын
as you said mapper takes input as (intwritable,text) format but you didn't specify such format in your text file.could you please make me understand
@TechnologicalGeeksHindi4 жыл бұрын
We don't have to specify anything in text file , the mapreduce is designed in such a way that it will automatically read using the InputFormat class
@jeskipurohit85556 жыл бұрын
My students enjoyed Ur video and have learn hadoop faster by your video. Will u plz make one video for load balancer?
@shaikhharoon69283 жыл бұрын
can this be done on hadoop 3...?
@nishantkashyap676 жыл бұрын
Hello, could you please help me to resolve below error: The package org.apache.hadoop.fs is accessible from more than one module: hadoop.common and hadoop.mapreduce.client.jobclient
@trickyvines21656 жыл бұрын
full series nhi hai nd installation kese hoga Windows pr.??? starting bataiye coding ki
@jayantapte99036 жыл бұрын
Hi sir , I have one question let say we have two nodes with having 4 GB ram each and let say we are executing 6 mappers task on datanode1 which needed 6 GB task ...how this would happend...does it transfer 2 GB data to node 2 using bandwidth transfer and perform map operation for 2 map in node 2?
@pradeeppareek74336 жыл бұрын
dear sir , I am new to java i am getting error while running the program error array index out of bound
@achin41407 жыл бұрын
Dear sir please explain about any one of the Analytical methods of Big data which shows statistics for model building and evaluation is important. Please explain sir please
@piyush981827 жыл бұрын
hdfs ki command reference ke practice ke liye aur videos dalo sir. great word. post more videos related to hadoop.
@TechnologicalGeeksHindi7 жыл бұрын
Sure brother ...
@vnansen6 жыл бұрын
It is still showing the below erros.. can someone help Description Resource Path Location Type Syntax error, parameterized types are only available if source level is 1.5 or greater WordCount.java /WordCount/src line 66 Java Problem Syntax error, parameterized types are only available if source level is 1.5 or greater WordCount.java /WordCount/src line 72 Java Problem The class file Mapper contains a signature '(Lorg/apache/hadoop/mapreduce/Mapper.Context;)V' ill-formed at position 74 WordCount.java /WordCount/src line 34 Java Problem Syntax error, parameterized types are only available if source level is 1.5 or greater WordCount.java /WordCount/src line 34 Java Problem
@bigmugable3 жыл бұрын
Simple and super!
@devilatgood12236 жыл бұрын
can you elaborate with any program which uses a dataset attributes
@sachinvishwakarma41332 жыл бұрын
where do i get the code?
@achin41407 жыл бұрын
i beg you sir please tell what is Naive Bayesian Classifier in Big data analytics ?
@TechnologicalGeeksHindi7 жыл бұрын
+Achin Will be working on it 😊
@recordingsp78335 жыл бұрын
You can refer this link it's a good explanation with examples kzbin.info/www/bejne/noG0moiha5uVgq8
@abcdefghi97766 жыл бұрын
Hi Sir, Please make a video for flume too.
@vamshikrishna1984 жыл бұрын
👌👌👌explanation
@yogeshupadhyay28367 жыл бұрын
very nyc video sir ji i hope u will upload next video of deeply hive
@TechnologicalGeeksHindi7 жыл бұрын
+Yogesh Upadhyay Sure brother 😊
@suraj82852 жыл бұрын
bery help phul sarji
@poojachougule71626 жыл бұрын
sir how to add file in multinode hadoop
@piyushchoudhary77596 жыл бұрын
Hello sandeep sir awesome videos it help us lots to understand hadoop, could please make details videos for Sqoop and Flume n Zookeeper.
@prashantsharma-xk9sd7 жыл бұрын
sir another nice video and nicely explained .... but what about bucketing in hive video?
@TechnologicalGeeksHindi7 жыл бұрын
+prashant sharma Thank you bro 😊 Next video bucketing ka hi hai 😊
@syedimran52647 жыл бұрын
mai namenode format kia uske bad namenode start nahi hora bhai.. uske liye kya karna hai plz tell me plz.
@TechnologicalGeeksHindi7 жыл бұрын
+Tech With Me Execute following commands stop-dfs.sh stop-yarn.sh sudo rm -rf /app/hadoop/tmp/ cd $HADOOP_HOME bin/hadoop namenode -format start-dfs.sh start-yarn.sh This may help 😊
@syedimran52647 жыл бұрын
Technological Geeks Hindi ty bro
@rahulshandilya8806 жыл бұрын
Input and out file format Example in mapreduce program with example
@dnsvarun6 жыл бұрын
Boss where is part 2?
@unwrap_journey4 жыл бұрын
awesome explaination
@damnikjain89855 жыл бұрын
PPT link ?
@ankitamorey18386 жыл бұрын
Sir I am going to join hadoop developer course, if I have interest in pig hive etc but not mapreduce that much, so when I get job I have to work on mapreduce also? Or only scripting languages like pig hive etc. Because in a video of edureka I saw that hive is used for structured data, pig for structured and semi structured data and mapreduce for all data types .pls answer asap
@TechnologicalGeeksHindi6 жыл бұрын
+ankita morey hive is recommended for structured data and pig for structured and semi structured data , that's right , however we can use UDF for processing unstructured data in both pig and hive. Good knowledge of mr is always a plus point.
@rahulshandilya8806 жыл бұрын
give me all input file format and out file format example video in detail
@pranjalpathak44984 жыл бұрын
Sab sar ke upar se nikal gya bhai!
@rudegal73317 жыл бұрын
mjy hdfs file system namespace smjna h
@TechnologicalGeeksHindi7 жыл бұрын
+Rude Gal Pls check video for HDFS , you will find it in hadoop tutorial playlist on my channel 😊
@rudegal73317 жыл бұрын
Mjy nae mil rha meri tuesday ko presentation h plz link kr dein ap k kn c video h mjy detail me namespace smjna h
@TechnologicalGeeksHindi7 жыл бұрын
Namespace pe jyada detail me video nahi banai hai , namespace is just the directory structure on hdfs , HDFS Architecture and Read write operations : kzbin.info/www/bejne/a5XVZamZr7aVqKM
@rudegal73317 жыл бұрын
ok thnx
@achin41407 жыл бұрын
How to read and Write data in HADOOP using Java interface please explain sir please i touch your feet please tell
@TechnologicalGeeksHindi7 жыл бұрын
+Achin It is done by InputFormat class , we don't have to explicitly specify it in Mapreduce program 😊
@achin41407 жыл бұрын
i beg you sir please explain just this read and write program through the video hadooptutorial.info/java-interface-to-hdfs-file-read-write/