Mapreduce Program -1 | WordCount | Hindi

  Рет қаралды 68,750

TG117 Hindi

TG117 Hindi

Күн бұрын

Пікірлер: 74
@TechnologicalGeeksHindi
@TechnologicalGeeksHindi 4 жыл бұрын
For Online classes Visit our website technologicalgeeks.com/ Course Details : kzbin.info/www/bejne/gXOuaWh7ia1opKs
@deepalichoubey6903
@deepalichoubey6903 3 жыл бұрын
Amazing teacher Very nicely Explains each and every topic I watch every video I have Cleared so many Concepts.
@amarxkumar
@amarxkumar 5 жыл бұрын
Really Nice Tutorial, I was all confused. Later i got to know thats the offset,LN# in the Map as input. Very nice video. Keep making more.
@sarveshkumaryadav6222
@sarveshkumaryadav6222 2 жыл бұрын
NICE........., Your accent is wonderful
@allmix5384
@allmix5384 5 жыл бұрын
while running hadoop fs -put file.txt file i am getting below error, Could you please Assist 19/02/20 14:45:33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable put: /home/klu/big data pracice/file._COPYING_ (Permission denied)
@prajwaldeepkhokhar7416
@prajwaldeepkhokhar7416 4 ай бұрын
Great explanation bro. Just one question bro, what is that 36 in the second input line of map reduce
@sudipchatterjeetube
@sudipchatterjeetube 3 жыл бұрын
Can you give the link of the code?
@kiranugalmugale4347
@kiranugalmugale4347 Жыл бұрын
share/haddop/mapreduce -> lib folder is missing, what should I do?
@the_sky
@the_sky 5 жыл бұрын
Ab time aa gaya hai ke aap hadoop he sare topics like HDFS, YARN and sab ke detailed video series banaye.
@ysirisha3412
@ysirisha3412 Жыл бұрын
Implement a simple map-reduce job that builds an inverted index on the set of input documents (Hadoop) please explain this program
@ronaksengupta6174
@ronaksengupta6174 5 жыл бұрын
Word length agar calculate krna hai toh kyase krna padega ?
@karangupta6402
@karangupta6402 5 жыл бұрын
Hi Sandeep, you have really made very nice videos. I have thoroughly enjoyed your videos. Also, the examples are excellent and easy to understand.
@the_sweet_heaven
@the_sweet_heaven 3 жыл бұрын
If file is of size peta byte the how would you count words?
@shubhamsen4572
@shubhamsen4572 4 жыл бұрын
awesome attitude of teaching.
@ravjotsingh537
@ravjotsingh537 6 жыл бұрын
// from where i can find the direct jar file of word count??
@ManidipMandal
@ManidipMandal 7 жыл бұрын
I have installed the latest hadoop 3.0. but there is no lib folder or nor jar file within the lib folder. What to do.
@syedimran5264
@syedimran5264 7 жыл бұрын
bro can u make a video for mapreduce step by step.. actually ye pata nahi chalra ye class kyun liye kaisa flow rehna programme ka.. Plz make it in detail.. By the way ur doing awesome for beginners hats off bro. 🎩
@TechnologicalGeeksHindi
@TechnologicalGeeksHindi 7 жыл бұрын
+Tech With Me Thank you bro , Actually Jo jar files hamne import karwai hai un jar files ke under mapreduce ke methods hai , aur is wale program me teen methods use ki hai map , reduce aur main method , aur Jo program ka flow hai wo main method me mention kiya gaya hai , Actually ye video mapreduce ke theory video pr based hai Jo Maine is video se pehle upload kiya hai , I'll suggest you to watch that video and java basics for hadoop once 😊 For any query you can contact me sdp117@gmail.com
@harshal3123
@harshal3123 2 жыл бұрын
Thanks sir, It helps alot
@peoples_channel
@peoples_channel 5 жыл бұрын
I am getting an exception when I am executing Hadoop jar command. Exception in thread main java.lang.classnotfoundexception Can you please help?
@dharmendranamdev4562
@dharmendranamdev4562 4 жыл бұрын
as you said mapper takes input as (intwritable,text) format but you didn't specify such format in your text file.could you please make me understand
@TechnologicalGeeksHindi
@TechnologicalGeeksHindi 4 жыл бұрын
We don't have to specify anything in text file , the mapreduce is designed in such a way that it will automatically read using the InputFormat class
@jeskipurohit8555
@jeskipurohit8555 6 жыл бұрын
My students enjoyed Ur video and have learn hadoop faster by your video. Will u plz make one video for load balancer?
@shaikhharoon6928
@shaikhharoon6928 3 жыл бұрын
can this be done on hadoop 3...?
@nishantkashyap67
@nishantkashyap67 6 жыл бұрын
Hello, could you please help me to resolve below error: The package org.apache.hadoop.fs is accessible from more than one module: hadoop.common and hadoop.mapreduce.client.jobclient
@trickyvines2165
@trickyvines2165 6 жыл бұрын
full series nhi hai nd installation kese hoga Windows pr.??? starting bataiye coding ki
@jayantapte9903
@jayantapte9903 6 жыл бұрын
Hi sir , I have one question let say we have two nodes with having 4 GB ram each and let say we are executing 6 mappers task on datanode1 which needed 6 GB task ...how this would happend...does it transfer 2 GB data to node 2 using bandwidth transfer and perform map operation for 2 map in node 2?
@pradeeppareek7433
@pradeeppareek7433 6 жыл бұрын
dear sir , I am new to java i am getting error while running the program error array index out of bound
@achin4140
@achin4140 7 жыл бұрын
Dear sir please explain about any one of the Analytical methods of Big data which shows statistics for model building and evaluation is important. Please explain sir please
@piyush98182
@piyush98182 7 жыл бұрын
hdfs ki command reference ke practice ke liye aur videos dalo sir. great word. post more videos related to hadoop.
@TechnologicalGeeksHindi
@TechnologicalGeeksHindi 7 жыл бұрын
Sure brother ...
@vnansen
@vnansen 6 жыл бұрын
It is still showing the below erros.. can someone help Description Resource Path Location Type Syntax error, parameterized types are only available if source level is 1.5 or greater WordCount.java /WordCount/src line 66 Java Problem Syntax error, parameterized types are only available if source level is 1.5 or greater WordCount.java /WordCount/src line 72 Java Problem The class file Mapper contains a signature '(Lorg/apache/hadoop/mapreduce/Mapper.Context;)V' ill-formed at position 74 WordCount.java /WordCount/src line 34 Java Problem Syntax error, parameterized types are only available if source level is 1.5 or greater WordCount.java /WordCount/src line 34 Java Problem
@bigmugable
@bigmugable 3 жыл бұрын
Simple and super!
@devilatgood1223
@devilatgood1223 6 жыл бұрын
can you elaborate with any program which uses a dataset attributes
@sachinvishwakarma4133
@sachinvishwakarma4133 2 жыл бұрын
where do i get the code?
@achin4140
@achin4140 7 жыл бұрын
i beg you sir please tell what is Naive Bayesian Classifier in Big data analytics ?
@TechnologicalGeeksHindi
@TechnologicalGeeksHindi 7 жыл бұрын
+Achin Will be working on it 😊
@recordingsp7833
@recordingsp7833 5 жыл бұрын
You can refer this link it's a good explanation with examples kzbin.info/www/bejne/noG0moiha5uVgq8
@abcdefghi9776
@abcdefghi9776 6 жыл бұрын
Hi Sir, Please make a video for flume too.
@vamshikrishna198
@vamshikrishna198 4 жыл бұрын
👌👌👌explanation
@yogeshupadhyay2836
@yogeshupadhyay2836 7 жыл бұрын
very nyc video sir ji i hope u will upload next video of deeply hive
@TechnologicalGeeksHindi
@TechnologicalGeeksHindi 7 жыл бұрын
+Yogesh Upadhyay Sure brother 😊
@suraj8285
@suraj8285 2 жыл бұрын
bery help phul sarji
@poojachougule7162
@poojachougule7162 6 жыл бұрын
sir how to add file in multinode hadoop
@piyushchoudhary7759
@piyushchoudhary7759 6 жыл бұрын
Hello sandeep sir awesome videos it help us lots to understand hadoop, could please make details videos for Sqoop and Flume n Zookeeper.
@prashantsharma-xk9sd
@prashantsharma-xk9sd 7 жыл бұрын
sir another nice video and nicely explained .... but what about bucketing in hive video?
@TechnologicalGeeksHindi
@TechnologicalGeeksHindi 7 жыл бұрын
+prashant sharma Thank you bro 😊 Next video bucketing ka hi hai 😊
@syedimran5264
@syedimran5264 7 жыл бұрын
mai namenode format kia uske bad namenode start nahi hora bhai.. uske liye kya karna hai plz tell me plz.
@TechnologicalGeeksHindi
@TechnologicalGeeksHindi 7 жыл бұрын
+Tech With Me Execute following commands stop-dfs.sh stop-yarn.sh sudo rm -rf /app/hadoop/tmp/ cd $HADOOP_HOME bin/hadoop namenode -format start-dfs.sh start-yarn.sh This may help 😊
@syedimran5264
@syedimran5264 7 жыл бұрын
Technological Geeks Hindi ty bro
@rahulshandilya880
@rahulshandilya880 6 жыл бұрын
Input and out file format Example in mapreduce program with example
@dnsvarun
@dnsvarun 6 жыл бұрын
Boss where is part 2?
@unwrap_journey
@unwrap_journey 4 жыл бұрын
awesome explaination
@damnikjain8985
@damnikjain8985 5 жыл бұрын
PPT link ?
@ankitamorey1838
@ankitamorey1838 6 жыл бұрын
Sir I am going to join hadoop developer course, if I have interest in pig hive etc but not mapreduce that much, so when I get job I have to work on mapreduce also? Or only scripting languages like pig hive etc. Because in a video of edureka I saw that hive is used for structured data, pig for structured and semi structured data and mapreduce for all data types .pls answer asap
@TechnologicalGeeksHindi
@TechnologicalGeeksHindi 6 жыл бұрын
+ankita morey hive is recommended for structured data and pig for structured and semi structured data , that's right , however we can use UDF for processing unstructured data in both pig and hive. Good knowledge of mr is always a plus point.
@rahulshandilya880
@rahulshandilya880 6 жыл бұрын
give me all input file format and out file format example video in detail
@pranjalpathak4498
@pranjalpathak4498 4 жыл бұрын
Sab sar ke upar se nikal gya bhai!
@rudegal7331
@rudegal7331 7 жыл бұрын
mjy hdfs file system namespace smjna h
@TechnologicalGeeksHindi
@TechnologicalGeeksHindi 7 жыл бұрын
+Rude Gal Pls check video for HDFS , you will find it in hadoop tutorial playlist on my channel 😊
@rudegal7331
@rudegal7331 7 жыл бұрын
Mjy nae mil rha meri tuesday ko presentation h plz link kr dein ap k kn c video h mjy detail me namespace smjna h
@TechnologicalGeeksHindi
@TechnologicalGeeksHindi 7 жыл бұрын
Namespace pe jyada detail me video nahi banai hai , namespace is just the directory structure on hdfs , HDFS Architecture and Read write operations : kzbin.info/www/bejne/a5XVZamZr7aVqKM
@rudegal7331
@rudegal7331 7 жыл бұрын
ok thnx
@achin4140
@achin4140 7 жыл бұрын
How to read and Write data in HADOOP using Java interface please explain sir please i touch your feet please tell
@TechnologicalGeeksHindi
@TechnologicalGeeksHindi 7 жыл бұрын
+Achin It is done by InputFormat class , we don't have to explicitly specify it in Mapreduce program 😊
@achin4140
@achin4140 7 жыл бұрын
i beg you sir please explain just this read and write program through the video hadooptutorial.info/java-interface-to-hdfs-file-read-write/
@-BCA-Akhilsiby
@-BCA-Akhilsiby Жыл бұрын
nice
@aniketshinde4416
@aniketshinde4416 4 жыл бұрын
Source code share karo na
@ahmadali-cz4gu
@ahmadali-cz4gu 7 жыл бұрын
thank you sir :-D
@TechnologicalGeeksHindi
@TechnologicalGeeksHindi 7 жыл бұрын
+ahmad ali You're most welcome Ali Bhai 😊
@aditya_londhe
@aditya_londhe 5 жыл бұрын
Thanks bro helped
@manslayerbhanu8152
@manslayerbhanu8152 5 жыл бұрын
thanks bro
@malodevinod07
@malodevinod07 6 жыл бұрын
thanks sir :)
HIVE INTRODUCTION HINDI
6:25
TG117 Hindi
Рет қаралды 58 М.
MapReduce tutorial hindi
18:08
TG117 Hindi
Рет қаралды 67 М.
VIP ACCESS
00:47
Natan por Aí
Рет қаралды 15 МЛН
So Cute 🥰 who is better?
00:15
dednahype
Рет қаралды 18 МЛН
Cat mode and a glass of water #family #humor #fun
00:22
Kotiki_Z
Рет қаралды 8 МЛН
Support each other🤝
00:31
ISSEI / いっせい
Рет қаралды 46 МЛН
How to run first Map Reduce program in Apache Hadoop (in Hindi)
8:42
Unboxing Big Data
Рет қаралды 4,1 М.
Map Reduce explained with example | System Design
9:09
ByteMonk
Рет қаралды 175 М.
[Hindi] Bucketing in Hive , Map side join , Data Sampling
30:27
TG117 Hindi
Рет қаралды 54 М.
Hive Practical - 2 | Partitioning in Hive | Hindi
20:07
TG117 Hindi
Рет қаралды 47 М.
Hive Practical - 1 | Hindi | Internal and External Tables
26:23
TG117 Hindi
Рет қаралды 52 М.
Run WordCount.java On Hadoop-2.9.0 on Ubuntu 18.04 step by step
17:16
VIP ACCESS
00:47
Natan por Aí
Рет қаралды 15 МЛН