• 카테고리

    질문 & 답변
  • 세부 분야

    데이터 엔지니어링

  • 해결 여부

    해결됨

WordDriver 실행 시 NullPointerException Error 발생관련 문의드립니다.

21.07.12 19:05 작성 조회수 153

1

안녕하세요 강사님 강의 유익하게 듣고 있습니다. 

다름이 아니라 WordDriver를 실행시키면 Exception Error가 발생하여 구글링을 통해 찾아보았지만 완벽한 해답을 찾지 못하여 문의드립니다. 어떻게 하면 에러를 해결할 수 있을까요?

2021-07-12 17:22:08,281 WARN  [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(60)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2021-07-12 17:22:08,897 WARN  [main] impl.MetricsConfig (MetricsConfig.java:loadFirst(136)) - Cannot locate configuration: tried hadoop-metrics2-jobtracker.properties,hadoop-metrics2.properties
2021-07-12 17:22:08,973 INFO  [main] impl.MetricsSystemImpl (MetricsSystemImpl.java:startTimer(378)) - Scheduled Metric snapshot period at 10 second(s).
2021-07-12 17:22:08,974 INFO  [main] impl.MetricsSystemImpl (MetricsSystemImpl.java:start(191)) - JobTracker metrics system started
2021-07-12 17:22:09,282 WARN  [main] mapreduce.JobResourceUploader (JobResourceUploader.java:uploadResourcesInternal(149)) - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
2021-07-12 17:22:09,290 WARN  [main] mapreduce.JobResourceUploader (JobResourceUploader.java:uploadJobJar(482)) - No job jar file set.  User classes may not be found. See Job or Job#setJar(String).
2021-07-12 17:22:09,366 INFO  [main] input.FileInputFormat (FileInputFormat.java:listStatus(292)) - Total input files to process : 2
2021-07-12 17:22:09,411 INFO  [main] mapreduce.JobSubmitter (JobSubmitter.java:submitJobInternal(202)) - number of splits:2
2021-07-12 17:22:09,549 INFO  [main] mapreduce.JobSubmitter (JobSubmitter.java:printTokens(298)) - Submitting tokens for job: job_local579481131_0001
2021-07-12 17:22:09,551 INFO  [main] mapreduce.JobSubmitter (JobSubmitter.java:printTokens(299)) - Executing with tokens: []
2021-07-12 17:22:09,688 INFO  [main] mapreduce.Job (Job.java:submit(1569)) - The url to track the job: http://localhost:8080/
2021-07-12 17:22:09,690 INFO  [Thread-23] mapred.LocalJobRunner (LocalJobRunner.java:createOutputCommitter(501)) - OutputCommitter set in config null
2021-07-12 17:22:09,700 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1614)) - Running job: job_local579481131_0001
2021-07-12 17:22:09,714 INFO  [Thread-23] output.FileOutputCommitter (FileOutputCommitter.java:<init>(141)) - File Output Committer Algorithm version is 2
2021-07-12 17:22:09,714 INFO  [Thread-23] output.FileOutputCommitter (FileOutputCommitter.java:<init>(156)) - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
2021-07-12 17:22:09,715 INFO  [Thread-23] mapred.LocalJobRunner (LocalJobRunner.java:createOutputCommitter(519)) - OutputCommitter is org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
2021-07-12 17:22:09,844 INFO  [Thread-23] mapred.LocalJobRunner (LocalJobRunner.java:runTasks(478)) - Waiting for map tasks
2021-07-12 17:22:09,845 INFO  [LocalJobRunner Map Task Executor #0] mapred.LocalJobRunner (LocalJobRunner.java:run(252)) - Starting task: attempt_local579481131_0001_m_000000_0
2021-07-12 17:22:09,886 INFO  [LocalJobRunner Map Task Executor #0] output.FileOutputCommitter (FileOutputCommitter.java:<init>(141)) - File Output Committer Algorithm version is 2
2021-07-12 17:22:09,886 INFO  [LocalJobRunner Map Task Executor #0] output.FileOutputCommitter (FileOutputCommitter.java:<init>(156)) - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
2021-07-12 17:22:09,941 INFO  [LocalJobRunner Map Task Executor #0] mapred.Task (Task.java:initialize(626)) -  Using ResourceCalculatorProcessTree : [ ]
2021-07-12 17:22:09,948 INFO  [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:runNewMapper(768)) - Processing split: hdfs://localhost:9000/user/mapreduce2/input/Ireland-And-The-home.txt:0+495572
2021-07-12 17:22:10,025 INFO  [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:setEquator(1219)) - (EQUATOR) 0 kvi 26214396(104857584)
2021-07-12 17:22:10,026 INFO  [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:init(1012)) - mapreduce.task.io.sort.mb: 100
2021-07-12 17:22:10,026 INFO  [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:init(1013)) - soft limit at 83886080
2021-07-12 17:22:10,026 INFO  [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:init(1014)) - bufstart = 0; bufvoid = 104857600
2021-07-12 17:22:10,026 INFO  [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:init(1015)) - kvstart = 26214396; length = 6553600
2021-07-12 17:22:10,031 INFO  [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:createSortingCollector(409)) - Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
2021-07-12 17:22:11,334 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1635)) - Job job_local579481131_0001 running in uber mode : false
2021-07-12 17:22:11,336 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1642)) -  map 0% reduce 0%
2021-07-12 17:22:11,351 INFO  [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:flush(1476)) - Starting flush of map output
2021-07-12 17:22:11,374 INFO  [LocalJobRunner Map Task Executor #0] mapred.LocalJobRunner (LocalJobRunner.java:run(252)) - Starting task: attempt_local579481131_0001_m_000001_0
2021-07-12 17:22:11,376 INFO  [LocalJobRunner Map Task Executor #0] output.FileOutputCommitter (FileOutputCommitter.java:<init>(141)) - File Output Committer Algorithm version is 2
2021-07-12 17:22:11,376 INFO  [LocalJobRunner Map Task Executor #0] output.FileOutputCommitter (FileOutputCommitter.java:<init>(156)) - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
2021-07-12 17:22:11,376 INFO  [LocalJobRunner Map Task Executor #0] mapred.Task (Task.java:initialize(626)) -  Using ResourceCalculatorProcessTree : [ ]
2021-07-12 17:22:11,378 INFO  [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:runNewMapper(768)) - Processing split: hdfs://localhost:9000/user/mapreduce2/input/The-Plain-book.txt:0+33520
2021-07-12 17:22:11,387 INFO  [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:setEquator(1219)) - (EQUATOR) 0 kvi 26214396(104857584)
2021-07-12 17:22:11,387 INFO  [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:init(1012)) - mapreduce.task.io.sort.mb: 100
2021-07-12 17:22:11,387 INFO  [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:init(1013)) - soft limit at 83886080
2021-07-12 17:22:11,387 INFO  [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:init(1014)) - bufstart = 0; bufvoid = 104857600
2021-07-12 17:22:11,388 INFO  [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:init(1015)) - kvstart = 26214396; length = 6553600
2021-07-12 17:22:11,388 INFO  [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:createSortingCollector(409)) - Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
2021-07-12 17:22:11,402 INFO  [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:flush(1476)) - Starting flush of map output
2021-07-12 17:22:11,427 INFO  [Thread-23] mapred.LocalJobRunner (LocalJobRunner.java:runTasks(486)) - map task executor complete.
2021-07-12 17:22:11,436 WARN  [Thread-23] mapred.LocalJobRunner (LocalJobRunner.java:run(590)) - job_local579481131_0001
java.lang.Exception: java.lang.NullPointerException
	at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:492)
	at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:552)
Caused by: java.lang.NullPointerException
	at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1090)
	at org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:727)
	at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
	at com.wonjun.jun.WordMapper.map(WordMapper.java:21)
	at com.wonjun.jun.WordMapper.map(WordMapper.java:1)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
	at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:271)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
2021-07-12 17:22:12,339 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1655)) - Job job_local579481131_0001 failed with state FAILED due to: NA
2021-07-12 17:22:12,350 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1660)) - Counters: 0

답변 2

·

답변을 작성해보세요.

1

wonjun1995님의 프로필

wonjun1995

질문자

2021.07.13

Mapper에 context.write 를 잘못 사용하여 에러가 났었네요.... 해결하였습니다 감사합니다!!

안녕하세요. , Wonjun1995님

늦은 시간까지 열공하시니 감사하네요.. 해결되었다니 다행입니다. 역시 맵 구현 과정에서

결과값을 받을 때 IntWritable 객체(오브젝트)에 비어 있었거나 Text 객체 구현에 문제가 있었군요..

좋은 결과 행복하셨으면 합니다.

참고로 미해결 버튼을 해결 버튼으로 눌려주시면 고맙겠습니다. 제가 하려고 했는데 안되네요..

아마도 수강생에게 활성화되었나봅니다.

다시 한번 하둡 초급자 과정을 잘 맞춰 정말 한 단계 프로그래머로서 성장함을 축하합니다.

0

안녕하세요... wonjun1995 님.

감사하게도...

처음 접하는 하둡을 플러그인까지 설치하는 일에 잘 따라 오셨으니 고맙죠.


WordMapper, WordReducer, WordDriver 세 가지 소스를 제게 보여주세요.. 

맵 타스트 구현할 때 널(NullPointerException)이라면 구현 방식에서 세 가지 소스 중 어느 하나를 

잘못 구현한 것 같습니다. 컴파일은 괜찮을 듯 하지만 아마 맵 혹은 리듀스 단계 구현 어디선가 데이터가 비어서 드리이버 구현할 때 널이 발생한 듯 하네요..

이러한 에러는 좋은 연습이 될 듯합니다. 
다시한 번 파일 소스를 첨부하여 보내주세요 제가 구현을 해보죠..
혹은 세 가지 소스를 캡쳐하여 보내주세요..

수고하네요..