oozie安装目录下,oozie-examples.tar.gz 中存在一些案例。
解压到当前目录,并查看。创建oozie-apps目录,保存测试示例。

  1. $ tar -zxf oozie-examples.tar.gz
  2. $ cd examples/
  3. $ ls
  4. apps input-data src
  5. $ ls apps
  6. aggregator cron custom-main demo hadoop-el hive map-reduce pig sla sqoop-freeform streaming
  7. bundle cron-schedule datelist-java-main distcp hcatalog java-main no-op shell sqoop ssh subwf
  8. $ cd ..
  9. $ mkdir oozie-apps

调度MR任务 - oozie示例

使用 oozie 自带示例 examples/apps/map-reduce 及数据文件 examples/input-data ,复制到 oozie-apps 中,并修改配置文件。

  1. $ cp -r examples/apps/map-reduce oozie-apps/map-reduce
  2. $ cp -r examples/input-data oozie-apps/input-data
  3. $ ls oozie-apps/map-reduce/
  4. job.properties job-with-config-class.properties lib
  5. workflow-with-config-class.xml workflow.xml
  6. $ ls oozie-apps/map-reduce/lib
  7. oozie-examples-4.0.0-cdh5.3.6.jar

job.properties

  1. nameNode=hdfs://192.168.32.130:8020
  2. # yarn.resourcemanager.address ${yarn.resourcemanager.hostname}:8032
  3. jobTracker=192.168.32.130:8032
  4. queueName=default
  5. examplesRoot=oozie-apps
  6. # 执行应用程序路径 hdfs://192.168.32.130:8020/user/jack/oozie-apps/map-reduce/workflow.xml
  7. oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/map-reduce/workflow.xml
  8. outputDir=map-reduce

workflow.xml

  1. <workflow-app xmlns="uri:oozie:workflow:0.2" name="map-reduce-wf">
  2. <start to="mr-node"/>
  3. <action name="mr-node">
  4. <map-reduce>
  5. <job-tracker>${jobTracker}</job-tracker>
  6. <name-node>${nameNode}</name-node>
  7. <prepare>
  8. <delete path="${nameNode}/user/${wf:user()}/${examplesRoot}/output-data/${outputDir}"/>
  9. </prepare>
  10. <configuration>
  11. <property>
  12. <name>mapred.job.queue.name</name>
  13. <value>${queueName}</value>
  14. </property>
  15. <property>
  16. <name>mapred.mapper.class</name>
  17. <value>org.apache.oozie.example.SampleMapper</value>
  18. </property>
  19. <property>
  20. <name>mapred.reducer.class</name>
  21. <value>org.apache.oozie.example.SampleReducer</value>
  22. </property>
  23. <property>
  24. <name>mapred.map.tasks</name>
  25. <value>1</value>
  26. </property>
  27. <property>
  28. <name>mapred.input.dir</name>
  29. <value>/user/${wf:user()}/${examplesRoot}/input-data/text</value>
  30. </property>
  31. <property>
  32. <name>mapred.output.dir</name>
  33. <value>/user/${wf:user()}/${examplesRoot}/output-data/${outputDir}</value>
  34. </property>
  35. </configuration>
  36. </map-reduce>
  37. <ok to="end"/>
  38. <error to="fail"/>
  39. </action>
  40. <kill name="fail">
  41. <message>Map/Reduce failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
  42. </kill>
  43. <end name="end"/>
  44. </workflow-app>

prepare 元素,指定该节点任务的准备条件,

属性 mapred.mapper.classmapred.reducer.class 的值,指向的 class ,放置在该工作目录下的 lib 中,以 jar 包的形式保存(oozie-examples-4.0.0-cdh5.3.6.jar)。

这些类的实现源码,在 examples/src 目录下。

属性 mapred.input.dirmapred.output.dir 的值,表示MapReduce程序的输入输出目录位置。输入数据在 examples/input-data 目录中,需上传到指定位置 /user/jack/oozie-apps/ 下。

SampleMapper.java

使用的是旧式API

  1. /**
  2. * Licensed to the Apache Software Foundation (ASF) under one
  3. * or more contributor license agreements. See the NOTICE file
  4. * distributed with this work for additional information
  5. * regarding copyright ownership. The ASF licenses this file
  6. * to you under the Apache License, Version 2.0 (the
  7. * "License"); you may not use this file except in compliance
  8. * with the License. You may obtain a copy of the License at
  9. *
  10. * http://www.apache.org/licenses/LICENSE-2.0
  11. *
  12. * Unless required by applicable law or agreed to in writing, software
  13. * distributed under the License is distributed on an "AS IS" BASIS,
  14. * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  15. * See the License for the specific language governing permissions and
  16. * limitations under the License.
  17. */
  18. package org.apache.oozie.example;
  19. import org.apache.hadoop.io.LongWritable;
  20. import org.apache.hadoop.io.Text;
  21. import org.apache.hadoop.mapred.JobConf;
  22. import org.apache.hadoop.mapred.Mapper;
  23. import org.apache.hadoop.mapred.OutputCollector;
  24. import org.apache.hadoop.mapred.Reporter;
  25. import java.io.IOException;
  26. public class SampleMapper implements Mapper<LongWritable, Text, LongWritable, Text> {
  27. public void configure(JobConf jobConf) {
  28. }
  29. public void map(LongWritable key, Text value,
  30. OutputCollector<LongWritable, Text> collector, Reporter reporter)
  31. throws IOException {
  32. // 字节偏移量,行文本
  33. collector.collect(key, value);
  34. }
  35. public void close() throws IOException {
  36. }
  37. }

SampleReducer.java

  1. /**
  2. * Licensed to the Apache Software Foundation (ASF) under one
  3. * or more contributor license agreements. See the NOTICE file
  4. * distributed with this work for additional information
  5. * regarding copyright ownership. The ASF licenses this file
  6. * to you under the Apache License, Version 2.0 (the
  7. * "License"); you may not use this file except in compliance
  8. * with the License. You may obtain a copy of the License at
  9. *
  10. * http://www.apache.org/licenses/LICENSE-2.0
  11. *
  12. * Unless required by applicable law or agreed to in writing, software
  13. * distributed under the License is distributed on an "AS IS" BASIS,
  14. * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  15. * See the License for the specific language governing permissions and
  16. * limitations under the License.
  17. */
  18. package org.apache.oozie.example;
  19. import org.apache.hadoop.io.LongWritable;
  20. import org.apache.hadoop.io.Text;
  21. import org.apache.hadoop.mapred.JobConf;
  22. import org.apache.hadoop.mapred.OutputCollector;
  23. import org.apache.hadoop.mapred.Reducer;
  24. import org.apache.hadoop.mapred.Reporter;
  25. import java.io.IOException;
  26. import java.util.Iterator;
  27. public class SampleReducer implements Reducer<LongWritable, Text, LongWritable, Text> {
  28. public void configure(JobConf jobConf) {
  29. }
  30. public void reduce(LongWritable key, Iterator<Text> values, OutputCollector<LongWritable, Text> collector, Reporter reporter)
  31. throws IOException {
  32. // 循环输出,不作处理
  33. while (values.hasNext()) {
  34. collector.collect(key, values.next());
  35. }
  36. }
  37. public void close() throws IOException {
  38. }
  39. }

执行 job.properties

  1. $ ~/Documents/hadoop/bin/hadoop fs -put oozie-apps/input-data oozie-apps/input-data
  2. $ ~/Documents/hadoop/bin/hadoop fs -put oozie-apps/map-reduce/* oozie-apps/map-reduce
  3. $ ~/Documents/hadoop/bin/hadoop fs -ls oozie-apps/map-reduce
  4. Found 5 items
  5. -rw-r--r-- 1 jack supergroup 1028 2020-04-21 05:59 oozie-apps/map-reduce/job-with-config-class.properties
  6. -rw-r--r-- 1 jack supergroup 1019 2020-04-21 05:59 oozie-apps/map-reduce/job.properties
  7. drwxr-xr-x - jack supergroup 0 2020-04-21 05:59 oozie-apps/map-reduce/lib
  8. -rw-r--r-- 1 jack supergroup 2274 2020-04-21 05:59 oozie-apps/map-reduce/workflow-with-config-class.xml
  9. -rw-r--r-- 1 jack supergroup 2559 2020-04-21 05:59 oozie-apps/map-reduce/workflow.xml
  10. $ # 执行任务
  11. $ export OOZIE_URL="http://192.168.32.130:11000/oozie"
  12. $ bin/oozie job -config oozie-apps/map-reduce/job.properties -run
  13. job: 0000007-200420185350972-oozie-jack-W
  14. $ bin/oozie job -info 0000007-200420185350972-oozie-jack-W
  15. Job ID : 0000007-200420185350972-oozie-jack-W
  16. ------------------------------------------------------------------------------------------------------------------------------------
  17. Workflow Name : map-reduce-wf
  18. App Path : hdfs://192.168.32.130:8020/user/jack/oozie-apps/map-reduce/workflow.xml
  19. Status : SUCCEEDED
  20. Run : 0
  21. User : jack
  22. Group : -
  23. Created : 2020-04-21 13:00 GMT
  24. Started : 2020-04-21 13:00 GMT
  25. Last Modified : 2020-04-21 13:01 GMT
  26. Ended : 2020-04-21 13:01 GMT
  27. CoordAction ID: -
  28. Actions
  29. ------------------------------------------------------------------------------------------------------------------------------------
  30. ID Status Ext ID Ext Status Err Code
  31. ------------------------------------------------------------------------------------------------------------------------------------
  32. 0000007-200420185350972-oozie-jack-W@:start: OK - OK -
  33. ------------------------------------------------------------------------------------------------------------------------------------
  34. 0000007-200420185350972-oozie-jack-W@mr-node OK job_1586921478592_0029 SUCCEEDED -
  35. ------------------------------------------------------------------------------------------------------------------------------------
  36. 0000007-200420185350972-oozie-jack-W@end OK - OK -
  37. ------------------------------------------------------------------------------------------------------------------------------------
  38. $ ~/Documents/hadoop/bin/hadoop fs -ls oozie-apps/output-data/map-reduce
  39. Found 2 items
  40. -rw-r--r-- 1 jack supergroup 0 2020-04-21 06:01 oozie-apps/output-data/map-reduce/_SUCCESS
  41. -rw-r--r-- 1 jack supergroup 1547 2020-04-21 06:01 oozie-apps/output-data/map-reduce/part-00000
  42. $ # 默认Map处理,Reduce不作处理
  43. $ ~/Documents/hadoop/bin/hadoop fs -cat oozie-apps/output-data/map-reduce/*
  44. 0 To be or not to be, that is the question;
  45. 42 Whether 'tis nobler in the mind to suffer
  46. 84 The slings and arrows of outrageous fortune,
  47. 129 Or to take arms against a sea of troubles,
  48. 172 And by opposing, end them. To die, to sleep;
  49. 217 No more; and by a sleep to say we end
  50. 255 The heart-ache and the thousand natural shocks
  51. 302 That flesh is heir to ? 'tis a consummation
  52. 346 Devoutly to be wish'd. To die, to sleep;
  53. 387 To sleep, perchance to dream. Ay, there's the rub,
  54. 438 For in that sleep of death what dreams may come,
  55. 487 When we have shuffled off this mortal coil,
  56. 531 Must give us pause. There's the respect
  57. 571 That makes calamity of so long life,
  58. 608 For who would bear the whips and scorns of time,
  59. 657 Th'oppressor's wrong, the proud man's contumely,
  60. 706 The pangs of despised love, the law's delay,
  61. 751 The insolence of office, and the spurns
  62. 791 That patient merit of th'unworthy takes,
  63. 832 When he himself might his quietus make
  64. 871 With a bare bodkin? who would fardels bear,
  65. 915 To grunt and sweat under a weary life,
  66. 954 But that the dread of something after death,
  67. 999 The undiscovered country from whose bourn
  68. 1041 No traveller returns, puzzles the will,
  69. 1081 And makes us rather bear those ills we have
  70. 1125 Than fly to others that we know not of?
  71. 1165 Thus conscience does make cowards of us all,
  72. 1210 And thus the native hue of resolution
  73. 1248 Is sicklied o'er with the pale cast of thought,
  74. 1296 And enterprises of great pitch and moment
  75. 1338 With this regard their currents turn awry,
  76. 1381 And lose the name of action.

job-with-config-class.properties

  1. nameNode=hdfs://192.168.32.130:8020
  2. jobTracker=192.168.32.130:8032
  3. queueName=default
  4. examplesRoot=oozie-apps
  5. oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/map-reduce/workflow-with-config-class.xml
  6. outputDir=map-reduce

workflow-with-config-class.xml

  1. <workflow-app xmlns="uri:oozie:workflow:0.5" name="map-reduce-wf">
  2. <start to="mr-node"/>
  3. <action name="mr-node">
  4. <map-reduce>
  5. <job-tracker>${jobTracker}</job-tracker>
  6. <name-node>${nameNode}</name-node>
  7. <prepare>
  8. <delete path="${nameNode}/user/${wf:user()}/${examplesRoot}/output-data/${outputDir}"/>
  9. </prepare>
  10. <!-- most of the <configuration> properties are being set by SampleOozieActionConfigurator -->
  11. <configuration>
  12. <property>
  13. <name>mapred.job.queue.name</name>
  14. <value>${queueName}</value>
  15. </property>
  16. <!-- These two are not Hadoop properties, but SampleOozieActionConfigurator can use them -->
  17. <property>
  18. <name>examples.root</name>
  19. <value>${examplesRoot}</value>
  20. </property>
  21. <property>
  22. <name>output.dir.name</name>
  23. <value>${outputDir}</value>
  24. </property>
  25. </configuration>
  26. <config-class>org.apache.oozie.example.SampleOozieActionConfigurator</config-class>
  27. </map-reduce>
  28. <ok to="end"/>
  29. <error to="fail"/>
  30. </action>
  31. <kill name="fail">
  32. <message>Map/Reduce failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
  33. </kill>
  34. <end name="end"/>
  35. </workflow-app>

workflow.xml 的差异,通过 config-class 元素来统一配置,该元素引用的class类,引用配置类 org.apache.oozie.example.SampleOozieActionConfigurator

该类的实现源码,在 examples/src 目录下。

SampleOozieActionConfigurator.java

  1. /**
  2. * Licensed to the Apache Software Foundation (ASF) under one
  3. * or more contributor license agreements. See the NOTICE file
  4. * distributed with this work for additional information
  5. * regarding copyright ownership. The ASF licenses this file
  6. * to you under the Apache License, Version 2.0 (the
  7. * "License"); you may not use this file except in compliance
  8. * with the License. You may obtain a copy of the License at
  9. *
  10. * http://www.apache.org/licenses/LICENSE-2.0
  11. *
  12. * Unless required by applicable law or agreed to in writing, software
  13. * distributed under the License is distributed on an "AS IS" BASIS,
  14. * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  15. * See the License for the specific language governing permissions and
  16. * limitations under the License.
  17. */
  18. package org.apache.oozie.example;
  19. import org.apache.hadoop.fs.Path;
  20. import org.apache.hadoop.mapred.FileInputFormat;
  21. import org.apache.hadoop.mapred.FileOutputFormat;
  22. import org.apache.hadoop.mapred.JobConf;
  23. import org.apache.oozie.action.hadoop.OozieActionConfigurator;
  24. import org.apache.oozie.action.hadoop.OozieActionConfiguratorException;
  25. public class SampleOozieActionConfigurator implements OozieActionConfigurator {
  26. @Override
  27. public void configure(JobConf actionConf) throws OozieActionConfiguratorException {
  28. if (actionConf.getUser() == null) {
  29. throw new OozieActionConfiguratorException("No user set");
  30. }
  31. if (actionConf.get("examples.root") == null) {
  32. throw new OozieActionConfiguratorException("examples.root not set");
  33. }
  34. if (actionConf.get("output.dir.name") == null) {
  35. throw new OozieActionConfiguratorException("output.dir.name not set");
  36. }
  37. actionConf.setMapperClass(SampleMapper.class);
  38. actionConf.setReducerClass(SampleReducer.class);
  39. actionConf.setNumMapTasks(1);
  40. FileInputFormat.setInputPaths(actionConf,
  41. new Path("/user/" + actionConf.getUser() + "/"
  42. + actionConf.get("examples.root") + "/input-data/text"));
  43. FileOutputFormat.setOutputPath(actionConf,
  44. new Path("/user/" + actionConf.getUser() + "/"
  45. + actionConf.get("examples.root") + "/output-data/"
  46. + actionConf.get("output.dir.name")));
  47. }
  48. }

OozieActionConfigurator 接口的实现类,定义了 Mapper类、Reducer类、Map任务数、输入目录和输出目录。

执行 job-with-config-class.properties

  1. $ # 更新 job-with-config-class.properties
  2. $ ~/Documents/hadoop/bin/hadoop fs -put -f oozie-apps/map-reduce/job-with-config-class.properties oozie-apps/map-reduce/
  3. $ ~/Documents/hadoop/bin/hadoop fs -ls oozie-apps/map-reduce
  4. Found 5 items
  5. -rw-r--r-- 1 jack supergroup 1028 2020-04-21 18:14 oozie-apps/map-reduce/job-with-config-class.properties
  6. -rw-r--r-- 1 jack supergroup 1019 2020-04-21 05:59 oozie-apps/map-reduce/job.properties
  7. drwxr-xr-x - jack supergroup 0 2020-04-21 05:59 oozie-apps/map-reduce/lib
  8. -rw-r--r-- 1 jack supergroup 2274 2020-04-21 05:59 oozie-apps/map-reduce/workflow-with-config-class.xml
  9. -rw-r--r-- 1 jack supergroup 2559 2020-04-21 05:59 oozie-apps/map-reduce/workflow.xml
  10. $ # 执行任务
  11. $ export OOZIE_URL="http://192.168.32.130:11000/oozie"
  12. $ bin/oozie job -config oozie-apps/map-reduce/job.properties -run
  13. job: 0000008-200420185350972-oozie-jack-W
  14. $ bin/oozie job -info 0000008-200420185350972-oozie-jack-W
  15. Job ID : 0000008-200420185350972-oozie-jack-W
  16. ------------------------------------------------------------------------------------------------------------------------------------
  17. Workflow Name : map-reduce-wf
  18. App Path : hdfs://192.168.32.130:8020/user/jack/oozie-apps/map-reduce/workflow-with-config-class.xml
  19. Status : RUNNING
  20. Run : 0
  21. User : jack
  22. Group : -
  23. Created : 2020-04-22 01:18 GMT
  24. Started : 2020-04-22 01:18 GMT
  25. Last Modified : 2020-04-22 01:18 GMT
  26. Ended : -
  27. CoordAction ID: -
  28. Actions
  29. ------------------------------------------------------------------------------------------------------------------------------------
  30. ID Status Ext ID Ext Status Err Code
  31. ------------------------------------------------------------------------------------------------------------------------------------
  32. 0000008-200420185350972-oozie-jack-W@:start: OK - OK -
  33. ------------------------------------------------------------------------------------------------------------------------------------
  34. 0000008-200420185350972-oozie-jack-W@mr-node RUNNING job_1586921478592_0031 RUNNING -
  35. ------------------------------------------------------------------------------------------------------------------------------------
  36. [jack@master oozie]$ bin/oozie job -info 0000008-200420185350972-oozie-jack-W
  37. Job ID : 0000008-200420185350972-oozie-jack-W
  38. ------------------------------------------------------------------------------------------------------------------------------------
  39. Workflow Name : map-reduce-wf
  40. App Path : hdfs://192.168.32.130:8020/user/jack/oozie-apps/map-reduce/workflow-with-config-class.xml
  41. Status : SUCCEEDED
  42. Run : 0
  43. User : jack
  44. Group : -
  45. Created : 2020-04-22 01:18 GMT
  46. Started : 2020-04-22 01:18 GMT
  47. Last Modified : 2020-04-22 01:19 GMT
  48. Ended : 2020-04-22 01:19 GMT
  49. CoordAction ID: -
  50. Actions
  51. ------------------------------------------------------------------------------------------------------------------------------------
  52. ID Status Ext ID Ext Status Err Code
  53. ------------------------------------------------------------------------------------------------------------------------------------
  54. 0000008-200420185350972-oozie-jack-W@:start: OK - OK -
  55. ------------------------------------------------------------------------------------------------------------------------------------
  56. 0000008-200420185350972-oozie-jack-W@mr-node OK job_1586921478592_0031 SUCCEEDED -
  57. ------------------------------------------------------------------------------------------------------------------------------------
  58. 0000008-200420185350972-oozie-jack-W@end OK - OK -
  59. ------------------------------------------------------------------------------------------------------------------------------------
  60. $ ~/Documents/hadoop/bin/hadoop fs -ls oozie-apps/output-data/map-reduce
  61. Found 2 items
  62. -rw-r--r-- 1 jack supergroup 0 2020-04-21 18:18 oozie-apps/output-data/map-reduce/_SUCCESS
  63. -rw-r--r-- 1 jack supergroup 1547 2020-04-21 18:18 oozie-apps/output-data/map-reduce/part-00000
  64. $ # 默认Map处理,Reduce不作处理
  65. $ ~/Documents/hadoop/bin/hadoop fs -cat oozie-apps/output-data/map-reduce/*
  66. 0 To be or not to be, that is the question;
  67. 42 Whether 'tis nobler in the mind to suffer
  68. 84 The slings and arrows of outrageous fortune,
  69. 129 Or to take arms against a sea of troubles,
  70. 172 And by opposing, end them. To die, to sleep;
  71. 217 No more; and by a sleep to say we end
  72. 255 The heart-ache and the thousand natural shocks
  73. 302 That flesh is heir to ? 'tis a consummation
  74. 346 Devoutly to be wish'd. To die, to sleep;
  75. 387 To sleep, perchance to dream. Ay, there's the rub,
  76. 438 For in that sleep of death what dreams may come,
  77. 487 When we have shuffled off this mortal coil,
  78. 531 Must give us pause. There's the respect
  79. 571 That makes calamity of so long life,
  80. 608 For who would bear the whips and scorns of time,
  81. 657 Th'oppressor's wrong, the proud man's contumely,
  82. 706 The pangs of despised love, the law's delay,
  83. 751 The insolence of office, and the spurns
  84. 791 That patient merit of th'unworthy takes,
  85. 832 When he himself might his quietus make
  86. 871 With a bare bodkin? who would fardels bear,
  87. 915 To grunt and sweat under a weary life,
  88. 954 But that the dread of something after death,
  89. 999 The undiscovered country from whose bourn
  90. 1041 No traveller returns, puzzles the will,
  91. 1081 And makes us rather bear those ills we have
  92. 1125 Than fly to others that we know not of?
  93. 1165 Thus conscience does make cowards of us all,
  94. 1210 And thus the native hue of resolution
  95. 1248 Is sicklied o'er with the pale cast of thought,
  96. 1296 And enterprises of great pitch and moment
  97. 1338 With this regard their currents turn awry,
  98. 1381 And lose the name of action.

WordCount示例

示例位置 hadoop/share/hadoop/mapreduce2/hadoop-mapreduce-examples-2.5.0-cdh5.3.6.jar ,放入lib目录下,修改 job.propertiesworkflow.xml 文件,并执行。

workflow.xml

  1. <workflow-app xmlns="uri:oozie:workflow:0.2" name="map-reduce-wf">
  2. <start to="mr-node"/>
  3. <action name="mr-node">
  4. <map-reduce>
  5. <job-tracker>${jobTracker}</job-tracker>
  6. <name-node>${nameNode}</name-node>
  7. <prepare>
  8. <delete path="${nameNode}/user/${wf:user()}/${examplesRoot}/output-data/${outputDir}"/>
  9. </prepare>
  10. <configuration>
  11. <property>
  12. <name>mapred.job.queue.name</name>
  13. <value>${queueName}</value>
  14. </property>
  15. <property>
  16. <name>mapred.mapper.class</name>
  17. <value>org.apache.hadoop.examples.WordCount$TokenizerMapper</value>
  18. </property>
  19. <property>
  20. <name>mapred.reducer.class</name>
  21. <value>org.apache.hadoop.examples.WordCount$IntSumReducer</value>
  22. </property>
  23. <property>
  24. <name>mapred.map.tasks</name>
  25. <value>1</value>
  26. </property>
  27. <property>
  28. <name>mapred.input.dir</name>
  29. <value>/user/${wf:user()}/${examplesRoot}/input-data/text</value>
  30. </property>
  31. <property>
  32. <name>mapred.output.dir</name>
  33. <value>/user/${wf:user()}/${examplesRoot}/output-data/${outputDir}</value>
  34. </property>
  35. </configuration>
  36. </map-reduce>
  37. <ok to="end"/>
  38. <error to="fail"/>
  39. </action>
  40. <kill name="fail">
  41. <message>Map/Reduce failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
  42. </kill>
  43. <end name="end"/>
  44. </workflow-app>

WordCount 的内部类,编译成jar包,org.apache.hadoop.examples.WordCount$IntSumReducer.classorg.apache.hadoop.examples.WordCount$TokenizerMapper.class 是编译之后的class导入名称,符号 $ 后的类是前面的类的内部类。