1. 程式人生 > >向HBase中匯入資料3:使用MapReduce從HDFS或本地檔案中讀取資料並寫入HBase(增加使用Reduce批量插入)

向HBase中匯入資料3:使用MapReduce從HDFS或本地檔案中讀取資料並寫入HBase(增加使用Reduce批量插入)

前面我們介紹了:

為了提高插入效率,我們在前面只使用map的基礎上增加使用reduce,思想是使用map-reduce操作,將rowkey相同的項規約到同一個reduce中,再在reduce中構建put物件實現批量插入

測試資料如下:


注意到有兩條記錄是相似的。

package cn.edu.shu.ces.chenjie.tianyi.hadoop;

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
import org.apache.hadoop.hbase.mapreduce.TableReducer;
import org.apache.hadoop.hbase.util.Bytes;


/***
 * 使用MapReduce向HBase中匯入資料
 * @author chenjie
 * 
 */
public class HadoopConnectTest4
{
	public static class HBaseHFileMapper extends Mapper<LongWritable, Text, Text, Text>
	{
		@Override
		protected void map(LongWritable key, Text value, Context context)
		{
			String value_str = value.toString();
			System.out.println("----map----" + value_str);
			//將Text型資料轉為字串:
			//1080x1920|Vivo X9|a8622b8a26ae679f0be82135f6529902|中國|湖南|益陽|wifi|2017-12-31 11:55:58
			String values[] = value_str.split("\\|");
			//按|隔開
			if (values == null || values.length < 8)
				return;
			//過濾掉長度不符合要求的記錄
			String userID = values[2];
			//取出使用者:a8622b8a26ae679f0be82135f6529902
			String time = values[7];
			//取出時間:2017-12-31 11:55:58
			String ymd = time.split(" ")[0];
			//得到年月日:2017-12-31
			String rowkey = userID + "-" + ymd;
			//使用使用者ID-年月日作為HBase表中的行鍵
			try
			{
				context.write(new Text(rowkey), new Text(value_str));
			}
			catch (IOException e)
			{
				e.printStackTrace();
			}
			catch (InterruptedException e)
			{
				e.printStackTrace();
			}
		}
	}

	/**
	 * 注意這裡的TableReducer
	 */
	public static class HBaseHFileReducer2 extends TableReducer<Text,Text,NullWritable>
	{
		@Override
		protected void reduce(Text key, Iterable<Text> texts, Context context) throws IOException, InterruptedException
		{
			String rowkey = key.toString();
			System.out.println("--reduce:" + rowkey);
			Put p1 = new Put(Bytes.toBytes(rowkey));
			//使用行鍵新建Put物件
			for(Text value : texts)
			{
				String value_str = value.toString();
				System.out.println("\t" + value_str);
				String values[] = value_str.split("\\|");
				//按|隔開
				if (values == null || values.length < 8)
					return;
				//過濾掉長度不符合要求的記錄
				String time = values[7];
				//取出時間:2017-12-31 11:55:58
				String hms = time.split(" ")[1] + ":000";
				//得到時分秒:11:55:58
				//p1.addColumn(value.getFamily(), value.getQualifier(), value.getValue());
				p1.addColumn("d".getBytes(), hms.getBytes(), value_str.getBytes());
				//向put中增加一列,列族為d,列名為時分秒,值為原字串
			}
			context.write(NullWritable.get(), p1);   
		}
	}
	
	private static final String HDFS = "hdfs://192.168.1.112:9000";// HDFS路徑
	private static final String INPATH = HDFS + "/tmp/clientdata10_3.txt";// 輸入檔案路徑

	public int run() throws IOException, ClassNotFoundException, InterruptedException
	{
		Configuration conf = HBaseConfiguration.create();
		//任務的配置設定,configuration是一個任務的配置物件,封裝了任務的配置資訊
		conf.set("hbase.zookeeper.quorum", "pc2:2181,pc3:2181,pc4:2181");
		//設定zookeeper
		conf.set("hbase.rootdir", "hdfs://pc2:9000/hbase");
		//設定hbase根目錄
		conf.set("zookeeper.znode.parent", "/hbase");

		Job job = Job.getInstance(conf, "HFile bulk load test");
		// 生成一個新的任務物件並
		job.setJarByClass(HadoopConnectTest4.class);
		//設定driver類
		job.setMapperClass(HBaseHFileMapper.class);
		job.setMapOutputKeyClass(Text.class);
		job.setMapOutputValueClass(Text.class);
		// 設定任務的map類和,map類輸出結果是ImmutableBytesWritable和put型別
		TableMapReduceUtil.initTableReducerJob("clientdata_test5", HBaseHFileReducer2.class, job);
		// TableMapReduceUtil是HBase提供的工具類,會自動設定mapreuce提交到hbase任務的各種配置,封裝了操作,只需要簡單的設定即可
		//設定表名為clientdata_test5,reducer類為空,job為此前設定號的job
		job.setNumReduceTasks(1);
		// 設定reduce過程,這裡由map端的資料直接提交,不要使用reduce類,因而設定成null,並設定reduce的個數為0
		FileInputFormat.addInputPath(job, new Path(INPATH));
		// 設定輸入檔案路徑
		return (job.waitForCompletion(true) ? 0 : -1);
	}
}

執行結果:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/D:/SourceCode/MyEclipse2014Workspace/HDFS2HBaseByHadoopMR/hadooplibs/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/D:/SourceCode/MyEclipse2014Workspace/HDFS2HBaseByHadoopMR/hbaselibs/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2018-03-16 22:43:42,360 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(1129)) - session.id is deprecated. Instead, use dfs.metrics.session-id
2018-03-16 22:43:42,366 INFO  [main] jvm.JvmMetrics (JvmMetrics.java:init(76)) - Initializing JVM Metrics with processName=JobTracker, sessionId=
2018-03-16 22:43:42,437 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(1129)) - io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum
2018-03-16 22:43:42,717 WARN  [main] mapreduce.JobResourceUploader (JobResourceUploader.java:uploadFiles(64)) - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
2018-03-16 22:43:42,729 WARN  [main] mapreduce.JobResourceUploader (JobResourceUploader.java:uploadFiles(171)) - No job jar file set.  User classes may not be found. See Job or Job#setJar(String).
2018-03-16 22:43:44,094 INFO  [main] input.FileInputFormat (FileInputFormat.java:listStatus(281)) - Total input paths to process : 1
2018-03-16 22:43:44,181 INFO  [main] mapreduce.JobSubmitter (JobSubmitter.java:submitJobInternal(199)) - number of splits:1
2018-03-16 22:43:44,192 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(1129)) - io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum
2018-03-16 22:43:44,355 INFO  [main] mapreduce.JobSubmitter (JobSubmitter.java:printTokens(288)) - Submitting tokens for job: job_local597129759_0001
2018-03-16 22:43:46,410 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:setup(165)) - Localized file:/D:/SourceCode/MyEclipse2014Workspace/HDFS2HBaseByHadoopMR/hbaselibs/zookeeper-3.4.6.jar as file:/tmp/hadoop-Administrator/mapred/local/1521211424485/zookeeper-3.4.6.jar
2018-03-16 22:43:46,411 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:setup(165)) - Localized file:/D:/SourceCode/MyEclipse2014Workspace/HDFS2HBaseByHadoopMR/hadooplibs/hadoop-mapreduce-client-core-2.6.5.jar as file:/tmp/hadoop-Administrator/mapred/local/1521211424486/hadoop-mapreduce-client-core-2.6.5.jar
2018-03-16 22:43:46,412 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:setup(165)) - Localized file:/D:/SourceCode/MyEclipse2014Workspace/HDFS2HBaseByHadoopMR/hbaselibs/hbase-client-1.2.6.jar as file:/tmp/hadoop-Administrator/mapred/local/1521211424487/hbase-client-1.2.6.jar
2018-03-16 22:43:46,413 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:setup(165)) - Localized file:/D:/SourceCode/MyEclipse2014Workspace/HDFS2HBaseByHadoopMR/hadooplibs/protobuf-java-2.5.0.jar as file:/tmp/hadoop-Administrator/mapred/local/1521211424488/protobuf-java-2.5.0.jar
2018-03-16 22:43:46,416 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:setup(165)) - Localized file:/D:/SourceCode/MyEclipse2014Workspace/HDFS2HBaseByHadoopMR/hbaselibs/htrace-core-3.1.0-incubating.jar as file:/tmp/hadoop-Administrator/mapred/local/1521211424489/htrace-core-3.1.0-incubating.jar
2018-03-16 22:43:46,518 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:setup(165)) - Localized file:/D:/SourceCode/MyEclipse2014Workspace/HDFS2HBaseByHadoopMR/hbaselibs/metrics-core-2.2.0.jar as file:/tmp/hadoop-Administrator/mapred/local/1521211424490/metrics-core-2.2.0.jar
2018-03-16 22:43:46,687 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:setup(165)) - Localized file:/D:/SourceCode/MyEclipse2014Workspace/HDFS2HBaseByHadoopMR/hadooplibs/hadoop-common-2.6.5.jar as file:/tmp/hadoop-Administrator/mapred/local/1521211424491/hadoop-common-2.6.5.jar
2018-03-16 22:43:46,691 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:setup(165)) - Localized file:/D:/SourceCode/MyEclipse2014Workspace/HDFS2HBaseByHadoopMR/hbaselibs/hbase-common-1.2.6.jar as file:/tmp/hadoop-Administrator/mapred/local/1521211424492/hbase-common-1.2.6.jar
2018-03-16 22:43:46,692 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:setup(165)) - Localized file:/D:/SourceCode/MyEclipse2014Workspace/HDFS2HBaseByHadoopMR/hbaselibs/hbase-hadoop-compat-1.2.6.jar as file:/tmp/hadoop-Administrator/mapred/local/1521211424493/hbase-hadoop-compat-1.2.6.jar
2018-03-16 22:43:46,694 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:setup(165)) - Localized file:/D:/SourceCode/MyEclipse2014Workspace/HDFS2HBaseByHadoopMR/hbaselibs/netty-all-4.0.23.Final.jar as file:/tmp/hadoop-Administrator/mapred/local/1521211424494/netty-all-4.0.23.Final.jar
2018-03-16 22:43:46,696 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:setup(165)) - Localized file:/D:/SourceCode/MyEclipse2014Workspace/HDFS2HBaseByHadoopMR/hbaselibs/hbase-prefix-tree-1.2.6.jar as file:/tmp/hadoop-Administrator/mapred/local/1521211424495/hbase-prefix-tree-1.2.6.jar
2018-03-16 22:43:46,698 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:setup(165)) - Localized file:/D:/SourceCode/MyEclipse2014Workspace/HDFS2HBaseByHadoopMR/hbaselibs/hbase-protocol-1.2.6.jar as file:/tmp/hadoop-Administrator/mapred/local/1521211424496/hbase-protocol-1.2.6.jar
2018-03-16 22:43:46,707 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:setup(165)) - Localized file:/D:/SourceCode/MyEclipse2014Workspace/HDFS2HBaseByHadoopMR/hbaselibs/hbase-server-1.2.6.jar as file:/tmp/hadoop-Administrator/mapred/local/1521211424497/hbase-server-1.2.6.jar
2018-03-16 22:43:46,708 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:setup(165)) - Localized file:/D:/SourceCode/MyEclipse2014Workspace/HDFS2HBaseByHadoopMR/hadooplibs/guava-11.0.2.jar as file:/tmp/hadoop-Administrator/mapred/local/1521211424498/guava-11.0.2.jar
2018-03-16 22:43:46,795 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:makeClassLoader(234)) - file:/D:/tmp/hadoop-Administrator/mapred/local/1521211424485/zookeeper-3.4.6.jar
2018-03-16 22:43:46,796 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:makeClassLoader(234)) - file:/D:/tmp/hadoop-Administrator/mapred/local/1521211424486/hadoop-mapreduce-client-core-2.6.5.jar
2018-03-16 22:43:46,796 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:makeClassLoader(234)) - file:/D:/tmp/hadoop-Administrator/mapred/local/1521211424487/hbase-client-1.2.6.jar
2018-03-16 22:43:46,796 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:makeClassLoader(234)) - file:/D:/tmp/hadoop-Administrator/mapred/local/1521211424488/protobuf-java-2.5.0.jar
2018-03-16 22:43:46,796 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:makeClassLoader(234)) - file:/D:/tmp/hadoop-Administrator/mapred/local/1521211424489/htrace-core-3.1.0-incubating.jar
2018-03-16 22:43:46,796 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:makeClassLoader(234)) - file:/D:/tmp/hadoop-Administrator/mapred/local/1521211424490/metrics-core-2.2.0.jar
2018-03-16 22:43:46,797 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:makeClassLoader(234)) - file:/D:/tmp/hadoop-Administrator/mapred/local/1521211424491/hadoop-common-2.6.5.jar
2018-03-16 22:43:46,797 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:makeClassLoader(234)) - file:/D:/tmp/hadoop-Administrator/mapred/local/1521211424492/hbase-common-1.2.6.jar
2018-03-16 22:43:46,797 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:makeClassLoader(234)) - file:/D:/tmp/hadoop-Administrator/mapred/local/1521211424493/hbase-hadoop-compat-1.2.6.jar
2018-03-16 22:43:46,799 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:makeClassLoader(234)) - file:/D:/tmp/hadoop-Administrator/mapred/local/1521211424494/netty-all-4.0.23.Final.jar
2018-03-16 22:43:46,799 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:makeClassLoader(234)) - file:/D:/tmp/hadoop-Administrator/mapred/local/1521211424495/hbase-prefix-tree-1.2.6.jar
2018-03-16 22:43:46,799 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:makeClassLoader(234)) - file:/D:/tmp/hadoop-Administrator/mapred/local/1521211424496/hbase-protocol-1.2.6.jar
2018-03-16 22:43:46,799 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:makeClassLoader(234)) - file:/D:/tmp/hadoop-Administrator/mapred/local/1521211424497/hbase-server-1.2.6.jar
2018-03-16 22:43:46,800 INFO  [main] mapred.LocalDistributedCacheManager (LocalDistributedCacheManager.java:makeClassLoader(234)) - file:/D:/tmp/hadoop-Administrator/mapred/local/1521211424498/guava-11.0.2.jar
2018-03-16 22:43:46,807 INFO  [main] mapreduce.Job (Job.java:submit(1301)) - The url to track the job: http://localhost:8080/
2018-03-16 22:43:46,808 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1346)) - Running job: job_local597129759_0001
2018-03-16 22:43:46,818 INFO  [Thread-24] mapred.LocalJobRunner (LocalJobRunner.java:createOutputCommitter(471)) - OutputCommitter set in config null
2018-03-16 22:43:46,896 INFO  [Thread-24] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(1129)) - io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum
2018-03-16 22:43:46,898 INFO  [Thread-24] mapred.LocalJobRunner (LocalJobRunner.java:createOutputCommitter(489)) - OutputCommitter is org.apache.hadoop.hbase.mapreduce.TableOutputCommitter
2018-03-16 22:43:46,984 INFO  [Thread-24] mapred.LocalJobRunner (LocalJobRunner.java:runTasks(448)) - Waiting for map tasks
2018-03-16 22:43:46,987 INFO  [LocalJobRunner Map Task Executor #0] mapred.LocalJobRunner (LocalJobRunner.java:run(224)) - Starting task: attempt_local597129759_0001_m_000000_0
2018-03-16 22:43:47,079 INFO  [LocalJobRunner Map Task Executor #0] util.ProcfsBasedProcessTree (ProcfsBasedProcessTree.java:isAvailable(181)) - ProcfsBasedProcessTree currently is supported only on Linux.
2018-03-16 22:43:47,172 INFO  [LocalJobRunner Map Task Executor #0] mapred.Task (Task.java:initialize(587)) -  Using ResourceCalculatorProcessTree : 
[email protected]
2018-03-16 22:43:47,181 INFO [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:runNewMapper(753)) - Processing split: hdfs://192.168.1.112:9000/tmp/clientdata10_3.txt:0+1130 2018-03-16 22:43:47,237 INFO [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:setEquator(1202)) - (EQUATOR) 0 kvi 26214396(104857584) 2018-03-16 22:43:47,237 INFO [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:init(995)) - mapreduce.task.io.sort.mb: 100 2018-03-16 22:43:47,237 INFO [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:init(996)) - soft limit at 83886080 2018-03-16 22:43:47,238 INFO [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:init(997)) - bufstart = 0; bufvoid = 104857600 2018-03-16 22:43:47,238 INFO [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:init(998)) - kvstart = 26214396; length = 6553600 2018-03-16 22:43:47,241 INFO [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:createSortingCollector(402)) - Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer 2018-03-16 22:43:47,811 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1367)) - Job job_local597129759_0001 running in uber mode : false 2018-03-16 22:43:47,813 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1374)) - map 0% reduce 0% 2018-03-16 22:43:48,365 INFO [LocalJobRunner Map Task Executor #0] input.LineRecordReader (LineRecordReader.java:skipUtfByteOrderMark(157)) - Found UTF-8 BOM and skipped it ----map----1080x1920|Vivo X9|a8622b8a26ae679f0be82135f6529902|中國|湖南|益陽|wifi|2017-12-31 11:55:58 ----map----1080x1920|Vivo X9|1c4e5de968280d746667df16ea851fbc|中國|安徽|池州|wifi|2017-12-31 10:15:14 ----map----1080x1812|HUAWEI NEM-AL10|1c7fbee71125ff575dc8f8eee233e359|中國|貴州|貴陽|wifi|2017-12-31 10:14:22 ----map----720x1280|Vivo Y55A|ecaa9c2588bb551e666861b1bf9352fd|中國|江西|unknown|wifi|2017-12-31 20:13:40 ----map----1080x1776|HUAWEI KIW-CL00|a70132d9e71afe2842cb140c70cf8b92|中國|江蘇|南京|ctnet|2017-12-31 09:36:38 ----map----720x1280|Meizu M578CA|4f77fe8ebb9cdc08c5757e17de3d3dc0|中國|江西|南昌|ctnet|2017-12-31 15:24:41 ----map----1920x1080|OPPO R9sk|934263fff13ab52b523dcae794f2c09b|中國|遼寧|unknown|ctnet|2017-12-31 06:41:48 ----map----1184x720|HUAWEI JMM-AL10|511dccb14d1d28420940a4d7c1d781f0|中國|山東|威海|wifi|2017-12-31 14:37:33 ----map----1080x1920|Vivo V3Max A|c52869a5a84136aeaf8cd452fe7f0d44|中國|江蘇|南京|wifi|2017-12-31 09:03:44 ----map----1920x1080|OPPO A77|5a13b7c9cf5fd5057bd9af9506caef2e|中國|浙江|unknown|ctnet|2017-12-31 12:06:14 ----map----1080x1812|HUAWEI NEM-AL10|1c7fbee71125ff575dc8f8eee233e359|中國|貴州|貴陽|wifi|2017-12-31 00:00:00 2018-03-16 22:43:48,372 INFO [LocalJobRunner Map Task Executor #0] mapred.LocalJobRunner (LocalJobRunner.java:statusUpdate(591)) - 2018-03-16 22:43:48,375 INFO [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:flush(1457)) - Starting flush of map output 2018-03-16 22:43:48,375 INFO [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:flush(1475)) - Spilling map output 2018-03-16 22:43:48,375 INFO [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:flush(1476)) - bufstart = 0; bufend = 1602; bufvoid = 104857600 2018-03-16 22:43:48,375 INFO [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:flush(1478)) - kvstart = 26214396(104857584); kvend = 26214356(104857424); length = 41/6553600 2018-03-16 22:43:48,391 INFO [LocalJobRunner Map Task Executor #0] mapred.MapTask (MapTask.java:sortAndSpill(1660)) - Finished spill 0 2018-03-16 22:43:48,402 INFO [LocalJobRunner Map Task Executor #0] mapred.Task (Task.java:done(1001)) - Task:attempt_local597129759_0001_m_000000_0 is done. And is in the process of committing 2018-03-16 22:43:48,413 INFO [LocalJobRunner Map Task Executor #0] mapred.LocalJobRunner (LocalJobRunner.java:statusUpdate(591)) - map 2018-03-16 22:43:48,413 INFO [LocalJobRunner Map Task Executor #0] mapred.Task (Task.java:sendDone(1121)) - Task 'attempt_local597129759_0001_m_000000_0' done. 2018-03-16 22:43:48,413 INFO [LocalJobRunner Map Task Executor #0] mapred.LocalJobRunner (LocalJobRunner.java:run(249)) - Finishing task: attempt_local597129759_0001_m_000000_0 2018-03-16 22:43:48,413 INFO [Thread-24] mapred.LocalJobRunner (LocalJobRunner.java:runTasks(456)) - map task executor complete. 2018-03-16 22:43:48,416 INFO [Thread-24] mapred.LocalJobRunner (LocalJobRunner.java:runTasks(448)) - Waiting for reduce tasks 2018-03-16 22:43:48,416 INFO [pool-7-thread-1] mapred.LocalJobRunner (LocalJobRunner.java:run(302)) - Starting task: attempt_local597129759_0001_r_000000_0 2018-03-16 22:43:48,443 INFO [pool-7-thread-1] util.ProcfsBasedProcessTree (ProcfsBasedProcessTree.java:isAvailable(181)) - ProcfsBasedProcessTree currently is supported only on Linux. 2018-03-16 22:43:48,645 INFO [pool-7-thread-1] mapred.Task (Task.java:initialize(587)) - Using ResourceCalculatorProcessTree :
[email protected]
2018-03-16 22:43:48,648 INFO [pool-7-thread-1] mapred.ReduceTask (ReduceTask.java:run(362)) - Using ShuffleConsumerPlugin: [email protected] 2018-03-16 22:43:48,665 INFO [pool-7-thread-1] reduce.MergeManagerImpl (MergeManagerImpl.java:<init>(197)) - MergerManager: memoryLimit=1316801664, maxSingleShuffleLimit=329200416, mergeThreshold=869089152, ioSortFactor=10, memToMemMergeOutputsThreshold=10 2018-03-16 22:43:48,668 INFO [EventFetcher for fetching Map Completion Events] reduce.EventFetcher (EventFetcher.java:run(61)) - attempt_local597129759_0001_r_000000_0 Thread started: EventFetcher for fetching Map Completion Events 2018-03-16 22:43:48,704 INFO [localfetcher#1] reduce.LocalFetcher (LocalFetcher.java:copyMapOutput(141)) - localfetcher#1 about to shuffle output of map attempt_local597129759_0001_m_000000_0 decomp: 1626 len: 1630 to MEMORY 2018-03-16 22:43:48,714 INFO [localfetcher#1] reduce.InMemoryMapOutput (InMemoryMapOutput.java:shuffle(100)) - Read 1626 bytes from map-output for attempt_local597129759_0001_m_000000_0 2018-03-16 22:43:48,717 INFO [localfetcher#1] reduce.MergeManagerImpl (MergeManagerImpl.java:closeInMemoryFile(315)) - closeInMemoryFile -> map-output of size: 1626, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->1626 2018-03-16 22:43:48,720 INFO [EventFetcher for fetching Map Completion Events] reduce.EventFetcher (EventFetcher.java:run(76)) - EventFetcher is interrupted.. Returning 2018-03-16 22:43:48,721 INFO [pool-7-thread-1] mapred.LocalJobRunner (LocalJobRunner.java:statusUpdate(591)) - 1 / 1 copied. 2018-03-16 22:43:48,721 INFO [pool-7-thread-1] reduce.MergeManagerImpl (MergeManagerImpl.java:finalMerge(687)) - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs 2018-03-16 22:43:48,739 INFO [pool-7-thread-1] mapred.Merger (Merger.java:merge(597)) - Merging 1 sorted segments 2018-03-16 22:43:48,740 INFO [pool-7-thread-1] mapred.Merger (Merger.java:merge(696)) - Down to the last merge-pass, with 1 segments left of total size: 1580 bytes 2018-03-16 22:43:48,746 INFO [pool-7-thread-1] reduce.MergeManagerImpl (MergeManagerImpl.java:finalMerge(754)) - Merged 1 segments, 1626 bytes to disk to satisfy reduce memory limit 2018-03-16 22:43:48,747 INFO [pool-7-thread-1] reduce.MergeManagerImpl (MergeManagerImpl.java:finalMerge(784)) - Merging 1 files, 1630 bytes from disk 2018-03-16 22:43:48,748 INFO [pool-7-thread-1] reduce.MergeManagerImpl (MergeManagerImpl.java:finalMerge(799)) - Merging 0 segments, 0 bytes from memory into reduce 2018-03-16 22:43:48,748 INFO [pool-7-thread-1] mapred.Merger (Merger.java:merge(597)) - Merging 1 sorted segments 2018-03-16 22:43:48,752 INFO [pool-7-thread-1] mapred.Merger (Merger.java:merge(696)) - Down to the last merge-pass, with 1 segments left of total size: 1580 bytes 2018-03-16 22:43:48,756 INFO [pool-7-thread-1] mapred.LocalJobRunner (LocalJobRunner.java:statusUpdate(591)) - 1 / 1 copied. 2018-03-16 22:43:48,816 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1374)) - map 100% reduce 0% 2018-03-16 22:43:49,922 INFO [pool-7-thread-1] zookeeper.RecoverableZooKeeper (RecoverableZooKeeper.java:<init>(120)) - Process identifier=hconnection-0x478f1f48 connecting to ZooKeeper ensemble=pc2:2181,pc3:2181,pc4:2181 2018-03-16 22:43:49,930 INFO [pool-7-thread-1] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT 2018-03-16 22:43:49,931 INFO [pool-7-thread-1] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:host.name=chen 2018-03-16 22:43:49,931 INFO [pool-7-thread-1] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:java.version=1.8.0_74 2018-03-16 22:43:49,931 INFO [pool-7-thread-1] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:java.vendor=Oracle Corporation 2018-03-16 22:43:49,931 INFO [pool-7-thread-1] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:java.home=D:\Java\jdk1.8.0_74\jre 2018-03-16 22:43:49,931 INFO [pool-7-thread-1] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:java.class.path=D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\bin;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\activation-1.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\aopalliance-1.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\apacheds-i18n-2.0.0-M15.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\apacheds-kerberos-codec-2.0.0-M15.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\api-asn1-api-1.0.0-M20.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\api-util-1.0.0-M20.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\asm-3.2.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\aspectjweaver-1.6.11.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\avro-1.7.4.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\commons-beanutils-core-1.8.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\commons-cli-1.2.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\commons-codec-1.4.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\commons-collections-3.2.2.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\commons-compress-1.4.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\commons-configuration-1.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\commons-dbcp-1.2.2.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\commons-digester-1.8.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\commons-el-1.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\commons-fileupload-1.2.2.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\commons-httpclient-3.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\commons-io-2.4.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\commons-lang-2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\commons-logging-1.1.3.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\commons-math3-3.1.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\commons-net-3.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\commons-pool-1.5.4.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\curator-client-2.6.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\curator-framework-2.6.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\curator-recipes-2.6.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\fastjson-1.1.41.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\gson-2.3.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\guava-11.0.2.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-annotations-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-auth-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-common-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-hdfs-2.6.5-tests.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-hdfs-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-hdfs-nfs-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-mapreduce-client-app-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-mapreduce-client-common-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-mapreduce-client-core-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-mapreduce-client-hs-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-mapreduce-client-hs-plugins-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-mapreduce-client-jobclient-2.6.5-tests.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-mapreduce-client-jobclient-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-mapreduce-client-shuffle-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-mapreduce-examples-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-nfs-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-yarn-api-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-yarn-applications-distributedshell-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-yarn-applications-unmanaged-am-launcher-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-yarn-client-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-yarn-common-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-yarn-registry-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-yarn-server-applicationhistoryservice-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-yarn-server-common-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-yarn-server-nodemanager-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-yarn-server-resourcemanager-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-yarn-server-tests-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hadoop-yarn-server-web-proxy-2.6.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\hamcrest-core-1.3.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\htrace-core-3.0.4.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\httpclient-4.4.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\httpcore-4.4.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\jackson-core-asl-1.9.13.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\jackson-jaxrs-1.9.13.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\jackson-mapper-asl-1.9.13.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\jackson-xc-1.9.13.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\java-xmlbuilder-0.4.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\javax.mail-1.5.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\jaxb-api-2.2.2.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\jaxb-impl-2.2.3-1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\jedis-2.1.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\jersey-core-1.9.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\jersey-json-1.9.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\jersey-server-1.9.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\jets3t-0.9.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\jettison-1.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\jetty-6.1.26.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\jetty-util-6.1.26.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\jsch-0.1.42.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\jsr305-1.3.9.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\junit-4.11.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\libfb303-0.9.3.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\log4j-1.2.17.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\mockito-all-1.8.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\netty-3.6.2.Final.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\paranamer-2.3.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\protobuf-java-2.5.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\protostuff-api-1.1.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\protostuff-collectionschema-1.1.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\protostuff-core-1.1.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\protostuff-runtime-1.1.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\servlet-api-2.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\slf4j-api-1.7.7.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\slf4j-log4j12-1.7.7.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\snappy-java-1.0.4.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\stax-api-1.0-2.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\xmlenc-0.52.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hadooplibs\xz-1.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\activation-1.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\aopalliance-1.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\apacheds-i18n-2.0.0-M15.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\apacheds-kerberos-codec-2.0.0-M15.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\api-asn1-api-1.0.0-M20.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\api-util-1.0.0-M20.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\asm-3.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\avro-1.7.4.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\commons-beanutils-1.7.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\commons-beanutils-core-1.8.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\commons-cli-1.2.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\commons-codec-1.9.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\commons-collections-3.2.2.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\commons-compress-1.4.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\commons-configuration-1.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\commons-daemon-1.0.13.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\commons-digester-1.8.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\commons-el-1.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\commons-httpclient-3.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\commons-io-2.4.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\commons-lang-2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\commons-logging-1.2.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\commons-math-2.2.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\commons-math3-3.1.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\commons-net-3.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\disruptor-3.3.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\findbugs-annotations-1.3.9-1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\guava-12.0.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\guice-3.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\guice-servlet-3.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-annotations-1.2.6-tests.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-annotations-1.2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-client-1.2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-common-1.2.6-tests.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-common-1.2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-examples-1.2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-external-blockcache-1.2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-hadoop-compat-1.2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-hadoop2-compat-1.2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-it-1.2.6-tests.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-it-1.2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-prefix-tree-1.2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-procedure-1.2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-protocol-1.2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-resource-bundle-1.2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-rest-1.2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-server-1.2.6-tests.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-server-1.2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-shell-1.2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\hbase-thrift-1.2.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\htrace-core-3.1.0-incubating.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\httpclient-4.2.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\httpcore-4.4.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jackson-core-asl-1.9.13.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jackson-jaxrs-1.9.13.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jackson-mapper-asl-1.9.13.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jackson-xc-1.9.13.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jamon-runtime-2.4.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jasper-compiler-5.5.23.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jasper-runtime-5.5.23.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\java-xmlbuilder-0.4.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\javax.inject-1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jaxb-api-2.2.2.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jaxb-impl-2.2.3-1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jcodings-1.0.8.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jersey-client-1.9.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jersey-core-1.9.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jersey-guice-1.9.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jersey-json-1.9.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jersey-server-1.9.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jets3t-0.9.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jettison-1.3.3.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jetty-6.1.26.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jetty-sslengine-6.1.26.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jetty-util-6.1.26.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\joni-2.1.2.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jruby-complete-1.6.8.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jsch-0.1.42.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jsp-2.1-6.1.14.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\jsp-api-2.1-6.1.14.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\junit-4.12.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\leveldbjni-all-1.8.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\libthrift-0.9.3.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\log4j-1.2.17.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\metrics-core-2.2.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\netty-all-4.0.23.Final.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\paranamer-2.3.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\protobuf-java-2.5.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\slf4j-api-1.7.7.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\slf4j-log4j12-1.7.5.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\snappy-java-1.0.4.1.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\spymemcached-2.11.6.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\xmlenc-0.52.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\xz-1.0.jar;D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR\hbaselibs\zookeeper-3.4.6.jar 2018-03-16 22:43:49,932 INFO [pool-7-thread-1] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:java.library.path=D:\Java\jdk1.8.0_74\bin;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;C:\Program Files (x86)\ARM\ADSv1_2\bin;D:\Python36-32\Scripts\;D:\Python36-32\;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Program Files (x86)\Intel\OpenCL SDK\2.0\bin\x86;C:\Program Files (x86)\Intel\OpenCL SDK\2.0\bin\x64;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;D:\Java\jdk1.8.0_74\bin;D:\Java\jdk1.8.0_74\jre\bin;D:\mysql-5.6.24-winx64\bin;C:\Program Files (x86)\GtkSharp\2.12\bin;C:\Program Files\OpenVPN\bin;C:\Program Files\Git\cmd;D:\scala-2.12.4\bin;D:\jython2.7.0;D:\jython2.7.0\bin;D:\Program Files\nodejs\;D:\apache-maven-3.5.0\bin;D:\hadoop-2.6.5\bin;D:\spark-1.6.3-bin-hadoop2.6\spark-1.6.3-bin-hadoop2.6\bin;C:\strawberry\c\bin;C:\strawberry\perl\bin;C:\Program Files\TortoiseSVN\bin;D:\ldxfiles\apache-ant-1.10.1-bin\apache-ant-1.10.1\bin;C:\Program Files (x86)\Microsoft Visual Studio\Common\Tools\WinNT;C:\Program Files (x86)\Microsoft Visual Studio\Common\MSDev98\Bin;C:\Program Files (x86)\Microsoft Visual Studio\Common\Tools;C:\Program Files (x86)\Microsoft Visual Studio\VC98\bin;C:\Users\Administrator\AppData\Roaming\npm;C:\Users\Administrator\AppData\Local\Programs\Fiddler;. 2018-03-16 22:43:49,932 INFO [pool-7-thread-1] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:java.io.tmpdir=C:\Users\ADMINI~1\AppData\Local\Temp\ 2018-03-16 22:43:49,932 INFO [pool-7-thread-1] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:java.compiler=<NA> 2018-03-16 22:43:49,932 INFO [pool-7-thread-1] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:os.name=Windows 7 2018-03-16 22:43:49,932 INFO [pool-7-thread-1] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:os.arch=amd64 2018-03-16 22:43:49,933 INFO [pool-7-thread-1] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:os.version=6.1 2018-03-16 22:43:49,933 INFO [pool-7-thread-1] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:user.name=Administrator 2018-03-16 22:43:49,933 INFO [pool-7-thread-1] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:user.home=C:\Users\Administrator 2018-03-16 22:43:49,933 INFO [pool-7-thread-1] zookeeper.ZooKeeper (Environment.java:logEnv(100)) - Client environment:user.dir=D:\SourceCode\MyEclipse2014Workspace\HDFS2HBaseByHadoopMR 2018-03-16 22:43:49,934 INFO [pool-7-thread-1] zookeeper.ZooKeeper (ZooKeeper.java:<init>(438)) - Initiating client connection, connectString=pc2:2181,pc3:2181,pc4:2181 sessionTimeout=90000 watcher=hconnection-0x478f1f480x0, quorum=pc2:2181,pc3:2181,pc4:2181, baseZNode=/hbase 2018-03-16 22:43:49,960 INFO [pool-7-thread-1-SendThread(pc3:2181)] zookeeper.ClientCnxn (ClientCnxn.java:logStartConnect(975)) - Opening socket connection to server pc3/192.168.1.81:2181. Will not attempt to authenticate using SASL (unknown error) 2018-03-16 22:43:49,966 INFO [pool-7-thread-1-SendThread(pc3:2181)] zookeeper.ClientCnxn (ClientCnxn.java:primeConnection(852)) - Socket connection established to pc3/192.168.1.81:2181, initiating session 2018-03-16 22:43:49,984 INFO [pool-7-thread-1-SendThread(pc3:2181)] zookeeper.ClientCnxn (ClientCnxn.java:onConnected(1235)) - Session establishment complete on server pc3/192.168.1.81:2181, sessionid = 0x200001082220017, negotiated timeout = 40000 2018-03-16 22:43:50,110 INFO [pool-7-thread-1] mapreduce.TableOutputFormat (TableOutputFormat.java:<init>(108)) - Created table instance for clientdata_test5 2018-03-16 22:43:50,110 INFO [pool-7-thread-1] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(1129)) - mapred.skip.on is deprecated. Instead, use mapreduce.job.skiprecords --reduce:1c4e5de968280d746667df16ea851fbc-2017-12-31 1080x1920|Vivo X9|1c4e5de968280d746667df16ea851fbc|中國|安徽|池州|wifi|2017-12-31 10:15:14 --reduce:1c7fbee71125ff575dc8f8eee233e359-2017-12-31 1080x1812|HUAWEI NEM-AL10|1c7fbee71125ff575dc8f8eee233e359|中國|貴州|貴陽|wifi|2017-12-31 00:00:00 1080x1812|HUAWEI NEM-AL10|1c7fbee71125ff575dc8f8eee233e359|中國|貴州|貴陽|wifi|2017-12-31 10:14:22 --reduce:4f77fe8ebb9cdc08c5757e17de3d3dc0-2017-12-31 720x1280|Meizu M578CA|4f77fe8ebb9cdc08c5757e17de3d3dc0|中國|江西|南昌|ctnet|2017-12-31 15:24:41 --reduce:511dccb14d1d28420940a4d7c1d781f0-2017-12-31 1184x720|HUAWEI JMM-AL10|511dccb14d1d28420940a4d7c1d781f0|中國|山東|威海|wifi|2017-12-31 14:37:33 --reduce:5a13b7c9cf5fd5057bd9af9506caef2e-2017-12-31 1920x1080|OPPO A77|5a13b7c9cf5fd5057bd9af9506caef2e|中國|浙江|unknown|ctnet|2017-12-31 12:06:14 --reduce:934263fff13ab52b523dcae794f2c09b-2017-12-31 1920x1080|OPPO R9sk|934263fff13ab52b523dcae794f2c09b|中國|遼寧|unknown|ctnet|2017-12-31 06:41:48 --reduce:a70132d9e71afe2842cb140c70cf8b92-2017-12-31 1080x1776|HUAWEI KIW-CL00|a70132d9e71afe2842cb140c70cf8b92|中國|江蘇|南京|ctnet|2017-12-31 09:36:38 --reduce:a8622b8a26ae679f0be82135f6529902-2017-12-31 1080x1920|Vivo X9|a8622b8a26ae679f0be82135f6529902|中國|湖南|益陽|wifi|2017-12-31 11:55:58 --reduce:c52869a5a84136aeaf8cd452fe7f0d44-2017-12-31 1080x1920|Vivo V3Max A|c52869a5a84136aeaf8cd452fe7f0d44|中國|江蘇|南京|wifi|2017-12-31 09:03:44 --reduce:ecaa9c2588bb551e666861b1bf9352fd-2017-12-31 720x1280|Vivo Y55A|ecaa9c2588bb551e666861b1bf9352fd|中國|江西|unknown|wifi|2017-12-31 20:13:40 2018-03-16 22:43:50,637 INFO [pool-7-thread-1] client.ConnectionManager$HConnectionImplementation (ConnectionManager.java:closeZooKeeperWatcher(1710)) - Closing zookeeper sessionid=0x200001082220017 2018-03-16 22:43:50,646 INFO [pool-7-thread-1] zookeeper.ZooKeeper (ZooKeeper.java:close(684)) - Session: 0x200001082220017 closed 2018-03-16 22:43:50,646 INFO [pool-7-thread-1-EventThread] zookeeper.ClientCnxn (ClientCnxn.java:run(512)) - EventThread shut down 2018-03-16 22:43:50,656 INFO [pool-7-thread-1] mapred.Task (Task.java:done(1001)) - Task:attempt_local597129759_0001_r_000000_0 is done. And is in the process of committing 2018-03-16 22:43:50,658 INFO [pool-7-thread-1] mapred.LocalJobRunner (LocalJobRunner.java:statusUpdate(591)) - reduce > reduce 2018-03-16 22:43:50,658 INFO [pool-7-thread-1] mapred.Task (Task.java:sendDone(1121)) - Task 'attempt_local597129759_0001_r_000000_0' done. 2018-03-16 22:43:50,658 INFO [pool-7-thread-1] mapred.LocalJobRunner (LocalJobRunner.java:run(325)) - Finishing task: attempt_local597129759_0001_r_000000_0 2018-03-16 22:43:50,660 INFO [Thread-24] mapred.LocalJobRunner (LocalJobRunner.java:runTasks(456)) - reduce task executor complete. 2018-03-16 22:43:50,816 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1374)) - map 100% reduce 100% 2018-03-16 22:43:50,817 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1385)) - Job job_local597129759_0001 completed successfully 2018-03-16 22:43:50,861 INFO [main] mapreduce.Job (Job.java:monitorAndPrintJob(1392)) - Counters: 38 File System Counters FILE: Number of bytes read=43542990 FILE: Number of bytes written=44562580 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=2260 HDFS: Number of bytes written=0 HDFS: Number of read operations=6 HDFS: Number of large read operations=0 HDFS: Number of write operations=0 Map-Reduce Framework Map input records=11 Map output records=11 Map output bytes=1602 Map output materialized bytes=1630 Input split bytes=113 Combine input records=0 Combine output records=0 Reduce input groups=10 Reduce shuffle bytes=1630 Reduce input records=11 Reduce output records=10 Spilled Records=22 Shuffled Maps =1 Failed Shuffles=0 Merged Map outputs=1 GC time elapsed (ms)=18 CPU time spent (ms)=0 Physical memory (bytes) snapshot=0 Virtual memory (bytes) snapshot=0 Total committed heap usage (bytes)=457703424 Shuffle Errors BAD_ID=0 CONNECTION=0 IO_ERROR=0 WRONG_LENGTH=0 WRONG_MAP=0 WRONG_REDUCE=0 File Input Format Counters Bytes Read=1130 File Output Format Counters Bytes Written=0


注意到rowkey相同的行規約到了一起


大量資料測試:



相關推薦

HBase匯入資料3使用MapReduceHDFS本地檔案讀取資料寫入HBase增加使用Reduce批量插入

前面我們介紹了:為了提高插入效率,我們在前面只使用map的基礎上增加使用reduce,思想是使用map-reduce操作,將rowkey相同的項規約到同一個reduce中,再在reduce中構建put物件實現批量插入測試資料如下:注意到有兩條記錄是相似的。package cn

xmlyml檔案讀取資料

CvFileStorage:檔案儲存器,這是資料持久化和RTTI部分基礎的資料結構,該部分的其他函式均通過此結構來訪問檔案。 typedef struct CvFileStorage {     int flags;     int is_xml;     int write_mode;     int i

軟件工程課後作業3如何返回一個整數數組最大子數組的和

10個 如何 nbsp ima stdio.h scan can getchar() char 4 代碼語言: #include <stdio.h> int main(){ int a[10]; int b[5]; int i,j,t; printf("請輸入1

線性排序3如何根據年齡給100萬用戶資料排序?

       如何根據年齡給 100 萬用戶排序? 你可能會說,我用上一節課講的歸併、快排就可以搞定啊!是的,它們也可以完成功能,但是時間複雜度最低也是O(nlogn)。有沒有更快的排序方法呢?讓我們一起進入今天的內容! 桶排序(Bucket sort)        首先

eclipse匯入maven專案org.apache.maven.archiver.MavenArchiver.getManifest(org.apache.maven.project.Maven

org.apache.maven.archiver.MavenArchiver.getManifest(org.apache.maven.project.Maven)匯入專案報錯 原因:maven的配置檔案不是最新的 解決方法為:更新eclipse中的maven外掛 1

Android開發 Eclipse匯入專案前有紅叉提示但是專案檔案內容無錯誤的解決方法

 Eclipse中,Android專案名稱前有紅叉,但專案內所有檔案都無錯誤,通常發生在匯入專案時。 先可以去看一下幾個視窗的輸出內容,不同的錯誤日誌要採用不同的方法,要靈活使用各種方法! 1>選單路徑----Window/Show View/Console 2

eclipse匯入maven專案org.apache.maven.archiver.MavenArchiver

org.codehaus.plexus.archiver.jar.Manifest.write(java.io.PrintWriter) 解決方法為:更新eclipse中的maven外掛 1.help ->  Install New Software -> ad

Hadoop MapReduce使用hdfs檔案

本程式碼包含功能:獲取DataNode名,並寫入到HDFS檔案系統中的檔案hdfs:///copyOftest.c中。 並計數檔案hdfs:///copyOftest.c中的wordcount計數,有別於Hadoop的examples中的讀取本地檔案系統中的檔案,這次讀取的

eclipse匯入maven專案org.apache.maven.archiver.MavenArchiver.getManifest(org.apache.maven.project.Ma

org.codehaus.plexus.archiver.jar.Manifest.write(java.io.PrintWriter)解決方法為:更新eclipse中的maven外掛1.help ->  Install New Software -> add -

C#開發BIMFACE系列18 服務端API之獲取模型資料3獲取構件屬性

系列目錄     【已更新最新開發文章,點選檢視詳細】 本篇主要介紹如何獲取單檔案/模型下單個構建的屬性資訊。 請求地址:GET https://api.bimface.com/data/v2/files/{fileId}/elements/{elementId}

資料分析如何網際網路大資料分析行業趨勢

一、前言: 研究行業趨勢是每家公司的硬需求,如手機業者希望瞭解同行有沒有什麼顏色是比較受消費者歡迎的,護膚品公司想要了解什麼成分是被廣泛而且美譽的討論,藉由加入這些概念元素,他們可以讓他們的產品更具吸引力,這種跟風做法其實一直都有,但是傳統人工去看會遇到兩個問題: 1、發現過慢:通常人工可以發現時,這些概念元

第二章 鍵盤文件獲取標準輸入read命令

read命令 從鍵盤或文件中獲取標準輸入 第二章 從鍵盤或文件中獲取標準輸入:read命令 read命令 從鍵盤讀取變量的值,通常用在shell腳本中與用戶進行交互的場合。該命令可以一次讀多個變量的值,變量和輸入的值都需要使用空格隔開。在read命令後面,如果沒有指定變量名,讀取的數據將被自動賦值給

資料處理用pandas處理大型csv檔案

在訓練機器學習模型的過程中,源資料常常不符合我們的要求。大量繁雜的資料,需要按照我們的需求進行過濾。拿到我們想要的資料格式,並建立能夠反映資料間邏輯結構的資料表達形式。 最近就拿到一個小任務,需要處理70多萬條資料。 我們在處理csv檔案時,經常使用pandas,可以幫助處理較大的

Android RxJava操作符的學習---組合合併操作符---磁碟記憶體快取獲取快取資料

1. 需求場景     2. 功能說明 對於從磁碟 / 記憶體快取中 獲取快取資料 的功能邏輯如下: 3. 具體實現 詳細請看程式碼註釋 // 該2變數用於模擬記憶體快取 & 磁碟快取中的資料 String me

系統排錯3若誤刪grub引導檔案,如何恢復?

系統排錯 若誤刪grub引導檔案,如何恢復? (1)刪除grub引導檔案但系統並未重啟 1).模擬實驗環境 [[email protected] ~]# cd /boot/grub2 [[email protected] grub2]# ls device

Python如何字典的多value的某個值得到這個多value?

在這裡再次解釋一下題目: 目的:一個字典,存在著1key多value的現象,如果根據多value中的值,找到這個key,並且得到這個key對應的所有value? 比如我們這裡有一個字典: mydict = {'george':16,'amber':[19, 20]} 我們想根據19,

海量資料處理十道面試題與十個海量資料處理方法總結資料演算法面試題

第一部分、十道海量資料處理面試題 1、海量日誌資料,提取出某日訪問百度次數最多的那個IP。       首先是這一天,並且是訪問百度的日誌中的IP取出來,逐個寫入到一個大檔案中。注意到IP是32位的,最多有個2^32個IP。同樣可以採用對映的方法

C++學習筆記開啟檔案讀取資料資料定位與資料寫入

1.開啟二進位制檔案(fopen)、讀取資料(fread),應用示例如下: FILE *fp = fopen("data.yuv", "rb+"); //開啟當前目錄中的data.yuv檔案 char *buffer = ( char*) malloc (sizeof(char)*FrameSi

python讀txt檔案資料,然後修改資料,再以矩陣形式儲存在檔案

import os os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2' # -*- coding: UTF-8 -*- import numpy as np import glob import tensorflow as tf flag=T

Pandas日期資料處理如何按日期篩選、顯示及統計資料

前言 pandas有著強大的日期資料處理功能,本期我們來了解下pandas處理日期資料的一些基本功能,主要包括以下三個方面: 按日期篩選資料 按日期顯示資料 按日期統計資料 執行環境為 windows系統,64位,python3.5。 1 讀取並整理資料 首先