mahout分类学习和遇到的问题总结,mahout分类


这段时间学习Mahout有喜有悲,在这里首先感谢樊哲老师的指导,下面列出关于这次Mahout分类的学习和遇到的问题,还请大家多多提出建议:(所有文件操作都使用是在hdfs上边进行的)。

(本人用的环境是Mahout0.9+hadoop-2.2.0)

一、首先将预分类文件转换为序列化化存储:

下边图片列出的是使用的20newsgroup数据(我使用的linux上的eclipse,然后在eclipse上边安装的eclipse-hadoop插件),数据图片如下:



然后编写java代码将此20newsgroup分类文件转换为sequence文件存储,如下图:


执行上边的程序,最后生成下列序列化文件:


写程序查看序列化文件的内容如下:


二、将序列化文件转储为向量文件:

输入上边的序列化文件的输出结果将序列化文件转换为向量文件,转换java代码如下:


转换为向量文件结果如下图所示:


使用java程序查看向量文件的tfidf-vectors下的part-m-00000内容如下(一部分结果):

key= /hdfs://h1:9000/classifier/classifier-src4/20news-bydate-train/sci.med/58064   value= /hdfs://h1:9000/classifier/classifier-src4/20news-bydate-train/sci.med/58064:{40949:4.404207229614258,58885:6.966500282287598,52884:1.404096007347107,54411:6.043336868286133,25074:16.67474365234375,20175:6.222922325134277,32290:7.155742168426514,57777:6.696209907531738,59820:5.825411796569824,53261:5.4359564781188965,15068:9.235183715820312,68017:4.232685565948486,47050:9.235183715820312,39193:8.724358558654785,56670:3.387782096862793,69370:1.5702117681503296,21521:7.7688469886779785,54393:5.597597599029541,62286:4.329909324645996,11644:7.128331661224365,12511:1.7634414434432983,17861:4.079967498779297,39895:7.338063716888428,55445:7.62574577331543,50765:9.235183715820312,41274:7.338063716888428,19358:7.155742168426514,19837:9.235183715820312,46237:9.235183715820312,24355:2.2430875301361084,52769:7.289273738861084,24217:1.9966870546340942,19749:3.0007731914520264,28651:7.114920139312744,24284:2.4492197036743164,61132:6.2394514083862305,39146:5.458599090576172,34340:7.289273738861084,34936:7.443424224853516,10911:6.129103660583496,12647:8.254354476928711,47554:1.0402182340621948,40788:7.338063716888428,10482:7.037959098815918,62155:3.3696606159210205,33813:1.230707049369812,24044:8.947502136230469,63344:5.73867654800415,68080:6.2394514083862305,17029:5.118860244750977,65110:7.075699806213379,32127:5.7694478034973145,30288:3.71773099899292,13537:3.087428092956543,11545:8.254354476928711,47708:7.935900688171387,47276:2.321446418762207,24045:8.947502136230469,56652:3.911620616912842,10107:12.653678894042969,10233:3.475231170654297,34068:5.321562767028809,58884:7.338063716888428,68165:1.6761456727981567,40591:9.235183715820312,47733:11.100005149841309,58624:4.493154525756836,50783:11.100005149841309,44628:6.572596073150635}
key= /hdfs://h1:9000/classifier/classifier-src4/20news-bydate-train/talk.politics.guns/54390   value= /hdfs://h1:9000/classifier/classifier-src4/20news-bydate-train/talk.politics.guns/54390:{25902:9.235183715820312,47590:4.066595554351807,48033:3.2437193393707275,67021:2.7126009464263916,24472:8.947502136230469,44782:1.9297716617584229,42373:5.015676021575928,24015:2.4575374126434326,11106:1.9394488334655762,42273:2.070721387863159,62394:3.628157138824463,61226:7.848889350891113,44453:2.7429440021514893,21501:15.389477729797363,32332:3.7448697090148926,64726:1.9358370304107666,48742:7.499622821807861,60003:15.995806694030762,50448:4.258450031280518,14327:3.194135904312134,50798:7.7688469886779785,47387:3.7190706729888916,47554:1.0402182340621948,11201:3.9916746616363525,53791:2.5926971435546875,19329:7.785928249359131,52953:2.958540439605713,13970:5.588863849639893,46327:8.387886047363281,58697:4.107259273529053,49227:15.995806694030762,18911:6.383354663848877,50439:3.510509967803955,9861:1.9852583408355713,56602:3.647935152053833,50458:1.7995492219924927,36905:5.748828887939453,66718:5.264892101287842,33813:2.7519447803497314,68017:2.9929606914520264,23442:3.458564043045044,1890:3.5897369384765625,44013:5.357062339782715,35455:3.657973051071167,65123:2.995558023452759,56080:7.561207294464111,40309:4.490251541137695,26572:8.628969192504883,23439:3.0159199237823486,27894:5.060796737670898,46052:1.8622281551361084,18320:5.84535026550293,17803:3.757326602935791,33291:1.8489196300506592}
key= /hdfs://h1:9000/classifier/classifier-src4/20news-bydate-train/comp.windows.x/67312   value= /hdfs://h1:9000/classifier/classifier-src4/20news-bydate-train/comp.windows.x/67312:{47554:1.0402182340621948,6258:3.1925511360168457,33291:1.8489196300506592,788:5.197997570037842,32485:4.425988674163818,53061:4.7731146812438965,68774:6.114288330078125,4487:4.123196125030518,65155:4.61021089553833,65670:5.0815229415893555,5285:10.784433364868164,50458:1.7995492219924927,35455:5.173154830932617,46052:1.8622281551361084,27311:8.298446655273438,8749:3.7985548973083496,26321:6.401970386505127,4587:13.633935928344727,1109:3.6212668418884277,34867:6.085300922393799,41201:3.171398639678955,25533:13.633935928344727}
三、生成训练模型:

输入上边生成的tfidf-vectors下的part-m-00000文件生成训练模型具体代码如下:


生成结果如下图所示:

(1)、训练模型


(2)、indexlabel文件


经过查看indexlabel的内容发现indexlabel不正确。

indexlabel的内容如下:(显然不正确不知道为什么,网上搜了很多内容都没有解决掉错误)

key= hdfs:   value= 0

自己尝试的解决办法修改Mahout源码的org.apache.mahout.classifier.naivebayes.BayesUtils.java中代码toString())[1]改为toString())[7],如下图所示:

这里修改这个数组下标的原因是BayesUtils.java默认使用“/”来分隔目录,数组的toString())[7]正好是所需要的文件类型标识。


改为:


这时候indexlabel文件内容为(总共20个类型标识显示正确),如下图所示:


可是在执行训练模型的时候又出现了新的错误。。(如下执行过程所示):

2014-07-21 20:45:27,738 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir
2014-07-21 20:45:27,738 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.compress.map.output is deprecated. Instead, use mapreduce.map.output.compress
2014-07-21 20:45:27,738 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
2014-07-21 20:45:27,837 INFO  [main] client.RMProxy (RMProxy.java:createRMProxy(56)) - Connecting to ResourceManager at h1/192.168.1.130:8032
2014-07-21 20:45:28,085 WARN  [main] mapreduce.JobSubmitter (JobSubmitter.java:copyAndConfigureFiles(149)) - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
2014-07-21 20:45:28,963 INFO  [main] input.FileInputFormat (FileInputFormat.java:listStatus(287)) - Total input paths to process : 1
2014-07-21 20:45:29,025 INFO  [main] mapreduce.JobSubmitter (JobSubmitter.java:submitJobInternal(394)) - number of splits:1
2014-07-21 20:45:29,036 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - user.name is deprecated. Instead, use mapreduce.job.user.name
2014-07-21 20:45:29,037 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.jar is deprecated. Instead, use mapreduce.job.jar
2014-07-21 20:45:29,037 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.cache.files.filesizes is deprecated. Instead, use mapreduce.job.cache.files.filesizes
2014-07-21 20:45:29,037 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.cache.files is deprecated. Instead, use mapreduce.job.cache.files
2014-07-21 20:45:29,038 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.output.value.class is deprecated. Instead, use mapreduce.job.output.value.class
2014-07-21 20:45:29,038 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.mapoutput.value.class is deprecated. Instead, use mapreduce.map.output.value.class
2014-07-21 20:45:29,038 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapreduce.combine.class is deprecated. Instead, use mapreduce.job.combine.class
2014-07-21 20:45:29,045 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapreduce.map.class is deprecated. Instead, use mapreduce.job.map.class
2014-07-21 20:45:29,045 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.job.name is deprecated. Instead, use mapreduce.job.name
2014-07-21 20:45:29,046 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapreduce.reduce.class is deprecated. Instead, use mapreduce.job.reduce.class
2014-07-21 20:45:29,046 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapreduce.inputformat.class is deprecated. Instead, use mapreduce.job.inputformat.class
2014-07-21 20:45:29,046 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapreduce.outputformat.class is deprecated. Instead, use mapreduce.job.outputformat.class
2014-07-21 20:45:29,046 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
2014-07-21 20:45:29,046 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.cache.files.timestamps is deprecated. Instead, use mapreduce.job.cache.files.timestamps
2014-07-21 20:45:29,046 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.output.key.class is deprecated. Instead, use mapreduce.job.output.key.class
2014-07-21 20:45:29,047 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.mapoutput.key.class is deprecated. Instead, use mapreduce.map.output.key.class
2014-07-21 20:45:29,047 INFO  [main] Configuration.deprecation (Configuration.java:warnOnceIfDeprecated(840)) - mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir
2014-07-21 20:45:29,110 INFO  [main] mapreduce.JobSubmitter (JobSubmitter.java:printTokens(477)) - Submitting tokens for job: job_1405911825122_0013
2014-07-21 20:45:29,310 INFO  [main] impl.YarnClientImpl (YarnClientImpl.java:submitApplication(174)) - Submitted application application_1405911825122_0013 to ResourceManager at h1/192.168.1.130:8032
2014-07-21 20:45:29,349 INFO  [main] mapreduce.Job (Job.java:submit(1272)) - The url to track the job: http://h1:8088/proxy/application_1405911825122_0013/
2014-07-21 20:45:29,349 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1317)) - Running job: job_1405911825122_0013
2014-07-21 20:45:34,926 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1338)) - Job job_1405911825122_0013 running in uber mode : false
2014-07-21 20:45:34,928 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1345)) -  map 0% reduce 0%
2014-07-21 20:45:44,229 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1345)) -  map 100% reduce 0%
2014-07-21 20:45:49,803 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1345)) -  map 100% reduce 100%
2014-07-21 20:45:49,818 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1356)) - Job job_1405911825122_0013 completed successfully
2014-07-21 20:45:49,910 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1363)) - Counters: 44
    File System Counters
        FILE: Number of bytes read=680
        FILE: Number of bytes written=161999
        FILE: Number of read operations=0
        FILE: Number of large read operations=0
        FILE: Number of write operations=0
        HDFS: Number of bytes read=18538518
        HDFS: Number of bytes written=97
        HDFS: Number of read operations=7
        HDFS: Number of large read operations=0
        HDFS: Number of write operations=2
    Job Counters
        Launched map tasks=1
        Launched reduce tasks=1
        Data-local map tasks=1
        Total time spent by all maps in occupied slots (ms)=42768
        Total time spent by all reduces in occupied slots (ms)=2352
    Map-Reduce Framework
        Map input records=11314
        Map output records=0
        Map output bytes=0
        Map output materialized bytes=14
        Input split bytes=151
        Combine input records=0
        Combine output records=0
        Reduce input groups=0
        Reduce shuffle bytes=14
        Reduce input records=0
        Reduce output records=0
        Spilled Records=0
        Shuffled Maps =1
        Failed Shuffles=0
        Merged Map outputs=1
        GC time elapsed (ms)=427
        CPU time spent (ms)=3470
        Physical memory (bytes) snapshot=270143488
        Virtual memory (bytes) snapshot=808521728
        Total committed heap usage (bytes)=201457664
    Shuffle Errors
        BAD_ID=0
        CONNECTION=0
        IO_ERROR=0
        WRONG_LENGTH=0
        WRONG_MAP=0
        WRONG_REDUCE=0
    File Input Format Counters
        Bytes Read=18538367
    File Output Format Counters
        Bytes Written=97
    org.apache.mahout.classifier.naivebayes.training.IndexInstancesMapper$Counter
        SKIPPED_INSTANCES=11314
2014-07-21 20:45:49,934 INFO  [main] client.RMProxy (RMProxy.java:createRMProxy(56)) - Connecting to ResourceManager at h1/192.168.1.130:8032
2014-07-21 20:45:49,961 WARN  [main] mapreduce.JobSubmitter (JobSubmitter.java:copyAndConfigureFiles(149)) - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
2014-07-21 20:45:50,218 INFO  [main] input.FileInputFormat (FileInputFormat.java:listStatus(287)) - Total input paths to process : 1
2014-07-21 20:45:50,245 INFO  [main] mapreduce.JobSubmitter (JobSubmitter.java:submitJobInternal(394)) - number of splits:1
2014-07-21 20:45:50,853 INFO  [main] mapreduce.JobSubmitter (JobSubmitter.java:printTokens(477)) - Submitting tokens for job: job_1405911825122_0014
2014-07-21 20:45:50,905 INFO  [main] impl.YarnClientImpl (YarnClientImpl.java:submitApplication(174)) - Submitted application application_1405911825122_0014 to ResourceManager at h1/192.168.1.130:8032
2014-07-21 20:45:50,908 INFO  [main] mapreduce.Job (Job.java:submit(1272)) - The url to track the job: http://h1:8088/proxy/application_1405911825122_0014/
2014-07-21 20:45:50,908 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1317)) - Running job: job_1405911825122_0014
2014-07-21 20:46:01,907 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1338)) - Job job_1405911825122_0014 running in uber mode : false
2014-07-21 20:46:01,907 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1345)) -  map 0% reduce 0%
2014-07-21 20:46:06,258 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1345)) -  map 100% reduce 0%
2014-07-21 20:46:11,815 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1345)) -  map 100% reduce 100%
2014-07-21 20:46:11,824 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1356)) - Job job_1405911825122_0014 completed successfully
2014-07-21 20:46:11,852 INFO  [main] mapreduce.Job (Job.java:monitorAndPrintJob(1363)) - Counters: 43
    File System Counters
        FILE: Number of bytes read=22
        FILE: Number of bytes written=162241
        FILE: Number of read operations=0
        FILE: Number of large read operations=0
        FILE: Number of write operations=0
        HDFS: Number of bytes read=237
        HDFS: Number of bytes written=90
        HDFS: Number of read operations=7
        HDFS: Number of large read operations=0
        HDFS: Number of write operations=2
    Job Counters
        Launched map tasks=1
        Launched reduce tasks=1
        Data-local map tasks=1
        Total time spent by all maps in occupied slots (ms)=12960
        Total time spent by all reduces in occupied slots (ms)=2214
    Map-Reduce Framework
        Map input records=0
        Map output records=0
        Map output bytes=0
        Map output materialized bytes=14
        Input split bytes=140
        Combine input records=0
        Combine output records=0
        Reduce input groups=0
        Reduce shuffle bytes=14
        Reduce input records=0
        Reduce output records=0
        Spilled Records=0
        Shuffled Maps =1
        Failed Shuffles=0
        Merged Map outputs=1
        GC time elapsed (ms)=31
        CPU time spent (ms)=1390
        Physical memory (bytes) snapshot=300437504
        Virtual memory (bytes) snapshot=809414656
        Total committed heap usage (bytes)=225509376
    Shuffle Errors
        BAD_ID=0
        CONNECTION=0
        IO_ERROR=0
        WRONG_LENGTH=0
        WRONG_MAP=0
        WRONG_REDUCE=0
    File Input Format Counters
        Bytes Read=97
    File Output Format Counters
        Bytes Written=90
Exception in thread "main" java.lang.NullPointerException
    at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:187)
    at org.apache.mahout.classifier.naivebayes.BayesUtils.readModelFromDir(BayesUtils.java:81)
    at org.apache.mahout.classifier.naivebayes.training.TrainNaiveBayesJob.run(TrainNaiveBayesJob.java:162)
    at com.redhadoop.trainnb.TrainnbTest.main(TrainnbTest.java:48)



所指的BayesUtils.java 81行的代码如下:



四、使用训练模型进行测试

测试的java代码如下:


测试结果明显不正确



在这一块奋斗好几个日日夜夜,可是错误(上面的红色部分)还是没有解决,还是无奈。。求大神帮忙o(╯□╰)o!




学习总结的问题

刚开始的时候先原封不动的保存,然后在旁边写一份总结,然后慢慢的在抄别人的时候,试着减少一些句子,最后达到只留下自己的总结的话的效果
 

写总结,总结自己在小学阶段学习存在的问题与遇到的困难

语文:
若你以前语文成绩不错的话,最好针对课外训练,争取在作文上出彩,并适当做阅读题目,记得一定要思考。因为语文成绩好的人有时候不需要怎么归纳便可以学会一些东西,然而着终归是少数。
反之,你最好先把古诗文背好了再说。因为考试定会考古诗文,这是显而易见的。然后,我建议你用一种方法。语文尽管活,可是出来出去不会脱离这个大纲。你可以用猜测法。如:本学期学了说明文,那么考试定会有,那我就争取在说明文上出彩。本学期看了哪些名著,那么定会有相关题目。你可以现在就开始坐文章主题思想概括法和人物赏析。
总之语文就是阅读,作文的考试。建议你从2方面夹击,定能取得好成绩!
怎样学好英语
想学好英语,首先要培养对英语的兴趣。“兴趣是最好的老师”,兴趣是学习英语的巨大动力,有了兴趣,学习就会事半功倍。我们都有这样的经验:喜欢的事,就容易坚持下去;不喜欢的事,是很难坚持下去的。而兴趣不是与生俱来的,需要培养。有的同学说:“我一看到英语就头疼,怎么能培养对英语的兴趣呢?”还有的同学说:“英语单词我今天记了明天忘,我太笨了,唉,我算没治了。”这都是缺乏信心的表现。初学英语时,没有掌握正确的学习方法,没有树立必胜的信心,缺乏了克服困难的勇气,丧失了上进的动力,稍遇失败,就会向挫折缴枪,向困难低头。你就会感到英语是一门枯燥无味的学科,学了一段时间之后,学习积极性也逐渐降低,自然也就不会取得好成绩。但是,只要在老师的帮助下,认识到学英语的必要性,用正确的态度对待英语学习,用科学的方法指导学习。开始时多参加一些英语方面的活动,比如 ,唱英文歌、做英语游戏、读英语幽默短文、练习口头对话等。时间长了,懂得多了,就有了兴趣,当然,学习起来就有了动力和欲望。然后,就要像农民一样勤勤恳恳,不辞辛苦,付出辛勤的劳动和汗水,一定会取得成功,收获丰硕的成果。毕竟是No pains, no gains吗。
练好基本功是学好英语的必要条件,没有扎实的英语基础,就谈不上继续学习,更谈不上有所成就。要想基本功扎实,必须全神贯注地认真听讲,上好每一节课,提高课堂效率,脚踏实地、一步一个脚印地,做到以下“五到”:
一、“心到”。在课堂上应聚精会神,一刻也不能懈怠,大脑要始终处于积极状态,思维要活跃、思路要开阔,心随老师走,听懂每一句话,抓住每一个环节,理解每一个知识点,多联想、多思考,做到心领神会。
二、“手到”。学英语,一定要做课堂笔记。因为人的记忆力是有限的,人不可能都过目不忘,记忆本身就是不断与遗忘作斗争的过程。常言说,“好脑筋不如烂笔头”。老师讲的知识可能在课堂上记住了,可是过了一段时间,就会忘记,所以,做好笔记很有必要。英语知识也是一点点积累起来的,学到的每一个单词、词组以及句型结构,都记在笔记本上,甚至是书的空白处或字里行间,这对以后的复习巩固都是非常方便的。
三、“耳到”。在课堂上,认真听讲是十分必要的,不但要专心听老师对知识的讲解,而且要认真听老师说英语的语音、语调、重音、连读、失去爆破、断句等发音要领,以便培养自己纯正地道的英语口语。听见听懂老师传授的每一个知识点,在头脑里形成反馈以帮助记忆;理解领会老师提出的问题,以便迅速作答,对比同学对问题的回答,以加深对问题的理解而取别人之长补自己之短。
四、“眼到”。在认真听讲的同时,还要双眼紧随老师观察老师的动作、口形、表情、板书、绘图、教具展示等。大脑里形成的视觉信息和听觉信息相结合,印象就会更加深刻。
五、“口到”。学习语言,不张嘴不动口是学不好的,同学们最大的毛病是读书不出声,害羞不敢张嘴。尤其是早读课,同学们只是用眼看或......余下全文>>
 

相关内容

    暂无相关文章