druid异常处理:Types.collectionOf(Ljava/lang/reflect/Type;)Ljava/lang/reflect/ParameterizedType

版权声明:*************本文为博主原创文章,未经许可不得转载************* https://lidong.blog.csdn.net/article/details/83932759
2018-11-10T19:54:35,072 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Job wikiticker-determine_partitions_hashed-Optional.of([2015-09-12T00:00:00.000Z/2015-09-13T00:00:00.000Z]) submitted, status available at: http://stone.lan:8088/proxy/application_1541847157114_0006/
2018-11-10T19:54:35,073 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Running job: job_1541847157114_0006
2018-11-10T19:54:41,142 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Job job_1541847157114_0006 running in uber mode : false
2018-11-10T19:54:41,143 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job -  map 0% reduce 0%
2018-11-10T19:54:46,592 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1541847157114_0006_m_000000_0, Status : FAILED
Error: com.google.inject.util.Types.collectionOf(Ljava/lang/reflect/Type;)Ljava/lang/reflect/ParameterizedType;
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143

2018-11-10T19:54:50,636 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1541847157114_0006_m_000000_1, Status : FAILED
Error: com.google.inject.util.Types.collectionOf(Ljava/lang/reflect/Type;)Ljava/lang/reflect/ParameterizedType;
2018-11-10T19:54:56,679 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Task Id : attempt_1541847157114_0006_m_000000_2, Status : FAILED
Error: com.google.inject.util.Types.collectionOf(Ljava/lang/reflect/Type;)Ljava/lang/reflect/ParameterizedType;
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143

2018-11-10T19:55:02,714 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job -  map 100% reduce 100%
2018-11-10T19:55:03,728 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Job job_1541847157114_0006 failed with state FAILED due to: Task failed task_1541847157114_0006_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

2018-11-10T19:55:03,834 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Counters: 16
   Job Counters 
      Failed map tasks=4
      Killed reduce tasks=1
      Launched map tasks=4
      Other local map tasks=3
      Data-local map tasks=1
      Total time spent by all maps in occupied slots (ms)=13567
      Total time spent by all reduces in occupied slots (ms)=0
      Total time spent by all map tasks (ms)=13567
      Total time spent by all reduce tasks (ms)=0
      Total vcore-milliseconds taken by all map tasks=13567
      Total vcore-milliseconds taken by all reduce tasks=0
      Total megabyte-milliseconds taken by all map tasks=13892608
      Total megabyte-milliseconds taken by all reduce tasks=0
   Map-Reduce Framework
      CPU time spent (ms)=0
      Physical memory (bytes) snapshot=0
      Virtual memory (bytes) snapshot=0
2018-11-10T19:55:03,836 ERROR [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Job failed: job_1541847157114_0006
2018-11-10T19:55:03,836 INFO [task-runner-0-priority-0] io.druid.indexer.JobHelper - Deleting path[var/druid/hadoop-tmp/wikiticker/2018-11-10T115425.566Z_24d5a38f4b164508a3efbc594d03433e]
2018-11-10T19:55:03,884 ERROR [task-runner-0-priority-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Exception while running task[AbstractTask{id='index_hadoop_wikiticker_2018-11-10T11:54:25.566Z', groupId='index_hadoop_wikiticker_2018-11-10T11:54:25.566Z', taskResource=TaskResource{availabilityGroup='index_hadoop_wikiticker_2018-11-10T11:54:25.566Z', requiredCapacity=1}, dataSource='wikiticker', context={}}]
java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
   at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-16.0.1.jar:?]
   at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:222) ~[druid-indexing-service-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:184) ~[druid-indexing-service-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:444) [druid-indexing-service-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:416) [druid-indexing-service-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_171]
   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_171]
   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_171]
   at java.lang.Thread.run(Thread.java:748) [?:1.8.0_171]
Caused by: java.lang.reflect.InvocationTargetException
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_171]
   at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_171]
   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_171]
   at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_171]
   at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:219) ~[druid-indexing-service-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   ... 7 more
Caused by: io.druid.java.util.common.ISE: Job[class io.druid.indexer.DetermineHashedPartitionsJob] failed!
   at io.druid.indexer.JobHelper.runJobs(JobHelper.java:391) ~[druid-indexing-hadoop-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   at io.druid.indexer.HadoopDruidDetermineConfigurationJob.run(HadoopDruidDetermineConfigurationJob.java:91) ~[druid-indexing-hadoop-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:325) ~[druid-indexing-service-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_171]
   at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_171]
   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_171]
   at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_171]
   at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:219) ~[druid-indexing-service-0.12.4-SNAPSHOT.jar:0.12.4-SNAPSHOT]
   ... 7 more
2018-11-10T19:55:03,895 INFO [task-runner-0-priority-0] io.druid.indexing.overlord.TaskRunnerUtils - Task [index_hadoop_wikiticker_2018-11-10T11:54:25.566Z] status changed to [FAILED].
2018-11-10T19:55:03,898 INFO [task-runner-0-priority-0] io.druid.indexing.worker.executor.ExecutorLifecycle - Task completed with status: {
  "id" : "index_hadoop_wikiticker_2018-11-10T11:54:25.566Z",
  "status" : "FAILED",
  "duration" : 33977
}

解决方法:

添加  "mapreduce.job.user.classpath.first": "true" 到 jobProperties即可

 "jobProperties": {
                "mapreduce.job.user.classpath.first": "true"
                        }

 

 

展开阅读全文

关于:javax.mail.Transport.connect(Ljava/lang/String;Ljava/lang/String;)V错误

11-18

[code=Java]import java.io.File;rnimport java.io.UnsupportedEncodingException;rnrnimport javax.mail.MessagingException;rnrnimport lostsky.tool.mail.MailSender;rnrnrnpublic class SmtpSample rnrn /**rn * SMTP主机地址,邮箱用户名,邮箱密码请自行指定。公共邮箱目前已知sohu.com支持SMTP发送功能,可以尝试。rn * 作者:lostskyrn * 交流论坛:www.591pic.comrn */rn public static void main(String[] args) rn /*SMTP主机地址 如 smtp.sohu.com*/rn String smtpHost="smtp.sohu.com";rn /*邮箱用户名 此处用户名不可用 请自行指定*/rn String username="test@sohu.com";rn /*邮箱密码 此处密码不可用 请自行指定*/rn String password="testpwd";rn /*邮件标题*/rn String subject="dfMail发送测试";rn /*邮件发送者地址 此处地址不可用 请自行指定*/rn String senders="test@sohu.com";rn /*邮件接受者地址, 多个地址用英文半角逗号隔开*/rn String recipients="lostsky_11@yahoo.com.cn, 296821855@qq.com";rn /*邮件正文类型 两种: html格式和纯文本格式*/rn int contentType=MailSender.CONTENT_TYPE_HTML;rn /*邮件正文*/rn String content="欢迎使用dfMail-2.0,交流论坛";rn /*邮件正文编码 默认为null 一般不用修改*/rn String contentEncoding=null;rn /*附件文件名编码 默认为null 一般不用修改*/rn String fileNameEncoding=null;rn //附件rn File[] attaches=new File[2];rn attaches[0]=new File("d:\\dfMail附件测试.jpg");rn attaches[1]=new File("d:\\dfMail附件测试.doc");rn try rn MailSender sender=new MailSender(smtpHost, username, password, rn senders, recipients, subject, rn content, contentEncoding, contentType, rn attaches, fileNameEncoding);rn /*发送*/rn sender.send();rn catch (MessagingException e) rn e.printStackTrace();rn catch (UnsupportedEncodingException e) rn e.printStackTrace();rn rnrn rnrnrn[/code]rnrn在网上找了一段代码。在Myeclipse6.5 中新建java项目。然后导入 dfMail-2.0.jar, lib\mail.jar, lib\activation.jar, lib\log4j-1.2.15.jar 包,设置邮件地址后可以发送邮件。rnrnrn但是在新建 web项目后重新的和java项目中一样做配置,则报异常:rn[code=Java]Exception in thread "main" java.lang.NoSuchMethodError: javax.mail.Transport.connect(Ljava/lang/String;Ljava/lang/String;)Vrn at lostsky.tool.mail.MailSender.send(MailSender.java:86)rn at com.mx.util.SmtpEmail.send(SmtpEmail.java:53)rn [/code]rnrn53行是: sender.send();rn 论坛

org.hyperic.sigar.Sigar.getNetInterfaceList()[Ljava/lang/String;

11-06

严重: Servlet.service() for servlet action threw exceptionrnjava.lang.UnsatisfiedLinkError: org.hyperic.sigar.Sigar.getNetInterfaceList()[Ljava/lang/String;rn at org.hyperic.sigar.Sigar.getNetInterfaceList(Native Method)rn at com.sanss.wyTsycl.dao.oracle.SystemInforDAO.getNet(SystemInforDAO.java:60)rn at com.sanss.wyTsycl.struts.action.SystemInforAction.getNetBytes(SystemInforAction.java:96)rn at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)rn at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)rn at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)rn at java.lang.reflect.Method.invoke(Method.java:597)rn at org.apache.struts.actions.DispatchAction.dispatchMethod(DispatchAction.java:270)rn at org.apache.struts.actions.DispatchAction.execute(DispatchAction.java:187)rn at org.springframework.web.struts.DelegatingActionProxy.execute(DelegatingActionProxy.java:110)rn at org.apache.struts.action.RequestProcessor.processActionPerform(RequestProcessor.java:431)rn at org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:236)rn at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1196)rn at org.apache.struts.action.ActionServlet.doGet(ActionServlet.java:414)rn at javax.servlet.http.HttpServlet.service(HttpServlet.java:617)rn at javax.servlet.http.HttpServlet.service(HttpServlet.java:723)rn at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)rn at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)rn at com.sanss.wyTsycl.util.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:169)rn at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)rn at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)rn at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)rn at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)rn at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)rn at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)rn at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)rn at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)rn at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:861)rn at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:606)rn at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)rn at java.lang.Thread.run(Thread.java:662) 论坛

没有更多推荐了,返回首页