Uploaded image for project: 'Help-Desk'
  1. Help-Desk
  2. HELP-4251

FIWARE.Request.Tech.Data.BigData-Analysis.SharkError

    Details

    • Type: extRequest
    • Status: Closed
    • Priority: Major
    • Resolution: Done
    • Fix Version/s: 2021
    • Component/s: FIWARE-TECH-HELP
    • Labels:
      None

      Description

      To whom it may concern,
      from yesterday our applications have been getting the following error
      when trying to connect to Hive/Shark:

      > INFO: Getting Historical Load Data by Sector
      >
      > Sep 21, 2015 10:16:31 AM eu.finesce.emarketplace.client.HiveClient
      > getHiveConnection
      >
      > SEVERE: HIVE Connection Error
      >
      > java.sql.SQLException: Could not establish connection to
      > 130.206.80.46:9999/default?user=FINESCE-WP4&password=******************:
      > java.net.ConnectException: Connection refused
      >
      > at
      > org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:117)
      >
      > at
      > org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106)
      >
      > at java.sql.DriverManager.getConnection(DriverManager.java:571)
      >
      > at java.sql.DriverManager.getConnection(DriverManager.java:233)

      Just to give you more details, when we try to launch hive from CLI, we
      get (although the CLI starts in the end):
      >
      > login as: FINESCE-WP4
      >
      > FINESCE-WP4@130.206.80.46's password:
      >
      > Last login: Mon Sep 21 10:15:40 2015 from 89-97-237-254.ip19.fastwebnet.it
      >
      > -bash-4.1$ hive
      >
      > log4j:ERROR Could not instantiate class
      > [org.apache.hadoop.hive.shims.HiveEventCounter].
      >
      > java.lang.RuntimeException: Could not load shims in class
      > org.apache.hadoop.log.metrics.EventCounter
      >
      > at
      > org.apache.hadoop.hive.shims.ShimLoader.createShim(ShimLoader.java:123)
      >
      > at
      > org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:115)
      >
      > at
      > org.apache.hadoop.hive.shims.ShimLoader.getEventCounter(ShimLoader.java:98)
      >
      > at
      > org.apache.hadoop.hive.shims.HiveEventCounter.<init>(HiveEventCounter.java:34)
      >
      > at
      > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      >
      > at
      > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
      >
      > at
      > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
      >
      > at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
      >
      > at java.lang.Class.newInstance0(Class.java:357)
      >
      > at java.lang.Class.newInstance(Class.java:310)
      >
      > at
      > org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:330)
      >
      > at
      > org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:121)
      >
      > at
      > org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:664)
      >
      > at
      > org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:647)
      >
      > at
      > org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:544)
      >
      > at
      > org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:440)
      >
      > at
      > org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:476)
      >
      > at
      > org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:354)
      >
      > at
      > org.apache.hadoop.hive.common.LogUtils.initHiveLog4jDefault(LogUtils.java:127)
      >
      > at
      > org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:77)
      >
      > at
      > org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:58)
      >
      > at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:641)
      >
      > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
      >
      > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      >
      > at
      > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
      >
      > at
      > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      >
      > at java.lang.reflect.Method.invoke(Method.java:597)
      >
      > at org.apache.hadoop.util.RunJar.main(RunJar.java:197)
      >
      > Caused by: java.lang.ClassNotFoundException:
      > org.apache.hadoop.log.metrics.EventCounter
      >
      > at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
      >
      > at java.security.AccessController.doPrivileged(Native Method)
      >
      > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
      >
      > at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
      >
      > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
      >
      > at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
      >
      > at java.lang.Class.forName0(Native Method)
      >
      > at java.lang.Class.forName(Class.java:171)
      >
      > at
      > org.apache.hadoop.hive.shims.ShimLoader.createShim(ShimLoader.java:120)
      >
      > ... 27 more
      >
      > log4j:ERROR Could not instantiate appender named "EventCounter".
      >
      > Logging initialized using configuration in
      > jar:file:/usr/local/apache-hive-0.13.0-bin/lib/hive-common-0.13.0.jar!/hive-log4j.properties

      Could someone from COSMOS team help us sorting the issue out, please?

      Regards,

      Massimiliano


      ===============================================================

      Massimiliano Nigrelli
      Direzione Ricerca e Innovazione - R&D Lab
      massimiliano.nigrelli@eng.it

      Engineering Ingegneria Informatica S.p.A.
      Viale Regione Siciliana, 7275 - 90146 Palermo (Italy)

      Phone: +39 091.75.11.847

      ============================================================

      _______________________________________________
      Fiware-lab-help mailing list
      Fiware-lab-help@lists.fiware.org
      https://lists.fiware.org/listinfo/fiware-lab-help-new
      [Created via e-mail received from: Massimiliano Nigrelli <massimiliano.nigrelli@eng.it>]

        Activity

        Hide
        mev Manuel Escriche added a comment -

        Dear Massimiliano,

        This issue is assigned to Francisco, who is COSMOS GEI owner for its resolution.

        Kind regards,
        Manuel

        Show
        mev Manuel Escriche added a comment - Dear Massimiliano, This issue is assigned to Francisco, who is COSMOS GEI owner for its resolution. Kind regards, Manuel
        Hide
        fw.ext.user FW External User added a comment -

        Hi Massimiliano,

        Regarding the Shark server, it was down. Now it should be up and running.

        Regarding the Hive CLI issue, as explained to Dario some days ago, your user (FINESCE_WP4) should be using Hive 0.9.0 instead of Hive 0.13.0. The reason is Hive 0.9.0 is the version used by Shark at TCP/9999 port, the service you prefer instead of the default HiveServer2 (using Hive 0.13.0) at TCP/10000 that any other user may access.

        Thus, your PATH and your HIVE_HOME should be pointing to:

        • expor HIVE_HOME=/usr/local/hive-0.9.0-shark-0.8.0-bin/
        • export PATH=/usr/local/hive-0.9.0-shark-0.8.0-bin/bin/:/usr/local/shark-0.8/bin/:/usr/local/node-v0.12.4-linux-x64/bin:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin

        If the above exports are not permanent, but by ssh session, then you will be pointing to Hive 0.13.0 (as any other user) instead of Hive 0.9.0 you need.

        I’m copying you the email I originally sent to Dario:

        “… As promised, I’ve setup a Spark/Shark deployment just for you. This has required the installation of a whole new Hive metastore since the existing one was recently tuned for Hive 0.13.0 (due to HiveServer2) and the Shark we had installed was compiled for Hive 0.9.0 (in any case, I’ve been looking for the more recent version of Shark and the latest one, before the project was discontinued, was designed to work with Hive 0.11.0, thus installing a new version would not solve the problem).

        A couple of remarks:

        • Shark server now runs on port TCP/9999, don’t forget to change this in your client.
        • As any other user, your default Hive home within the Cosmos instance is /usr/local/apache-hive-0.13.0-bin . Nevertheless, your Hive metastore is related to Hive 0.9.0, thus my recommendation is you locally change both your PATH and your HIVE_HOME in order you always refer to Hive 0.9.0 and not Hive 0.13.0 when using, for instance, the CLI. Basically, add these lines to your /<your_user>/.bash_profile file:
        • export HIVE_HOME=/usr/local/hive-0.9.0-shark-0.8.0-bin/
        • export PATH=/usr/local/hive-0.9.0-shark-0.8.0-bin/bin/:/usr/local/shark-0.8/bin/:/usr/local/node-v0.12.4-linux-x64/bin:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin
        • Finally, the new metastore specifically created for you is empty: there is no tables nor databases (except for the default one and my personal db, named “frb"). Don’t panic! As its names denotes, is a storage for metadata, it does not contains real data; the real data continues stored in your HDFS space. So, you just need to recreate your tables by executing the command "create external table etc etc location ‘/path/to/data/in/hdfs/…’"; I guess you know the command because you already created the old tables by your own. If you don’t remember some detail regarding the tables, you can ask for it to Hive (0.13.0): “describe extended|formatted <table_name>"

        …”

        Regards,
        Francisco

        De: Massimiliano Nigrelli <massimiliano.nigrelli@eng.it<massimiliano.nigrelli@eng.it>>
        Fecha: lunes, 21 de septiembre de 2015, 10:28
        CC: Pellegrino Dario <dario.pellegrino@eng.it<dario.pellegrino@eng.it>>, Pasquale Andriani <pasquale.andriani@eng.it<pasquale.andriani@eng.it>>, Leandro Lombardo <Leandro.Lombardo@eng.it<Leandro.Lombardo@eng.it>>, "fiware-lab-help@lists.fi-ware.org<fiware-lab-help@lists.fi-ware.org>" <fiware-lab-help@lists.fi-ware.org<fiware-lab-help@lists.fi-ware.org>>, Francisco Romero Bueno <francisco.romerobueno@telefonica.com<francisco.romerobueno@telefonica.com>>, SERGIO GARCIA GOMEZ <sergio.garciagomez@telefonica.com<sergio.garciagomez@telefonica.com>>, MIGUEL CARRILLO PACHECO <miguel.carrillopacheco@telefonica.com<miguel.carrillopacheco@telefonica.com>>
        Asunto: [FINESCE] COSMOS : Shark error

        To whom it may concern,
        from yesterday our applications have been getting the following error when trying to connect to Hive/Shark:

        INFO: Getting Historical Load Data by Sector
        Sep 21, 2015 10:16:31 AM eu.finesce.emarketplace.client.HiveClient getHiveConnection
        SEVERE: HIVE Connection Error
        java.sql.SQLException: Could not establish connection to 130.206.80.46:9999/default?user=FINESCE-WP4&password=******************: java.net.ConnectException: Connection refused
        at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:117)
        at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106)
        at java.sql.DriverManager.getConnection(DriverManager.java:571)
        at java.sql.DriverManager.getConnection(DriverManager.java:233)

        Just to give you more details, when we try to launch hive from CLI, we get (although the CLI starts in the end):
        login as: FINESCE-WP4
        FINESCE-WP4@130.206.80.46<FINESCE-WP4@130.206.80.46>'s password:
        Last login: Mon Sep 21 10:15:40 2015 from 89-97-237-254.ip19.fastwebnet.it
        -bash-4.1$ hive
        log4j:ERROR Could not instantiate class [org.apache.hadoop.hive.shims.HiveEventCounter].
        java.lang.RuntimeException: Could not load shims in class org.apache.hadoop.log.metrics.EventCounter
        at org.apache.hadoop.hive.shims.ShimLoader.createShim(ShimLoader.java:123)
        at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:115)
        at org.apache.hadoop.hive.shims.ShimLoader.getEventCounter(ShimLoader.java:98)
        at org.apache.hadoop.hive.shims.HiveEventCounter.<init>(HiveEventCounter.java:34)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at java.lang.Class.newInstance0(Class.java:357)
        at java.lang.Class.newInstance(Class.java:310)
        at org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:330)
        at org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:121)
        at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:664)
        at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:647)
        at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:544)
        at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:440)
        at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:476)
        at org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:354)
        at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jDefault(LogUtils.java:127)
        at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:77)
        at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:58)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:641)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:197)
        Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.log.metrics.EventCounter
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:171)
        at org.apache.hadoop.hive.shims.ShimLoader.createShim(ShimLoader.java:120)
        ... 27 more
        log4j:ERROR Could not instantiate appender named "EventCounter".

        Logging initialized using configuration in jar:file:/usr/local/apache-hive-0.13.0-bin/lib/hive-common-0.13.0.jar!/hive-log4j.properties

        Could someone from COSMOS team help us sorting the issue out, please?

        Regards,

        Massimiliano


        ===============================================================

        Massimiliano Nigrelli
        Direzione Ricerca e Innovazione - R&D Lab
        massimiliano.nigrelli@eng.it<massimiliano.nigrelli@eng.it>

        Engineering Ingegneria Informatica S.p.A.
        Viale Regione Siciliana, 7275 - 90146 Palermo (Italy)

        Phone: +39 091.75.11.847

        ============================================================

        Show
        fw.ext.user FW External User added a comment - Hi Massimiliano, Regarding the Shark server, it was down. Now it should be up and running. Regarding the Hive CLI issue, as explained to Dario some days ago, your user (FINESCE_WP4) should be using Hive 0.9.0 instead of Hive 0.13.0. The reason is Hive 0.9.0 is the version used by Shark at TCP/9999 port, the service you prefer instead of the default HiveServer2 (using Hive 0.13.0) at TCP/10000 that any other user may access. Thus, your PATH and your HIVE_HOME should be pointing to: expor HIVE_HOME=/usr/local/hive-0.9.0-shark-0.8.0-bin/ export PATH=/usr/local/hive-0.9.0-shark-0.8.0-bin/bin/:/usr/local/shark-0.8/bin/:/usr/local/node-v0.12.4-linux-x64/bin:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin If the above exports are not permanent, but by ssh session, then you will be pointing to Hive 0.13.0 (as any other user) instead of Hive 0.9.0 you need. I’m copying you the email I originally sent to Dario: “… As promised, I’ve setup a Spark/Shark deployment just for you. This has required the installation of a whole new Hive metastore since the existing one was recently tuned for Hive 0.13.0 (due to HiveServer2) and the Shark we had installed was compiled for Hive 0.9.0 (in any case, I’ve been looking for the more recent version of Shark and the latest one, before the project was discontinued, was designed to work with Hive 0.11.0, thus installing a new version would not solve the problem). A couple of remarks: Shark server now runs on port TCP/9999, don’t forget to change this in your client. As any other user, your default Hive home within the Cosmos instance is /usr/local/apache-hive-0.13.0-bin . Nevertheless, your Hive metastore is related to Hive 0.9.0, thus my recommendation is you locally change both your PATH and your HIVE_HOME in order you always refer to Hive 0.9.0 and not Hive 0.13.0 when using, for instance, the CLI. Basically, add these lines to your /<your_user>/.bash_profile file: export HIVE_HOME=/usr/local/hive-0.9.0-shark-0.8.0-bin/ export PATH=/usr/local/hive-0.9.0-shark-0.8.0-bin/bin/:/usr/local/shark-0.8/bin/:/usr/local/node-v0.12.4-linux-x64/bin:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin Finally, the new metastore specifically created for you is empty: there is no tables nor databases (except for the default one and my personal db, named “frb"). Don’t panic! As its names denotes, is a storage for metadata, it does not contains real data; the real data continues stored in your HDFS space. So, you just need to recreate your tables by executing the command "create external table etc etc location ‘/path/to/data/in/hdfs/…’"; I guess you know the command because you already created the old tables by your own. If you don’t remember some detail regarding the tables, you can ask for it to Hive (0.13.0): “describe extended|formatted <table_name>" …” Regards, Francisco De: Massimiliano Nigrelli <massimiliano.nigrelli@eng.it< massimiliano.nigrelli@eng.it >> Fecha: lunes, 21 de septiembre de 2015, 10:28 CC: Pellegrino Dario <dario.pellegrino@eng.it< dario.pellegrino@eng.it >>, Pasquale Andriani <pasquale.andriani@eng.it< pasquale.andriani@eng.it >>, Leandro Lombardo <Leandro.Lombardo@eng.it< Leandro.Lombardo@eng.it >>, "fiware-lab-help@lists.fi-ware.org< fiware-lab-help@lists.fi-ware.org >" <fiware-lab-help@lists.fi-ware.org< fiware-lab-help@lists.fi-ware.org >>, Francisco Romero Bueno <francisco.romerobueno@telefonica.com< francisco.romerobueno@telefonica.com >>, SERGIO GARCIA GOMEZ <sergio.garciagomez@telefonica.com< sergio.garciagomez@telefonica.com >>, MIGUEL CARRILLO PACHECO <miguel.carrillopacheco@telefonica.com< miguel.carrillopacheco@telefonica.com >> Asunto: [FINESCE] COSMOS : Shark error To whom it may concern, from yesterday our applications have been getting the following error when trying to connect to Hive/Shark: INFO: Getting Historical Load Data by Sector Sep 21, 2015 10:16:31 AM eu.finesce.emarketplace.client.HiveClient getHiveConnection SEVERE: HIVE Connection Error java.sql.SQLException: Could not establish connection to 130.206.80.46:9999/default?user=FINESCE-WP4&password=******************: java.net.ConnectException: Connection refused at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:117) at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106) at java.sql.DriverManager.getConnection(DriverManager.java:571) at java.sql.DriverManager.getConnection(DriverManager.java:233) Just to give you more details, when we try to launch hive from CLI, we get (although the CLI starts in the end): login as: FINESCE-WP4 FINESCE-WP4@130.206.80.46< FINESCE-WP4@130.206.80.46 >'s password: Last login: Mon Sep 21 10:15:40 2015 from 89-97-237-254.ip19.fastwebnet.it -bash-4.1$ hive log4j:ERROR Could not instantiate class [org.apache.hadoop.hive.shims.HiveEventCounter] . java.lang.RuntimeException: Could not load shims in class org.apache.hadoop.log.metrics.EventCounter at org.apache.hadoop.hive.shims.ShimLoader.createShim(ShimLoader.java:123) at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:115) at org.apache.hadoop.hive.shims.ShimLoader.getEventCounter(ShimLoader.java:98) at org.apache.hadoop.hive.shims.HiveEventCounter.<init>(HiveEventCounter.java:34) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) at java.lang.reflect.Constructor.newInstance(Constructor.java:513) at java.lang.Class.newInstance0(Class.java:357) at java.lang.Class.newInstance(Class.java:310) at org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:330) at org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:121) at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:664) at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:647) at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:544) at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:440) at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:476) at org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:354) at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jDefault(LogUtils.java:127) at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:77) at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:58) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:641) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:197) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.log.metrics.EventCounter at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:247) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:171) at org.apache.hadoop.hive.shims.ShimLoader.createShim(ShimLoader.java:120) ... 27 more log4j:ERROR Could not instantiate appender named "EventCounter". Logging initialized using configuration in jar: file:/usr/local/apache-hive-0.13.0-bin/lib/hive-common-0.13.0.jar!/hive-log4j.properties Could someone from COSMOS team help us sorting the issue out, please? Regards, Massimiliano – =============================================================== Massimiliano Nigrelli Direzione Ricerca e Innovazione - R&D Lab massimiliano.nigrelli@eng.it< massimiliano.nigrelli@eng.it > Engineering Ingegneria Informatica S.p.A. Viale Regione Siciliana, 7275 - 90146 Palermo (Italy) Phone: +39 091.75.11.847 ============================================================
        Hide
        fw.ext.user FW External User added a comment -

        First, thank you for being so quick!

        Then, Dario is aware of you stated at the end of your email and did
        everything you suggested to him before our review in Berlin last week.

        I am writing as it is travelling for work and he is not available this
        morning.

        Anyway, now our UI application retrieves data from COSMOS.

        Thank you.

        Regards,

        Massimiliano


        ===============================================================

        Massimiliano Nigrelli
        Direzione Ricerca e Innovazione - R&D Lab
        massimiliano.nigrelli@eng.it

        Engineering Ingegneria Informatica S.p.A.
        Viale Regione Siciliana, 7275 - 90146 Palermo (Italy)

        Phone: +39 091.75.11.847

        ============================================================

        _______________________________________________
        Fiware-lab-help mailing list
        Fiware-lab-help@lists.fiware.org
        https://lists.fiware.org/listinfo/fiware-lab-help-new

        Show
        fw.ext.user FW External User added a comment - First, thank you for being so quick! Then, Dario is aware of you stated at the end of your email and did everything you suggested to him before our review in Berlin last week. I am writing as it is travelling for work and he is not available this morning. Anyway, now our UI application retrieves data from COSMOS. Thank you. Regards, Massimiliano – =============================================================== Massimiliano Nigrelli Direzione Ricerca e Innovazione - R&D Lab massimiliano.nigrelli@eng.it Engineering Ingegneria Informatica S.p.A. Viale Regione Siciliana, 7275 - 90146 Palermo (Italy) Phone: +39 091.75.11.847 ============================================================ _______________________________________________ Fiware-lab-help mailing list Fiware-lab-help@lists.fiware.org https://lists.fiware.org/listinfo/fiware-lab-help-new

          People

          • Assignee:
            frb Francisco Romero
            Reporter:
            fw.ext.user FW External User
          • Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved: