Uploaded image for project: 'Help-Desk'
  1. Help-Desk
  2. HELP-3894

FIWARE.Request.Tech.Data.BigData-Analysis.HiveServer2Error

    Details

    • Type: extRequest
    • Status: Closed
    • Priority: Major
    • Resolution: Done
    • Fix Version/s: 2021
    • Component/s: FIWARE-TECH-HELP
    • Labels:
      None

      Description

      Dear all,
      a few days ago we have received your mail about the Hive Server upgrade (HiveServer2 instead of Shark) and we have modified our Java code as you recommended.
      In particular we have load the new driver "org.apache.hive.jdbc.HiveDriver" , we have modified the JDBC connection "return DriverManager.getConnection("jdbc:hive2://" + hiveServer + ":" + hivePort + "/default", hdfsUser, hdfsPwd);" and we have changed the file POM.xml (dependencies Hive 0.13.0).
      Unfortunately, after changes our application doesn't work.
      You can find our error message below:

      25-ago-2015 15.34.01 org.apache.catalina.core.StandardWrapperValve invoke
      GRAVE: Servlet.service() for servlet [eu.finesce.emarketplace.RestHiveInputApplication] in context with path [/rest2cosmos] threw exception [java.lang.IllegalMonitorStateException] with root cause
      java.lang.IllegalMonitorStateException
      at java.util.concurrent.locks.ReentrantLock$Sync.tryRelease(Unknown Source)
      at java.util.concurrent.locks.AbstractQueuedSynchronizer.release(Unknown Source)
      at java.util.concurrent.locks.ReentrantLock.unlock(Unknown Source)
      at org.apache.hive.jdbc.HiveStatement.closeClientOperation(HiveStatement.java:175)
      at org.apache.hive.jdbc.HiveQueryResultSet.close(HiveQueryResultSet.java:293)
      at eu.finesce.emarketplace.client.HiveClient.getmeterDetails(HiveClient.java:1386)
      at eu.finesce.emarketplace.RestHive2Cosmos.getMeterDetails(RestHive2Cosmos.java:299)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
      at java.lang.reflect.Method.invoke(Unknown Source)
      at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81)
      at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:151)
      at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:171)
      at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:195)
      at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:104)
      at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:402)
      at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:349)
      at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:106)
      at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:259)
      at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271)
      at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267)
      at org.glassfish.jersey.internal.Errors.process(Errors.java:315)
      at org.glassfish.jersey.internal.Errors.process(Errors.java:297)
      at org.glassfish.jersey.internal.Errors.process(Errors.java:267)
      at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:318)
      at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:236)
      at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1010)
      at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:373)
      at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:382)
      at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:345)
      at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:220)
      at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
      at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
      at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
      at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
      at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
      at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:170)
      at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98)
      at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
      at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
      at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
      at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1040)
      at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:607)
      at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:313)
      at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
      at java.lang.Thread.run(Unknown Source)

      Waiting for your feedback, thank you in advance.
      Best regards,
      Dario Pellegrino

      Dario Pellegrino
      Direzione Ricerca e Innovazione - R&D Lab
      dario.pellegrino@eng.it<dario.pellegrino@eng.it>

      Engineering Ingegneria Informatica spa
      Viale Regione Siciliana, 7275 - 90146 Palermo
      Tel. +39-091.7511847
      Mob. +39-346.5325257
      www.eng.it<http://www.eng.it/>

      _______________________________________________
      Fiware-lab-help mailing list
      Fiware-lab-help@lists.fi-ware.org
      https://lists.fi-ware.org/listinfo/fiware-lab-help

      [Created via e-mail received from: Pellegrino Dario <dario.pellegrino@eng.it>]

        Activity

        Transition Time In Source Status Execution Times Last Executer Last Execution Date
        Open Open In Progress In Progress
        20h 19m 1 Francisco Romero 26/Aug/15 12:37 PM
        In Progress In Progress Answered Answered
        27m 33s 1 Francisco Romero 26/Aug/15 1:04 PM
        Answered Answered Closed Closed
        1d 3h 1m 1 Francisco Romero 27/Aug/15 4:06 PM
        fla Fernando Lopez made changes -
        Fix Version/s 2021 [ 12600 ]
        mev Manuel Escriche made changes -
        HD-Enabler Cosmos [ 10872 ]
        HD-Chapter Data [ 10838 ]
        mev Manuel Escriche made changes -
        mev Manuel Escriche made changes -
        Summary FIWARE.Request.Lab.Data.BigData-Analysis.HiveServer2Error FIWARE.Request.Tech.Data.BigData-Analysis.HiveServer2Error
        mev Manuel Escriche made changes -
        Component/s FIWARE-TECH-HELP [ 10278 ]
        Component/s FIWARE-LAB-HELP [ 10279 ]
        Hide
        frb Francisco Romero added a comment -

        Last emails exchanged:

        Hi Dario,

        Just came back from my holidays. Great to hear that. Yes, I’ll be specially checking for it all along these days.

        Regards,
        Francisco

        De: Pellegrino Dario <dario.pellegrino@eng.it>
        Fecha: jueves, 10 de septiembre de 2015, 11:37
        Para: Francisco Romero Bueno <francisco.romerobueno@telefonica.com>
        CC: Leandro Lombardo <Leandro.Lombardo@eng.it>, Massimiliano Nigrelli <massimiliano.nigrelli@eng.it>, Luigi Briguglio <Luigi.Briguglio@eng.it>, "fiware-lab-help@lists.fi-ware.org" <fiware-lab-help@lists.fi-ware.org>, SERGIO GARCIA GOMEZ <sergio.garciagomez@telefonica.com>, SANTIAGO MARTINEZ GARCIA <santiago.martinezgarcia@telefonica.com>, "stefano.depanfilis@eng.it" <stefano.depanfilis@eng.it>, JUAN JOSE HIERRO SUREDA <juanjose.hierro@telefonica.com>, Pasquale Andriani <pasquale.andriani@eng.it>
        Asunto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Hi Francisco,
        I confirm you that Hive 0.9 with Spark/Shark works properly with reasonable query response time.
        Please, could you have a "special" look at Shark/Spark system from Monday 14th to Wednesday 16th during the FINESCE final event?
        Thank you again for your kind support.
        Best regards,
        Dario

        Show
        frb Francisco Romero added a comment - Last emails exchanged: Hi Dario, Just came back from my holidays. Great to hear that. Yes, I’ll be specially checking for it all along these days. Regards, Francisco De: Pellegrino Dario <dario.pellegrino@eng.it> Fecha: jueves, 10 de septiembre de 2015, 11:37 Para: Francisco Romero Bueno <francisco.romerobueno@telefonica.com> CC: Leandro Lombardo <Leandro.Lombardo@eng.it>, Massimiliano Nigrelli <massimiliano.nigrelli@eng.it>, Luigi Briguglio <Luigi.Briguglio@eng.it>, "fiware-lab-help@lists.fi-ware.org" <fiware-lab-help@lists.fi-ware.org>, SERGIO GARCIA GOMEZ <sergio.garciagomez@telefonica.com>, SANTIAGO MARTINEZ GARCIA <santiago.martinezgarcia@telefonica.com>, "stefano.depanfilis@eng.it" <stefano.depanfilis@eng.it>, JUAN JOSE HIERRO SUREDA <juanjose.hierro@telefonica.com>, Pasquale Andriani <pasquale.andriani@eng.it> Asunto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Hi Francisco, I confirm you that Hive 0.9 with Spark/Shark works properly with reasonable query response time. Please, could you have a "special" look at Shark/Spark system from Monday 14th to Wednesday 16th during the FINESCE final event? Thank you again for your kind support. Best regards, Dario
        Hide
        fw.ext.user FW External User added a comment -

        Hi Francisco,
        I confirm you that Hive 0.9 with Spark/Shark works properly with reasonable query response time.
        Please, could you have a "special" look at Shark/Spark system from Monday 14th to Wednesday 16th during the FINESCE final event?
        Thank you again for your kind support.
        Best regards,
        Dario
        Dario Pellegrino
        Direzione Ricerca e Innovazione - R&D Lab
        dario.pellegrino@eng.it<dario.pellegrino@eng.it>

        Engineering Ingegneria Informatica spa
        Viale Regione Siciliana, 7275 - 90146 Palermo
        Tel. +39-091.7511847
        Mob. +39-346.5325257
        www.eng.it<http://www.eng.it/>

        Da: Pellegrino Dario
        Inviato: martedì 8 settembre 2015 11:07
        A: 'FRANCISCO ROMERO BUENO'
        Cc: Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; fiware-lab-help@lists.fi-ware.org; SERGIO GARCIA GOMEZ; SANTIAGO MARTINEZ GARCIA; Stefano De Panfilis; JUAN JOSE HIERRO SUREDA; Pasquale Andriani
        Oggetto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Hi Francisco,
        thanks for your support.
        I am going to implement your solution.
        I will let you know.
        Best regards,
        Dario
        Dario Pellegrino
        Direzione Ricerca e Innovazione - R&D Lab
        dario.pellegrino@eng.it<dario.pellegrino@eng.it>

        Engineering Ingegneria Informatica spa
        Viale Regione Siciliana, 7275 - 90146 Palermo
        Tel. +39-091.7511847
        Mob. +39-346.5325257
        www.eng.it<http://www.eng.it/>

        Da: FRANCISCO ROMERO BUENO francisco.romerobueno@telefonica.com
        Inviato: domenica 6 settembre 2015 19:16
        A: Pasquale Andriani
        Cc: Pellegrino Dario; Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; fiware-lab-help@lists.fi-ware.org<fiware-lab-help@lists.fi-ware.org>; SERGIO GARCIA GOMEZ; SANTIAGO MARTINEZ GARCIA; Stefano De Panfilis; JUAN JOSE HIERRO SUREDA
        Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Hi all,

        As promised, I've setup a Spark/Shark deployment just for you. This has required the installation of a whole new Hive metastore since the existing one was recently tuned for Hive 0.13.0 (due to HiveServer2) and the Shark we had installed was compiled for Hive 0.9.0 (in any case, I've been looking for the more recent version of Shark and the latest one, before the project was discontinued, was designed to work with Hive 0.11.0, thus installing a new version would not solve the problem).

        A couple of remarks:

        • Shark server now runs on port TCP/9999, don't forget to change this in your client.
        • As any other user, your default Hive home within the Cosmos instance is /usr/local/apache-hive-0.13.0-bin . Nevertheless, your Hive metastore is related to Hive 0.9.0, thus my recommendation is you locally change both your PATH and your HIVE_HOME in order you always refer to Hive 0.9.0 and not Hive 0.13.0 when using, for instance, the CLI. Basically, add these lines to your /<your_user>/.bash_profile file:
        • export HIVE_HOME=/usr/local/hive-0.9.0-shark-0.8.0-bin/
        • export PATH=/usr/local/hive-0.9.0-shark-0.8.0-bin/bin/:/usr/local/shark-0.8/bin/:/usr/local/node-v0.12.4-linux-x64/bin:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin
        • Finally, the new metastore specifically created for you is empty: there is no tables nor databases (except for the default one and my personal db, named "frb"). Don't panic! As its names denotes, is a storage for metadata, it does not contains real data; the real data continues stored in your HDFS space. So, you just need to recreate your tables by executing the command "create external table etc etc location '/path/to/data/in/hdfs/...'"; I guess you know the command because you already created the old tables by your own. If you don't remember some detail regarding the tables, you can ask for it to Hive (0.13.0): "describe extended|formatted <table_name>"
          I'll keep an eye out on the email if you have any doubt.

        Regards,
        Francisco

        De: Pasquale Andriani <pasquale.andriani@eng.it<pasquale.andriani@eng.it>>
        Fecha: lunes, 31 de agosto de 2015, 14:55
        Para: Francisco Romero Bueno <francisco.romerobueno@telefonica.com<francisco.romerobueno@telefonica.com>>
        CC: Pellegrino Dario <dario.pellegrino@eng.it<dario.pellegrino@eng.it>>, Leandro Lombardo <Leandro.Lombardo@eng.it<Leandro.Lombardo@eng.it>>, Massimiliano Nigrelli <massimiliano.nigrelli@eng.it<massimiliano.nigrelli@eng.it>>, Luigi Briguglio <Luigi.Briguglio@eng.it<Luigi.Briguglio@eng.it>>, "fiware-lab-help@lists.fi-ware.org<fiware-lab-help@lists.fi-ware.org>" <fiware-lab-help@lists.fi-ware.org<fiware-lab-help@lists.fi-ware.org>>, SERGIO GARCIA GOMEZ <sergio.garciagomez@telefonica.com<sergio.garciagomez@telefonica.com>>, SANTIAGO MARTINEZ GARCIA <santiago.martinezgarcia@telefonica.com<santiago.martinezgarcia@telefonica.com>>, "stefano.depanfilis@eng.it<stefano.depanfilis@eng.it>" <stefano.depanfilis@eng.it<stefano.depanfilis@eng.it>>, JUAN JOSE HIERRO SUREDA <juanjose.hierro@telefonica.com<juanjose.hierro@telefonica.com>>
        Asunto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Thanks. Waiting for your news about Spark/Shark on Monday 7th.

        Kind regards,
        P.

        Pasquale Andriani
        Direzione Ricerca e Innovazione - Research & Development Lab
        pasquale.andriani@eng.it<pasquale.andriani@eng.it>

        Engineering Ingegneria Informatica spa
        Via Riccardo Morandi, 32 - 00148 Roma
        Tel. +39-06.87594138
        Mob. +39 3924698746
        Fax. +39-06.83074408
        www.eng.it<http://www.eng.it>

        On Mon, Aug 31, 2015 at 2:44 PM, FRANCISCO ROMERO BUENO <francisco.romerobueno@telefonica.com<francisco.romerobueno@telefonica.com>> wrote:
        Dear Pasquale, did you see my last email? I was saying I can setup a Spark/Shark deployment, exclusively for you, next monday 7th (despite I will be on holydays). Currently I am out of Spain with limited access to the Internet. Regards, Francisco

        Show
        fw.ext.user FW External User added a comment - Hi Francisco, I confirm you that Hive 0.9 with Spark/Shark works properly with reasonable query response time. Please, could you have a "special" look at Shark/Spark system from Monday 14th to Wednesday 16th during the FINESCE final event? Thank you again for your kind support. Best regards, Dario Dario Pellegrino Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it< dario.pellegrino@eng.it > Engineering Ingegneria Informatica spa Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it< http://www.eng.it/ > Da: Pellegrino Dario Inviato: martedì 8 settembre 2015 11:07 A: 'FRANCISCO ROMERO BUENO' Cc: Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; fiware-lab-help@lists.fi-ware.org; SERGIO GARCIA GOMEZ; SANTIAGO MARTINEZ GARCIA; Stefano De Panfilis; JUAN JOSE HIERRO SUREDA; Pasquale Andriani Oggetto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Hi Francisco, thanks for your support. I am going to implement your solution. I will let you know. Best regards, Dario Dario Pellegrino Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it< dario.pellegrino@eng.it > Engineering Ingegneria Informatica spa Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it< http://www.eng.it/ > Da: FRANCISCO ROMERO BUENO francisco.romerobueno@telefonica.com Inviato: domenica 6 settembre 2015 19:16 A: Pasquale Andriani Cc: Pellegrino Dario; Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; fiware-lab-help@lists.fi-ware.org< fiware-lab-help@lists.fi-ware.org >; SERGIO GARCIA GOMEZ; SANTIAGO MARTINEZ GARCIA; Stefano De Panfilis; JUAN JOSE HIERRO SUREDA Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Hi all, As promised, I've setup a Spark/Shark deployment just for you. This has required the installation of a whole new Hive metastore since the existing one was recently tuned for Hive 0.13.0 (due to HiveServer2) and the Shark we had installed was compiled for Hive 0.9.0 (in any case, I've been looking for the more recent version of Shark and the latest one, before the project was discontinued, was designed to work with Hive 0.11.0, thus installing a new version would not solve the problem). A couple of remarks: Shark server now runs on port TCP/9999, don't forget to change this in your client. As any other user, your default Hive home within the Cosmos instance is /usr/local/apache-hive-0.13.0-bin . Nevertheless, your Hive metastore is related to Hive 0.9.0, thus my recommendation is you locally change both your PATH and your HIVE_HOME in order you always refer to Hive 0.9.0 and not Hive 0.13.0 when using, for instance, the CLI. Basically, add these lines to your /<your_user>/.bash_profile file: export HIVE_HOME=/usr/local/hive-0.9.0-shark-0.8.0-bin/ export PATH=/usr/local/hive-0.9.0-shark-0.8.0-bin/bin/:/usr/local/shark-0.8/bin/:/usr/local/node-v0.12.4-linux-x64/bin:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin Finally, the new metastore specifically created for you is empty: there is no tables nor databases (except for the default one and my personal db, named "frb"). Don't panic! As its names denotes, is a storage for metadata, it does not contains real data; the real data continues stored in your HDFS space. So, you just need to recreate your tables by executing the command "create external table etc etc location '/path/to/data/in/hdfs/...'"; I guess you know the command because you already created the old tables by your own. If you don't remember some detail regarding the tables, you can ask for it to Hive (0.13.0): "describe extended|formatted <table_name>" I'll keep an eye out on the email if you have any doubt. Regards, Francisco De: Pasquale Andriani <pasquale.andriani@eng.it< pasquale.andriani@eng.it >> Fecha: lunes, 31 de agosto de 2015, 14:55 Para: Francisco Romero Bueno <francisco.romerobueno@telefonica.com< francisco.romerobueno@telefonica.com >> CC: Pellegrino Dario <dario.pellegrino@eng.it< dario.pellegrino@eng.it >>, Leandro Lombardo <Leandro.Lombardo@eng.it< Leandro.Lombardo@eng.it >>, Massimiliano Nigrelli <massimiliano.nigrelli@eng.it< massimiliano.nigrelli@eng.it >>, Luigi Briguglio <Luigi.Briguglio@eng.it< Luigi.Briguglio@eng.it >>, "fiware-lab-help@lists.fi-ware.org< fiware-lab-help@lists.fi-ware.org >" <fiware-lab-help@lists.fi-ware.org< fiware-lab-help@lists.fi-ware.org >>, SERGIO GARCIA GOMEZ <sergio.garciagomez@telefonica.com< sergio.garciagomez@telefonica.com >>, SANTIAGO MARTINEZ GARCIA <santiago.martinezgarcia@telefonica.com< santiago.martinezgarcia@telefonica.com >>, "stefano.depanfilis@eng.it< stefano.depanfilis@eng.it >" <stefano.depanfilis@eng.it< stefano.depanfilis@eng.it >>, JUAN JOSE HIERRO SUREDA <juanjose.hierro@telefonica.com< juanjose.hierro@telefonica.com >> Asunto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Thanks. Waiting for your news about Spark/Shark on Monday 7th. Kind regards, P. Pasquale Andriani Direzione Ricerca e Innovazione - Research & Development Lab pasquale.andriani@eng.it< pasquale.andriani@eng.it > Engineering Ingegneria Informatica spa Via Riccardo Morandi, 32 - 00148 Roma Tel. +39-06.87594138 Mob. +39 3924698746 Fax. +39-06.83074408 www.eng.it< http://www.eng.it > On Mon, Aug 31, 2015 at 2:44 PM, FRANCISCO ROMERO BUENO <francisco.romerobueno@telefonica.com< francisco.romerobueno@telefonica.com >> wrote: Dear Pasquale, did you see my last email? I was saying I can setup a Spark/Shark deployment, exclusively for you, next monday 7th (despite I will be on holydays). Currently I am out of Spain with limited access to the Internet. Regards, Francisco
        Hide
        fw.ext.user FW External User added a comment -

        Hi Francisco,
        thanks for your support.
        I am going to implement your solution.
        I will let you know.
        Best regards,
        Dario
        Dario Pellegrino
        Direzione Ricerca e Innovazione - R&D Lab
        dario.pellegrino@eng.it<dario.pellegrino@eng.it>

        Engineering Ingegneria Informatica spa
        Viale Regione Siciliana, 7275 - 90146 Palermo
        Tel. +39-091.7511847
        Mob. +39-346.5325257
        www.eng.it<http://www.eng.it/>

        Da: FRANCISCO ROMERO BUENO francisco.romerobueno@telefonica.com
        Inviato: domenica 6 settembre 2015 19:16
        A: Pasquale Andriani
        Cc: Pellegrino Dario; Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; fiware-lab-help@lists.fi-ware.org; SERGIO GARCIA GOMEZ; SANTIAGO MARTINEZ GARCIA; Stefano De Panfilis; JUAN JOSE HIERRO SUREDA
        Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Hi all,

        As promised, I've setup a Spark/Shark deployment just for you. This has required the installation of a whole new Hive metastore since the existing one was recently tuned for Hive 0.13.0 (due to HiveServer2) and the Shark we had installed was compiled for Hive 0.9.0 (in any case, I've been looking for the more recent version of Shark and the latest one, before the project was discontinued, was designed to work with Hive 0.11.0, thus installing a new version would not solve the problem).

        A couple of remarks:

        • Shark server now runs on port TCP/9999, don't forget to change this in your client.
        • As any other user, your default Hive home within the Cosmos instance is /usr/local/apache-hive-0.13.0-bin . Nevertheless, your Hive metastore is related to Hive 0.9.0, thus my recommendation is you locally change both your PATH and your HIVE_HOME in order you always refer to Hive 0.9.0 and not Hive 0.13.0 when using, for instance, the CLI. Basically, add these lines to your /<your_user>/.bash_profile file:
        • export HIVE_HOME=/usr/local/hive-0.9.0-shark-0.8.0-bin/
        • export PATH=/usr/local/hive-0.9.0-shark-0.8.0-bin/bin/:/usr/local/shark-0.8/bin/:/usr/local/node-v0.12.4-linux-x64/bin:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin
        • Finally, the new metastore specifically created for you is empty: there is no tables nor databases (except for the default one and my personal db, named "frb"). Don't panic! As its names denotes, is a storage for metadata, it does not contains real data; the real data continues stored in your HDFS space. So, you just need to recreate your tables by executing the command "create external table etc etc location '/path/to/data/in/hdfs/...'"; I guess you know the command because you already created the old tables by your own. If you don't remember some detail regarding the tables, you can ask for it to Hive (0.13.0): "describe extended|formatted <table_name>"
          I'll keep an eye out on the email if you have any doubt.

        Regards,
        Francisco

        De: Pasquale Andriani <pasquale.andriani@eng.it<pasquale.andriani@eng.it>>
        Fecha: lunes, 31 de agosto de 2015, 14:55
        Para: Francisco Romero Bueno <francisco.romerobueno@telefonica.com<francisco.romerobueno@telefonica.com>>
        CC: Pellegrino Dario <dario.pellegrino@eng.it<dario.pellegrino@eng.it>>, Leandro Lombardo <Leandro.Lombardo@eng.it<Leandro.Lombardo@eng.it>>, Massimiliano Nigrelli <massimiliano.nigrelli@eng.it<massimiliano.nigrelli@eng.it>>, Luigi Briguglio <Luigi.Briguglio@eng.it<Luigi.Briguglio@eng.it>>, "fiware-lab-help@lists.fi-ware.org<fiware-lab-help@lists.fi-ware.org>" <fiware-lab-help@lists.fi-ware.org<fiware-lab-help@lists.fi-ware.org>>, SERGIO GARCIA GOMEZ <sergio.garciagomez@telefonica.com<sergio.garciagomez@telefonica.com>>, SANTIAGO MARTINEZ GARCIA <santiago.martinezgarcia@telefonica.com<santiago.martinezgarcia@telefonica.com>>, "stefano.depanfilis@eng.it<stefano.depanfilis@eng.it>" <stefano.depanfilis@eng.it<stefano.depanfilis@eng.it>>, JUAN JOSE HIERRO SUREDA <juanjose.hierro@telefonica.com<juanjose.hierro@telefonica.com>>
        Asunto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Thanks. Waiting for your news about Spark/Shark on Monday 7th.

        Kind regards,
        P.

        Pasquale Andriani
        Direzione Ricerca e Innovazione - Research & Development Lab
        pasquale.andriani@eng.it<pasquale.andriani@eng.it>

        Engineering Ingegneria Informatica spa
        Via Riccardo Morandi, 32 - 00148 Roma
        Tel. +39-06.87594138
        Mob. +39 3924698746
        Fax. +39-06.83074408
        www.eng.it<http://www.eng.it>

        On Mon, Aug 31, 2015 at 2:44 PM, FRANCISCO ROMERO BUENO <francisco.romerobueno@telefonica.com<francisco.romerobueno@telefonica.com>> wrote:
        Dear Pasquale, did you see my last email? I was saying I can setup a Spark/Shark deployment, exclusively for you, next monday 7th (despite I will be on holydays). Currently I am out of Spain with limited access to the Internet. Regards, Francisco

        Show
        fw.ext.user FW External User added a comment - Hi Francisco, thanks for your support. I am going to implement your solution. I will let you know. Best regards, Dario Dario Pellegrino Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it< dario.pellegrino@eng.it > Engineering Ingegneria Informatica spa Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it< http://www.eng.it/ > Da: FRANCISCO ROMERO BUENO francisco.romerobueno@telefonica.com Inviato: domenica 6 settembre 2015 19:16 A: Pasquale Andriani Cc: Pellegrino Dario; Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; fiware-lab-help@lists.fi-ware.org; SERGIO GARCIA GOMEZ; SANTIAGO MARTINEZ GARCIA; Stefano De Panfilis; JUAN JOSE HIERRO SUREDA Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Hi all, As promised, I've setup a Spark/Shark deployment just for you. This has required the installation of a whole new Hive metastore since the existing one was recently tuned for Hive 0.13.0 (due to HiveServer2) and the Shark we had installed was compiled for Hive 0.9.0 (in any case, I've been looking for the more recent version of Shark and the latest one, before the project was discontinued, was designed to work with Hive 0.11.0, thus installing a new version would not solve the problem). A couple of remarks: Shark server now runs on port TCP/9999, don't forget to change this in your client. As any other user, your default Hive home within the Cosmos instance is /usr/local/apache-hive-0.13.0-bin . Nevertheless, your Hive metastore is related to Hive 0.9.0, thus my recommendation is you locally change both your PATH and your HIVE_HOME in order you always refer to Hive 0.9.0 and not Hive 0.13.0 when using, for instance, the CLI. Basically, add these lines to your /<your_user>/.bash_profile file: export HIVE_HOME=/usr/local/hive-0.9.0-shark-0.8.0-bin/ export PATH=/usr/local/hive-0.9.0-shark-0.8.0-bin/bin/:/usr/local/shark-0.8/bin/:/usr/local/node-v0.12.4-linux-x64/bin:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin Finally, the new metastore specifically created for you is empty: there is no tables nor databases (except for the default one and my personal db, named "frb"). Don't panic! As its names denotes, is a storage for metadata, it does not contains real data; the real data continues stored in your HDFS space. So, you just need to recreate your tables by executing the command "create external table etc etc location '/path/to/data/in/hdfs/...'"; I guess you know the command because you already created the old tables by your own. If you don't remember some detail regarding the tables, you can ask for it to Hive (0.13.0): "describe extended|formatted <table_name>" I'll keep an eye out on the email if you have any doubt. Regards, Francisco De: Pasquale Andriani <pasquale.andriani@eng.it< pasquale.andriani@eng.it >> Fecha: lunes, 31 de agosto de 2015, 14:55 Para: Francisco Romero Bueno <francisco.romerobueno@telefonica.com< francisco.romerobueno@telefonica.com >> CC: Pellegrino Dario <dario.pellegrino@eng.it< dario.pellegrino@eng.it >>, Leandro Lombardo <Leandro.Lombardo@eng.it< Leandro.Lombardo@eng.it >>, Massimiliano Nigrelli <massimiliano.nigrelli@eng.it< massimiliano.nigrelli@eng.it >>, Luigi Briguglio <Luigi.Briguglio@eng.it< Luigi.Briguglio@eng.it >>, "fiware-lab-help@lists.fi-ware.org< fiware-lab-help@lists.fi-ware.org >" <fiware-lab-help@lists.fi-ware.org< fiware-lab-help@lists.fi-ware.org >>, SERGIO GARCIA GOMEZ <sergio.garciagomez@telefonica.com< sergio.garciagomez@telefonica.com >>, SANTIAGO MARTINEZ GARCIA <santiago.martinezgarcia@telefonica.com< santiago.martinezgarcia@telefonica.com >>, "stefano.depanfilis@eng.it< stefano.depanfilis@eng.it >" <stefano.depanfilis@eng.it< stefano.depanfilis@eng.it >>, JUAN JOSE HIERRO SUREDA <juanjose.hierro@telefonica.com< juanjose.hierro@telefonica.com >> Asunto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Thanks. Waiting for your news about Spark/Shark on Monday 7th. Kind regards, P. Pasquale Andriani Direzione Ricerca e Innovazione - Research & Development Lab pasquale.andriani@eng.it< pasquale.andriani@eng.it > Engineering Ingegneria Informatica spa Via Riccardo Morandi, 32 - 00148 Roma Tel. +39-06.87594138 Mob. +39 3924698746 Fax. +39-06.83074408 www.eng.it< http://www.eng.it > On Mon, Aug 31, 2015 at 2:44 PM, FRANCISCO ROMERO BUENO <francisco.romerobueno@telefonica.com< francisco.romerobueno@telefonica.com >> wrote: Dear Pasquale, did you see my last email? I was saying I can setup a Spark/Shark deployment, exclusively for you, next monday 7th (despite I will be on holydays). Currently I am out of Spain with limited access to the Internet. Regards, Francisco
        Hide
        fw.ext.user FW External User added a comment -

        Hi all,

        As promised, I’ve setup a Spark/Shark deployment just for you. This has required the installation of a whole new Hive metastore since the existing one was recently tuned for Hive 0.13.0 (due to HiveServer2) and the Shark we had installed was compiled for Hive 0.9.0 (in any case, I’ve been looking for the more recent version of Shark and the latest one, before the project was discontinued, was designed to work with Hive 0.11.0, thus installing a new version would not solve the problem).

        A couple of remarks:

        • Shark server now runs on port TCP/9999, don’t forget to change this in your client.
        • As any other user, your default Hive home within the Cosmos instance is /usr/local/apache-hive-0.13.0-bin . Nevertheless, your Hive metastore is related to Hive 0.9.0, thus my recommendation is you locally change both your PATH and your HIVE_HOME in order you always refer to Hive 0.9.0 and not Hive 0.13.0 when using, for instance, the CLI. Basically, add these lines to your /<your_user>/.bash_profile file:
        • export HIVE_HOME=/usr/local/hive-0.9.0-shark-0.8.0-bin/
        • export PATH=/usr/local/hive-0.9.0-shark-0.8.0-bin/bin/:/usr/local/shark-0.8/bin/:/usr/local/node-v0.12.4-linux-x64/bin:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin
        • Finally, the new metastore specifically created for you is empty: there is no tables nor databases (except for the default one and my personal db, named “frb"). Don’t panic! As its names denotes, is a storage for metadata, it does not contains real data; the real data continues stored in your HDFS space. So, you just need to recreate your tables by executing the command "create external table etc etc location ‘/path/to/data/in/hdfs/…’"; I guess you know the command because you already created the old tables by your own. If you don’t remember some detail regarding the tables, you can ask for it to Hive (0.13.0): “describe extended|formatted <table_name>"

        I’ll keep an eye out on the email if you have any doubt.

        Regards,
        Francisco

        De: Pasquale Andriani <pasquale.andriani@eng.it<pasquale.andriani@eng.it>>
        Fecha: lunes, 31 de agosto de 2015, 14:55
        Para: Francisco Romero Bueno <francisco.romerobueno@telefonica.com<francisco.romerobueno@telefonica.com>>
        CC: Pellegrino Dario <dario.pellegrino@eng.it<dario.pellegrino@eng.it>>, Leandro Lombardo <Leandro.Lombardo@eng.it<Leandro.Lombardo@eng.it>>, Massimiliano Nigrelli <massimiliano.nigrelli@eng.it<massimiliano.nigrelli@eng.it>>, Luigi Briguglio <Luigi.Briguglio@eng.it<Luigi.Briguglio@eng.it>>, "fiware-lab-help@lists.fi-ware.org<fiware-lab-help@lists.fi-ware.org>" <fiware-lab-help@lists.fi-ware.org<fiware-lab-help@lists.fi-ware.org>>, SERGIO GARCIA GOMEZ <sergio.garciagomez@telefonica.com<sergio.garciagomez@telefonica.com>>, SANTIAGO MARTINEZ GARCIA <santiago.martinezgarcia@telefonica.com<santiago.martinezgarcia@telefonica.com>>, "stefano.depanfilis@eng.it<stefano.depanfilis@eng.it>" <stefano.depanfilis@eng.it<stefano.depanfilis@eng.it>>, JUAN JOSE HIERRO SUREDA <juanjose.hierro@telefonica.com<juanjose.hierro@telefonica.com>>
        Asunto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Thanks. Waiting for your news about Spark/Shark on Monday 7th.

        Kind regards,
        P.

        Pasquale Andriani
        Direzione Ricerca e Innovazione - Research & Development Lab
        pasquale.andriani@eng.it<pasquale.andriani@eng.it>

        Engineering Ingegneria Informatica spa
        Via Riccardo Morandi, 32 - 00148 Roma
        Tel. +39-06.87594138
        Mob. +39 3924698746
        Fax. +39-06.83074408
        www.eng.it<http://www.eng.it>

        On Mon, Aug 31, 2015 at 2:44 PM, FRANCISCO ROMERO BUENO <francisco.romerobueno@telefonica.com<francisco.romerobueno@telefonica.com>> wrote:
        Dear Pasquale, did you see my last email? I was saying I can setup a Spark/Shark deployment, exclusively for you, next monday 7th (despite I will be on holydays). Currently I am out of Spain with limited access to the Internet. Regards, Francisco

        Show
        fw.ext.user FW External User added a comment - Hi all, As promised, I’ve setup a Spark/Shark deployment just for you. This has required the installation of a whole new Hive metastore since the existing one was recently tuned for Hive 0.13.0 (due to HiveServer2) and the Shark we had installed was compiled for Hive 0.9.0 (in any case, I’ve been looking for the more recent version of Shark and the latest one, before the project was discontinued, was designed to work with Hive 0.11.0, thus installing a new version would not solve the problem). A couple of remarks: Shark server now runs on port TCP/9999, don’t forget to change this in your client. As any other user, your default Hive home within the Cosmos instance is /usr/local/apache-hive-0.13.0-bin . Nevertheless, your Hive metastore is related to Hive 0.9.0, thus my recommendation is you locally change both your PATH and your HIVE_HOME in order you always refer to Hive 0.9.0 and not Hive 0.13.0 when using, for instance, the CLI. Basically, add these lines to your /<your_user>/.bash_profile file: export HIVE_HOME=/usr/local/hive-0.9.0-shark-0.8.0-bin/ export PATH=/usr/local/hive-0.9.0-shark-0.8.0-bin/bin/:/usr/local/shark-0.8/bin/:/usr/local/node-v0.12.4-linux-x64/bin:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin Finally, the new metastore specifically created for you is empty: there is no tables nor databases (except for the default one and my personal db, named “frb"). Don’t panic! As its names denotes, is a storage for metadata, it does not contains real data; the real data continues stored in your HDFS space. So, you just need to recreate your tables by executing the command "create external table etc etc location ‘/path/to/data/in/hdfs/…’"; I guess you know the command because you already created the old tables by your own. If you don’t remember some detail regarding the tables, you can ask for it to Hive (0.13.0): “describe extended|formatted <table_name>" I’ll keep an eye out on the email if you have any doubt. Regards, Francisco De: Pasquale Andriani <pasquale.andriani@eng.it< pasquale.andriani@eng.it >> Fecha: lunes, 31 de agosto de 2015, 14:55 Para: Francisco Romero Bueno <francisco.romerobueno@telefonica.com< francisco.romerobueno@telefonica.com >> CC: Pellegrino Dario <dario.pellegrino@eng.it< dario.pellegrino@eng.it >>, Leandro Lombardo <Leandro.Lombardo@eng.it< Leandro.Lombardo@eng.it >>, Massimiliano Nigrelli <massimiliano.nigrelli@eng.it< massimiliano.nigrelli@eng.it >>, Luigi Briguglio <Luigi.Briguglio@eng.it< Luigi.Briguglio@eng.it >>, "fiware-lab-help@lists.fi-ware.org< fiware-lab-help@lists.fi-ware.org >" <fiware-lab-help@lists.fi-ware.org< fiware-lab-help@lists.fi-ware.org >>, SERGIO GARCIA GOMEZ <sergio.garciagomez@telefonica.com< sergio.garciagomez@telefonica.com >>, SANTIAGO MARTINEZ GARCIA <santiago.martinezgarcia@telefonica.com< santiago.martinezgarcia@telefonica.com >>, "stefano.depanfilis@eng.it< stefano.depanfilis@eng.it >" <stefano.depanfilis@eng.it< stefano.depanfilis@eng.it >>, JUAN JOSE HIERRO SUREDA <juanjose.hierro@telefonica.com< juanjose.hierro@telefonica.com >> Asunto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Thanks. Waiting for your news about Spark/Shark on Monday 7th. Kind regards, P. Pasquale Andriani Direzione Ricerca e Innovazione - Research & Development Lab pasquale.andriani@eng.it< pasquale.andriani@eng.it > Engineering Ingegneria Informatica spa Via Riccardo Morandi, 32 - 00148 Roma Tel. +39-06.87594138 Mob. +39 3924698746 Fax. +39-06.83074408 www.eng.it< http://www.eng.it > On Mon, Aug 31, 2015 at 2:44 PM, FRANCISCO ROMERO BUENO <francisco.romerobueno@telefonica.com< francisco.romerobueno@telefonica.com >> wrote: Dear Pasquale, did you see my last email? I was saying I can setup a Spark/Shark deployment, exclusively for you, next monday 7th (despite I will be on holydays). Currently I am out of Spain with limited access to the Internet. Regards, Francisco
        Hide
        fw.ext.user FW External User added a comment -

        Hi Francisco,
        the Hive Server does not seem to be working for me. Could you check it?
        Regards,
        Dario
        Dario Pellegrino
        Direzione Ricerca e Innovazione - R&D Lab
        dario.pellegrino@eng.it<dario.pellegrino@eng.it>

        Engineering Ingegneria Informatica spa
        Viale Regione Siciliana, 7275 - 90146 Palermo
        Tel. +39-091.7511847
        Mob. +39-346.5325257
        www.eng.it<http://www.eng.it/>

        Da: FRANCISCO ROMERO BUENO francisco.romerobueno@telefonica.com
        Inviato: lunedì 31 agosto 2015 14:45
        A: Pasquale Andriani
        Cc: Pellegrino Dario; Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; fiware-lab-help@lists.fi-ware.org; SERGIO GARCIA GOMEZ; SANTIAGO MARTINEZ GARCIA; Stefano De Panfilis; JUAN JOSE HIERRO SUREDA
        Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Dear Pasquale, did you see my last email? I was saying I can setup a Spark/Shark deployment, exclusively for you, next monday 7th (despite I will be on holydays). Currently I am out of Spain with limited access to the Internet. Regards, Francisco

        Show
        fw.ext.user FW External User added a comment - Hi Francisco, the Hive Server does not seem to be working for me. Could you check it? Regards, Dario Dario Pellegrino Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it< dario.pellegrino@eng.it > Engineering Ingegneria Informatica spa Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it< http://www.eng.it/ > Da: FRANCISCO ROMERO BUENO francisco.romerobueno@telefonica.com Inviato: lunedì 31 agosto 2015 14:45 A: Pasquale Andriani Cc: Pellegrino Dario; Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; fiware-lab-help@lists.fi-ware.org; SERGIO GARCIA GOMEZ; SANTIAGO MARTINEZ GARCIA; Stefano De Panfilis; JUAN JOSE HIERRO SUREDA Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Dear Pasquale, did you see my last email? I was saying I can setup a Spark/Shark deployment, exclusively for you, next monday 7th (despite I will be on holydays). Currently I am out of Spain with limited access to the Internet. Regards, Francisco
        Hide
        pandriani Pasquale Andriani added a comment -

        Thanks. Waiting for your news about Spark/Shark on Monday 7th.

        Kind regards,
        P.

        Pasquale Andriani
        Direzione Ricerca e Innovazione - Research & Development Lab
        pasquale.andriani@eng.it

        Engineering Ingegneria Informatica spa
        Via Riccardo Morandi, 32 - 00148 Roma
        Tel. +39-06.87594138
        Mob. +39 3924698746
        Fax. +39-06.83074408
        www.eng.it

        On Mon, Aug 31, 2015 at 2:44 PM, FRANCISCO ROMERO BUENO <

        _______________________________________________
        Fiware-lab-help mailing list
        Fiware-lab-help@lists.fi-ware.org
        https://lists.fi-ware.org/listinfo/fiware-lab-help

        Show
        pandriani Pasquale Andriani added a comment - Thanks. Waiting for your news about Spark/Shark on Monday 7th. Kind regards, P. Pasquale Andriani Direzione Ricerca e Innovazione - Research & Development Lab pasquale.andriani@eng.it Engineering Ingegneria Informatica spa Via Riccardo Morandi, 32 - 00148 Roma Tel. +39-06.87594138 Mob. +39 3924698746 Fax. +39-06.83074408 www.eng.it On Mon, Aug 31, 2015 at 2:44 PM, FRANCISCO ROMERO BUENO < _______________________________________________ Fiware-lab-help mailing list Fiware-lab-help@lists.fi-ware.org https://lists.fi-ware.org/listinfo/fiware-lab-help
        Hide
        fw.ext.user FW External User added a comment -

        Dear Pasquale, did you see my last email? I was saying I can setup a Spark/Shark deployment, exclusively for you, next monday 7th (despite I will be on holydays). Currently I am out of Spain with limited access to the Internet. Regards, Francisco

        Show
        fw.ext.user FW External User added a comment - Dear Pasquale, did you see my last email? I was saying I can setup a Spark/Shark deployment, exclusively for you, next monday 7th (despite I will be on holydays). Currently I am out of Spain with limited access to the Internet. Regards, Francisco
        Hide
        pandriani Pasquale Andriani added a comment -

        Dear Francisco,
        I'm quite disappointed about the current status.

        I understand the need of upgrading your infrastructure, but you should bear
        in mind that FINESCE is one of your user and right now, after the change to
        HiveServer2 and without any in-memory computation, we are having only
        performance issues without seeing any other advantage and making our
        application useless due to the very high queries response time.

        We have to show a working demo (as it worked till just before the change)
        during the FINESCE final event on September 14th. How can you help us in
        solving the performance issue in reasonable time?

        For example, would it be possible an upgrade to the latest version of Spark
        with Thrift JDBC/ODBC server on SparkSQL instead of testing the oldest
        version of shark/spark with Hive 0.13?

        Kind regards,
        P.

        Pasquale Andriani
        Direzione Ricerca e Innovazione - Research & Development Lab
        pasquale.andriani@eng.it

        Engineering Ingegneria Informatica spa
        Via Riccardo Morandi, 32 - 00148 Roma
        Tel. +39-06.87594138
        Mob. +39 3924698746
        Fax. +39-06.83074408
        www.eng.it

        On Fri, Aug 28, 2015 at 3:56 PM, FRANCISCO ROMERO BUENO <

        _______________________________________________
        Fiware-lab-help mailing list
        Fiware-lab-help@lists.fi-ware.org
        https://lists.fi-ware.org/listinfo/fiware-lab-help

        Show
        pandriani Pasquale Andriani added a comment - Dear Francisco, I'm quite disappointed about the current status. I understand the need of upgrading your infrastructure, but you should bear in mind that FINESCE is one of your user and right now, after the change to HiveServer2 and without any in-memory computation, we are having only performance issues without seeing any other advantage and making our application useless due to the very high queries response time. We have to show a working demo (as it worked till just before the change) during the FINESCE final event on September 14th. How can you help us in solving the performance issue in reasonable time? For example, would it be possible an upgrade to the latest version of Spark with Thrift JDBC/ODBC server on SparkSQL instead of testing the oldest version of shark/spark with Hive 0.13? Kind regards, P. Pasquale Andriani Direzione Ricerca e Innovazione - Research & Development Lab pasquale.andriani@eng.it Engineering Ingegneria Informatica spa Via Riccardo Morandi, 32 - 00148 Roma Tel. +39-06.87594138 Mob. +39 3924698746 Fax. +39-06.83074408 www.eng.it On Fri, Aug 28, 2015 at 3:56 PM, FRANCISCO ROMERO BUENO < _______________________________________________ Fiware-lab-help mailing list Fiware-lab-help@lists.fi-ware.org https://lists.fi-ware.org/listinfo/fiware-lab-help
        Hide
        fw.ext.user FW External User added a comment -

        Hi Dario,

        The performance of the server is what it is; this depends on the available infrastructure (not so "big" as we would like) and the number of users doing analysis at the same time (not only Hive queries but MapReduce jobs and many other kind of scripts at the same time).

        We decided to stop using Shark because it is discontinued by the Apache community, and because the compiled version of Hive for Shark was pretty old (Hive 0.9.0 for Spark/Shark 0.8) and a lot of interesting features in terms of functionality, security, etc were missing. In fact, our aim was to upgrade to a more recent version of Hive, but we found Hive 0.13.0 was the latest working with our also pretty old Hadoop.

        We are losing the in-memory computations, OK, but HiveServer2 is supposed to be faster than old Hive because it accepts concurrent queries; is there any way to take advantage of this feature by modifying your code?

        Being said that, I could start the old Shark server just for you, listening in a different port than TCP/10000 and only accepting queries from your machine. However, the already deployed version of Spark/Shark, as already said, is 0.8, and it was compiled for working with Hive 0.9.0. Now Hive is 0.13.0 and I think the Hive metastore is not compatible with Spark/Shark anymore. I had to check for it, and make some tests, even, a parallel metastore could be created just for you. But this requires time.

        The problem is I am supposed to be on holydays until September the 14th This week I was attending the email and supporting FIWARE users because I had a broadband connection available; but this will change from tomorrow until the 7th of September, due to I'll be out of Spain. From the 7th to 14th I'll continue on holydays... but I'll be at home. Well, my wife will kill me, but I think I can do those tests then.

        Best regards,
        Francisco

        Show
        fw.ext.user FW External User added a comment - Hi Dario, The performance of the server is what it is; this depends on the available infrastructure (not so "big" as we would like) and the number of users doing analysis at the same time (not only Hive queries but MapReduce jobs and many other kind of scripts at the same time). We decided to stop using Shark because it is discontinued by the Apache community, and because the compiled version of Hive for Shark was pretty old (Hive 0.9.0 for Spark/Shark 0.8) and a lot of interesting features in terms of functionality, security, etc were missing. In fact, our aim was to upgrade to a more recent version of Hive, but we found Hive 0.13.0 was the latest working with our also pretty old Hadoop. We are losing the in-memory computations, OK, but HiveServer2 is supposed to be faster than old Hive because it accepts concurrent queries; is there any way to take advantage of this feature by modifying your code? Being said that, I could start the old Shark server just for you, listening in a different port than TCP/10000 and only accepting queries from your machine. However, the already deployed version of Spark/Shark, as already said, is 0.8, and it was compiled for working with Hive 0.9.0. Now Hive is 0.13.0 and I think the Hive metastore is not compatible with Spark/Shark anymore. I had to check for it, and make some tests, even, a parallel metastore could be created just for you. But this requires time. The problem is I am supposed to be on holydays until September the 14th This week I was attending the email and supporting FIWARE users because I had a broadband connection available; but this will change from tomorrow until the 7th of September, due to I'll be out of Spain. From the 7th to 14th I'll continue on holydays... but I'll be at home. Well, my wife will kill me, but I think I can do those tests then. Best regards, Francisco
        Hide
        fw.ext.user FW External User added a comment -

        Hi Francisco,
        at the moment the beeline connection works properly but we are still having the performance issues (as I have written yesterday in my email below) that we do not allow a regular usage of our application.
        At the middle of September we are going to prepare the final event session and the final review of our research project and we will be presenting a live demo to the review panel.
        As you can surely understand, for us it is very important that our application works properly like before the Hive Server update.
        Are you working to solve this issue?
        Best regards,
        Dario

        Dario Pellegrino
        Direzione Ricerca e Innovazione - R&D Lab
        dario.pellegrino@eng.it

        Engineering Ingegneria Informatica spa
        Viale Regione Siciliana, 7275 - 90146 Palermo
        Tel. +39-091.7511847
        Mob. +39-346.5325257
        www.eng.it

        ----Messaggio originale----
        Da: Pellegrino Dario
        Inviato: giovedì 27 agosto 2015 11:08
        A: 'FRANCISCO ROMERO BUENO'
        Cc: 'Leandro Lombardo'; 'Massimiliano Nigrelli'; 'Luigi Briguglio'; 'Pasquale Andriani'; 'fiware-lab-help@lists.fi-ware.org'
        Oggetto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Hi Francisco,
        right now I have been facing a connection issue with hive2.
        Is Hive Server down?
        Regards,
        Dario

        Dario Pellegrino
        Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it

        Engineering Ingegneria Informatica spa
        Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it

        ----Messaggio originale----
        Da: Pellegrino Dario
        Inviato: mercoledì 26 agosto 2015 17:57
        A: 'FRANCISCO ROMERO BUENO'
        Cc: Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani; fiware-lab-help@lists.fi-ware.org
        Oggetto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Hi Francisco,
        first of all thanks for your reply.
        The problem was not in our connection string but it is generated because using hive2 the query response time has significantly increased.
        I solved the fatal error in my application but the performances are not yet acceptable. For example the response time for a simple query is now 40 sec while before the upgrade to hive2 was only 5 sec.
        Could you verify why the performances have decreased?
        Best regards,
        Dario

        Dario Pellegrino
        Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it

        Engineering Ingegneria Informatica spa
        Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it

        ----Messaggio originale----
        Da: FRANCISCO ROMERO BUENO francisco.romerobueno@telefonica.com
        Inviato: mercoledì 26 agosto 2015 13:02
        A: Pellegrino Dario; agalani@unipi.gr; fiware-lab-help@lists.fi-ware.org
        Cc: Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani
        Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Dear Dario,

        I will need to know the code around this trace:
        at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102)

        It should be something similar to/inspired by the attached code that is working for me:

        Connecting to jdbc:hive2://130.206.80.46:10000/frb?user=frb&password=XXXXXXXXXX
        remotehive> select * from frb_one;
        1431949600,2015-05-18T11:46:40.171Z,Room1,Room,temperature,centigrade,26.5
        1431949749,2015-05-18T11:49:09.506Z,Room1,Room,temperature,centigrade,26.5
        1432014378,2015-05-19T05:46:18.361Z,Room1,Room,temperature,centigrade,26.5
        1432221197,2015-05-21T15:13:17.979Z,Room1,Room,temperature,centigrade,26.5
        remotehive>

        Regarding beeline, the connection must be done without the comma character:

        $ beeline
        Beeline version 0.13.0 by Apache Hive
        beeline> !connect jdbc:hive2://localhost:10000 frb XXXXXXXX
        beeline> org.apache.hive.jdbc.HiveDriver
        Connecting to jdbc:hive2://localhost:10000 Connected to: Apache Hive (version 0.13.0)
        Driver: Hive JDBC (version 0.13.0)
        Transaction isolation: TRANSACTION_REPEATABLE_READ
        0: jdbc:hive2://localhost:10000> select * from frb.frb_one;
        --------------------------------------------------------------------------------------------------------------------------------------
        --------------------------------------------------------------------------------------------------------------------------------------
        --------------------------------------------------------------------------------------------------------------------------------------
        4 rows selected (0.368 seconds)
        0: jdbc:hive2://localhost:10000>

        Attached code:

        package com.telefonica.iot.hivebasicclient;

        import java.io.BufferedReader;
        import java.io.IOException;
        import java.io.InputStreamReader;
        import java.sql.Connection;
        import java.sql.DriverManager;
        import java.sql.ResultSet;
        import java.sql.SQLException;
        import java.sql.Statement;

        /**
        *

        • @author Francisco Romero Bueno frb@tid.es
          *
        • Basic remote client for Hive mimicing the native Hive CLI behaviour.
          *
        • Can be used as the base for more complex clients, interactive or not interactive.
          */
          public final class HiveBasicClient {
          // JDBC driver required for Hive connections
          private static final String DRIVERNAME = "org.apache.hive.jdbc.HiveDriver";
          private static Connection con;

        /**

        • Constructor.
          */
          private HiveBasicClient() {
          } // HiveBasicClient

        /**
        *

        • @param hiveServer
        • @param hivePort
        • @param dbName
        • @param hadoopUser
        • @param hadoopPassword
        • @return
          */
          private static Connection getConnection(String hiveServer, String hivePort, String dbName,
          String hadoopUser, String hadoopPassword)
          Unknown macro: { try { // dynamically load the Hive JDBC driver Class.forName(DRIVERNAME); } catch (ClassNotFoundException e) { System.out.println(e.getMessage()); return null; } // try catch

          try { System.out.println("Connecting to jdbc:hive2://" + hiveServer + ":" + hivePort + "/" + dbName + "?user=" + hadoopUser + "&password=XXXXXXXXXX"); // return a connection based on the Hive JDBC driver return DriverManager.getConnection("jdbc:hive2://" + hiveServer + ":" + hivePort + "/" + dbName, hadoopUser, hadoopPassword); } catch (SQLException e) { System.out.println(e.getMessage()); return null; } // try catch }

          // getConnection

        /**
        *

        • @param query
          */
          private static void doExecute(String query) {
          try {
          // from here on, everything is SQL!
          Statement stmt = con.createStatement();
          ResultSet res = stmt.executeQuery(query);

        // iterate on the result
        while (res.next()) {
        String s = "";

        for (int i = 1; i < res.getMetaData().getColumnCount(); i++)

        { s += res.getString(i) + ","; }

        // for

        s += res.getString(res.getMetaData().getColumnCount());
        System.out.println(s);
        } // while

        // close everything
        res.close();
        stmt.close();
        } catch (SQLException e)

        { System.out.println(e.getMessage()); } // try catch
        } // doExecute

        /**
        *
        * @param query
        */
        private static void doUpdate(String query) {
        try { // from here on, everything is SQL! Statement stmt = con.createStatement(); stmt.executeUpdate(query); // close everything stmt.close(); } catch (SQLException e) { System.out.println(e.getMessage()); }

        // try catch
        } // doUpdate

        /**
        *

        • @param args
          */
          public static void main(String[] args) {
          // get the arguments
          String hiveServer = args[0];
          String hivePort = args[1];
          String dbName = args[2];
          String cosmosUser = args[3];
          String cosmosPassword = args[4];

        // get a connection to the Hive server running on the specified IP address, listening on 10000/TCP port
        // authenticate using my credentials
        con = getConnection(hiveServer, hivePort, dbName, cosmosUser, cosmosPassword);

        if (con == null)

        { System.out.println("Could not connect to the Hive server!"); System.exit(-1); }

        // if

        // add JSON serde
        doUpdate("add JAR /usr/local/apache-hive-0.13.0-bin/lib/json-serde-1.3.1-SNAPSHOT-jar-with-dependencies.jar");

        // use the database
        doUpdate("use " + dbName);

        while (true) {
        // prompt the user for a set of HiveQL sentence (';' separated)
        System.out.print("remotehive> ");

        // open the standard input
        BufferedReader br = new BufferedReader(new InputStreamReader(System.in));

        // read the HiveQL sentences from the standard input
        String hiveqlSentences = null;

        try

        { hiveqlSentences = br.readLine(); }

        catch (IOException e)

        { System.out.println("IO error trying to read a HiveQL query: " + e.getMessage()); System.exit(1); }

        // try catch

        if (hiveqlSentences != null) {
        // get all the queries within the input HiveQL sentences
        String[] queries = hiveqlSentences.split(";");

        // for each query, execute it
        for (String querie : queries)

        { doExecute(querie); }

        // for
        } // if
        } // while
        } // main

        } //HiveClientTest

        Best regards,
        Francisco

        ________________________________________
        De: Pellegrino Dario <dario.pellegrino@eng.it>
        Enviado: miércoles, 26 de agosto de 2015 12:09
        Para: agalani@unipi.gr; fiware-lab-help@lists.fi-ware.org
        Cc: Leandro Lombardo; FRANCISCO ROMERO BUENO; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani
        Asunto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Hi Aristi,
        I have done other tests. Please, could you send the information below to a second level support .

        1) TOMCAT Log
        Aug 26, 2015 11:29:43 AM eu.finesce.emarketplace.client.HiveClient getHiveConnection
        SEVERE: HIVE Connection Error
        java.sql.SQLException: Could not open connection to jdbc:hive2://130.206.80.46:10000: java.net.SocketException: Connection reset
        at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206)
        at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:178)
        at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
        at java.sql.DriverManager.getConnection(DriverManager.java:571)
        at java.sql.DriverManager.getConnection(DriverManager.java:215)
        at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102)
        at eu.finesce.emarketplace.client.HiveClient.getloadpredictionBySector(HiveClient.java:1017)
        at eu.finesce.emarketplace.RestHive2Cosmos.getLoadPredictionbySector(RestHive2Cosmos.java:251)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81)
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:151)
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:171)
        at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:195)
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:104)
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:402)
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:349)
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:106)
        at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:259)
        at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271)
        at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267)
        at org.glassfish.jersey.internal.Errors.process(Errors.java:315)
        at org.glassfish.jersey.internal.Errors.process(Errors.java:297)
        at org.glassfish.jersey.internal.Errors.process(Errors.java:267)
        at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:318)
        at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:236)
        at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1010)
        at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:373)
        at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:382)
        at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:345)
        at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:220)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:224)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:169)
        at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98)
        at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:927)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407)
        at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:987)
        at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:579)
        at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:307)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
        Caused by: org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
        at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129)
        at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
        at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:288)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:203)
        ... 48 more
        Caused by: java.net.SocketException: Connection reset
        at java.net.SocketInputStream.read(SocketInputStream.java:196)
        at java.net.SocketInputStream.read(SocketInputStream.java:122)
        at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
        at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127)
        ... 53 more

        2) BEELINE TEST
        I tried to use the Jdbc Hive2 connection in "Beeline Hive Client" on COSMOS

        -bash-4.1$ beeline
        Beeline version 0.13.0 by Apache Hive
        beeline> !connect jdbc:hive2://130.206.80.46:10000", "FINESCE-WP4", "password"
        scan complete in 24ms
        Connecting to jdbc:hive2://130.206.80.46:10000",
        Error: Invalid URL: jdbc:hive2://130.206.80.46:10000", (state=08S01,code=0)

        Best regards,
        Dario

        Dario Pellegrino
        Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it

        Engineering Ingegneria Informatica spa
        Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it

        ----Messaggio originale----
        Da: Aristi Galani agalani@unipi.gr
        Inviato: martedì 25 agosto 2015 18:29
        A: Pellegrino Dario
        Cc: fiware-lab-help@lists.fi-ware.org; Leandro Lombardo; FRANCISCO ROMERO BUENO (francisco.romerobueno@telefonica.com); Massimiliano Nigrelli; Luigi Briguglio; SERGIO GARCIA GOMEZ (sergio.garciagomez@telefonica.com); Pasquale Andriani
        Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Dear Dario,

        We forwarded your request to second level support.

        Kind regards
        IWAVE team, on behalf of helpdesk team

        ________________________________

        Este mensaje y sus adjuntos se dirigen exclusivamente a su destinatario, puede contener información privilegiada o confidencial y es para uso exclusivo de la persona o entidad de destino. Si no es usted. el destinatario indicado, queda notificado de que la lectura, utilización, divulgación y/o copia sin autorización puede estar prohibida en virtud de la legislación vigente. Si ha recibido este mensaje por error, le rogamos que nos lo comunique inmediatamente por esta misma vía y proceda a su destrucción.

        The information contained in this transmission is privileged and confidential information intended only for the use of the individual or entity named above. If the reader of this message is not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this transmission in error, do not read it. Please immediately reply to the sender that you have received this communication in error and then delete it.

        Esta mensagem e seus anexos se dirigem exclusivamente ao seu destinatário, pode conter informação privilegiada ou confidencial e é para uso exclusivo da pessoa ou entidade de destino. Se não é vossa senhoria o destinatário indicado, fica notificado de que a leitura, utilização, divulgação e/ou cópia sem autorização pode estar proibida em virtude da legislação vigente. Se recebeu esta mensagem por erro, rogamos-lhe que nos o comunique imediatamente por esta mesma via e proceda a sua destruição
        _______________________________________________
        Fiware-lab-help mailing list
        Fiware-lab-help@lists.fi-ware.org
        https://lists.fi-ware.org/listinfo/fiware-lab-help

        Show
        fw.ext.user FW External User added a comment - Hi Francisco, at the moment the beeline connection works properly but we are still having the performance issues (as I have written yesterday in my email below) that we do not allow a regular usage of our application. At the middle of September we are going to prepare the final event session and the final review of our research project and we will be presenting a live demo to the review panel. As you can surely understand, for us it is very important that our application works properly like before the Hive Server update. Are you working to solve this issue? Best regards, Dario Dario Pellegrino Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it Engineering Ingegneria Informatica spa Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it ---- Messaggio originale ---- Da: Pellegrino Dario Inviato: giovedì 27 agosto 2015 11:08 A: 'FRANCISCO ROMERO BUENO' Cc: 'Leandro Lombardo'; 'Massimiliano Nigrelli'; 'Luigi Briguglio'; 'Pasquale Andriani'; 'fiware-lab-help@lists.fi-ware.org' Oggetto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Hi Francisco, right now I have been facing a connection issue with hive2. Is Hive Server down? Regards, Dario Dario Pellegrino Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it Engineering Ingegneria Informatica spa Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it ---- Messaggio originale ---- Da: Pellegrino Dario Inviato: mercoledì 26 agosto 2015 17:57 A: 'FRANCISCO ROMERO BUENO' Cc: Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani; fiware-lab-help@lists.fi-ware.org Oggetto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Hi Francisco, first of all thanks for your reply. The problem was not in our connection string but it is generated because using hive2 the query response time has significantly increased. I solved the fatal error in my application but the performances are not yet acceptable. For example the response time for a simple query is now 40 sec while before the upgrade to hive2 was only 5 sec. Could you verify why the performances have decreased? Best regards, Dario Dario Pellegrino Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it Engineering Ingegneria Informatica spa Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it ---- Messaggio originale ---- Da: FRANCISCO ROMERO BUENO francisco.romerobueno@telefonica.com Inviato: mercoledì 26 agosto 2015 13:02 A: Pellegrino Dario; agalani@unipi.gr; fiware-lab-help@lists.fi-ware.org Cc: Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Dear Dario, I will need to know the code around this trace: at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102) It should be something similar to/inspired by the attached code that is working for me: Connecting to jdbc:hive2://130.206.80.46:10000/frb?user=frb&password=XXXXXXXXXX remotehive> select * from frb_one; 1431949600,2015-05-18T11:46:40.171Z,Room1,Room,temperature,centigrade,26.5 1431949749,2015-05-18T11:49:09.506Z,Room1,Room,temperature,centigrade,26.5 1432014378,2015-05-19T05:46:18.361Z,Room1,Room,temperature,centigrade,26.5 1432221197,2015-05-21T15:13:17.979Z,Room1,Room,temperature,centigrade,26.5 remotehive> Regarding beeline, the connection must be done without the comma character: $ beeline Beeline version 0.13.0 by Apache Hive beeline> !connect jdbc:hive2://localhost:10000 frb XXXXXXXX beeline> org.apache.hive.jdbc.HiveDriver Connecting to jdbc:hive2://localhost:10000 Connected to: Apache Hive (version 0.13.0) Driver: Hive JDBC (version 0.13.0) Transaction isolation: TRANSACTION_REPEATABLE_READ 0: jdbc:hive2://localhost:10000> select * from frb.frb_one; -------------------- ------------------------- ----------------- ------------------- ----------------- ----------------- ------------------- -------------------- ------------------------- ----------------- ------------------- ----------------- ----------------- ------------------- -------------------- ------------------------- ----------------- ------------------- ----------------- ----------------- ------------------- 4 rows selected (0.368 seconds) 0: jdbc:hive2://localhost:10000> Attached code: package com.telefonica.iot.hivebasicclient; import java.io.BufferedReader; import java.io.IOException; import java.io.InputStreamReader; import java.sql.Connection; import java.sql.DriverManager; import java.sql.ResultSet; import java.sql.SQLException; import java.sql.Statement; /** * @author Francisco Romero Bueno frb@tid.es * Basic remote client for Hive mimicing the native Hive CLI behaviour. * Can be used as the base for more complex clients, interactive or not interactive. */ public final class HiveBasicClient { // JDBC driver required for Hive connections private static final String DRIVERNAME = "org.apache.hive.jdbc.HiveDriver"; private static Connection con; /** Constructor. */ private HiveBasicClient() { } // HiveBasicClient /** * @param hiveServer @param hivePort @param dbName @param hadoopUser @param hadoopPassword @return */ private static Connection getConnection(String hiveServer, String hivePort, String dbName, String hadoopUser, String hadoopPassword) Unknown macro: { try { // dynamically load the Hive JDBC driver Class.forName(DRIVERNAME); } catch (ClassNotFoundException e) { System.out.println(e.getMessage()); return null; } // try catch try { System.out.println("Connecting to jdbc:hive2://" + hiveServer + ":" + hivePort + "/" + dbName + "?user=" + hadoopUser + "&password=XXXXXXXXXX"); // return a connection based on the Hive JDBC driver return DriverManager.getConnection("jdbc:hive2://" + hiveServer + ":" + hivePort + "/" + dbName, hadoopUser, hadoopPassword); } catch (SQLException e) { System.out.println(e.getMessage()); return null; } // try catch } // getConnection /** * @param query */ private static void doExecute(String query) { try { // from here on, everything is SQL! Statement stmt = con.createStatement(); ResultSet res = stmt.executeQuery(query); // iterate on the result while (res.next()) { String s = ""; for (int i = 1; i < res.getMetaData().getColumnCount(); i++) { s += res.getString(i) + ","; } // for s += res.getString(res.getMetaData().getColumnCount()); System.out.println(s); } // while // close everything res.close(); stmt.close(); } catch (SQLException e) { System.out.println(e.getMessage()); } // try catch } // doExecute /** * * @param query */ private static void doUpdate(String query) { try { // from here on, everything is SQL! Statement stmt = con.createStatement(); stmt.executeUpdate(query); // close everything stmt.close(); } catch (SQLException e) { System.out.println(e.getMessage()); } // try catch } // doUpdate /** * @param args */ public static void main(String[] args) { // get the arguments String hiveServer = args [0] ; String hivePort = args [1] ; String dbName = args [2] ; String cosmosUser = args [3] ; String cosmosPassword = args [4] ; // get a connection to the Hive server running on the specified IP address, listening on 10000/TCP port // authenticate using my credentials con = getConnection(hiveServer, hivePort, dbName, cosmosUser, cosmosPassword); if (con == null) { System.out.println("Could not connect to the Hive server!"); System.exit(-1); } // if // add JSON serde doUpdate("add JAR /usr/local/apache-hive-0.13.0-bin/lib/json-serde-1.3.1-SNAPSHOT-jar-with-dependencies.jar"); // use the database doUpdate("use " + dbName); while (true) { // prompt the user for a set of HiveQL sentence (';' separated) System.out.print("remotehive> "); // open the standard input BufferedReader br = new BufferedReader(new InputStreamReader(System.in)); // read the HiveQL sentences from the standard input String hiveqlSentences = null; try { hiveqlSentences = br.readLine(); } catch (IOException e) { System.out.println("IO error trying to read a HiveQL query: " + e.getMessage()); System.exit(1); } // try catch if (hiveqlSentences != null) { // get all the queries within the input HiveQL sentences String[] queries = hiveqlSentences.split(";"); // for each query, execute it for (String querie : queries) { doExecute(querie); } // for } // if } // while } // main } //HiveClientTest Best regards, Francisco ________________________________________ De: Pellegrino Dario <dario.pellegrino@eng.it> Enviado: miércoles, 26 de agosto de 2015 12:09 Para: agalani@unipi.gr; fiware-lab-help@lists.fi-ware.org Cc: Leandro Lombardo; FRANCISCO ROMERO BUENO; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani Asunto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Hi Aristi, I have done other tests. Please, could you send the information below to a second level support . 1) TOMCAT Log Aug 26, 2015 11:29:43 AM eu.finesce.emarketplace.client.HiveClient getHiveConnection SEVERE: HIVE Connection Error java.sql.SQLException: Could not open connection to jdbc:hive2://130.206.80.46:10000: java.net.SocketException: Connection reset at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206) at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:178) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:571) at java.sql.DriverManager.getConnection(DriverManager.java:215) at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102) at eu.finesce.emarketplace.client.HiveClient.getloadpredictionBySector(HiveClient.java:1017) at eu.finesce.emarketplace.RestHive2Cosmos.getLoadPredictionbySector(RestHive2Cosmos.java:251) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81) at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:151) at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:171) at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:195) at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:104) at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:402) at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:349) at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:106) at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:259) at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271) at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267) at org.glassfish.jersey.internal.Errors.process(Errors.java:315) at org.glassfish.jersey.internal.Errors.process(Errors.java:297) at org.glassfish.jersey.internal.Errors.process(Errors.java:267) at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:318) at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:236) at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1010) at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:373) at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:382) at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:345) at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:220) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:224) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:169) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98) at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:927) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407) at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:987) at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:579) at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:307) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129) at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84) at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:288) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:203) ... 48 more Caused by: java.net.SocketException: Connection reset at java.net.SocketInputStream.read(SocketInputStream.java:196) at java.net.SocketInputStream.read(SocketInputStream.java:122) at java.io.BufferedInputStream.fill(BufferedInputStream.java:235) at java.io.BufferedInputStream.read1(BufferedInputStream.java:275) at java.io.BufferedInputStream.read(BufferedInputStream.java:334) at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127) ... 53 more 2) BEELINE TEST I tried to use the Jdbc Hive2 connection in "Beeline Hive Client" on COSMOS -bash-4.1$ beeline Beeline version 0.13.0 by Apache Hive beeline> !connect jdbc:hive2://130.206.80.46:10000", "FINESCE-WP4", "password" scan complete in 24ms Connecting to jdbc:hive2://130.206.80.46:10000", Error: Invalid URL: jdbc:hive2://130.206.80.46:10000", (state=08S01,code=0) Best regards, Dario Dario Pellegrino Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it Engineering Ingegneria Informatica spa Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it ---- Messaggio originale ---- Da: Aristi Galani agalani@unipi.gr Inviato: martedì 25 agosto 2015 18:29 A: Pellegrino Dario Cc: fiware-lab-help@lists.fi-ware.org; Leandro Lombardo; FRANCISCO ROMERO BUENO (francisco.romerobueno@telefonica.com); Massimiliano Nigrelli; Luigi Briguglio; SERGIO GARCIA GOMEZ (sergio.garciagomez@telefonica.com); Pasquale Andriani Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Dear Dario, We forwarded your request to second level support. Kind regards IWAVE team, on behalf of helpdesk team ________________________________ Este mensaje y sus adjuntos se dirigen exclusivamente a su destinatario, puede contener información privilegiada o confidencial y es para uso exclusivo de la persona o entidad de destino. Si no es usted. el destinatario indicado, queda notificado de que la lectura, utilización, divulgación y/o copia sin autorización puede estar prohibida en virtud de la legislación vigente. Si ha recibido este mensaje por error, le rogamos que nos lo comunique inmediatamente por esta misma vía y proceda a su destrucción. The information contained in this transmission is privileged and confidential information intended only for the use of the individual or entity named above. If the reader of this message is not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this transmission in error, do not read it. Please immediately reply to the sender that you have received this communication in error and then delete it. Esta mensagem e seus anexos se dirigem exclusivamente ao seu destinatário, pode conter informação privilegiada ou confidencial e é para uso exclusivo da pessoa ou entidade de destino. Se não é vossa senhoria o destinatário indicado, fica notificado de que a leitura, utilização, divulgação e/ou cópia sem autorização pode estar proibida em virtude da legislação vigente. Se recebeu esta mensagem por erro, rogamos-lhe que nos o comunique imediatamente por esta mesma via e proceda a sua destruição _______________________________________________ Fiware-lab-help mailing list Fiware-lab-help@lists.fi-ware.org https://lists.fi-ware.org/listinfo/fiware-lab-help
        frb Francisco Romero made changes -
        Resolution Done [ 10000 ]
        Status Answered [ 10104 ] Closed [ 6 ]
        frb Francisco Romero made changes -
        Summary [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 FIWARE.Request.Lab.Data.BigData-Analysis.HiveServer2Error
        Hide
        fw.ext.user FW External User added a comment -

        Hi Francisco,
        right now I have been facing a connection issue with hive2.
        Is Hive Server down?
        Regards,
        Dario

        Dario Pellegrino
        Direzione Ricerca e Innovazione - R&D Lab
        dario.pellegrino@eng.it

        Engineering Ingegneria Informatica spa
        Viale Regione Siciliana, 7275 - 90146 Palermo
        Tel. +39-091.7511847
        Mob. +39-346.5325257
        www.eng.it

        ----Messaggio originale----
        Da: Pellegrino Dario
        Inviato: mercoledì 26 agosto 2015 17:57
        A: 'FRANCISCO ROMERO BUENO'
        Cc: Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani; fiware-lab-help@lists.fi-ware.org
        Oggetto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Hi Francisco,
        first of all thanks for your reply.
        The problem was not in our connection string but it is generated because using hive2 the query response time has significantly increased.
        I solved the fatal error in my application but the performances are not yet acceptable. For example the response time for a simple query is now 40 sec while before the upgrade to hive2 was only 5 sec.
        Could you verify why the performances have decreased?
        Best regards,
        Dario

        Dario Pellegrino
        Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it

        Engineering Ingegneria Informatica spa
        Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it

        ----Messaggio originale----
        Da: FRANCISCO ROMERO BUENO francisco.romerobueno@telefonica.com
        Inviato: mercoledì 26 agosto 2015 13:02
        A: Pellegrino Dario; agalani@unipi.gr; fiware-lab-help@lists.fi-ware.org
        Cc: Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani
        Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Dear Dario,

        I will need to know the code around this trace:
        at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102)

        It should be something similar to/inspired by the attached code that is working for me:

        Connecting to jdbc:hive2://130.206.80.46:10000/frb?user=frb&password=XXXXXXXXXX
        remotehive> select * from frb_one;
        1431949600,2015-05-18T11:46:40.171Z,Room1,Room,temperature,centigrade,26.5
        1431949749,2015-05-18T11:49:09.506Z,Room1,Room,temperature,centigrade,26.5
        1432014378,2015-05-19T05:46:18.361Z,Room1,Room,temperature,centigrade,26.5
        1432221197,2015-05-21T15:13:17.979Z,Room1,Room,temperature,centigrade,26.5
        remotehive>

        Regarding beeline, the connection must be done without the comma character:

        $ beeline
        Beeline version 0.13.0 by Apache Hive
        beeline> !connect jdbc:hive2://localhost:10000 frb XXXXXXXX
        beeline> org.apache.hive.jdbc.HiveDriver
        Connecting to jdbc:hive2://localhost:10000 Connected to: Apache Hive (version 0.13.0)
        Driver: Hive JDBC (version 0.13.0)
        Transaction isolation: TRANSACTION_REPEATABLE_READ
        0: jdbc:hive2://localhost:10000> select * from frb.frb_one;
        --------------------------------------------------------------------------------------------------------------------------------------
        --------------------------------------------------------------------------------------------------------------------------------------
        --------------------------------------------------------------------------------------------------------------------------------------
        4 rows selected (0.368 seconds)
        0: jdbc:hive2://localhost:10000>

        Attached code:

        package com.telefonica.iot.hivebasicclient;

        import java.io.BufferedReader;
        import java.io.IOException;
        import java.io.InputStreamReader;
        import java.sql.Connection;
        import java.sql.DriverManager;
        import java.sql.ResultSet;
        import java.sql.SQLException;
        import java.sql.Statement;

        /**
        *

        • @author Francisco Romero Bueno frb@tid.es
          *
        • Basic remote client for Hive mimicing the native Hive CLI behaviour.
          *
        • Can be used as the base for more complex clients, interactive or not interactive.
          */
          public final class HiveBasicClient {
          // JDBC driver required for Hive connections
          private static final String DRIVERNAME = "org.apache.hive.jdbc.HiveDriver";
          private static Connection con;

        /**

        • Constructor.
          */
          private HiveBasicClient() {
          } // HiveBasicClient

        /**
        *

        • @param hiveServer
        • @param hivePort
        • @param dbName
        • @param hadoopUser
        • @param hadoopPassword
        • @return
          */
          private static Connection getConnection(String hiveServer, String hivePort, String dbName,
          String hadoopUser, String hadoopPassword)
          Unknown macro: { try { // dynamically load the Hive JDBC driver Class.forName(DRIVERNAME); } catch (ClassNotFoundException e) { System.out.println(e.getMessage()); return null; } // try catch

          try { System.out.println("Connecting to jdbc:hive2://" + hiveServer + ":" + hivePort + "/" + dbName + "?user=" + hadoopUser + "&password=XXXXXXXXXX"); // return a connection based on the Hive JDBC driver return DriverManager.getConnection("jdbc:hive2://" + hiveServer + ":" + hivePort + "/" + dbName, hadoopUser, hadoopPassword); } catch (SQLException e) { System.out.println(e.getMessage()); return null; } // try catch }

          // getConnection

        /**
        *

        • @param query
          */
          private static void doExecute(String query) {
          try {
          // from here on, everything is SQL!
          Statement stmt = con.createStatement();
          ResultSet res = stmt.executeQuery(query);

        // iterate on the result
        while (res.next()) {
        String s = "";

        for (int i = 1; i < res.getMetaData().getColumnCount(); i++)

        { s += res.getString(i) + ","; }

        // for

        s += res.getString(res.getMetaData().getColumnCount());
        System.out.println(s);
        } // while

        // close everything
        res.close();
        stmt.close();
        } catch (SQLException e)

        { System.out.println(e.getMessage()); } // try catch
        } // doExecute

        /**
        *
        * @param query
        */
        private static void doUpdate(String query) {
        try { // from here on, everything is SQL! Statement stmt = con.createStatement(); stmt.executeUpdate(query); // close everything stmt.close(); } catch (SQLException e) { System.out.println(e.getMessage()); }

        // try catch
        } // doUpdate

        /**
        *

        • @param args
          */
          public static void main(String[] args) {
          // get the arguments
          String hiveServer = args[0];
          String hivePort = args[1];
          String dbName = args[2];
          String cosmosUser = args[3];
          String cosmosPassword = args[4];

        // get a connection to the Hive server running on the specified IP address, listening on 10000/TCP port
        // authenticate using my credentials
        con = getConnection(hiveServer, hivePort, dbName, cosmosUser, cosmosPassword);

        if (con == null)

        { System.out.println("Could not connect to the Hive server!"); System.exit(-1); }

        // if

        // add JSON serde
        doUpdate("add JAR /usr/local/apache-hive-0.13.0-bin/lib/json-serde-1.3.1-SNAPSHOT-jar-with-dependencies.jar");

        // use the database
        doUpdate("use " + dbName);

        while (true) {
        // prompt the user for a set of HiveQL sentence (';' separated)
        System.out.print("remotehive> ");

        // open the standard input
        BufferedReader br = new BufferedReader(new InputStreamReader(System.in));

        // read the HiveQL sentences from the standard input
        String hiveqlSentences = null;

        try

        { hiveqlSentences = br.readLine(); }

        catch (IOException e)

        { System.out.println("IO error trying to read a HiveQL query: " + e.getMessage()); System.exit(1); }

        // try catch

        if (hiveqlSentences != null) {
        // get all the queries within the input HiveQL sentences
        String[] queries = hiveqlSentences.split(";");

        // for each query, execute it
        for (String querie : queries)

        { doExecute(querie); }

        // for
        } // if
        } // while
        } // main

        } //HiveClientTest

        Best regards,
        Francisco

        ________________________________________
        De: Pellegrino Dario <dario.pellegrino@eng.it>
        Enviado: miércoles, 26 de agosto de 2015 12:09
        Para: agalani@unipi.gr; fiware-lab-help@lists.fi-ware.org
        Cc: Leandro Lombardo; FRANCISCO ROMERO BUENO; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani
        Asunto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Hi Aristi,
        I have done other tests. Please, could you send the information below to a second level support .

        1) TOMCAT Log
        Aug 26, 2015 11:29:43 AM eu.finesce.emarketplace.client.HiveClient getHiveConnection
        SEVERE: HIVE Connection Error
        java.sql.SQLException: Could not open connection to jdbc:hive2://130.206.80.46:10000: java.net.SocketException: Connection reset
        at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206)
        at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:178)
        at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
        at java.sql.DriverManager.getConnection(DriverManager.java:571)
        at java.sql.DriverManager.getConnection(DriverManager.java:215)
        at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102)
        at eu.finesce.emarketplace.client.HiveClient.getloadpredictionBySector(HiveClient.java:1017)
        at eu.finesce.emarketplace.RestHive2Cosmos.getLoadPredictionbySector(RestHive2Cosmos.java:251)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81)
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:151)
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:171)
        at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:195)
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:104)
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:402)
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:349)
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:106)
        at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:259)
        at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271)
        at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267)
        at org.glassfish.jersey.internal.Errors.process(Errors.java:315)
        at org.glassfish.jersey.internal.Errors.process(Errors.java:297)
        at org.glassfish.jersey.internal.Errors.process(Errors.java:267)
        at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:318)
        at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:236)
        at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1010)
        at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:373)
        at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:382)
        at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:345)
        at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:220)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:224)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:169)
        at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98)
        at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:927)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407)
        at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:987)
        at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:579)
        at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:307)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
        Caused by: org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
        at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129)
        at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
        at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:288)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:203)
        ... 48 more
        Caused by: java.net.SocketException: Connection reset
        at java.net.SocketInputStream.read(SocketInputStream.java:196)
        at java.net.SocketInputStream.read(SocketInputStream.java:122)
        at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
        at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127)
        ... 53 more

        2) BEELINE TEST
        I tried to use the Jdbc Hive2 connection in "Beeline Hive Client" on COSMOS

        -bash-4.1$ beeline
        Beeline version 0.13.0 by Apache Hive
        beeline> !connect jdbc:hive2://130.206.80.46:10000", "FINESCE-WP4", "password"
        scan complete in 24ms
        Connecting to jdbc:hive2://130.206.80.46:10000",
        Error: Invalid URL: jdbc:hive2://130.206.80.46:10000", (state=08S01,code=0)

        Best regards,
        Dario

        Dario Pellegrino
        Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it

        Engineering Ingegneria Informatica spa
        Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it

        ----Messaggio originale----
        Da: Aristi Galani agalani@unipi.gr
        Inviato: martedì 25 agosto 2015 18:29
        A: Pellegrino Dario
        Cc: fiware-lab-help@lists.fi-ware.org; Leandro Lombardo; FRANCISCO ROMERO BUENO (francisco.romerobueno@telefonica.com); Massimiliano Nigrelli; Luigi Briguglio; SERGIO GARCIA GOMEZ (sergio.garciagomez@telefonica.com); Pasquale Andriani
        Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Dear Dario,

        We forwarded your request to second level support.

        Kind regards
        IWAVE team, on behalf of helpdesk team

        ________________________________

        Este mensaje y sus adjuntos se dirigen exclusivamente a su destinatario, puede contener información privilegiada o confidencial y es para uso exclusivo de la persona o entidad de destino. Si no es usted. el destinatario indicado, queda notificado de que la lectura, utilización, divulgación y/o copia sin autorización puede estar prohibida en virtud de la legislación vigente. Si ha recibido este mensaje por error, le rogamos que nos lo comunique inmediatamente por esta misma vía y proceda a su destrucción.

        The information contained in this transmission is privileged and confidential information intended only for the use of the individual or entity named above. If the reader of this message is not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this transmission in error, do not read it. Please immediately reply to the sender that you have received this communication in error and then delete it.

        Esta mensagem e seus anexos se dirigem exclusivamente ao seu destinatário, pode conter informação privilegiada ou confidencial e é para uso exclusivo da pessoa ou entidade de destino. Se não é vossa senhoria o destinatário indicado, fica notificado de que a leitura, utilização, divulgação e/ou cópia sem autorização pode estar proibida em virtude da legislação vigente. Se recebeu esta mensagem por erro, rogamos-lhe que nos o comunique imediatamente por esta mesma via e proceda a sua destruição
        _______________________________________________
        Fiware-lab-help mailing list
        Fiware-lab-help@lists.fi-ware.org
        https://lists.fi-ware.org/listinfo/fiware-lab-help

        Show
        fw.ext.user FW External User added a comment - Hi Francisco, right now I have been facing a connection issue with hive2. Is Hive Server down? Regards, Dario Dario Pellegrino Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it Engineering Ingegneria Informatica spa Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it ---- Messaggio originale ---- Da: Pellegrino Dario Inviato: mercoledì 26 agosto 2015 17:57 A: 'FRANCISCO ROMERO BUENO' Cc: Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani; fiware-lab-help@lists.fi-ware.org Oggetto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Hi Francisco, first of all thanks for your reply. The problem was not in our connection string but it is generated because using hive2 the query response time has significantly increased. I solved the fatal error in my application but the performances are not yet acceptable. For example the response time for a simple query is now 40 sec while before the upgrade to hive2 was only 5 sec. Could you verify why the performances have decreased? Best regards, Dario Dario Pellegrino Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it Engineering Ingegneria Informatica spa Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it ---- Messaggio originale ---- Da: FRANCISCO ROMERO BUENO francisco.romerobueno@telefonica.com Inviato: mercoledì 26 agosto 2015 13:02 A: Pellegrino Dario; agalani@unipi.gr; fiware-lab-help@lists.fi-ware.org Cc: Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Dear Dario, I will need to know the code around this trace: at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102) It should be something similar to/inspired by the attached code that is working for me: Connecting to jdbc:hive2://130.206.80.46:10000/frb?user=frb&password=XXXXXXXXXX remotehive> select * from frb_one; 1431949600,2015-05-18T11:46:40.171Z,Room1,Room,temperature,centigrade,26.5 1431949749,2015-05-18T11:49:09.506Z,Room1,Room,temperature,centigrade,26.5 1432014378,2015-05-19T05:46:18.361Z,Room1,Room,temperature,centigrade,26.5 1432221197,2015-05-21T15:13:17.979Z,Room1,Room,temperature,centigrade,26.5 remotehive> Regarding beeline, the connection must be done without the comma character: $ beeline Beeline version 0.13.0 by Apache Hive beeline> !connect jdbc:hive2://localhost:10000 frb XXXXXXXX beeline> org.apache.hive.jdbc.HiveDriver Connecting to jdbc:hive2://localhost:10000 Connected to: Apache Hive (version 0.13.0) Driver: Hive JDBC (version 0.13.0) Transaction isolation: TRANSACTION_REPEATABLE_READ 0: jdbc:hive2://localhost:10000> select * from frb.frb_one; -------------------- ------------------------- ----------------- ------------------- ----------------- ----------------- ------------------- -------------------- ------------------------- ----------------- ------------------- ----------------- ----------------- ------------------- -------------------- ------------------------- ----------------- ------------------- ----------------- ----------------- ------------------- 4 rows selected (0.368 seconds) 0: jdbc:hive2://localhost:10000> Attached code: package com.telefonica.iot.hivebasicclient; import java.io.BufferedReader; import java.io.IOException; import java.io.InputStreamReader; import java.sql.Connection; import java.sql.DriverManager; import java.sql.ResultSet; import java.sql.SQLException; import java.sql.Statement; /** * @author Francisco Romero Bueno frb@tid.es * Basic remote client for Hive mimicing the native Hive CLI behaviour. * Can be used as the base for more complex clients, interactive or not interactive. */ public final class HiveBasicClient { // JDBC driver required for Hive connections private static final String DRIVERNAME = "org.apache.hive.jdbc.HiveDriver"; private static Connection con; /** Constructor. */ private HiveBasicClient() { } // HiveBasicClient /** * @param hiveServer @param hivePort @param dbName @param hadoopUser @param hadoopPassword @return */ private static Connection getConnection(String hiveServer, String hivePort, String dbName, String hadoopUser, String hadoopPassword) Unknown macro: { try { // dynamically load the Hive JDBC driver Class.forName(DRIVERNAME); } catch (ClassNotFoundException e) { System.out.println(e.getMessage()); return null; } // try catch try { System.out.println("Connecting to jdbc:hive2://" + hiveServer + ":" + hivePort + "/" + dbName + "?user=" + hadoopUser + "&password=XXXXXXXXXX"); // return a connection based on the Hive JDBC driver return DriverManager.getConnection("jdbc:hive2://" + hiveServer + ":" + hivePort + "/" + dbName, hadoopUser, hadoopPassword); } catch (SQLException e) { System.out.println(e.getMessage()); return null; } // try catch } // getConnection /** * @param query */ private static void doExecute(String query) { try { // from here on, everything is SQL! Statement stmt = con.createStatement(); ResultSet res = stmt.executeQuery(query); // iterate on the result while (res.next()) { String s = ""; for (int i = 1; i < res.getMetaData().getColumnCount(); i++) { s += res.getString(i) + ","; } // for s += res.getString(res.getMetaData().getColumnCount()); System.out.println(s); } // while // close everything res.close(); stmt.close(); } catch (SQLException e) { System.out.println(e.getMessage()); } // try catch } // doExecute /** * * @param query */ private static void doUpdate(String query) { try { // from here on, everything is SQL! Statement stmt = con.createStatement(); stmt.executeUpdate(query); // close everything stmt.close(); } catch (SQLException e) { System.out.println(e.getMessage()); } // try catch } // doUpdate /** * @param args */ public static void main(String[] args) { // get the arguments String hiveServer = args [0] ; String hivePort = args [1] ; String dbName = args [2] ; String cosmosUser = args [3] ; String cosmosPassword = args [4] ; // get a connection to the Hive server running on the specified IP address, listening on 10000/TCP port // authenticate using my credentials con = getConnection(hiveServer, hivePort, dbName, cosmosUser, cosmosPassword); if (con == null) { System.out.println("Could not connect to the Hive server!"); System.exit(-1); } // if // add JSON serde doUpdate("add JAR /usr/local/apache-hive-0.13.0-bin/lib/json-serde-1.3.1-SNAPSHOT-jar-with-dependencies.jar"); // use the database doUpdate("use " + dbName); while (true) { // prompt the user for a set of HiveQL sentence (';' separated) System.out.print("remotehive> "); // open the standard input BufferedReader br = new BufferedReader(new InputStreamReader(System.in)); // read the HiveQL sentences from the standard input String hiveqlSentences = null; try { hiveqlSentences = br.readLine(); } catch (IOException e) { System.out.println("IO error trying to read a HiveQL query: " + e.getMessage()); System.exit(1); } // try catch if (hiveqlSentences != null) { // get all the queries within the input HiveQL sentences String[] queries = hiveqlSentences.split(";"); // for each query, execute it for (String querie : queries) { doExecute(querie); } // for } // if } // while } // main } //HiveClientTest Best regards, Francisco ________________________________________ De: Pellegrino Dario <dario.pellegrino@eng.it> Enviado: miércoles, 26 de agosto de 2015 12:09 Para: agalani@unipi.gr; fiware-lab-help@lists.fi-ware.org Cc: Leandro Lombardo; FRANCISCO ROMERO BUENO; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani Asunto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Hi Aristi, I have done other tests. Please, could you send the information below to a second level support . 1) TOMCAT Log Aug 26, 2015 11:29:43 AM eu.finesce.emarketplace.client.HiveClient getHiveConnection SEVERE: HIVE Connection Error java.sql.SQLException: Could not open connection to jdbc:hive2://130.206.80.46:10000: java.net.SocketException: Connection reset at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206) at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:178) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:571) at java.sql.DriverManager.getConnection(DriverManager.java:215) at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102) at eu.finesce.emarketplace.client.HiveClient.getloadpredictionBySector(HiveClient.java:1017) at eu.finesce.emarketplace.RestHive2Cosmos.getLoadPredictionbySector(RestHive2Cosmos.java:251) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81) at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:151) at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:171) at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:195) at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:104) at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:402) at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:349) at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:106) at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:259) at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271) at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267) at org.glassfish.jersey.internal.Errors.process(Errors.java:315) at org.glassfish.jersey.internal.Errors.process(Errors.java:297) at org.glassfish.jersey.internal.Errors.process(Errors.java:267) at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:318) at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:236) at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1010) at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:373) at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:382) at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:345) at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:220) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:224) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:169) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98) at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:927) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407) at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:987) at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:579) at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:307) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129) at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84) at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:288) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:203) ... 48 more Caused by: java.net.SocketException: Connection reset at java.net.SocketInputStream.read(SocketInputStream.java:196) at java.net.SocketInputStream.read(SocketInputStream.java:122) at java.io.BufferedInputStream.fill(BufferedInputStream.java:235) at java.io.BufferedInputStream.read1(BufferedInputStream.java:275) at java.io.BufferedInputStream.read(BufferedInputStream.java:334) at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127) ... 53 more 2) BEELINE TEST I tried to use the Jdbc Hive2 connection in "Beeline Hive Client" on COSMOS -bash-4.1$ beeline Beeline version 0.13.0 by Apache Hive beeline> !connect jdbc:hive2://130.206.80.46:10000", "FINESCE-WP4", "password" scan complete in 24ms Connecting to jdbc:hive2://130.206.80.46:10000", Error: Invalid URL: jdbc:hive2://130.206.80.46:10000", (state=08S01,code=0) Best regards, Dario Dario Pellegrino Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it Engineering Ingegneria Informatica spa Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it ---- Messaggio originale ---- Da: Aristi Galani agalani@unipi.gr Inviato: martedì 25 agosto 2015 18:29 A: Pellegrino Dario Cc: fiware-lab-help@lists.fi-ware.org; Leandro Lombardo; FRANCISCO ROMERO BUENO (francisco.romerobueno@telefonica.com); Massimiliano Nigrelli; Luigi Briguglio; SERGIO GARCIA GOMEZ (sergio.garciagomez@telefonica.com); Pasquale Andriani Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Dear Dario, We forwarded your request to second level support. Kind regards IWAVE team, on behalf of helpdesk team ________________________________ Este mensaje y sus adjuntos se dirigen exclusivamente a su destinatario, puede contener información privilegiada o confidencial y es para uso exclusivo de la persona o entidad de destino. Si no es usted. el destinatario indicado, queda notificado de que la lectura, utilización, divulgación y/o copia sin autorización puede estar prohibida en virtud de la legislación vigente. Si ha recibido este mensaje por error, le rogamos que nos lo comunique inmediatamente por esta misma vía y proceda a su destrucción. The information contained in this transmission is privileged and confidential information intended only for the use of the individual or entity named above. If the reader of this message is not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this transmission in error, do not read it. Please immediately reply to the sender that you have received this communication in error and then delete it. Esta mensagem e seus anexos se dirigem exclusivamente ao seu destinatário, pode conter informação privilegiada ou confidencial e é para uso exclusivo da pessoa ou entidade de destino. Se não é vossa senhoria o destinatário indicado, fica notificado de que a leitura, utilização, divulgação e/ou cópia sem autorização pode estar proibida em virtude da legislação vigente. Se recebeu esta mensagem por erro, rogamos-lhe que nos o comunique imediatamente por esta mesma via e proceda a sua destruição _______________________________________________ Fiware-lab-help mailing list Fiware-lab-help@lists.fi-ware.org https://lists.fi-ware.org/listinfo/fiware-lab-help
        Hide
        fw.ext.user FW External User added a comment -

        Hi Francisco,
        first of all thanks for your reply.
        The problem was not in our connection string but it is generated because using hive2 the query response time has significantly increased.
        I solved the fatal error in my application but the performances are not yet acceptable. For example the response time for a simple query is now 40 sec while before the upgrade to hive2 was only 5 sec.
        Could you verify why the performances have decreased?
        Best regards,
        Dario

        Dario Pellegrino
        Direzione Ricerca e Innovazione - R&D Lab
        dario.pellegrino@eng.it

        Engineering Ingegneria Informatica spa
        Viale Regione Siciliana, 7275 - 90146 Palermo
        Tel. +39-091.7511847
        Mob. +39-346.5325257
        www.eng.it

        ----Messaggio originale----
        Da: FRANCISCO ROMERO BUENO francisco.romerobueno@telefonica.com
        Inviato: mercoledì 26 agosto 2015 13:02
        A: Pellegrino Dario; agalani@unipi.gr; fiware-lab-help@lists.fi-ware.org
        Cc: Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani
        Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Dear Dario,

        I will need to know the code around this trace:
        at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102)

        It should be something similar to/inspired by the attached code that is working for me:

        Connecting to jdbc:hive2://130.206.80.46:10000/frb?user=frb&password=XXXXXXXXXX
        remotehive> select * from frb_one;
        1431949600,2015-05-18T11:46:40.171Z,Room1,Room,temperature,centigrade,26.5
        1431949749,2015-05-18T11:49:09.506Z,Room1,Room,temperature,centigrade,26.5
        1432014378,2015-05-19T05:46:18.361Z,Room1,Room,temperature,centigrade,26.5
        1432221197,2015-05-21T15:13:17.979Z,Room1,Room,temperature,centigrade,26.5
        remotehive>

        Regarding beeline, the connection must be done without the comma character:

        $ beeline
        Beeline version 0.13.0 by Apache Hive
        beeline> !connect jdbc:hive2://localhost:10000 frb XXXXXXXX
        beeline> org.apache.hive.jdbc.HiveDriver
        Connecting to jdbc:hive2://localhost:10000 Connected to: Apache Hive (version 0.13.0)
        Driver: Hive JDBC (version 0.13.0)
        Transaction isolation: TRANSACTION_REPEATABLE_READ
        0: jdbc:hive2://localhost:10000> select * from frb.frb_one;
        --------------------------------------------------------------------------------------------------------------------------------------
        --------------------------------------------------------------------------------------------------------------------------------------
        --------------------------------------------------------------------------------------------------------------------------------------
        4 rows selected (0.368 seconds)
        0: jdbc:hive2://localhost:10000>

        Attached code:

        package com.telefonica.iot.hivebasicclient;

        import java.io.BufferedReader;
        import java.io.IOException;
        import java.io.InputStreamReader;
        import java.sql.Connection;
        import java.sql.DriverManager;
        import java.sql.ResultSet;
        import java.sql.SQLException;
        import java.sql.Statement;

        /**
        *

        • @author Francisco Romero Bueno frb@tid.es
          *
        • Basic remote client for Hive mimicing the native Hive CLI behaviour.
          *
        • Can be used as the base for more complex clients, interactive or not interactive.
          */
          public final class HiveBasicClient {
          // JDBC driver required for Hive connections
          private static final String DRIVERNAME = "org.apache.hive.jdbc.HiveDriver";
          private static Connection con;

        /**

        • Constructor.
          */
          private HiveBasicClient() {
          } // HiveBasicClient

        /**
        *

        • @param hiveServer
        • @param hivePort
        • @param dbName
        • @param hadoopUser
        • @param hadoopPassword
        • @return
          */
          private static Connection getConnection(String hiveServer, String hivePort, String dbName,
          String hadoopUser, String hadoopPassword)
          Unknown macro: { try { // dynamically load the Hive JDBC driver Class.forName(DRIVERNAME); } catch (ClassNotFoundException e) { System.out.println(e.getMessage()); return null; } // try catch

          try { System.out.println("Connecting to jdbc:hive2://" + hiveServer + ":" + hivePort + "/" + dbName + "?user=" + hadoopUser + "&password=XXXXXXXXXX"); // return a connection based on the Hive JDBC driver return DriverManager.getConnection("jdbc:hive2://" + hiveServer + ":" + hivePort + "/" + dbName, hadoopUser, hadoopPassword); } catch (SQLException e) { System.out.println(e.getMessage()); return null; } // try catch }

          // getConnection

        /**
        *

        • @param query
          */
          private static void doExecute(String query) {
          try {
          // from here on, everything is SQL!
          Statement stmt = con.createStatement();
          ResultSet res = stmt.executeQuery(query);

        // iterate on the result
        while (res.next()) {
        String s = "";

        for (int i = 1; i < res.getMetaData().getColumnCount(); i++)

        { s += res.getString(i) + ","; }

        // for

        s += res.getString(res.getMetaData().getColumnCount());
        System.out.println(s);
        } // while

        // close everything
        res.close();
        stmt.close();
        } catch (SQLException e)

        { System.out.println(e.getMessage()); } // try catch
        } // doExecute

        /**
        *
        * @param query
        */
        private static void doUpdate(String query) {
        try { // from here on, everything is SQL! Statement stmt = con.createStatement(); stmt.executeUpdate(query); // close everything stmt.close(); } catch (SQLException e) { System.out.println(e.getMessage()); }

        // try catch
        } // doUpdate

        /**
        *

        • @param args
          */
          public static void main(String[] args) {
          // get the arguments
          String hiveServer = args[0];
          String hivePort = args[1];
          String dbName = args[2];
          String cosmosUser = args[3];
          String cosmosPassword = args[4];

        // get a connection to the Hive server running on the specified IP address, listening on 10000/TCP port
        // authenticate using my credentials
        con = getConnection(hiveServer, hivePort, dbName, cosmosUser, cosmosPassword);

        if (con == null)

        { System.out.println("Could not connect to the Hive server!"); System.exit(-1); }

        // if

        // add JSON serde
        doUpdate("add JAR /usr/local/apache-hive-0.13.0-bin/lib/json-serde-1.3.1-SNAPSHOT-jar-with-dependencies.jar");

        // use the database
        doUpdate("use " + dbName);

        while (true) {
        // prompt the user for a set of HiveQL sentence (';' separated)
        System.out.print("remotehive> ");

        // open the standard input
        BufferedReader br = new BufferedReader(new InputStreamReader(System.in));

        // read the HiveQL sentences from the standard input
        String hiveqlSentences = null;

        try

        { hiveqlSentences = br.readLine(); }

        catch (IOException e)

        { System.out.println("IO error trying to read a HiveQL query: " + e.getMessage()); System.exit(1); }

        // try catch

        if (hiveqlSentences != null) {
        // get all the queries within the input HiveQL sentences
        String[] queries = hiveqlSentences.split(";");

        // for each query, execute it
        for (String querie : queries)

        { doExecute(querie); }

        // for
        } // if
        } // while
        } // main

        } //HiveClientTest

        Best regards,
        Francisco

        ________________________________________
        De: Pellegrino Dario <dario.pellegrino@eng.it>
        Enviado: miércoles, 26 de agosto de 2015 12:09
        Para: agalani@unipi.gr; fiware-lab-help@lists.fi-ware.org
        Cc: Leandro Lombardo; FRANCISCO ROMERO BUENO; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani
        Asunto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Hi Aristi,
        I have done other tests. Please, could you send the information below to a second level support .

        1) TOMCAT Log
        Aug 26, 2015 11:29:43 AM eu.finesce.emarketplace.client.HiveClient getHiveConnection
        SEVERE: HIVE Connection Error
        java.sql.SQLException: Could not open connection to jdbc:hive2://130.206.80.46:10000: java.net.SocketException: Connection reset
        at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206)
        at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:178)
        at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
        at java.sql.DriverManager.getConnection(DriverManager.java:571)
        at java.sql.DriverManager.getConnection(DriverManager.java:215)
        at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102)
        at eu.finesce.emarketplace.client.HiveClient.getloadpredictionBySector(HiveClient.java:1017)
        at eu.finesce.emarketplace.RestHive2Cosmos.getLoadPredictionbySector(RestHive2Cosmos.java:251)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81)
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:151)
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:171)
        at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:195)
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:104)
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:402)
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:349)
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:106)
        at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:259)
        at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271)
        at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267)
        at org.glassfish.jersey.internal.Errors.process(Errors.java:315)
        at org.glassfish.jersey.internal.Errors.process(Errors.java:297)
        at org.glassfish.jersey.internal.Errors.process(Errors.java:267)
        at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:318)
        at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:236)
        at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1010)
        at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:373)
        at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:382)
        at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:345)
        at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:220)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:224)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:169)
        at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98)
        at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:927)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407)
        at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:987)
        at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:579)
        at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:307)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
        Caused by: org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
        at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129)
        at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
        at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:288)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:203)
        ... 48 more
        Caused by: java.net.SocketException: Connection reset
        at java.net.SocketInputStream.read(SocketInputStream.java:196)
        at java.net.SocketInputStream.read(SocketInputStream.java:122)
        at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
        at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127)
        ... 53 more

        2) BEELINE TEST
        I tried to use the Jdbc Hive2 connection in "Beeline Hive Client" on COSMOS

        -bash-4.1$ beeline
        Beeline version 0.13.0 by Apache Hive
        beeline> !connect jdbc:hive2://130.206.80.46:10000", "FINESCE-WP4", "password"
        scan complete in 24ms
        Connecting to jdbc:hive2://130.206.80.46:10000",
        Error: Invalid URL: jdbc:hive2://130.206.80.46:10000", (state=08S01,code=0)

        Best regards,
        Dario

        Dario Pellegrino
        Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it

        Engineering Ingegneria Informatica spa
        Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it

        ----Messaggio originale----
        Da: Aristi Galani agalani@unipi.gr
        Inviato: martedì 25 agosto 2015 18:29
        A: Pellegrino Dario
        Cc: fiware-lab-help@lists.fi-ware.org; Leandro Lombardo; FRANCISCO ROMERO BUENO (francisco.romerobueno@telefonica.com); Massimiliano Nigrelli; Luigi Briguglio; SERGIO GARCIA GOMEZ (sergio.garciagomez@telefonica.com); Pasquale Andriani
        Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Dear Dario,

        We forwarded your request to second level support.

        Kind regards
        IWAVE team, on behalf of helpdesk team

        ________________________________

        Este mensaje y sus adjuntos se dirigen exclusivamente a su destinatario, puede contener información privilegiada o confidencial y es para uso exclusivo de la persona o entidad de destino. Si no es usted. el destinatario indicado, queda notificado de que la lectura, utilización, divulgación y/o copia sin autorización puede estar prohibida en virtud de la legislación vigente. Si ha recibido este mensaje por error, le rogamos que nos lo comunique inmediatamente por esta misma vía y proceda a su destrucción.

        The information contained in this transmission is privileged and confidential information intended only for the use of the individual or entity named above. If the reader of this message is not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this transmission in error, do not read it. Please immediately reply to the sender that you have received this communication in error and then delete it.

        Esta mensagem e seus anexos se dirigem exclusivamente ao seu destinatário, pode conter informação privilegiada ou confidencial e é para uso exclusivo da pessoa ou entidade de destino. Se não é vossa senhoria o destinatário indicado, fica notificado de que a leitura, utilização, divulgação e/ou cópia sem autorização pode estar proibida em virtude da legislação vigente. Se recebeu esta mensagem por erro, rogamos-lhe que nos o comunique imediatamente por esta mesma via e proceda a sua destruição
        _______________________________________________
        Fiware-lab-help mailing list
        Fiware-lab-help@lists.fi-ware.org
        https://lists.fi-ware.org/listinfo/fiware-lab-help

        Show
        fw.ext.user FW External User added a comment - Hi Francisco, first of all thanks for your reply. The problem was not in our connection string but it is generated because using hive2 the query response time has significantly increased. I solved the fatal error in my application but the performances are not yet acceptable. For example the response time for a simple query is now 40 sec while before the upgrade to hive2 was only 5 sec. Could you verify why the performances have decreased? Best regards, Dario Dario Pellegrino Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it Engineering Ingegneria Informatica spa Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it ---- Messaggio originale ---- Da: FRANCISCO ROMERO BUENO francisco.romerobueno@telefonica.com Inviato: mercoledì 26 agosto 2015 13:02 A: Pellegrino Dario; agalani@unipi.gr; fiware-lab-help@lists.fi-ware.org Cc: Leandro Lombardo; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Dear Dario, I will need to know the code around this trace: at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102) It should be something similar to/inspired by the attached code that is working for me: Connecting to jdbc:hive2://130.206.80.46:10000/frb?user=frb&password=XXXXXXXXXX remotehive> select * from frb_one; 1431949600,2015-05-18T11:46:40.171Z,Room1,Room,temperature,centigrade,26.5 1431949749,2015-05-18T11:49:09.506Z,Room1,Room,temperature,centigrade,26.5 1432014378,2015-05-19T05:46:18.361Z,Room1,Room,temperature,centigrade,26.5 1432221197,2015-05-21T15:13:17.979Z,Room1,Room,temperature,centigrade,26.5 remotehive> Regarding beeline, the connection must be done without the comma character: $ beeline Beeline version 0.13.0 by Apache Hive beeline> !connect jdbc:hive2://localhost:10000 frb XXXXXXXX beeline> org.apache.hive.jdbc.HiveDriver Connecting to jdbc:hive2://localhost:10000 Connected to: Apache Hive (version 0.13.0) Driver: Hive JDBC (version 0.13.0) Transaction isolation: TRANSACTION_REPEATABLE_READ 0: jdbc:hive2://localhost:10000> select * from frb.frb_one; -------------------- ------------------------- ----------------- ------------------- ----------------- ----------------- ------------------- -------------------- ------------------------- ----------------- ------------------- ----------------- ----------------- ------------------- -------------------- ------------------------- ----------------- ------------------- ----------------- ----------------- ------------------- 4 rows selected (0.368 seconds) 0: jdbc:hive2://localhost:10000> Attached code: package com.telefonica.iot.hivebasicclient; import java.io.BufferedReader; import java.io.IOException; import java.io.InputStreamReader; import java.sql.Connection; import java.sql.DriverManager; import java.sql.ResultSet; import java.sql.SQLException; import java.sql.Statement; /** * @author Francisco Romero Bueno frb@tid.es * Basic remote client for Hive mimicing the native Hive CLI behaviour. * Can be used as the base for more complex clients, interactive or not interactive. */ public final class HiveBasicClient { // JDBC driver required for Hive connections private static final String DRIVERNAME = "org.apache.hive.jdbc.HiveDriver"; private static Connection con; /** Constructor. */ private HiveBasicClient() { } // HiveBasicClient /** * @param hiveServer @param hivePort @param dbName @param hadoopUser @param hadoopPassword @return */ private static Connection getConnection(String hiveServer, String hivePort, String dbName, String hadoopUser, String hadoopPassword) Unknown macro: { try { // dynamically load the Hive JDBC driver Class.forName(DRIVERNAME); } catch (ClassNotFoundException e) { System.out.println(e.getMessage()); return null; } // try catch try { System.out.println("Connecting to jdbc:hive2://" + hiveServer + ":" + hivePort + "/" + dbName + "?user=" + hadoopUser + "&password=XXXXXXXXXX"); // return a connection based on the Hive JDBC driver return DriverManager.getConnection("jdbc:hive2://" + hiveServer + ":" + hivePort + "/" + dbName, hadoopUser, hadoopPassword); } catch (SQLException e) { System.out.println(e.getMessage()); return null; } // try catch } // getConnection /** * @param query */ private static void doExecute(String query) { try { // from here on, everything is SQL! Statement stmt = con.createStatement(); ResultSet res = stmt.executeQuery(query); // iterate on the result while (res.next()) { String s = ""; for (int i = 1; i < res.getMetaData().getColumnCount(); i++) { s += res.getString(i) + ","; } // for s += res.getString(res.getMetaData().getColumnCount()); System.out.println(s); } // while // close everything res.close(); stmt.close(); } catch (SQLException e) { System.out.println(e.getMessage()); } // try catch } // doExecute /** * * @param query */ private static void doUpdate(String query) { try { // from here on, everything is SQL! Statement stmt = con.createStatement(); stmt.executeUpdate(query); // close everything stmt.close(); } catch (SQLException e) { System.out.println(e.getMessage()); } // try catch } // doUpdate /** * @param args */ public static void main(String[] args) { // get the arguments String hiveServer = args [0] ; String hivePort = args [1] ; String dbName = args [2] ; String cosmosUser = args [3] ; String cosmosPassword = args [4] ; // get a connection to the Hive server running on the specified IP address, listening on 10000/TCP port // authenticate using my credentials con = getConnection(hiveServer, hivePort, dbName, cosmosUser, cosmosPassword); if (con == null) { System.out.println("Could not connect to the Hive server!"); System.exit(-1); } // if // add JSON serde doUpdate("add JAR /usr/local/apache-hive-0.13.0-bin/lib/json-serde-1.3.1-SNAPSHOT-jar-with-dependencies.jar"); // use the database doUpdate("use " + dbName); while (true) { // prompt the user for a set of HiveQL sentence (';' separated) System.out.print("remotehive> "); // open the standard input BufferedReader br = new BufferedReader(new InputStreamReader(System.in)); // read the HiveQL sentences from the standard input String hiveqlSentences = null; try { hiveqlSentences = br.readLine(); } catch (IOException e) { System.out.println("IO error trying to read a HiveQL query: " + e.getMessage()); System.exit(1); } // try catch if (hiveqlSentences != null) { // get all the queries within the input HiveQL sentences String[] queries = hiveqlSentences.split(";"); // for each query, execute it for (String querie : queries) { doExecute(querie); } // for } // if } // while } // main } //HiveClientTest Best regards, Francisco ________________________________________ De: Pellegrino Dario <dario.pellegrino@eng.it> Enviado: miércoles, 26 de agosto de 2015 12:09 Para: agalani@unipi.gr; fiware-lab-help@lists.fi-ware.org Cc: Leandro Lombardo; FRANCISCO ROMERO BUENO; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani Asunto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Hi Aristi, I have done other tests. Please, could you send the information below to a second level support . 1) TOMCAT Log Aug 26, 2015 11:29:43 AM eu.finesce.emarketplace.client.HiveClient getHiveConnection SEVERE: HIVE Connection Error java.sql.SQLException: Could not open connection to jdbc:hive2://130.206.80.46:10000: java.net.SocketException: Connection reset at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206) at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:178) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:571) at java.sql.DriverManager.getConnection(DriverManager.java:215) at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102) at eu.finesce.emarketplace.client.HiveClient.getloadpredictionBySector(HiveClient.java:1017) at eu.finesce.emarketplace.RestHive2Cosmos.getLoadPredictionbySector(RestHive2Cosmos.java:251) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81) at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:151) at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:171) at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:195) at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:104) at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:402) at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:349) at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:106) at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:259) at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271) at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267) at org.glassfish.jersey.internal.Errors.process(Errors.java:315) at org.glassfish.jersey.internal.Errors.process(Errors.java:297) at org.glassfish.jersey.internal.Errors.process(Errors.java:267) at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:318) at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:236) at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1010) at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:373) at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:382) at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:345) at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:220) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:224) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:169) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98) at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:927) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407) at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:987) at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:579) at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:307) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129) at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84) at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:288) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:203) ... 48 more Caused by: java.net.SocketException: Connection reset at java.net.SocketInputStream.read(SocketInputStream.java:196) at java.net.SocketInputStream.read(SocketInputStream.java:122) at java.io.BufferedInputStream.fill(BufferedInputStream.java:235) at java.io.BufferedInputStream.read1(BufferedInputStream.java:275) at java.io.BufferedInputStream.read(BufferedInputStream.java:334) at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127) ... 53 more 2) BEELINE TEST I tried to use the Jdbc Hive2 connection in "Beeline Hive Client" on COSMOS -bash-4.1$ beeline Beeline version 0.13.0 by Apache Hive beeline> !connect jdbc:hive2://130.206.80.46:10000", "FINESCE-WP4", "password" scan complete in 24ms Connecting to jdbc:hive2://130.206.80.46:10000", Error: Invalid URL: jdbc:hive2://130.206.80.46:10000", (state=08S01,code=0) Best regards, Dario Dario Pellegrino Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it Engineering Ingegneria Informatica spa Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it ---- Messaggio originale ---- Da: Aristi Galani agalani@unipi.gr Inviato: martedì 25 agosto 2015 18:29 A: Pellegrino Dario Cc: fiware-lab-help@lists.fi-ware.org; Leandro Lombardo; FRANCISCO ROMERO BUENO (francisco.romerobueno@telefonica.com); Massimiliano Nigrelli; Luigi Briguglio; SERGIO GARCIA GOMEZ (sergio.garciagomez@telefonica.com); Pasquale Andriani Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Dear Dario, We forwarded your request to second level support. Kind regards IWAVE team, on behalf of helpdesk team ________________________________ Este mensaje y sus adjuntos se dirigen exclusivamente a su destinatario, puede contener información privilegiada o confidencial y es para uso exclusivo de la persona o entidad de destino. Si no es usted. el destinatario indicado, queda notificado de que la lectura, utilización, divulgación y/o copia sin autorización puede estar prohibida en virtud de la legislación vigente. Si ha recibido este mensaje por error, le rogamos que nos lo comunique inmediatamente por esta misma vía y proceda a su destrucción. The information contained in this transmission is privileged and confidential information intended only for the use of the individual or entity named above. If the reader of this message is not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this transmission in error, do not read it. Please immediately reply to the sender that you have received this communication in error and then delete it. Esta mensagem e seus anexos se dirigem exclusivamente ao seu destinatário, pode conter informação privilegiada ou confidencial e é para uso exclusivo da pessoa ou entidade de destino. Se não é vossa senhoria o destinatário indicado, fica notificado de que a leitura, utilização, divulgação e/ou cópia sem autorização pode estar proibida em virtude da legislação vigente. Se recebeu esta mensagem por erro, rogamos-lhe que nos o comunique imediatamente por esta mesma via e proceda a sua destruição _______________________________________________ Fiware-lab-help mailing list Fiware-lab-help@lists.fi-ware.org https://lists.fi-ware.org/listinfo/fiware-lab-help
        frb Francisco Romero made changes -
        Status In Progress [ 3 ] Answered [ 10104 ]
        Hide
        fw.ext.user FW External User added a comment -

        Dear Dario,

        I will need to know the code around this trace:
        at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102)

        It should be something similar to/inspired by the attached code that is working for me:

        Connecting to jdbc:hive2://130.206.80.46:10000/frb?user=frb&password=XXXXXXXXXX
        remotehive> select * from frb_one;
        1431949600,2015-05-18T11:46:40.171Z,Room1,Room,temperature,centigrade,26.5
        1431949749,2015-05-18T11:49:09.506Z,Room1,Room,temperature,centigrade,26.5
        1432014378,2015-05-19T05:46:18.361Z,Room1,Room,temperature,centigrade,26.5
        1432221197,2015-05-21T15:13:17.979Z,Room1,Room,temperature,centigrade,26.5
        remotehive>

        Regarding beeline, the connection must be done without the comma character:

        $ beeline
        Beeline version 0.13.0 by Apache Hive
        beeline> !connect jdbc:hive2://localhost:10000 frb XXXXXXXX org.apache.hive.jdbc.HiveDriver
        Connecting to jdbc:hive2://localhost:10000
        Connected to: Apache Hive (version 0.13.0)
        Driver: Hive JDBC (version 0.13.0)
        Transaction isolation: TRANSACTION_REPEATABLE_READ
        0: jdbc:hive2://localhost:10000> select * from frb.frb_one;
        --------------------------------------------------------------------------------------------------------------------------------------
        --------------------------------------------------------------------------------------------------------------------------------------
        --------------------------------------------------------------------------------------------------------------------------------------
        4 rows selected (0.368 seconds)
        0: jdbc:hive2://localhost:10000>

        Attached code:

        package com.telefonica.iot.hivebasicclient;

        import java.io.BufferedReader;
        import java.io.IOException;
        import java.io.InputStreamReader;
        import java.sql.Connection;
        import java.sql.DriverManager;
        import java.sql.ResultSet;
        import java.sql.SQLException;
        import java.sql.Statement;

        /**
        *

        • @author Francisco Romero Bueno frb@tid.es
          *
        • Basic remote client for Hive mimicing the native Hive CLI behaviour.
          *
        • Can be used as the base for more complex clients, interactive or not interactive.
          */
          public final class HiveBasicClient {
          // JDBC driver required for Hive connections
          private static final String DRIVERNAME = "org.apache.hive.jdbc.HiveDriver";
          private static Connection con;

        /**

        • Constructor.
          */
          private HiveBasicClient() {
          } // HiveBasicClient

        /**
        *

        • @param hiveServer
        • @param hivePort
        • @param dbName
        • @param hadoopUser
        • @param hadoopPassword
        • @return
          */
          private static Connection getConnection(String hiveServer, String hivePort, String dbName,
          String hadoopUser, String hadoopPassword)
          Unknown macro: { try { // dynamically load the Hive JDBC driver Class.forName(DRIVERNAME); } catch (ClassNotFoundException e) { System.out.println(e.getMessage()); return null; } // try catch

          try { System.out.println("Connecting to jdbc:hive2://" + hiveServer + ":" + hivePort + "/" + dbName + "?user=" + hadoopUser + "&password=XXXXXXXXXX"); // return a connection based on the Hive JDBC driver return DriverManager.getConnection("jdbc:hive2://" + hiveServer + ":" + hivePort + "/" + dbName, hadoopUser, hadoopPassword); } catch (SQLException e) { System.out.println(e.getMessage()); return null; } // try catch }

          // getConnection

        /**
        *

        • @param query
          */
          private static void doExecute(String query) {
          try {
          // from here on, everything is SQL!
          Statement stmt = con.createStatement();
          ResultSet res = stmt.executeQuery(query);

        // iterate on the result
        while (res.next()) {
        String s = "";

        for (int i = 1; i < res.getMetaData().getColumnCount(); i++)

        { s += res.getString(i) + ","; }

        // for

        s += res.getString(res.getMetaData().getColumnCount());
        System.out.println(s);
        } // while

        // close everything
        res.close();
        stmt.close();
        } catch (SQLException e)

        { System.out.println(e.getMessage()); } // try catch
        } // doExecute

        /**
        *
        * @param query
        */
        private static void doUpdate(String query) {
        try { // from here on, everything is SQL! Statement stmt = con.createStatement(); stmt.executeUpdate(query); // close everything stmt.close(); } catch (SQLException e) { System.out.println(e.getMessage()); }

        // try catch
        } // doUpdate

        /**
        *

        • @param args
          */
          public static void main(String[] args) {
          // get the arguments
          String hiveServer = args[0];
          String hivePort = args[1];
          String dbName = args[2];
          String cosmosUser = args[3];
          String cosmosPassword = args[4];

        // get a connection to the Hive server running on the specified IP address, listening on 10000/TCP port
        // authenticate using my credentials
        con = getConnection(hiveServer, hivePort, dbName, cosmosUser, cosmosPassword);

        if (con == null)

        { System.out.println("Could not connect to the Hive server!"); System.exit(-1); }

        // if

        // add JSON serde
        doUpdate("add JAR /usr/local/apache-hive-0.13.0-bin/lib/json-serde-1.3.1-SNAPSHOT-jar-with-dependencies.jar");

        // use the database
        doUpdate("use " + dbName);

        while (true) {
        // prompt the user for a set of HiveQL sentence (';' separated)
        System.out.print("remotehive> ");

        // open the standard input
        BufferedReader br = new BufferedReader(new InputStreamReader(System.in));

        // read the HiveQL sentences from the standard input
        String hiveqlSentences = null;

        try

        { hiveqlSentences = br.readLine(); }

        catch (IOException e)

        { System.out.println("IO error trying to read a HiveQL query: " + e.getMessage()); System.exit(1); }

        // try catch

        if (hiveqlSentences != null) {
        // get all the queries within the input HiveQL sentences
        String[] queries = hiveqlSentences.split(";");

        // for each query, execute it
        for (String querie : queries)

        { doExecute(querie); }

        // for
        } // if
        } // while
        } // main

        } //HiveClientTest

        Best regards,
        Francisco

        ________________________________________
        De: Pellegrino Dario <dario.pellegrino@eng.it>
        Enviado: miércoles, 26 de agosto de 2015 12:09
        Para: agalani@unipi.gr; fiware-lab-help@lists.fi-ware.org
        Cc: Leandro Lombardo; FRANCISCO ROMERO BUENO; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani
        Asunto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Hi Aristi,
        I have done other tests. Please, could you send the information below to a second level support .

        1) TOMCAT Log
        Aug 26, 2015 11:29:43 AM eu.finesce.emarketplace.client.HiveClient getHiveConnection
        SEVERE: HIVE Connection Error
        java.sql.SQLException: Could not open connection to jdbc:hive2://130.206.80.46:10000: java.net.SocketException: Connection reset
        at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206)
        at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:178)
        at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
        at java.sql.DriverManager.getConnection(DriverManager.java:571)
        at java.sql.DriverManager.getConnection(DriverManager.java:215)
        at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102)
        at eu.finesce.emarketplace.client.HiveClient.getloadpredictionBySector(HiveClient.java:1017)
        at eu.finesce.emarketplace.RestHive2Cosmos.getLoadPredictionbySector(RestHive2Cosmos.java:251)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81)
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:151)
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:171)
        at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:195)
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:104)
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:402)
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:349)
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:106)
        at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:259)
        at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271)
        at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267)
        at org.glassfish.jersey.internal.Errors.process(Errors.java:315)
        at org.glassfish.jersey.internal.Errors.process(Errors.java:297)
        at org.glassfish.jersey.internal.Errors.process(Errors.java:267)
        at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:318)
        at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:236)
        at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1010)
        at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:373)
        at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:382)
        at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:345)
        at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:220)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:224)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:169)
        at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98)
        at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:927)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407)
        at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:987)
        at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:579)
        at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:307)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
        Caused by: org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
        at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129)
        at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
        at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:288)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:203)
        ... 48 more
        Caused by: java.net.SocketException: Connection reset
        at java.net.SocketInputStream.read(SocketInputStream.java:196)
        at java.net.SocketInputStream.read(SocketInputStream.java:122)
        at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
        at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127)
        ... 53 more

        2) BEELINE TEST
        I tried to use the Jdbc Hive2 connection in "Beeline Hive Client" on COSMOS

        -bash-4.1$ beeline
        Beeline version 0.13.0 by Apache Hive
        beeline> !connect jdbc:hive2://130.206.80.46:10000", "FINESCE-WP4", "password"
        scan complete in 24ms
        Connecting to jdbc:hive2://130.206.80.46:10000",
        Error: Invalid URL: jdbc:hive2://130.206.80.46:10000", (state=08S01,code=0)

        Best regards,
        Dario

        Dario Pellegrino
        Direzione Ricerca e Innovazione - R&D Lab
        dario.pellegrino@eng.it

        Engineering Ingegneria Informatica spa
        Viale Regione Siciliana, 7275 - 90146 Palermo
        Tel. +39-091.7511847
        Mob. +39-346.5325257
        www.eng.it

        ----Messaggio originale----
        Da: Aristi Galani agalani@unipi.gr
        Inviato: martedì 25 agosto 2015 18:29
        A: Pellegrino Dario
        Cc: fiware-lab-help@lists.fi-ware.org; Leandro Lombardo; FRANCISCO ROMERO BUENO (francisco.romerobueno@telefonica.com); Massimiliano Nigrelli; Luigi Briguglio; SERGIO GARCIA GOMEZ (sergio.garciagomez@telefonica.com); Pasquale Andriani
        Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Dear Dario,

        We forwarded your request to second level support.

        Kind regards
        IWAVE team, on behalf of helpdesk team

        ________________________________

        Este mensaje y sus adjuntos se dirigen exclusivamente a su destinatario, puede contener información privilegiada o confidencial y es para uso exclusivo de la persona o entidad de destino. Si no es usted. el destinatario indicado, queda notificado de que la lectura, utilización, divulgación y/o copia sin autorización puede estar prohibida en virtud de la legislación vigente. Si ha recibido este mensaje por error, le rogamos que nos lo comunique inmediatamente por esta misma vía y proceda a su destrucción.

        The information contained in this transmission is privileged and confidential information intended only for the use of the individual or entity named above. If the reader of this message is not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this transmission in error, do not read it. Please immediately reply to the sender that you have received this communication in error and then delete it.

        Esta mensagem e seus anexos se dirigem exclusivamente ao seu destinatário, pode conter informação privilegiada ou confidencial e é para uso exclusivo da pessoa ou entidade de destino. Se não é vossa senhoria o destinatário indicado, fica notificado de que a leitura, utilização, divulgação e/ou cópia sem autorização pode estar proibida em virtude da legislação vigente. Se recebeu esta mensagem por erro, rogamos-lhe que nos o comunique imediatamente por esta mesma via e proceda a sua destruição
        _______________________________________________
        Fiware-lab-help mailing list
        Fiware-lab-help@lists.fi-ware.org
        https://lists.fi-ware.org/listinfo/fiware-lab-help

        Show
        fw.ext.user FW External User added a comment - Dear Dario, I will need to know the code around this trace: at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102) It should be something similar to/inspired by the attached code that is working for me: Connecting to jdbc:hive2://130.206.80.46:10000/frb?user=frb&password=XXXXXXXXXX remotehive> select * from frb_one; 1431949600,2015-05-18T11:46:40.171Z,Room1,Room,temperature,centigrade,26.5 1431949749,2015-05-18T11:49:09.506Z,Room1,Room,temperature,centigrade,26.5 1432014378,2015-05-19T05:46:18.361Z,Room1,Room,temperature,centigrade,26.5 1432221197,2015-05-21T15:13:17.979Z,Room1,Room,temperature,centigrade,26.5 remotehive> Regarding beeline, the connection must be done without the comma character: $ beeline Beeline version 0.13.0 by Apache Hive beeline> !connect jdbc:hive2://localhost:10000 frb XXXXXXXX org.apache.hive.jdbc.HiveDriver Connecting to jdbc:hive2://localhost:10000 Connected to: Apache Hive (version 0.13.0) Driver: Hive JDBC (version 0.13.0) Transaction isolation: TRANSACTION_REPEATABLE_READ 0: jdbc:hive2://localhost:10000> select * from frb.frb_one; -------------------- ------------------------- ----------------- ------------------- ----------------- ----------------- ------------------- -------------------- ------------------------- ----------------- ------------------- ----------------- ----------------- ------------------- -------------------- ------------------------- ----------------- ------------------- ----------------- ----------------- ------------------- 4 rows selected (0.368 seconds) 0: jdbc:hive2://localhost:10000> Attached code: package com.telefonica.iot.hivebasicclient; import java.io.BufferedReader; import java.io.IOException; import java.io.InputStreamReader; import java.sql.Connection; import java.sql.DriverManager; import java.sql.ResultSet; import java.sql.SQLException; import java.sql.Statement; /** * @author Francisco Romero Bueno frb@tid.es * Basic remote client for Hive mimicing the native Hive CLI behaviour. * Can be used as the base for more complex clients, interactive or not interactive. */ public final class HiveBasicClient { // JDBC driver required for Hive connections private static final String DRIVERNAME = "org.apache.hive.jdbc.HiveDriver"; private static Connection con; /** Constructor. */ private HiveBasicClient() { } // HiveBasicClient /** * @param hiveServer @param hivePort @param dbName @param hadoopUser @param hadoopPassword @return */ private static Connection getConnection(String hiveServer, String hivePort, String dbName, String hadoopUser, String hadoopPassword) Unknown macro: { try { // dynamically load the Hive JDBC driver Class.forName(DRIVERNAME); } catch (ClassNotFoundException e) { System.out.println(e.getMessage()); return null; } // try catch try { System.out.println("Connecting to jdbc:hive2://" + hiveServer + ":" + hivePort + "/" + dbName + "?user=" + hadoopUser + "&password=XXXXXXXXXX"); // return a connection based on the Hive JDBC driver return DriverManager.getConnection("jdbc:hive2://" + hiveServer + ":" + hivePort + "/" + dbName, hadoopUser, hadoopPassword); } catch (SQLException e) { System.out.println(e.getMessage()); return null; } // try catch } // getConnection /** * @param query */ private static void doExecute(String query) { try { // from here on, everything is SQL! Statement stmt = con.createStatement(); ResultSet res = stmt.executeQuery(query); // iterate on the result while (res.next()) { String s = ""; for (int i = 1; i < res.getMetaData().getColumnCount(); i++) { s += res.getString(i) + ","; } // for s += res.getString(res.getMetaData().getColumnCount()); System.out.println(s); } // while // close everything res.close(); stmt.close(); } catch (SQLException e) { System.out.println(e.getMessage()); } // try catch } // doExecute /** * * @param query */ private static void doUpdate(String query) { try { // from here on, everything is SQL! Statement stmt = con.createStatement(); stmt.executeUpdate(query); // close everything stmt.close(); } catch (SQLException e) { System.out.println(e.getMessage()); } // try catch } // doUpdate /** * @param args */ public static void main(String[] args) { // get the arguments String hiveServer = args [0] ; String hivePort = args [1] ; String dbName = args [2] ; String cosmosUser = args [3] ; String cosmosPassword = args [4] ; // get a connection to the Hive server running on the specified IP address, listening on 10000/TCP port // authenticate using my credentials con = getConnection(hiveServer, hivePort, dbName, cosmosUser, cosmosPassword); if (con == null) { System.out.println("Could not connect to the Hive server!"); System.exit(-1); } // if // add JSON serde doUpdate("add JAR /usr/local/apache-hive-0.13.0-bin/lib/json-serde-1.3.1-SNAPSHOT-jar-with-dependencies.jar"); // use the database doUpdate("use " + dbName); while (true) { // prompt the user for a set of HiveQL sentence (';' separated) System.out.print("remotehive> "); // open the standard input BufferedReader br = new BufferedReader(new InputStreamReader(System.in)); // read the HiveQL sentences from the standard input String hiveqlSentences = null; try { hiveqlSentences = br.readLine(); } catch (IOException e) { System.out.println("IO error trying to read a HiveQL query: " + e.getMessage()); System.exit(1); } // try catch if (hiveqlSentences != null) { // get all the queries within the input HiveQL sentences String[] queries = hiveqlSentences.split(";"); // for each query, execute it for (String querie : queries) { doExecute(querie); } // for } // if } // while } // main } //HiveClientTest Best regards, Francisco ________________________________________ De: Pellegrino Dario <dario.pellegrino@eng.it> Enviado: miércoles, 26 de agosto de 2015 12:09 Para: agalani@unipi.gr; fiware-lab-help@lists.fi-ware.org Cc: Leandro Lombardo; FRANCISCO ROMERO BUENO; Massimiliano Nigrelli; Luigi Briguglio; Pasquale Andriani Asunto: R: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Hi Aristi, I have done other tests. Please, could you send the information below to a second level support . 1) TOMCAT Log Aug 26, 2015 11:29:43 AM eu.finesce.emarketplace.client.HiveClient getHiveConnection SEVERE: HIVE Connection Error java.sql.SQLException: Could not open connection to jdbc:hive2://130.206.80.46:10000: java.net.SocketException: Connection reset at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206) at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:178) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:571) at java.sql.DriverManager.getConnection(DriverManager.java:215) at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102) at eu.finesce.emarketplace.client.HiveClient.getloadpredictionBySector(HiveClient.java:1017) at eu.finesce.emarketplace.RestHive2Cosmos.getLoadPredictionbySector(RestHive2Cosmos.java:251) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81) at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:151) at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:171) at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:195) at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:104) at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:402) at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:349) at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:106) at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:259) at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271) at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267) at org.glassfish.jersey.internal.Errors.process(Errors.java:315) at org.glassfish.jersey.internal.Errors.process(Errors.java:297) at org.glassfish.jersey.internal.Errors.process(Errors.java:267) at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:318) at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:236) at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1010) at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:373) at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:382) at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:345) at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:220) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:224) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:169) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98) at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:927) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407) at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:987) at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:579) at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:307) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129) at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84) at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:288) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:203) ... 48 more Caused by: java.net.SocketException: Connection reset at java.net.SocketInputStream.read(SocketInputStream.java:196) at java.net.SocketInputStream.read(SocketInputStream.java:122) at java.io.BufferedInputStream.fill(BufferedInputStream.java:235) at java.io.BufferedInputStream.read1(BufferedInputStream.java:275) at java.io.BufferedInputStream.read(BufferedInputStream.java:334) at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127) ... 53 more 2) BEELINE TEST I tried to use the Jdbc Hive2 connection in "Beeline Hive Client" on COSMOS -bash-4.1$ beeline Beeline version 0.13.0 by Apache Hive beeline> !connect jdbc:hive2://130.206.80.46:10000", "FINESCE-WP4", "password" scan complete in 24ms Connecting to jdbc:hive2://130.206.80.46:10000", Error: Invalid URL: jdbc:hive2://130.206.80.46:10000", (state=08S01,code=0) Best regards, Dario Dario Pellegrino Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it Engineering Ingegneria Informatica spa Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it ---- Messaggio originale ---- Da: Aristi Galani agalani@unipi.gr Inviato: martedì 25 agosto 2015 18:29 A: Pellegrino Dario Cc: fiware-lab-help@lists.fi-ware.org; Leandro Lombardo; FRANCISCO ROMERO BUENO (francisco.romerobueno@telefonica.com); Massimiliano Nigrelli; Luigi Briguglio; SERGIO GARCIA GOMEZ (sergio.garciagomez@telefonica.com); Pasquale Andriani Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Dear Dario, We forwarded your request to second level support. Kind regards IWAVE team, on behalf of helpdesk team ________________________________ Este mensaje y sus adjuntos se dirigen exclusivamente a su destinatario, puede contener información privilegiada o confidencial y es para uso exclusivo de la persona o entidad de destino. Si no es usted. el destinatario indicado, queda notificado de que la lectura, utilización, divulgación y/o copia sin autorización puede estar prohibida en virtud de la legislación vigente. Si ha recibido este mensaje por error, le rogamos que nos lo comunique inmediatamente por esta misma vía y proceda a su destrucción. The information contained in this transmission is privileged and confidential information intended only for the use of the individual or entity named above. If the reader of this message is not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this transmission in error, do not read it. Please immediately reply to the sender that you have received this communication in error and then delete it. Esta mensagem e seus anexos se dirigem exclusivamente ao seu destinatário, pode conter informação privilegiada ou confidencial e é para uso exclusivo da pessoa ou entidade de destino. Se não é vossa senhoria o destinatário indicado, fica notificado de que a leitura, utilização, divulgação e/ou cópia sem autorização pode estar proibida em virtude da legislação vigente. Se recebeu esta mensagem por erro, rogamos-lhe que nos o comunique imediatamente por esta mesma via e proceda a sua destruição _______________________________________________ Fiware-lab-help mailing list Fiware-lab-help@lists.fi-ware.org https://lists.fi-ware.org/listinfo/fiware-lab-help
        frb Francisco Romero made changes -
        Status Open [ 1 ] In Progress [ 3 ]
        Hide
        fw.ext.user FW External User added a comment -

        Hi Aristi,
        I have done other tests. Please, could you send the information below to a second level support .

        1) TOMCAT Log
        Aug 26, 2015 11:29:43 AM eu.finesce.emarketplace.client.HiveClient getHiveConnection
        SEVERE: HIVE Connection Error
        java.sql.SQLException: Could not open connection to jdbc:hive2://130.206.80.46:10000: java.net.SocketException: Connection reset
        at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206)
        at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:178)
        at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
        at java.sql.DriverManager.getConnection(DriverManager.java:571)
        at java.sql.DriverManager.getConnection(DriverManager.java:215)
        at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102)
        at eu.finesce.emarketplace.client.HiveClient.getloadpredictionBySector(HiveClient.java:1017)
        at eu.finesce.emarketplace.RestHive2Cosmos.getLoadPredictionbySector(RestHive2Cosmos.java:251)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81)
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:151)
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:171)
        at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:195)
        at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:104)
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:402)
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:349)
        at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:106)
        at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:259)
        at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271)
        at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267)
        at org.glassfish.jersey.internal.Errors.process(Errors.java:315)
        at org.glassfish.jersey.internal.Errors.process(Errors.java:297)
        at org.glassfish.jersey.internal.Errors.process(Errors.java:267)
        at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:318)
        at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:236)
        at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1010)
        at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:373)
        at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:382)
        at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:345)
        at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:220)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:224)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:169)
        at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98)
        at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:927)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407)
        at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:987)
        at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:579)
        at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:307)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
        Caused by: org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
        at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129)
        at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
        at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:288)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:203)
        ... 48 more
        Caused by: java.net.SocketException: Connection reset
        at java.net.SocketInputStream.read(SocketInputStream.java:196)
        at java.net.SocketInputStream.read(SocketInputStream.java:122)
        at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
        at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127)
        ... 53 more

        2) BEELINE TEST
        I tried to use the Jdbc Hive2 connection in "Beeline Hive Client" on COSMOS

        -bash-4.1$ beeline
        Beeline version 0.13.0 by Apache Hive
        beeline> !connect jdbc:hive2://130.206.80.46:10000", "FINESCE-WP4", "password"
        scan complete in 24ms
        Connecting to jdbc:hive2://130.206.80.46:10000",
        Error: Invalid URL: jdbc:hive2://130.206.80.46:10000", (state=08S01,code=0)

        Best regards,
        Dario

        Dario Pellegrino
        Direzione Ricerca e Innovazione - R&D Lab
        dario.pellegrino@eng.it

        Engineering Ingegneria Informatica spa
        Viale Regione Siciliana, 7275 - 90146 Palermo
        Tel. +39-091.7511847
        Mob. +39-346.5325257
        www.eng.it

        ----Messaggio originale----
        Da: Aristi Galani agalani@unipi.gr
        Inviato: martedì 25 agosto 2015 18:29
        A: Pellegrino Dario
        Cc: fiware-lab-help@lists.fi-ware.org; Leandro Lombardo; FRANCISCO ROMERO BUENO (francisco.romerobueno@telefonica.com); Massimiliano Nigrelli; Luigi Briguglio; SERGIO GARCIA GOMEZ (sergio.garciagomez@telefonica.com); Pasquale Andriani
        Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2

        Dear Dario,

        We forwarded your request to second level support.

        Kind regards
        IWAVE team, on behalf of helpdesk team

        _______________________________________________
        Fiware-lab-help mailing list
        Fiware-lab-help@lists.fi-ware.org
        https://lists.fi-ware.org/listinfo/fiware-lab-help

        Show
        fw.ext.user FW External User added a comment - Hi Aristi, I have done other tests. Please, could you send the information below to a second level support . 1) TOMCAT Log Aug 26, 2015 11:29:43 AM eu.finesce.emarketplace.client.HiveClient getHiveConnection SEVERE: HIVE Connection Error java.sql.SQLException: Could not open connection to jdbc:hive2://130.206.80.46:10000: java.net.SocketException: Connection reset at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206) at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:178) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:571) at java.sql.DriverManager.getConnection(DriverManager.java:215) at eu.finesce.emarketplace.client.HiveClient.getHiveConnection(HiveClient.java:102) at eu.finesce.emarketplace.client.HiveClient.getloadpredictionBySector(HiveClient.java:1017) at eu.finesce.emarketplace.RestHive2Cosmos.getLoadPredictionbySector(RestHive2Cosmos.java:251) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81) at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:151) at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:171) at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:195) at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:104) at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:402) at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:349) at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:106) at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:259) at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271) at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267) at org.glassfish.jersey.internal.Errors.process(Errors.java:315) at org.glassfish.jersey.internal.Errors.process(Errors.java:297) at org.glassfish.jersey.internal.Errors.process(Errors.java:267) at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:318) at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:236) at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1010) at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:373) at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:382) at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:345) at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:220) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:224) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:169) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98) at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:927) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407) at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:987) at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:579) at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:307) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129) at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84) at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:288) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:203) ... 48 more Caused by: java.net.SocketException: Connection reset at java.net.SocketInputStream.read(SocketInputStream.java:196) at java.net.SocketInputStream.read(SocketInputStream.java:122) at java.io.BufferedInputStream.fill(BufferedInputStream.java:235) at java.io.BufferedInputStream.read1(BufferedInputStream.java:275) at java.io.BufferedInputStream.read(BufferedInputStream.java:334) at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127) ... 53 more 2) BEELINE TEST I tried to use the Jdbc Hive2 connection in "Beeline Hive Client" on COSMOS -bash-4.1$ beeline Beeline version 0.13.0 by Apache Hive beeline> !connect jdbc:hive2://130.206.80.46:10000", "FINESCE-WP4", "password" scan complete in 24ms Connecting to jdbc:hive2://130.206.80.46:10000", Error: Invalid URL: jdbc:hive2://130.206.80.46:10000", (state=08S01,code=0) Best regards, Dario Dario Pellegrino Direzione Ricerca e Innovazione - R&D Lab dario.pellegrino@eng.it Engineering Ingegneria Informatica spa Viale Regione Siciliana, 7275 - 90146 Palermo Tel. +39-091.7511847 Mob. +39-346.5325257 www.eng.it ---- Messaggio originale ---- Da: Aristi Galani agalani@unipi.gr Inviato: martedì 25 agosto 2015 18:29 A: Pellegrino Dario Cc: fiware-lab-help@lists.fi-ware.org; Leandro Lombardo; FRANCISCO ROMERO BUENO (francisco.romerobueno@telefonica.com); Massimiliano Nigrelli; Luigi Briguglio; SERGIO GARCIA GOMEZ (sergio.garciagomez@telefonica.com); Pasquale Andriani Oggetto: Re: [Fiware-lab-help] [FINESCE-WP4] COSMOS : Error after upgrading to HiveServer2 Dear Dario, We forwarded your request to second level support. Kind regards IWAVE team, on behalf of helpdesk team _______________________________________________ Fiware-lab-help mailing list Fiware-lab-help@lists.fi-ware.org https://lists.fi-ware.org/listinfo/fiware-lab-help
        Hide
        aristi Aristi Galani (Inactive) added a comment -

        Dear Dario,

        We forwarded your request to second level support.

        Kind regards
        IWAVE team, on behalf of helpdesk team

        _______________________________________________
        Fiware-lab-help mailing list
        Fiware-lab-help@lists.fi-ware.org
        https://lists.fi-ware.org/listinfo/fiware-lab-help

        Show
        aristi Aristi Galani (Inactive) added a comment - Dear Dario, We forwarded your request to second level support. Kind regards IWAVE team, on behalf of helpdesk team _______________________________________________ Fiware-lab-help mailing list Fiware-lab-help@lists.fi-ware.org https://lists.fi-ware.org/listinfo/fiware-lab-help
        UPRC PiraeusU Node Helpdesk made changes -
        Assignee Francisco Romero [ frb ]
        UPRC PiraeusU Node Helpdesk made changes -
        Field Original Value New Value
        Component/s FIWARE-LAB-HELP [ 10279 ]
        fw.ext.user FW External User created issue -

          People

          • Assignee:
            frb Francisco Romero
            Reporter:
            fw.ext.user FW External User
          • Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved: