Uploaded image for project: 'Help-Desk'
  1. Help-Desk
  2. HELP-8989

[fiware-stackoverflow] Unable to use custom Mapreduce jar files in Cosmos

    Details

      Description

      Created question in FIWARE Q/A platform on 31-05-2016 at 15:05
      Please, ANSWER this question AT https://stackoverflow.com/questions/37546569/unable-to-use-custom-mapreduce-jar-files-in-cosmos

      Question:
      Unable to use custom Mapreduce jar files in Cosmos

      Description:
      I created my own Mapreduce jar file and tested in the Cosmos' old Hadoop cluster successfully using the HDFS shell commands. The next step was to test the same jar in the new cluster, so I uploaded it to the
      new cluster's HDFS, to my home folder (user/my.username).

      When I try to start a Mapreduce job using the curl post below,

      curl -X POST "http://computing.cosmos.lab.fiware.org:12000/tidoop/v1/user/my.username/jobs" -d '

      {"jar":"dt.jar","class_name":"DistanceTest","lib_jars":"dt.jar","input":"input","output":"output"}

      ' -H "Content-Type: application/json" -H "X-Auth-Token: xxxxxxxxxxxxxxxxxxx"

      I get:

      {"success":"false","error":255}

      I tried different path values for the jar and I get the same result. Do I have to upload my jar to somewhere else or am I missing some necessary steps?

        Activity

        Transition Time In Source Status Execution Times Last Executer Last Execution Date
        Open Open In Progress In Progress
        2h 55m 1 Backlog Manager 22/May/17 6:08 PM
        In Progress In Progress Closed Closed
        3h 1 Backlog Manager 22/May/17 9:08 PM

          People

          • Assignee:
            backlogmanager Backlog Manager
            Reporter:
            backlogmanager Backlog Manager
          • Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved: