Created question in FIWARE Q/A platform on 31-05-2016 at 15:05
Please, ANSWER this question AT http://stackoverflow.com/questions/37546569/unable-to-use-custom-mapreduce-jar-files-in-cosmos
Question:
Unable to use custom Mapreduce jar files in Cosmos
Description:
I created my own Mapreduce jar file and tested in the Cosmos' old Hadoop cluster successfully using the HDFS shell commands. The next step was to test the same jar in the new cluster, so I uploaded it to the
new cluster's HDFS, to my home folder (user/my.username).
When I try to start a Mapreduce job using the curl post below,
curl -X POST "http://computing.cosmos.lab.fiware.org:12000/tidoop/v1/user/my.username/jobs" -d '
{"jar":"dt.jar","class_name":"DistanceTest","lib_jars":"dt.jar","input":"input","output":"output"}
' -H "Content-Type: application/json" -H "X-Auth-Token: xxxxxxxxxxxxxxxxxxx"
I get:
{"success":"false","error":255}
I tried different path values for the jar and I get the same result. Do I have to upload my jar to somewhere else or am I missing some necessary steps?