ooyala / spark-jobserver

REST job server for Spark. Note that this is *not* the mainline open source version. For that, go to https://github.com/spark-jobserver/spark-jobserver. This fork now serves as a semi-private repo for Ooyala.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Get detail information with jobId

yyuzhong opened this issue · comments

Hi,
I could get detail information with curl localhost:8090/jobs, it will list all jobs info I had run.
{
"duration": "45.913 secs",
"classPath": "org.pvamu.hadoop.image.SparkJobDriver",
"startTime": "2014-10-20T19:28:15.927-05:00",
"context": "pixel-job91",
"status": "FINISHED",
"jobId": "712c0bc9-d560-419e-89c5-658864246a77"
}, {
"duration": "43.22 secs",
"classPath": "org.pvamu.hadoop.image.SparkJobDriver",
"startTime": "2014-10-24T15:58:12.134-05:00",
"context": "pixel-job92",
"status": "FINISHED",
"jobId": "a394d0ce-f65e-4f55-b836-726597261f6d"
}

But When I just want to query one specific job status(query every 5s), I use curl localhost:8090/jobs/a394d0ce-f65e-4f55-b836-726597261f6d, I could only get status and result.
{
"status": "OK",
"result": "()"
}

So my question is: Is there any way to get detail information with jobId? I want to query status and duration of one job with its jobId, but I do not want to get all jobs information and search in it, cause it will take more time and get other jobs information i do not care.

Thanks!