ogrodnek / csv-serde

Hive SerDe for CSV

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Add support for 0.9.0

StuHorsman-zz opened this issue · comments

Hi Larry,

Can you add support for 0.9.0?

Have tried the serde against 0.9.0 and it's throwing an NPE.

Simple example, load some csv files into hdfs and execute the following:

cat create_dividends_serde.hql

DROP TABLE dividends_serde;
CREATE EXTERNAL TABLE IF NOT EXISTS dividends_serde (
exchange STRING,
symbol STRING,
ymd STRING,
dividend FLOAT)
ROW FORMAT SERDE 'com.bizo.hive.serde.csv.CSVSerde'
WITH SERDEPROPERTIES (
"separatorChar" = ",",
"quoteChar" = "'",
"escapeChar" = ""
)
STORED AS TEXTFILE LOCATION '/user/oracle/dividends';
select * from dividends_serde where symbol = 'ZION';

hive -f ./create_dividends_serde.hql

Logging initialized using configuration in file:/etc/hive/conf.dist/hive-log4j.properties
Hive history file=/tmp/oracle/hive_job_log_oracle_201302131303_356127342.txt
OK
Time taken: 4.553 seconds
OK
Time taken: 0.458 seconds
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_201302111400_0008, Tracking URL = http://localhost.localdomain:50030/jobdetails.jsp?jobid=job_201302111400_0008
Kill Command = /usr/lib/hadoop/bin/hadoop job -Dmapred.job.tracker=localhost.localdomain:8021 -kill job_201302111400_0008
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
2013-02-13 13:03:21,604 Stage-1 map = 0%, reduce = 0%
2013-02-13 13:03:52,953 Stage-1 map = 100%, reduce = 100%
Ended Job = job_201302111400_0008 with errors
Error during job, obtaining debugging information...
Examining task ID: task_201302111400_0008_m_000002 (and more) from job job_201302111400_0008
Exception in thread "Thread-32" java.lang.NullPointerException
at org.apache.hadoop.hive.shims.Hadoop23Shims.getTaskAttemptLogUrl(Hadoop23Shims.java:44)
at org.apache.hadoop.hive.ql.exec.JobDebugger$TaskInfoGrabber.getTaskInfos(JobDebugger.java:186)
at org.apache.hadoop.hive.ql.exec.JobDebugger$TaskInfoGrabber.run(JobDebugger.java:142)
at java.lang.Thread.run(Thread.java:662)
FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask
MapReduce Jobs Launched:
Job 0: Map: 1 HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec

The current version should be compatible... Tested against Hive 0.11.0