hbutani / spark-druid-olap

Sparkline BI Accelerator provides fast ad-hoc query capability over Logical Cubes. This has been folded into our SNAP Platform(http://bit.ly/2oBJSpP) an Integrated BI platform on Apache Spark.

Home Page:http://sparklinedata.com/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

handle generic grouping expressions on a single dimension

hbutani opened this issue · comments

SELECT avg(lineitem.l_extendedprice) AS avg_l_extendedprice_ok,
customers.c_mktsegment AS c_mktsegment,
custnation.cn_name AS cn_name,
custregion.cr_name AS cr_name,
(((year(cast(lineitem.l_commitdate AS timestamp)) * 10000) + (month(cast(lineitem.l_commitdate AS timestamp)) * 100)) + day(cast(lineitem.l_commitdate AS timestamp))) AS md_l_commitdate_ok,
cast((month(cast(lineitem.l_commitdate AS timestamp)) - 1) / 3 + 1 AS BIGINT) AS qr_l_commitdate_ok
FROM (
SELECT *
FROM lineitemindexed ) lineitem
JOIN
(
SELECT *
FROM orders ) orders
ON (
lineitem.l_orderkey = orders.o_orderkey)
JOIN
(
SELECT *
FROM customer ) customers
ON (
orders.o_custkey = customers.c_custkey)
JOIN
(
SELECT *
FROM custnation ) custnation
ON (
customers.c_nationkey = custnation.cn_nationkey)
JOIN
(
SELECT *
FROM custregion ) custregion
ON (
custnation.cn_regionkey = custregion.cr_regionkey)
GROUP BY customers.c_mktsegment,
custnation.cn_name,
custregion.cr_name,
(((year(cast(lineitem.l_commitdate AS timestamp)) * 10000) + (month(cast(lineitem.l_commitdate AS timestamp)) * 100)) + day(cast(lineitem.l_commitdate AS timestamp))),
cast((month(cast(lineitem.l_commitdate AS timestamp)) - 1) / 3 + 1 AS BIGINT)