vlaaad / reveal

Read Eval Visualize Loop for Clojure

Home Page:https://vlaaad.github.io/reveal/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

view:table max columns number is restricted to 16

ikappaki opened this issue · comments

Hi,

the maximum number of columns the view:table currently support is restricted to 16. This number seems to be arbitrary. My expectation would be that either there is no max limits or it is a much larger number.

To reproduce

  1. Evan an artificial table with 1 row and 17 columns
[(apply array-map (interleave (map (comp keyword str) (range 17)) (range)))]
  1. In the reveal window, right click on the opening square bracket and select view:table.

The table is displayed as a single column item, instead of a 17 column table:

item
{:0 0 :1 1 :2 2 :3 3 :4 4 :5 5 :6 6 :7 7 :8 8 :9 9 :10 10 :11 11 :12 12 :13 13 :14 14 :15 15 :16 16}

The max columns value is hardcoded in vlaaad.revial.view/infer-columns:

(defn- infer-columns [sample]
  (and (seq sample)
       (condp every? sample
         (some-fn map? nil?)
         (let [all-keys (mapcat keys sample)
               columns (distinct all-keys)
               column-count (count columns)
               cells (* (count sample) column-count)]
           (when (and (<= (/ cells 2) (count all-keys))
                      (<= 1 column-count 16))
             (for [k columns]
               {:header k :fn #(get % k no-val)})))

         map-entry?
         [{:header 'key :fn key} {:header 'val :fn val}]

         sequential?
         (let [counts (map count sample)
               column-count (apply max counts)
               cell-count (* (count sample) column-count)]
           (when (and (<= (/ cell-count 2) (reduce + counts))
                      (<= 1 column-count 16))
             (for [i (range column-count)]
               {:header i :fn #(nth % i no-val)})))

         nil)))

It is not obvious why a limit is imposed here (perhaps there is performance implications) , possible improvements I can think of:

  1. Remove max limit,
  2. Double max limit to 32 (satisfies my use case).
  3. Increase max limit considerably just to catch abused cases, e.g. to 1024.
  4. Make it a user option.

Happy to submit a PR.

Thanks

Hi @ikappaki!

perhaps there is performance implications

Exactly! Column inference is recursive (to the depth of 4), so a worst-case scenario could be very bad (e.g. 16^4=65536 columns). Example that already has abysmal performance:

(->> (range 16)
     (map (fn [_]
            (->> (range 16)
                 (map (fn [_]
                        (into {} (map-indexed vector (range 16)))))))))

That said, I think it should be pretty rare to have such a big nested structure, so it's probably fine to bump the limit to 32. I released a new version (Free: 1.3.272, Pro: 1.3.342) with the bump.