lmb-freiburg / flownet2

FlowNet 2.0: Evolution of Optical Flow Estimation with Deep Networks

Home Page:https://lmb.informatik.uni-freiburg.de/Publications/2017/IMKDB17/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

The process of online data augmentation

adhara123007 opened this issue · comments

I wanted to understand the online process of data augmentation.
I give the network inputs in batches of 4. It applies transformation to the inputs. The parameters for augmentation are. The prob is 1, Does it mean that the network always applies a transformation and it is never the case that the network lets the original data pass through without the transformation?

Also, from the code below the uniform_bernoulli is same as bernoulli.

translate {
rand_type: "uniform_bernoulli"
exp: false
mean: 0
spread: 0.4
prob: 1.0
}

if (rand_type.compare("uniform_bernoulli") == 0) {
float tmp1;
int tmp2;

// Eddy:
// modified: a probability value of 0 will always return the default of prob0_value

tmp2=1;
if (param.prob() > 0.)
  caffe_rng_bernoulli(1, param.prob(), &tmp2);
else
    tmp2=0;

if(!tmp2) {
  if (!isnan(prob0_value))
    return prob0_value;
  else
    tmp1 = 0;
} else {
  if (spread > 0.)
    caffe_rng_uniform(1, param.mean() - spread, param.mean() + spread, &tmp1);
  else
    tmp1 = param.mean();
}

if (param.exp())
  tmp1 = exp(tmp1);

rand = static_cast<Randtype>(tmp1);

}

Does it mean that the network always applies a transformation [...] ?

I think so, yes. I suggest testing it, though: just run 10k samples through the augmentation and check how many are the same before and after.