derekkraan / delta_crdt_ex

Use DeltaCrdt to build distributed applications in Elixir

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Lost additions?

kevinbader opened this issue · comments

Hi, thanks for the library! This is probably more my lack of understanding than an issue with your code ^^

I've noticed the following:

iex> {:ok, c1} = DeltaCrdt.start_link(AWLWWMap)
iex> {:ok, c2} = DeltaCrdt.start_link(AWLWWMap)
iex> DeltaCrdt.set_neighbours(c1, [c2])
iex> DeltaCrdt.set_neighbours(c2, [c1])
iex> for id <- 1..100_000, do: DeltaCrdt.mutate(c1, :add, [id, %{}])
iex> DeltaCrdt.read(c1) |> map_size()
18429

I would've expected 100000 - I've waited until the scheduler utilization was low but the number didn't change after this.

Then I've tried it without setting up a neighbor:

iex> # Recreating c2:
iex> {:ok, c2} = DeltaCrdt.start_link(AWLWWMap)
iex> for id <- 1..100_000, do: DeltaCrdt.mutate(c2, :add, [id, %{}])
iex> DeltaCrdt.read(c2) |> map_size()
100000

That was fast and all elements are there. When connecting this new c2 instance to the c1, I've expected those 100000 entries to propagate from c2 to c1, but instead:

iex(26)> DeltaCrdt.read(c2) |> map_size()
100000
iex(27)> DeltaCrdt.read(c1) |> map_size()
18429
iex(28)> DeltaCrdt.set_neighbours c2, [c1]
:ok
iex(29)> DeltaCrdt.set_neighbours c1, [c2]
:ok
iex(30)> DeltaCrdt.read(c1) |> map_size()
17629
iex(31)> DeltaCrdt.read(c2) |> map_size()
98400
iex(32)> DeltaCrdt.read(c2) |> map_size()
97400
...

So the new entries in c2 are actually removed from the map and only the 18429 items in c1 remain.

What's going on here? :)

Hi @derekkraan - sure, see #46

For the first testcase I've played around with the sync interval as well as with how long to wait before reading the contents. It seems I get different numbers every time but so far I wasn't lucky enough to score the full 100_000 :)

Hi @kevinbader, please let me know if you find any other issues with the library. I'll close this issue for now, but feel free to re-open if you would like to discuss further.