akka / akka-persistence-jdbc

Asynchronously writes journal and snapshot entries to configured JDBC databases so that Akka Actors can recover state

Home Page:https://doc.akka.io/docs/akka-persistence-jdbc/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Migration tool - 3.5.x => 5.0.0

octonato opened this issue · comments

The initial goal is to provide a tool based on FlyWay. Enno has done some initial experiments and we are confident that its flexible and powerful enough to run all the migrations we need.

This will be a one-shot tool that will execute all the migration steps from 3.5.x to 4.0.0 5.0.0.

We are not using FlyWay because we want to keep around a table with all applied migrations. Users may delete the table if they want. The reason is to use its migration functions.

The current migrations are:

  • The first migration is the creation of the new tables: journal_messages, tags and snapshots. Users should be able to tweak the name of those tables. This is an existing feature in the plugin and we need to keep it.
    Note: This is for users coming from 3.5.x. New users create the tables by themselves.
  • Migrate Snapshots to new table (ie: snapshots) and unwrap the payload.
  • Migrate Events to new table (ie: journal_messages) and unwrap payload. When migrating the date, the timestamp column must be filled with 0 (begin of epoch).
  • Move tags to its own table (ie: tags) (one-to-many with Events table) and split content (currently comma-separated values).

Ideally, we should be able to run the migration tool without adding custom serializers. We should be able to read the byte array, remove the current header and save only the snapshot/event payload back. This need to be confirmed though.

Release 5.0.0 changes the database schema and requires users to either migrate their data or set the configuration to use the old schema.
New applications should use the new schema.

@ennru How long will the old schema will be viable? I think a migration tool is paramount and I don't think we should let the users do it.

We agree that a migration tool is important, but we don't have the bandwidth to create it right now. We think it is value in that new projects can benefit from the new schema.

Community contribution of migration tool is much welcome.

Hello,

I would like to know how to perform the unwrapping here:

- Migrate Snapshots to new table (ie: snapshots) and unwrap the payload.
- Migrate Events to new table (ie: journal_messages) and unwrap payload. When migrating the date, the timestamp column must be filled with 0 (begin of epoch).

looking at the new schema when dealing with things like:

  • ser_id
  • ser_manifest

In a nutshell should I unpack the stored BYTEA using the following PersistentMessage proto message https://github.com/akka/akka/blob/146944f99934557eac72e6dc7fa25fc6b2f0f11c/akka-persistence/src/main/protobuf/MessageFormats.proto#L10?

@octonato @patriknw kindly provide some pointers on the above comment. Thanks

@Tochemey, you may not need to deal with the internals of serialization. At least not at that level.

Instead, you can use the existing DAOs to read from one table and write to the other. The DAOs take care of the deserialization and serialization.

For instance, you can use the legacy query DAO to get all messages. It will deserialize for you and return a PersistentRepr.

Then you can use the new DAO to write it back in the new table and again it will ensure that the payload is serialized the way it should.

We may need to hack a few things around to get it initialized correctly, but the mechanics for deserialziing and serializing is already in place.

Hi,

We need to migrate an application currently in production for several clients to version 5, since it'd solve some pretty big performance problems we are having. What is missing to complete this tool? Can we help in some way?

It would be great if you can help to revive the work in PR #501.

Hi

What is the state regarding migration? I'm a bit confused because there are different branches (which all seem merged) and the release notes for 5.1.0 states that there is a migration tool. And then there is this issue here which says "help wanted". However the current Akka documentation for 5.2.0 states that this "tool doesn’t exist yet" https://doc.akka.io/docs/akka-persistence-jdbc/current/migration.html#migrating-to-version-6-0-0

Thanks and regards
Oliver

As far as I understand it was merged for 5.2.0 (c56753d).

The docs still saying it's not ready may be an oversight/miss, there is also no docs on actually using it so you'd have to figure that out on your own, the tests probably show you how it is intended to use: https://github.com/akka/akka-persistence-jdbc/tree/master/migrator/src/test/scala/akka/persistence/jdbc/migrator

Great, thanks! I will give it a try then.