JasperFx / marten

.NET Transactional Document DB and Event Store on PostgreSQL

Home Page:https://martendb.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Cannot deserialize byte[] when CollectionStorage.AsArray option is set

blusius opened this issue · comments

By default Json serializes byte[] to base 64 string. And it works when storing and loading documents.
But

  • when _.UseNewtonsoftForSerialization(collectionStorage: CollectionStorage.AsArray);
  • Then storing works as expected but document load fails with
  •   Newtonsoft.Json.JsonSerializationException : Cannot deserialize the current JSON array (e.g. [1,2,3]) into type 'System.Byte[]'  because the type requires a JSON primitive value (e.g. string, number, boolean, null) to deserialize correctly...
    

Document Load can be fixed by changing byte[] to ICollection
But

  • When using session.Patch(document.Id).Set(x => x.DocumentData, new byte[] { 3, 2, 1 });
  • Then value is stored as base 64 string and document load fails after this with
  •   Newtonsoft.Json.JsonSerializationException : Error converting value "AwIB" to type System.Collections.Generic.ICollection`1[System.Byte]'.
    

So there I see multiple issues:

  1. When writing byte[], should be able to load no matter what option is set for collection storage
  2. When doing patch, array should be serialized in the same way comparing to session.Store

Do I miss something in configuration? Or there are workarounds I can not find?
simple example to reproduce

public record Document
{
    public string Id { get; init; } = default!;
    public byte[] DocumentData { get; init; } = Array.Empty<byte>();
    //public ICollection<byte> DocumentData { get; init; } = Array.Empty<byte>();
}

[Fact]
public async Task PatchFailure()
{
    var document = new Document() { Id = "111", DocumentData = new byte[] { 1, 2, 3 } };

    var connectionString = "Host=localhost;Database=Demo;Username=postgres";

    // Setup store
    var options = new StoreOptions();
    options.DatabaseSchemaName = "test";
    options.UseNewtonsoftForSerialization(collectionStorage: CollectionStorage.AsArray);
    options.Connection(connectionString);

    // Cleanup
    using var conn = new NpgsqlConnection(connectionString);
    conn.Open();
    conn.CreateCommand($"drop schema if exists {options.DatabaseSchemaName} cascade").ExecuteNonQuery();

    var localStore = new DocumentStore(options);

    // Store
    using var session = localStore.LightweightSession();
    session.Store(document);
    await session.SaveChangesAsync();

    // To check if load fails
    var loadedDocument = await session.LoadAsync<Document>(document.Id);

    // Patch
    session.Patch<Document>(document.Id).Set(x => x.DocumentData, new byte[] { 3, 2, 1 });
    await session.SaveChangesAsync();

    // Assert
    loadedDocument = await session.LoadAsync<Document>(document.Id);
    Assert.Equal(new byte[] { 3, 2, 1 }, loadedDocument.DocumentData);
}

Hey, my consistent advice is to not use Marten documents for storing binary data. I don't think this is likely to be addressed anytime soon, and I'd strongly advise using just plain Postgresql tables for this kind of thing anyway.

I'm converting this to a discussion