nats-io / nats.net

The official C# Client for NATS

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Nats client has a memory leak

sspates opened this issue · comments

Observed behavior

Based on a performance test result from .dot Memory tool we are observing that NATS .Net client library has high memory usage when the client does not unsubscribe.

Memory usage resets if the client reconnects.

Service is being run for 1 hour with constant load during which time memory usage progressively increases.

Here is a simplified version of our connection and subscribe:

SimplifiedClient client = new("localhost");
client.Connect();
client.Subscribe("foo.bar", "test", (msg) => Console.WriteLine(msg.Subject));
client.SubscribeAsync("foo.baz", "test", (msg) => Console.WriteLine(msg.Subject));

while (true)
{
    await Task.Delay(200);
}

public class SimplifiedClient : IDisposable
{
    private IConnection? _connection;
    private IJetStreamManagement? _jetStreamManagement;
    private IJetStream? _jetStream;
    private readonly string[] _hosts;

    public SimplifiedClient(params string[] hosts)
    {
        _hosts = hosts;
    }

    public void Connect()
    {
        Options? options = ConnectionFactory.GetDefaultOptions();
        options.Servers = _hosts;
        options.MaxReconnect = Options.ReconnectForever;

        ConnectionFactory connectionFactory = new();

        _connection = connectionFactory.CreateConnection(options);

        _jetStreamManagement = _connection.CreateJetStreamManagementContext();

        _jetStream = _connection.CreateJetStreamContext();
    }

    public void Subscribe(string subject, string queueName, Action<Msg> action)
    {
        EventHandler<MsgHandlerEventArgs> eventHandler = (_, args) => HandleMessage(args, action);

        _connection!.SubscribeAsync(subject, queueName, eventHandler);
    }

    public void SubscribeAsync(string subject, string queueName, Action<Msg> action)
    {
        EventHandler<MsgHandlerEventArgs> eventHandler = (_, args) => HandleMessage(args, action);

        PushSubscribeOptions options = BuildPushSubscribeOptions(queueName);

        _jetStream!.PushSubscribeAsync(subject, eventHandler, false, options);
    }

    private void HandleMessage(MsgHandlerEventArgs args, Action<Msg> handler)
    {
        try
        {
            handler(args.Message);

            args.Message.Ack();
        }
        catch
        {
            args.Message.Nak();
        }
    }

    private PushSubscribeOptions BuildPushSubscribeOptions(string queueName)
    {
        ConsumerConfiguration consumerConfiguration = ConsumerConfiguration.Builder().Build();

        PushSubscribeOptions.PushSubscribeOptionsBuilder? builder = PushSubscribeOptions.Builder();

        builder
            .WithStream("TestStream")
            .WithConfiguration(consumerConfiguration)
            .WithDurable(queueName)
            .WithDeliverGroup(queueName);

        return builder.Build();
    }

    public void Dispose()
    {
        _connection?.Drain();
        _connection?.Close();
        _connection?.Dispose();
    }
}

Expected behavior

Service memory is at a constant level over the hour

Server and client version

Server: 2.10.9
Library: 1.1.1

Host environment

Windows 10 Enterprise LTSC
X86 CPU (various models)
Minimum 8GB memory
NATS hosted as windows service

Steps to reproduce

See code in observed behavior.

Run for around 1 hour and monitor memory utilization.

I'm unable to reproduce it:

image

In the original code is there perhaps an error where subscriptions are repeatedly created in a loop? That is my only speculation at the moment which might cause an increase in memory allocations.

PS ran above code along side nats pub:

nats pub foo.bar 01234567890123456789012345678901 --count 1000000000 --sleep 100ms

We think the is the result of the push consumer getting backed up with the messages piling up in memory. We will make adjustments on our side to handle this.