blehnen / DotNetWorkQueue

A work queue for dot.net with SQL server, SQLite, Redis and PostGreSQL backends

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Issue with redis on netcore

sabatmonk opened this issue · comments

Hi, i've tried to run this on netcore (linux and all) using a redis docker container, i can write to the queue (i do have to change your code, since it is simplemessage and not simplemessage.simplemessage) but for some reason, my consumer does not trigger and never consume the item.

Producer :

public void Run()
        {
            var queueName = "OSO";
            var connectionString = "localhost:6379";        
            using (var queueContainer = new QueueContainer<RedisQueueInit>())
            {
                using (var queue = queueContainer.CreateProducer<SimpleMessage>(queueName, connectionString))
                {
                    queue.Send(new SimpleMessage{Message = "hello world"});
                }
            }
        }

and consumer :

public void Run()
        {     
            using (var queueContainer = new QueueContainer<RedisQueueInit>())
            {
                using (var queue = queueContainer.CreateConsumer("OSO", "localhost:6379"))
                {
                    queue.Start<SimpleMessage>(HandleMessages);
                    Console.WriteLine("Processing messages - press any key to stop");
                    Console.ReadKey((true));
                }
            }
        }
        private void HandleMessages(IReceivedMessage<SimpleMessage> message, IWorkerNotification notifications)
        {
            Console.WriteLine(message.Body.Message);
        }

Hi,

There are two things to check to start with.

  1. Did the record make it into redis? You can check this using the redis CLI.
  2. The queue.Send command returns the status of the enqueue -

i.e.

var result = queue.Send(new SimpleMessage{Message = "hello world"});

If an exception occurred (i.e. a security issue with login to redis) it will be on the result object.

i know it did make it to the queue (check with cli). I will have to check for 2 when i get back

One issue is that you probably aren't getting any information from the log, as isn't not configured by default since LibLog is used internally.

You can pretty easily use Serilog and log to the console by doing this:

var log = new LoggerConfiguration()
.WriteTo.ColoredConsole(outputTemplate: "{Timestamp:HH:mm} [{Level}] ({Name:l}) {Message}{NewLine}{Exception}")
.CreateLogger();
Log.Logger = log;

This should be done before creating the queue container.

This would require NuGet references to:

Serilog
Serilog.Sinks.Console
Serilog.Sinks.ColoredConsole

loggingoutput

@blehnen Random question but could be related. Does OP have to check if queue exists before creating?

@zimmermanw84 I've not had a chance to actually run this in Linux, as I mostly do stuff in windows still. Its on my list of things to try, but the list seems to do nothing but grow.

That being said, the redis transport doesn't require that the queue be created first, as redis itself doesn't work that way. All of the other transports however do, and will throw various exceptions if the tables don't exist yet.

@blehnen I can confirm it works on Mac (with Redis transport), also I plan on wrapping this in a Linux container so I'll let you know the outcome of those results. I am familiar with similar lists of my own that only grow lol.

That's interesting, but makes sense given how redis works. Side note, I was calling createQueue.CreateQueue(); while testing the postgres transport and I was getting a SQL syntax error from the DB while trying to create table. Postgres version psql (PostgreSQL) 10.5.

@zimmermanw84 Do you happen to know what the exact error message was? We use the psql transport in production, though we might still be using server version 9.X.

@blehnen So I realized why it was failing and has nothing to do with this project. I named the queue Primary which is a reserved keyword in postgres. So the query fails, I updated the name to something else and it works as expected. Sorry for the scare.

@zimmermanw84 Gotcha - thanks for getting back to me. This is the transport we use in production - so I was hopeful that it was just a configuration issue.