subzerocloud / subzero-starter-kit

Starter Kit and tooling for authoring GraphQL/REST API backends with subZero

Home Page:https://subzero.cloud

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Unable to receive messages using the starter kit

riccardodivirgilio opened this issue · comments

Hi there, I'm trying to test this starterkit to see if it fits my requirements.

What I want to do is to receive notifications when a "todo" changes using a websocket.
To test that I'm using 2 very simple python programs, the first one is authenticating as alice and connecting to /rabbitmq/ws and listening for incoming messages, the second one is authenticating as alice and sending POST requests to /rest/todos to add new data.

The problem is that after running those 2 files

python3 subscribe.py
python3 addtodo.py

I don't receive any message on the websocket connection.
I'm sure i'm authenticated in the subscribe code, because I manually added logs in nginx lua code that is printing AUTHENTICATED when the code reaches this point

https://github.com/subzerocloud/subzero-starter-kit/blob/master/openresty/nginx/rabbitmq.conf#L159

I created a GIST with the file I was using can you please tell me what am I doing wrong, or if I need to do something extra in order to receive rabbitmq messages over the ws connection?

https://gist.github.com/riccardodivirgilio/83025d11f706c546151c819e6640d69a

Thank you very much

Right I spent some time trying to understand the code and I think I got this. What I don't understand is how do I authenticate (using the auth backend that is exposed by postgrest) to /rabbitmq/ws endpoint and listen for messages in javascript / python.

your api client authenticates first with rest and get back a jwt token.
save it as a cookie.
all the following requests to the /rabbitmq/ws will be sending that cookie
with your client send a "login/connect" frame with dummy username/password (but make sure the cookie is also sent)
the ws backend will inspect that cookie https://github.com/subzerocloud/subzero-starter-kit/blob/master/openresty/nginx/rabbitmq.conf#L143
and turn it into an authorization header https://github.com/subzerocloud/subzero-starter-kit/blob/master/openresty/nginx/rabbitmq.conf#L159 which rabbitmq understands
when the request reaches rabbitmq, it's configured to ask this endpoint https://github.com/subzerocloud/subzero-starter-kit/blob/master/openresty/nginx/rabbitmq.conf#L60 if the user (sent previously as authorization header) is allowed to perform that operation

all of the above is implemented so that once you login to the rest api, receive the jwt and save it as a cookie, the communication with the ws backend is automatically authenticated based on that cookie

Right and I think my python script is doing all that, it is correctly authenticating using the /rabbitmq/ws/, the problem is that once I connect to the endpoint and I start listening for message I don't receive anything. it is unclear to me what I need to send to that endpoint in order to receive notifications.

does the table (where you make changes) have the trigger attached (as i linked above) so that each insert/update generates a NOTIFY which then is picked up by pg_amqp_bridge
have you looked at pg_amqp_bridge logs if it's actually receiving/sending the events?

yes, yes i'm using docker-compose up, pg_amqp_bridge is sending events correctly, and I'm able to receive them if I use a small python script that is connecting to rabbitmq directly, so the message propagation works correctly.

I haven't done any changes to the app yet, I'm trying to listen for events to the todos table that is already sending events.

here is some "pseudo code", havent used python much so you'll have to correct the syntax and call the correct functions

#!/usr/bin/env python

import asyncio
import websockets
import aiohttp


async def hello():

    async with aiohttp.ClientSession() as session:
        async with session.post('http://localhost:8080/rest/rpc/login', data = {'email': 'alice@email.com', 'password': 'pass'}) as resp:
            jwt = resp.headers['Set-Cookie'].split('=')[1].split(';')[0]

        print(jwt)
        session.set_cookie({"SESSIONID":jwt}) #store the jwt as a cookie and check it's sent on every following request
        
        # this endpoint is called by rabbitmq, not by the api users
        # async with session.post('http://localhost:8080/rabbitmq/auth/user', data = {'username': 'user_1', 'password': jwt}) as resp:
        #     print(resp)
        #     print(await resp.text())

        # not sure if auth parameter is needed to trigger authentication but the values dont matter since the login is checked based on the cookie
        async with session.ws_connect("ws://localhost:8080/rabbitmq/ws", auth={login:"dummy", password:"dummy"}, ) as ws:
            async for msg in ws:
                print(msg)

asyncio.get_event_loop().run_until_complete(hello())

thanks for the help but maybe I'm unable to make myself clear. the python code works: the two calls I wrote, the first request to /rest/rpc/login is adding a cookie by adding an header Set-Cookie, there is no need to save the cookie becuase aiohttp session object is saving it automatically and forwarding it to the next request (just like a browser).
I'm sure that is doing it because I manually added some logs to the lua hooks that are printing messages if the JWT token is set and is valid, I can confirm the JWT token is set and is valid on the second request. however no messages are forwarded to the websocket session when I run the other python script that is adding a todo using the rest api.

Basically I have been changing this code to

https://github.com/subzerocloud/subzero-starter-kit/blob/master/openresty/nginx/rabbitmq.conf#L157

        then
            local user_id = jwt_obj.payload.user_id
            ngx.req.set_header('Authorization', 'Basic ' .. ngx.encode_base64('user_' .. user_id .. ':'..token))

            ngx.log(ngx.INFO, 'AUTHENTICATED')

            ngx.log(ngx.INFO, 'Basic ' .. ngx.encode_base64('user_' .. user_id .. ':'..token))
        end

and docker-compose is outputting AUTHENTICATED and the Basic header correctly on the second request

I'm very glad for your help, and I don't want to waste your time, if you are not familiar with python, maybe you have some javascript code that is able to listen for messages to /rabbitmq/ws?

oh, if authenticated is being printed everything is right.
looked at the code again, just noticed, you are not doing any "subscribe" call so that is why you are not getting any messages

basically this part, from an old file

var exchange = 'app_events'
var routing_key = '#'
id = client.subscribe("/exchange/" + exchange + "/" + routing_key, function(message) {
console.log(message)
print_first(message.headers.destination.replace('/exchange/app_events/',''))
print_first(message.body);
});

look in the amqp bridge logs to see the routing key of the events being sent and also look here https://www.rabbitmq.com/stomp.html for the correct "topic" name, how to form it

Thank you so much I have it working now! for the records, this library https://www.npmjs.com/package/@stomp/stompjs is not working with the setting on

{rabbitmq_web_stomp, [
    {use_http_auth, true}
] }

I needed to turn it off, otherwise the login user was always guest.

if you disabled that it means you are authenticating to rabbitmq using a rabbitmq user (you probably used admin/adminpass) and not using a "application user".
prior to connecting to the ws endpoint, did you call the login endpoint to "save the cookie" so that when you connect to ws the cookie is sent? your python code was only missing the "subscribe" call. Post your js code that did notwork without that setting disabled

no no I'm authenticating using user_1 and JWT. I think the auth backend is defined with

{ rabbitmq_auth_backend_http, [
    {http_method, post},
    {user_path,     "${RABBITMQ_AUTH_ENDPOINT}/user"},
    {vhost_path,    "${RABBITMQ_AUTH_ENDPOINT}/vhost"},
    {resource_path, "${RABBITMQ_AUTH_ENDPOINT}/resource"},
    {topic_path,    "${RABBITMQ_AUTH_ENDPOINT}/topic"}
] }

that one is still there, because basically I have refactored your code to create a standalone microservice from a vanilla openresty image that is only responding to the rabbit mq instance and is available only in the local network and can't be reached by external ips, it should be more encapsulated and a bit more secure like this I think.

https://www.rabbitmq.com/web-stomp.html

The use_http_auth option extends the authentication by allowing clients to send the login and passcode in the HTTP authorisation header (using HTTP Basic Auth). If present, these credentials will be used. Otherwise, the default STOMP credentials are used. The credentials found in the CONNECT frame, if any, are ignored.

So basically by disabiling it you can send the connect informations using stomp commands, otherwise you need to send them using Basic http auth.

this is the code i'm using now that is working and is receiving messages from rabbitmq (but you need to disable http auth)

https://gist.github.com/riccardodivirgilio/b92c367e790d525d12bb9ecacf856f99