Save message to S3 before DeleteMessage
k0kubun opened this issue · comments
This TODO should be fixed before v1. Delaying DeleteMessage or doing PutObject earlier is needed.
Yeah, I think the best solution for this is saving the message body just after enqueueing.
SQS is good for queueing and long-polling but not good for storing data. Any data should be always stored to database or S3 in my opinion.
queue = Barbeque::JobQueue.find_by!(name: @queue)
message = build_message
response = Barbeque::MessageEnqueuingService.sqs_client.send_message(
queue_url: queue.queue_url,
message_body: message.to_json,
)
Barbeque::ExecutionLog.save_message(response.message_id, message)
response.message_id
We can determine the S3 key for message.json before inserting JobExecution record since the S3 key consists of application name, job name and message_id which are available in the message body.
My idea above breaks SNS integration because SNS integration doesn't go through /v2/job_executions or MessageEnqueueingService. Thus we cannot add extra process before ReceiveMessage.