How to send bigger payloads?
ITler opened this issue · comments
Disclaimer
Probably not the right place to ask, but I don't know a better one
What happened
I have a simple client implementation which supposes to send a bigger payload using PostMessage
function from chat.go
My message structure looks like: this
Of cause these are test data, but in reality, I would have quite a lot of real data (queried via Github API). When sending the bigger amount of data, I got invalid_blocks
returned.
It seems, due to the amount of datasets I would want to process, the payload becomes too large.
If so, I would have wished for a more precise error message than invalid_blocks
.
Expected behavior
However, more important: I would like to send my data.
How to overcome the limitation I am facing here?
I could send multiple messages (each for one repo dataset that I want to send) , but it also wouldn't scale if the number of pull requests gets too high.
Steps to reproduce
api := slack.New("YOUR_TOKEN_HERE")
var blocks []slack.Block = blocksComposeMagic() // see block kit link above to see what would get composed here
channelID, timestamp, err := api.PostMessage(
"CHANNEL_ID",
slack.MsgOptionText("Some text", false),
slack.MsgOptionBlocks(blocks),
slack.MsgOptionAsUser(true),
)
if err != nil {
fmt.Printf("%s\n", err)
return
}
fmt.Printf("Message successfully sent to channel %s at %s", channelID, timestamp)
reproducible code
manifest.yaml
none
Versions
- Go: 1.20.3
- slack-go/slack: 0.12.2
Not a maintainer, but maybe I can help. This is probably the limitation you are running into. You would hit the 50 blocks pretty quick. Are users pulling the data, like "Hey chatbot, give me all my PRs" and they get all their open PRs? or is this more of a push like a cron job executed and now you are publishing the results at once to Slack?
If it is the former you could ask them if they want everything or just a specific repo using a combo box, or at least if you are going to spam them (breaking up by repo probably makes sense) then they get it all because they chose that (maybe they get a DM with all the info at that point).
If it is just a bulk process you could also shove it into a file and publish a file instead. I would say that chat schemes that get too chatty are a bad pattern since people tend to start to ignore the info. It really all depends on what is pulling the data, who it's going to, how many people need to see it etc.
I had a thing a while ago where you could ask the bot for all the PRs for a team. One team had dozens of repos and it got big quick so they always got theirs as a spreadsheet or a text file(can't remember exactly). You got some options, not always pretty, but there are some.
Thank you Mike (@mtintes)
I just overlooked this 50 blocks limitation. My bad. Thanks for pointing it out.
I had a thing a while ago where you could ask the bot for all the PRs for a team. One team had dozens of repos and it got big quick so they always got theirs as a spreadsheet or a text file(can't remember exactly). You got some options, not always pretty, but there are some.
Exactly our use-case, but not a chatbot implementation. It lists all PRs per a growing set of repos that have a certain age and we want to use it to inform 'a person in charge per day' to tackle findings.
Also thanks for sharing your view on 'bad pattern'.
I tend to agree, while I think that assembling a file indeed might be the better approach here. I will chase that approach.