t3-oss / create-t3-turbo

Clean and simple starter repo using the T3 Stack along with Expo React Native

Home Page:https://turbo.t3.gg

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

bug: hosted demo crashes

hichemfantar opened this issue · comments

Provide environment information

This is a problem on the hosted demo

Describe the bug

The get_posts requests retries and them completely fails with a db error.
auto clearing the db when this error occurs and pagination are 2 viable solutions.

Link to reproduction

https://turbo.t3.gg/

To reproduce

wait for get_posts request to finish

Additional information

{
    "0": {
        "error": {
            "json": {
                "message": "target: personal.-.primary: vttablet: rpc error: code = Aborted desc = Row count exceeded 100000 (CallerID: 841gdsku89u3w5ipbeh8)",
                "code": -32603,
                "data": {
                    "code": "INTERNAL_SERVER_ERROR",
                    "httpStatus": 500,
                    "stack": "DatabaseError: target: personal.-.primary: vttablet: rpc error: code = Aborted desc = Row count exceeded 100000 (CallerID: 841gdsku89u3w5ipbeh8)\n    at (vc/edge/function:512:1684)\n    at (vc/edge/function:512:5377)\n    at (vc/edge/function:123:2849)\n    at (vc/edge/function:123:3068)\n    at (vc/edge/function:123:3419)\n    at (vc/edge/function:122:8864)\n    at (vc/edge/function:122:11079)",
                    "path": "post.all",
                    "zodError": null
                }
            }
        }
    }
}

This is a "limit" in planetscales http api to only return Max 100k rows per request. Who spammed so hard to get the demo over 100k posts lol 😅

Crazy that the limit was reached so quickly 😅 i guess we could add a simple check for this specific error and clear the db when it happens

Or you could just clear the db, no need for a check.

Haha lmao

CleanShot 2023-12-11 at 13 25 37@2x

CleanShot 2023-12-11 at 13 26 42@2x

Wiped them :)
CleanShot 2023-12-11 at 13 31 00@2x

Real apps should implement rate limiting etc and probably have paginated queries anyways. Thanks for notifying me of the issue!

I setup a PR that should help avoid this error 🚀