Filter not working with Python grpc servers
legau opened this issue · comments
Hi,
I have been trying to set up this filter to handle Connect web messages in a Python gRPC server and it looks like it doesn't work.
The buf curl command does work, but making a call with a Connect Transport in a web browser always returns a 503 UC (unavailable). Using the GrpcWeb Transport works. The Go example works for both grpcWeb and connect as expected.
The error Python side are things like :
DEBUG:grpclib.protocol:Protocol error
Traceback (most recent call last):
File "/root/.cache/pypoetry/virtualenvs/pyserv-MATOk_fk-py3.10/lib/python3.10/site-packages/grpclib/protocol.py", line 714, in data_received
events = self.connection.feed(data)
File "/root/.cache/pypoetry/virtualenvs/pyserv-MATOk_fk-py3.10/lib/python3.10/site-packages/grpclib/protocol.py", line 189, in feed
return self._connection.receive_data(data) # type: ignore
File "/root/.cache/pypoetry/virtualenvs/pyserv-MATOk_fk-py3.10/lib/python3.10/site-packages/h2/connection.py", line 1463, in receive_data
events.extend(self._receive_frame(frame))
File "/root/.cache/pypoetry/virtualenvs/pyserv-MATOk_fk-py3.10/lib/python3.10/site-packages/h2/connection.py", line 1487, in _receive_frame
frames, events = self._frame_dispatch_table[frame.__class__](frame)
File "/root/.cache/pypoetry/virtualenvs/pyserv-MATOk_fk-py3.10/lib/python3.10/site-packages/h2/connection.py", line 1682, in _receive_data_frame
frames, stream_events = stream.receive_data(
File "/root/.cache/pypoetry/virtualenvs/pyserv-MATOk_fk-py3.10/lib/python3.10/site-packages/h2/stream.py", line 1073, in receive_data
self._track_content_length(len(data), end_stream)
File "/root/.cache/pypoetry/virtualenvs/pyserv-MATOk_fk-py3.10/lib/python3.10/site-packages/h2/stream.py", line 1340, in _track_content_length
raise InvalidBodyLengthError(expected, actual)
h2.exceptions.InvalidBodyLengthError: InvalidBodyLengthError: Expected 2 bytes, received 7
I use a https://github.com/vmagamedov/grpclib server (which uses https://github.com/python-hyper/h2 under the hood) but I also tried with the official grpc lib with the same results.
Are you aware of a weird behavior in python grpc implementation that would cause this ?
Here is my fork with a python server + a connect-example with buttons to make calls https://github.com/legau/envoy-demo
Thanks
That error message, InvalidBodyLengthError: Expected 2 bytes, received 7
, sounds like the filter is not correctly re-writing any Content-Length
headers. A unary RPC sent via the Connect protocol has no prefix, but messages in the gRPC protocol always have a five byte prefix (one byte with flags, such as indicating if the message is compressed or not, and four bytes for the length of the message).
@jchadwick-buf, any idea if the suspicion above sounds like it's on the right track?
Hmm, this is weird. I suspect it's not going to (or at least shouldn't) be related to Content-Length because the filter should be erasing the Content-Length:
- For Connect Streaming requests: https://github.com/envoyproxy/envoy/blob/main/source/extensions/filters/http/connect_grpc_bridge/filter.cc#L262
- For Connect Unary requests: https://github.com/envoyproxy/envoy/blob/main/source/extensions/filters/http/connect_grpc_bridge/filter.cc#L293
- For Connect Unary Get requests: https://github.com/envoyproxy/envoy/blob/main/source/extensions/filters/http/connect_grpc_bridge/filter.cc#L347
- And just for good measure, also on the response too: https://github.com/envoyproxy/envoy/blob/main/source/extensions/filters/http/connect_grpc_bridge/filter.cc#L399
So in theory, it shouldn't be a Content-Length related issue, unless I'm missing something. And yet, it does seem that this is exactly what the message in question is suggesting.
I think I may need to take a deeper look into exactly what's conspiring under the curtains to figure out what happened here. To me, it seems like the filter must've gotten into an unexpected state.
Looping back around, I am pretty sure I know what is happening here. I did indeed link the code where this was fixed, but the problem is, the Envoy release with the fix didn't actually come out at that point! The fix, envoyproxy/envoy@c18432e, was released in Envoy v1.29.0. I'm sorry for the confusion. Please update Envoy to v1.29.0 or higher and you should be all good. Feel free to re-open this if this doesn't seem to fix your problem.