eProsima / Integration-Service

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

websocket Handle bottleneck due to the json message conversion to eprosima xtypes by the integration service

gedeon1976 opened this issue · comments

Another issue that I'm experimenting is a websocket Handle bottleneck due to the json message conversion to eprosima xtypes by the integration service. This is happening with Compressed images.

This seems to happen only when you are using the websocket handle, at least what i have been tested
I have added some time measurement to the convert__msg.cpp.em template to show the duration between each received frame,

  void subscription_callback(
          const Ros2_Msg& msg)
  {
      auto start = std::chrono::steady_clock::now();
      logger << utils::Logger::Level::INFO
             << "Receiving message from ROS 2 for topic '" << _topic_name << "'" 
             << "time from last frame: " << std::chrono::duration_cast<std::chrono::milliseconds>(start - t_last).count() << "ms "
             << std::endl;

      xtypes::DynamicData data(_message_type);
      convert_to_xtype(msg, data);

      //logger << utils::Logger::Level::INFO
              //<< "Received message: [[ " << data << " ]]" << std::endl;
              //<< "Received message: [[ " " ]]" << std::endl;  // change to this to avoid data printing

      (*_callback)(data, nullptr);
      t_last = start;
  }

and added the next variable to the class

std::chrono::time_point<std::chrono::steady_clock> t_last;

For example the integration service is able to connect to the ROS2 topics with almost good processing time through a WIFI connection,

integration-service

But when I get connected from a webpage using roslibjs through a WIFI connection, an increase of 10X processing time is introduced,
the video reception on the web is going very slow around 1-2 fps
You can check the difference after connecting from the image below,

integration-service-websocket-sh

I perform some profiling and the issue seems to be related with the message conversion on convert_to_xtype(msg, data);

profiling

From profiling also we have,

The following functions spent a lot of time!!

On websocket::EndPoint::publish 

Websocket::JsonEncoding::encode_publication_msg() 

Eprosima::is::json_xtypes_to_json 

Eprosima::is::json_xtypes::add_json_node 

nlohmann::is::basic_json<std::map, std::vector, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char>>, bool, long, unsigned long, double, std::allocator, nlohmann::is::adl_serializer>::operator[] 

std::vector<nlohmann::is::basic_json<std::map, std::vector, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char>>, bool, long, unsigned long, double, std::allocator, nlohmann::is::adl_serializer>, std::allocator<nlohmann::is::basic_json<std::map, std::vector, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char>>, bool, long, unsigned long, double, std::allocator, nlohmann::is::adl_serializer>>>::insert 

So, I'm not sure if somebody from eProsima can check this behavior when using Websocket-SH handler?

Originally posted by @gedeon1976 in #169 (comment)