Monday, October 4, 2021

Should I use Apache Kafka or gRPC to communicate between robot fleet and cloud?

Hey, people! I'm currently working on a project in my company where I'm trying to stream data between our robots in the field and the cloud. The robots have connection to the cloud either through 4G or wifi all depending on if they're deployed inside or outside, but generally the network connection tend to be very poor in certain areas. I'm trying to decide on whether we should go for PubSub (publish-subscribe pattern) or RPC (request-response, bi-stream pattern) for communicating with the cloud. Two obvious candidates would be to go for gRPC in the case for RPC, or to go for Kafka in the case of PubSub. However, I'm a bit undecided on which of the two would be the best fit and I could need some expert advice from the Reddit community.

What data are we sending?

  • zipped files, streaming of sensory data like robot position, battery levels, pointclouds
  • streaming of mission commands like forward and backward gain (robot telepresence)
  • unary requests like mission plans, occupancy grid maps

Some requirements:

  • Data encryption
  • Authentication and authorization
  • possibility to prioritize data persistence over low-latency, and vice-versa

Some limitations to be aware of:

  • We generally have poor network when driving/flying around. Stable network connection is only assured at robot docking station.
  • In the case of poor connection we need to be able to persist data to disk (or memory) for things like sensory data, so that it can be uploaded/streamed once stable connection.

Any good advice on which of the two - gRPC or Kafka - that I should choose and why?

Some limitations / drawbacks that I should be aware of?

Any useful experience people have encountered that I should be aware of?

THANK YOU IN ADVANCE!!



No comments:

Post a Comment