a75c6 / grpc-mate

An enterprise ready micro service project base on gRPC-Java

Home Page:https://github.com/email2liyang/grpc-mate

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

gRPC-Mate - An enterprise ready micro service project base on gRPC-Java

gRPC-Mate demostrate best practice for gRPC based micro service.

Build Status Code Coverage Status Gitter chat

Demo Script

the project will demonstrate an online store search service including

  • Create elasticsearch index with alias
  • Uploading products into Elasticsearch (client streaming)
  • Downloading products from Elasticsearch (server streaming)
  • Search products from elasticsearch (simple RPC)
  • Calculate products score (bi-directional streaming)

Grpc best practice

  • elastic search communicate
    • use JsonFormat.Printer to convert proto buffer message into json
    • use JsonFormat.Parser to parse json into proto buffer

Simple RPC

  • sample
  • we could use JsonFormat.Parser to convert es document into protobuf message
      Product.Builder builder = Product.newBuilder();
      jsonParser.merge(hit.getSourceAsString(), builder);
      responseBuilder.addProducts(builder.build());

Server streaming

  • sample
  • with server streaming , user could pass PublishSubject to dao layer to connect the real data with ResponseObserver
PublishSubject<Product> productPublishSubject = PublishSubject.create();
    Disposable disposable = productPublishSubject
        .doOnNext(product -> responseObserver.onNext(product))
        .doOnComplete(() -> responseObserver.onCompleted())
        .doOnError(t -> responseObserver.onError(t))
        .subscribe();
    productDao.downloadProducts(request, productPublishSubject);
    disposable.dispose();

Client streaming

PublishSubject<Product> publishSubject = PublishSubject.create();
    publishSubject
        .doOnNext(product -> {
          log.info("saving product - {} ", product);
          productDao.upsertProduct(product);
        })
        .doOnError(t -> responseObserver.onError(t))
        .doOnComplete(() -> {
          responseObserver.onNext(UploadProductResponse.newBuilder().build());
          responseObserver.onCompleted();
        })
        .subscribe();

Bi-directional streaming

  • sample
  • use grpc's InProcessServer to test grpc service

Interceptors

Transfer Large File

  • grpc is not designed to transfer large files, but we could leverage stream api to transfer any size of data in binary stream
  • see protobuf definition below we could use stream api to transfer any size of data in any format
message DataChunk {
    bytes data = 1;
}
rpc DownloadProductImage(DownloadProductImageRequest) returns(stream DataChunk){
}

Restful endpoint

  • use grpc-gateway to bridge grpc service to restful endpoint
  • stream is not supported in http 1.1
  • define a sample grpc service like below
service EchoService {
    rpc Echo (EchoRequest) returns (EchoResponse) {
        option (google.api.http) = {
          post: "/grpc/api/v1/echo"
          body: "*"
        };
    }
}

message EchoRequest {
    string ping = 1;
}

message EchoResponse {
    string pong = 2;
}
curl -XPOST localhost:7070/grpc/api/v1/echo -d '{"ping":"hello"}'
{"pong":"hello"}%

Promethues integration

Kubernetes Deployment

  • sample
  • use property file to manage system property and add the system property to configmap, so it's easy to debug program locally by specify the property file from system env.
kubectl create configmap cluster-config --from-file=data_nerd.properties --namespace=prod
  • mount property from configmap in deploymnet yaml file
volumes:
      - name: config-volume
        configMap:
          name: cluster-config
          items:
          - key: data_nerd.properties
            path: data_nerd.properties
  • service will seldom get redeployed after first deployment

Gradle Best Practice

  • add gradle wrapper, so that it can be run anywhere
task wrapper(type: Wrapper) {
    gradleVersion = '4.0'
}

> gradle wrapper
  • remove auto generated classes in clean task
clean {
    doLast {
        // remove auto-generated files on clean
        delete "${projectDir}/src/generated"
    }
}
  • we force gradle to detect version conflict on build
subprojects {
    apply plugin: 'java'

    configurations.all {
        resolutionStrategy {
            // fail eagerly on version conflict (includes transitive dependencies)
            // e.g. multiple different versions of the same dependency (group and name are equal)
            failOnVersionConflict()
        }
    }
}
  • show error log in console make it easier to debug build failure in travis-ci
test {
    testLogging {
        // set options for log level LIFECYCLE
        events "failed"
        exceptionFormat "full"

        // remove standard output/error logging from --info builds
        // by assigning only 'failed' and 'skipped' events
        info.events = ["failed", "skipped"]
    }
}

Mockito best practice

  • use Mockito to mock dao method in service test, so that we do not launch docker container to provide ES env
  • use Guice to inject any mocked instance into the dependency graph in unit test
productDao = mock(ProductDao.class);
    injector = Guice.createInjector(
        Modules.override(new ElasticSearchModule())
            .with(binder -> {
              binder.bind(ProductDao.class).toInstance(productDao);
            })
    );

Junit best practice

  @ClassRule
  public static final GenericContainer esContainer = 
      new GenericContainer("email2liyang/elasticsearch-unit-image:5.4.3")
        .withExposedPorts(9200,9300);
  • user can use Guice Modules.override() method to override any default configuration in test
MapConfiguration memoryParams = new MapConfiguration(new HashMap<>());
    memoryParams.setProperty(CONFIG_ES_CLUSTER_HOST,ip);
    memoryParams.setProperty(CONFIG_ES_CLUSTER_PORT,transportPort);
    memoryParams.setProperty(CONFIG_ES_CLUSTER_NAME,"elasticsearch");
    Injector injector = Guice.createInjector(
        Modules.override(new ElasticSearchModule()).with(
            binder -> {
              binder.bind(Configuration.class).toProvider(() -> memoryParams);
            }
        )
    );
  • use toProvider(()->xxx); to avoid dedicated provider logic to execute
  • use GrpcServerRule with Junit Rule to start a mock grpc server to test grpc, see EchoServiceTest

Proto buffer best practice

  • define all proto file in top level of project for larger organization, it's a good idea to store all protobuffer file into a dedicated git repository, then checkout the proto buffer repository as a git submodule, then we could have single place to define all the grpc service and message to share across projects
  • define Makefile to generate java code , then it's easy to detect any issue for proto buffer definition.
clean:
	mkdir -p java_generated && rm -rf java_generated/*
gen: clean
	protoc --java_out=java_generated *.proto
> make gen	
  • it's good idea to use proto buffer message as value object to pass value among different layer of the application, then the developers do not need to care about marshalling/unmarshalling in different layer. let protobuffer to handle it in a reliable and fast way.
  • we could use JsonFormat.Printer and JsonFormat.Parser to serialize/deserialize proto buffer message into/from json to communicate with elasticsearch, as elastic search only support json format of data as it's document
  • it's good idea to define common message in a separate proto file, so that it can be used in multiple proto files by import
  • it's good idea to define package name and set multiple_files to true so that the generated java file has better package name
option java_package = "io.datanerd.generated.common";
option java_multiple_files = true;
  • proto buffer name best practice
    • use CamelCase (with an initial capital) for message names
    • use CamelCase (with an initial capital) for grpc service name
    • use underscore_separated_names for field names
    • use CamelCase (with an initial capital) for enum type names and CAPITALS_WITH_UNDERSCORES for value names
service ProductUpdateService {
    //upload product into elastic search , make it so that we could search on it
    //used to demo client side stream
    rpc UploadProduct (stream Product) returns (UploadProductResponse) {

    }
}


message UploadProductResponse {
    enum ResultStatus {
        SUCCESS = 0;
        FAILED = 1;
    }
    ResultStatus result_status = 1;
}

Docker best practice

  • we can use docker to simulate external service (e.g elasticsearch) in unit test
    • in this demo project , we will an elasitcsearch image for unit test purpose only
    • user can download it by command make pull_image to get latest test image

Quality control best practice

CheckStyle

FindBugs

  • user can exclude any file from findbugs(e.g: grpc generated java file) by adding it to findbugs_exclude_filter.xml

Jacoco

  • Jacoco related tasks are not bind to check and test task, we can bind jacoco related tasks to test by
    test.finalizedBy(jacocoTestReport,jacocoTestCoverageVerification)
  • use can add multiple rules in jacocoTestCoverageVerification
  • user can exclude any package from jacoco report in afterEvaluate config
    afterEvaluate {
        classDirectories = files(classDirectories.files.collect {
            fileTree(dir: it,
                     exclude: ['**/generated/**',
                               'com/google/**'])
        })
    }
  • Line coverage ratio on package level is the most meaningful standard on code coverage perspective
  • Jacoco will work with Junit out of box, for TestNG, it need extra config to make jacoco to work.

About

An enterprise ready micro service project base on gRPC-Java

https://github.com/email2liyang/grpc-mate

License:Apache License 2.0


Languages

Language:Java 95.6%Language:Shell 2.4%Language:Go 1.3%Language:Makefile 0.8%