tensorflow / serving

A flexible, high-performance serving system for machine learning models

Home Page:https://www.tensorflow.org/serving

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

File system scheme 'hdfs' not implemented

KeithTt opened this issue · comments

OS version: CentOS7
tfserving version: tensorflow/serving:2.10.0-gpu
kube version: v1.27.1
containerd version: v1.6.6

I am trying to run tfserving in a bare-metal kubernetes cluster.

Since the official did not provide an image with hdfs, I try to build a image manually.

Here is the Dockerfile:

FROM tensorflow/serving:2.10.0

RUN apt update && apt install -y openjdk-8-jre && apt-get clean

COPY hadoop-2.10.2 /root/hadoop

ENV JAVA_HOME /usr/lib/jvm/java-8-openjdk-amd64
ENV HADOOP_HDFS_HOME /root/hadoop
ENV LD_LIBRARY_PATH ${LD_LIBRARY_PATH}:${JAVA_HOME}/jre/lib/amd64/server

EXPOSE 8500
EXPOSE 8501

RUN echo '#!/bin/bash \n\n\
tensorflow_model_server --port=8500 --rest_api_port=8501 \
--model_name=${MODEL_NAME} --model_base_path=${MODEL_BASE_PATH}/${MODEL_NAME} \
"$@"' > /usr/bin/tf_serving_entrypoint.sh \
&& chmod +x /usr/bin/tf_serving_entrypoint.sh

ENTRYPOINT ["/usr/bin/tf_serving_entrypoint.sh"]

This image without gpu tensorflow/serving:2.10.0 can work fine in kubernetes. But when I change to a gpu image souce with FROM tensorflow/serving:2.10.0-gpu, it does't work, and raise expcetion File system scheme 'hdfs' not implemented.

图片

I refer to the Dockerfile: https://github.com/tensorflow/serving/blob/master/tensorflow_serving/tools/docker/Dockerfile.gpu, but can not figure out the issue.

@KeithTt,

HDFS file system scheme is moved to tensonflow-io. You only need to install tensorflow-io pip package and do the import tensorflow_io as tfio. There’s no need for any other code change as the filesystem plugin would be loaded behind the scenes.
Alternatively, export TF_USE_MODULAR_FILESYSTEM=1 in all hosts/envs that need to access hdfs should work.
In your case, since you are trying to build an image manually, you can follow Serving TensorFlow models with custom ops guide to create custom TF Serving with tensorflow-io support.

Thank you!

@singhniraj08,

I am sorry that I don't follow you well. In fact, I am not coding but trying to run tfserving with gpu in kubernetes.

And unfortunately I am not very familiar with tfserving, so I don't know how to compile a new docker image with an op?

Essentially, I just need a docker image with gpu which can get models from hdfs..