tensorflow / serving

A flexible, high-performance serving system for machine learning models

Home Page:https://www.tensorflow.org/serving

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

The tf-serving:2.6.2 can't load AWS config file properly

heyuanzhen opened this issue · comments

Bug Report

The tf server can't load correct aws config files and fail to load model from s3 bucket.

System information

  • MAC OS 13.2.1:
  • TensorFlow Serving installed from Docker:
  • TensorFlow Serving version: 2.6.2:

Describe the problem

I have a aws config file like this:

[profile default]
role_arn       = arn:aws:iam::accid:role/role-name
source_profile = cred

[profile cred]
aws_access_key_id = <my-access-key-id>
aws_secret_access_key = <my-secret-access-key>

I can use this config file to access s3 resources through aws cli, and I want to use it as the config file of tf serving to read from s3, but it doesn't work and reports error:

2023-03-23 08:18:52.221767: I tensorflow_serving/model_servers/server_core.cc:465] Adding/updating models.
2023-03-23 08:18:52.221826: I tensorflow_serving/model_servers/server_core.cc:591]  (Re-)adding model: test_model
2023-03-23 08:18:57.472903: E tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:365] FileSystemStoragePathSource encountered a filesystem access error: Could not find base path s3://my-bucket/test/test_model/ for servable test_model with error Failed precondition: AWS Credentials have not been set properly. Unable to access the specified S3 location

Exact Steps to Reproduce

I build my image based on tf-serving:

FROM tensorflow/serving:2.6.2

WORKDIR /app

COPY ./models.config /app/models/

COPY ./aws_config /app/models/

CMD ["--model_config_file=./models/models.config", "--model_config_file_poll_wait_seconds=60", "--rest_api_port=9000"]

First I built it:
docker build -t tf-model-service:0.0.1-xxx .
then I run:
docker run -p 8501:8501 -e AWS_SDK_LOAD_CONFIG=1 -e AWS_CONFIG_FILE='./models/aws_config' -t tf-model-service:0.0.1-xxx
Then I get the error, there is little log so I don't know where the error actually is, but I think maybe you should test whether this aws config loading functionality is working.

@heyuanzhen,

The error looks like AWS credentials are not properly configured for accessing S3 bucket. Please refer this article to properly setup your AWS credentials and try to run the model server.

You can use below docker command to run model server to load model from s3 bucket.
Kindly let us know if it works. Thank you!

docker run -p 8501:8501 -e AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID -e AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY -e MODEL_BASE_PATH=s3://path/bucket/models -e MODEL_NAME=model_name -e S3_ENDPOINT=s3.us-west-1.amazonaws.com -e AWS_REGION=us-west-1 -e TF_CPP_MIN_LOG_LEVEL=3 -t tensorflow/serving

Is the hard-coded access key in command line the only way to access aws resources? Can we assume role or use config file to achieve it?

@heyuanzhen,

You can add the access key as environment variable and add the variable in docker run command. I couldn't find any documentation or tutorial to add the access key as config file in TF serving.

Hope this helps. Thank you!

This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you.

This issue was closed due to lack of activity after being marked stale for past 7 days.

Are you satisfied with the resolution of your issue?
Yes
No