tf-encrypted / tf-encrypted

A Framework for Encrypted Machine Learning in TensorFlow

Home Page:https://tf-encrypted.io/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Encrypted models

zned45 opened this issue · comments

Hi,
From the perspective of the IP of the models running on the edge, does TF-encrypted allow the following (or is this even possible today?):

  1. The encrypted model is served on the edge
  2. The client has access to the model, encrypts the data and performs inference
  3. Is there any possibility of the results of the inference only be read using a third party key or module on centralized server, restricting the edge service (client) from decrypting? Like a process of encrypted that is different from decryption process?

Thank you.

@zned45 what you're suggesting is possible. Is the goal to protect the model from the client, and protect the input from the server, while revealing the result to the server?

I would think of it as:

  1. Encrypted the model on the server.
  2. Encrypted the input data on the client.
  3. Run the encrypted inference.
  4. Decrypt the result on the server.

With an MPC protocol the encrypted inference runs between the client and the server.

@gavinuhma thank you for the help.
The main goal is:

  • You have an edge inference server which is available to eveyone. It should run the inference on a input. Since it is available to everyone the key requrement is to encrypt the model. The input does not need encryption
  • Then another server should be able to decrypt the result of the inferece which comes encrypted.

I'm a newbie, so could you please elaborate a little bit more?

Thank you.

Please check the examples on ML inference and training on secret-shared images. https://github.com/tf-encrypted/tf-encrypted/tree/master/examples/benchmark