Secure and dynamic outsourcing computation of machine learning in cloud computing
This paper presents a novel, secure and dynamic mechanism to train machine learning
models that achieve membership privacy. Our protocol falls in the two-server-aided model
and allows one server to perform most of computations and allows another server to provide
auxiliary computation. In addition, users distribute their private data among two non-
colluding but untrusted servers who train neural network on the adaptively select data using
secure multi-party computation (MPC). In our protocol, only the selected members can …
models that achieve membership privacy. Our protocol falls in the two-server-aided model
and allows one server to perform most of computations and allows another server to provide
auxiliary computation. In addition, users distribute their private data among two non-
colluding but untrusted servers who train neural network on the adaptively select data using
secure multi-party computation (MPC). In our protocol, only the selected members can …
Abstract
This paper presents a novel, secure and dynamic mechanism to train machine learning models that achieve membership privacy. Our protocol falls in the two-server-aided model and allows one server to perform most of computations and allows another server to provide auxiliary computation. In addition, users distribute their private data among two non-colluding but untrusted servers who train neural network on the adaptively select data using secure multi-party computation (MPC). In our protocol, only the selected members can reconstruct a predefined secret and jointly decrypt the computation result. This protocol is proven to be secure in the semi-honest model and passive adversary settings, and show that security is maintained even if the unselected users’ drop out at any time.
Springer
Showing the best result for this search. See all results