Skip to main content
Article
PPDL - privacy preserving deep learning using homomorphic encryption
ACM International Conference Proceeding Series
  • Nayna Jain, IIIT Bangalore & IBM Systems
  • Karthik Nandakumar, Mohamed Bin Zayed University of Artificial Intelligence
  • Nalini Ratha, University at Buffalo, The State University of New York
  • Sharath Pankanti, Microsoft Corporation
  • Uttam Kumar, IIIT Bangalore
Document Type
Conference Proceeding
Abstract

Deep Learning Models such as Convolution Neural Networks (CNNs) have shown great potential in various applications. However, these techniques will face regulatory compliance challenges related to privacy of user data, especially when they are deployed as a service on a cloud platform. Such concerns can be mitigated by using privacy preserving machine learning techniques. The purpose of our work is to explore a class of privacy preserving machine learning technique called Fully Homomorphic Encryption in enabling CNN inference on encrypted real-world dataset. Fully homomorphic encryption face the limitation of computational depth. They are also resource intensive operations. We run our experiments on MNIST dataset to understand the challenges and identify the optimization techniques. We used these insights to achieve the end goal of enabling encrypted inference for binary classification on melanoma dataset using Cheon-Kim-Kim-Song (CKKS) encryption scheme available in the open-source HElib library.

DOI
10.1145/3493700.3493760
Publication Date
1-8-2022
Keywords
  • Ciphertext packing,
  • Convolutional neural network,
  • Homomorphic encryption,
  • Multi-threading,
  • Non-linear activation function,
  • Optimization
Comments

IR Deposit conditions: non-described

Citation Information
N. Jain, K. Nandakumar, N. Ratha, S. Pankanti and U. Kumar, "PPDL - privacy preserving deep learning using homomorphic encryption", in 5th Joint International Conference on Data Science & Management of Data (9th ACM IKDD CODS and 27th COMAD), CODS-COMAD 2022, (Association for Computing Machinery), p. 318–319, 2022. Available: 10.1145/3493700.3493760