Recently, researchers from Microsoft, Technion, Princeton University, and Algorand Foundation unveiled a new framework called Falcon. Built as a comprehensive 3-party protocol, the framework can be deployed for faster and secure computations of deep learning algorithms on larger networks.

Today, as more and more businesses highly depend on data, a massive amount of private and sensitive data is continually being generated. As per researchers, the combination of such data with deep learning algorithms can play an essential role in transforming the current social and technological situation.

The idea behind Falcon

A deep learning framework built to offer support for training and inference with a malicious security guarantee, Falcon, comprises hybrid integration of ideas from SecureNN and ABY3 with a newer set of constructions for privacy-preserving deep learning.

Written in C , Falcon is engineered with communication backend of SecureNN  and provides a cryptographically secure framework, where client data is kept secure by splitting it into unrecognizable parts among a wide range of non-colluding entities.

Below are three primary and most essential advantages of Falcon

  • Known as the pilot secure framework to support high capacity network over millions of parameters such as VGG16, the framework is extremely expressive and the first of its kind to support batch normalization.
  • Falcon assures security with an abort mechanism for malicious adversaries, assuming an honest majority. The deep learning algorithm makes sure that the protocol is completed with correct results for honest participants or aborts when senses the presence of a malicious enemy.
  • Falcon represents newer theoretical insights for protocol design to add to the efficiency quotient and permits it to surpass existing, secure in-depth learning solutions.

Exploring Falcon

To evaluate and test the Falcon, researchers deployed six diverse networks, varying from simple 3-layer Multi-Layer Perceptrons (MLP) with approximately 118,000 parameters to large networks with around 16-layers having 138 million parameters.

The framework is trained on popular datasets such as MNIST, Tiny ImageNet, and CIFAR-10 datasets, as appropriate based on the network size.

As per researchers, Falcon is the primary secure machine learning framework with a potential to support training capacity networks, namely, AlexNet and VGG16, on the Tiny ImageNet dataset.

Additionally, the framework was then evaluated in LAN and WAN settings along with semi-honest and malicious adversarial settings that resulted in performance improvement over SecureNN, a 3-party secure computation for Neural Networks.

Falcon is also claimed to be an optimized 3-PC framework related to communication, which is often considered as the main bottleneck.

More about this project

As per researchers, Falcon has the potential to enable secure deep learning techniques with the help of (a) malicious security, (b) improved protocols, and (c) expressiveness.