Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

ReLU, CIFAR, Parity, Lookup readmes based on challenges and articles from the winners #13

Open
wants to merge 7 commits into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
145 changes: 67 additions & 78 deletions include/polycircuit/component/README/cifar10_README.md
Original file line number Diff line number Diff line change
@@ -1,38 +1,22 @@
# CIFAR-10 Image Classification component
# CIFAR-10 classification

*This is a stub repository for the FHERMA CIFAR-10 Image Classification challenge https://fherma.io/challenges/652bf663485c878710fd0209 which is live March-Jun 2024.*
☀️ *This component was developed during the FHERMA CIFAR-10 image classification challenge in Mar - Jun 2024.*

---
This README describes the winning solution for the [FHERMA](https://fherma.io) [CIFAR-10 classification challenge](https://fherma.io/challenges/652bf663485c878710fd0209).

## Overview
For a more comprehensive analysis of the solution, check out the blog posts by the challenge winners: **hita** (Hieu Nguyen from the University of Technology Sydney) and [by the team of **Valentina Kononova**, **osmenojka** (Dmitry Tronin) and **Dmitrii Lekomtsev**](https://fherma.io/content/66d86ed4e4477f9e186fa08f).

[CIFAR-10](https://www.cs.toronto.edu/~kriz/cifar.html) is a widely recognized dataset comprising 60,000 color images of size 32x32 pixels, categorized into 10 classes (such as automobiles, airplanes, dogs, etc.).
This dataset serves as a standard benchmark for machine learning algorithms in computer vision.

The goal of the challenge is to develop and implement a machine-learning model capable of efficiently classifying encrypted CIFAR-10 images without decrypting them.

## Challenge Info:

1. **Challenge type**: This challenge is a White Box.
The project with source code is required.
2. **Encryption Scheme**: CKKS.
3. **Supported libraries**: [OpenFHE](https://github.com/openfheorg/openfhe-development), [Lattigo](https://github.com/tuneinsight/lattigo).
4. **Input**:
- Encrypted image
- Cryptocontext
- Public key
- Multiplication key
- Galois keys
5. **Output**: Encrypted classification result.
## Image classification on encrypted dataset via CKKS scheme

## How to Participate
[CIFAR-10](https://www.cs.toronto.edu/~kriz/cifar.html) is a widely recognized dataset comprising 60,000 color images of size 32x32 pixels, categorized into 10 classes such as automobiles, airplanes, dogs, etc.
This dataset serves as a standard benchmark for machine learning algorithms in computer vision.

This is a **White Box** challenge.
We've prepared a guide on how to participate in these types of challenges, please see our [User Guide](https://fherma.io/how_it_works).
The goal of the challenge was to develop and implement a machine-learning model capable of efficiently classifying CIFAR-10 images encrypted with the **CKKS homomorphic encryption scheme** without decrypting them, showcasing the intersection of cryptography and machine learning, particularly in privacy-preserving computations.

## Encoding Technique
## Challenge requirements
### Input and encoding technique

We utilize the following class indexing for the CIFAR-10 dataset.
The following class indexing was utilized for the CIFAR-10 dataset:

| Index | Class|
| -------- | -------- |
@@ -47,79 +31,84 @@ We utilize the following class indexing for the CIFAR-10 dataset.
| 8 | Ship |
| 9 | Truck |

### Input

Each image is encoded as a real vector with a dimension of 3072=3x1024.
Each image for the input is encoded as a real vector with a dimension of 3072=3x1024.
The initial 1024 slots denote the red channel, the subsequent ones denote green, and the final segment denotes blue.
Each slot stores value in the range of [0, 255].

If you need the data to be packaged in a different way, please open an issue on the [GitHub](https://github.com/Fherma-challenges/cifar-10).
Each slot stores a value in the range of [0, 255].

### Output

The outcome of the computation is governed by the initial 10 slots in the resultant ciphertext.
If the input image belongs to class "i", then within the first 10 slots of the resultant vector, the maximum value must be located in slot "i."
If the input image belongs to class `i`, then within the first 10 slots of the resultant vector, the maximum value must be located in slot `i`.

## Parameters
### Example

Parameters used to generate cryptocontext, keys and ciphertext can be changed through the `config.json` file located in the project root.
Parameters must provide security of at least 128 bits.
For a classified image of an airplane (class 0), correct outputs might look like:

## Implementation
| 0.78 | 0.23 | 0.56 | 0.75 | 0 | 0.1 | 0.23 | 0.56 | 0.43 | 0.3 | ... |
|---|---|---|---|---|---|---|---|---|---|---|

Your implementation could be here!
| 0.98 | 0.23 | 0.26 | 0.93 | 0 | 0.1 | 0.23 | 0.56 | 0.43 | 0.3 | ... |
|---|---|---|---|---|---|---|---|---|---|---|

## Test Environment
| 0.48 | 0.23 | 0.16 | 0.17 | 0 | 0.1 | 0.23 | 0.26 | 0.43 | 0.3 | ... |
|---|---|---|---|---|---|---|---|---|---|---|

The solution will be tested inside a docker container.
The following libraries/packages will be used in the testing environment:
- **OpenFHE**: v1.1.2
- **OpenFHE-Python**: v0.8.4
- **Lattigo**: v5.0.2
## Machine Learning Model

### Command-Line Interface for Application Testing
A neural network with one hidden layer was employed for image classification. The chosen model architecture was intentionally simple to align with the encrypted nature of the data and the constraints of homomorphic encryption.

#### OpenFHE
The application should support the following command-line interface (CLI) options:
## Operations on Encrypted Data

- **--input** [path]: Specifies the path to the file containing the encrypted test image.
- **--output** [path]: Specifies the path to the file where the result should be written.
- **--cc** [path]: Indicates the path to the Cryptocontext file serialized in **BINARY** form.
- **--key_public** [path]: Specifies the path to the Public Key file.
- **--key_mult** [path]: Specifies the path to the Evaluation (Multiplication) Key file.
- **--key_rot** [path]: Specifies the path to the Rotation Key file.
Given the homomorphically encrypted nature of the CIFAR-10 images, several custom operations were necessary to handle the data without decryption:

#### Lattigo
The application should support the following command-line interface (CLI) options:
- **EvalSum**: Implemented manually as sum keys were not provided.
- **DotProduct**: Custom implementation of EvalInnerProduct was necessary for encrypted operations.

- **--input** [path]: Specifies the path to the file containing the encrypted test image.
- **--output** [path]: Specifies the path to the file where the result should be written.
- **--cc** [path]: Indicates the path to the Cryptocontext file serialized in **BINARY** form.
- **--key_eval** [path]: defines the path to the file where `MemEvaluationKeySet` object is serialized. `MemEvaluationKeySet` contains the evaluation key and Galois keys.
These operations were critical to ensuring that the model could process the encrypted data correctly and efficiently.

## Example
## Optimizations

If the input image is classified as airplane (class 0), then the following outcomes are considered correct:
While neural networks typically benefit from parallelism, initial attempts to accelerate computations were hindered by Python’s pickle serialization limitations. To overcome this:

| 0.78 | 0.23 | 0.56 | 0.75 | 0 | 0.1 | 0.23 | 0.56 | 0.43 | 0.3 | ... |
|---|---|---|---|---|---|---|---|---|---|---|
- **Parameter Adjustments**: We optimized `log_q` and `log_n` parameters and further reduced the number of multiplications.
- **Performance Testing**: Comprehensive unit, integration, and performance tests were implemented to track improvements and ensure accuracy.

| 0.98 | 0.23 | 0.26 | 0.93 | 0 | 0.1 | 0.23 | 0.56 | 0.43 | 0.3 | ... |
|---|---|---|---|---|---|---|---|---|---|--- |
These optimizations led to an additional 30% improvement in performance.

| 0.48 | 0.23 | 0.16 | 0.17 | 0 | 0.1 | 0.23 | 0.26 | 0.43 | 0.3 | ... |
|---|---|---|---|---|---|---|---|---|---|---|
## Running and testing
### Test environment

## Useful Links
The solution was tested inside a Docker container using the following libraries/packages:

* [OpenFHE](https://github.com/openfheorg/openfhe-development)
* [OpenFHE Python](https://github.com/openfheorg/openfhe-python)
* [Lattigo](https://github.com/tuneinsight/lattigo)
- **OpenFHE**: v1.1.4
- **OpenFHE-Python**: v0.8.6

### Hardware

**CPU**: 12 core
**RAM**: 54 GB

## Help
### Command-line interface

If you have any questions, you can:
* Contact us by email: support@fherma.io
* Ask a question in our [Discord](https://discord.gg/NfhXwyr9M5).
* Open an issue in the [GitHub Repository](https://github.com/Fherma-challenges/cifar-10).
* Use [OpenFHE Discourse](https://openfhe.discourse.group).
The component supports the following CLI options:

- **--input** [path]: path to the encrypted test image file.
- **--output** [path]: path to the output file for results.
- **--cc** [path]: path to the **cryptocontext** file in **BINARY** form.
- **--key_public** [path]: path to the **public key** file.
- **--key_mult** [path]: path to the **evaluation (multiplication) ke**y file.
- **--key_rot** [path]: path to the **rotation key** file.

### Parameters

Parameters that are used to generate crypto context, keys, and ciphertext are specified in the config file **config.json** located in the project root.
See the **Parameters** section in our [FHERMA participant guide](https://fherma.io/how_it_works) for more info on what can be configured.

## Useful Links

* [FHERMA participation guide](https://fherma.io/how_it_works)—more on FHERMA challenges
* [OpenFHE](https://github.com/openfheorg/openfhe-development) repository, README, and installation guide
* [OpenFHE Python](https://github.com/openfheorg/openfhe-python)
* [OpenFHE Rust](https://github.com/fairmath/openfhe-rs), its tutorial and documentation
* [OpenFHE AAAI 2024 Tutorial](https://openfheorg.github.io/aaai-2024-lab-materials/)—Fully Homomorphic Encryption for Privacy-Preserving Machine Learning Using the OpenFHE Library
* [A vast collection of resources](https://fhe.org/resources) collected by [FHE.org](http://FHE.org), including tutorials and walk-throughs, use-cases and demos.
76 changes: 76 additions & 0 deletions include/polycircuit/component/README/lookup_README
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
# Lookup table component

☀️ *This component was developed during the FHERMA Lookup challenge in Apr - Aug 2024.*

---
This README describes the winning solution for the [FHERMA](https://fherma.io) [Lookup table challenge](https://fherma.io/challenges/665efcf8bad7bdd77d182111).

For a more comprehensive analysis, check out the [blog post](https://fherma.io/content/66d9c84af6ea18c58bf5e97a) by the challenge winner, [Jules Dumezy](https://www.linkedin.com/in/jules-dumezy/), MSc Student at the Ecole Centrale de Lille.

## Overview

The objective of the challenge is to get an `i` element of an array encrypted under BGV/BFV, with both the array and element's position being encrypted.

## Challenge requirements
### Parameters of the input

1. **Vector and its packing**: the values of the vector will be written in slots $0,\ldots, (n-1)$ of the ciphertext, where $n$ is the size of the vector. Slot $i$ will contain the value $x_{i+1}$, size of the vector is $n$.
2. **Input range**: for each element $x_i$ is an integer number in $[0,255]$

You will find an example input below in the **Example** section.

## Requirements of the output

1. **Packing**: the result should contain one value - the maximum value among all the values from the encrypted vector $A$.
2. **Accuracy**: the values for any $i$ must not incur an error of more than $0.01$.

You will find an example output below in the **Example** section.

### Example

The executable will be run as follows:
```
./app --cc cc.bin --key_public pub.bin --key_mult mult.bin --array array.bin --idx index.bin --output out.bin
```
An example for the message encrypted in `array.bin`: `Input = [1, 2, 3, 4, 5]`. An example for the message encrypted in `index.bin`: `Index = [2]`.

An example output for this input: `Output = [3]`

## Running and testing
### Test environment

The solution was tested inside a Docker container using the following libraries/packages:

- **OpenFHE**: v1.1.4
- **OpenFHE-Python**: v0.8.6

### Hardware

**CPU**: 12 core
**RAM**: 54 GB

### Command-line interface

The component supports the following CLI options:

- **--array** [path]: specifies the path to the file containing the encrypted vector/array.
- **--idx** [path]: specifies the path to the file containing the vector/array with the first element specifying the encrypted index.
- **--output** [path]: specifies the path to the file where the result should be written.
- **--cc** [path]: indicates the path to the crypto context file serialized in **BINARY** form.
- **--key_public** [path]: the path to the Public Key file.
- **--key_mult** [path]: the path to the Evaluation (Multiplication) Key file.
- **--key_rot** [path]: the path to the Rotation Key file.

### Parameters

Parameters that are used to generate crypto context, keys, and ciphertext are specified in the config file **config.json** located in the project root.
See the **Parameters** section in our [FHERMA participant guide](https://fherma.io/how_it_works) for more info on what can be configured.

## Useful links

* [FHERMA participation guide](https://fherma.io/how_it_works)—more on FHERMA challenges
* [OpenFHE](https://github.com/openfheorg/openfhe-development) repository, README, and installation guide
* [OpenFHE Python](https://github.com/openfheorg/openfhe-python)
* [OpenFHE Rust](https://github.com/fairmath/openfhe-rs), its tutorial and documentation
* [OpenFHE AAAI 2024 Tutorial](https://openfheorg.github.io/aaai-2024-lab-materials/)—Fully Homomorphic Encryption for Privacy-Preserving Machine Learning Using the OpenFHE Library
* [A vast collection of resources](https://fhe.org/resources) collected by [FHE.org](http://FHE.org), including tutorials and walk-throughs, use-cases and demos.
95 changes: 41 additions & 54 deletions include/polycircuit/component/README/parity_README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,9 @@
# Parity component

*This is a stub repository for the future Parity component from the IBM challenge on FHERMA https://fherma.io/challenges/65ef8c4c5428d672bcc3977b which is live on March-Jun 2024.*

*Link to the template repository: https://github.com/Fherma-challenges/parity*
☀️ *This component was developed during the FHERMA Parity challenge by [IBM Research](https://research.ibm.com) in Mar - Jun 2024.*

---
This README describes the winning solution for the [FHERMA](https://fherma.io) [Parity challenge](https://fherma.io/challenges/652bf669485c878710fd020b).

## Overview

@@ -19,56 +18,20 @@ where $x \in \mathbb{Z}.$
The `parity` function is closely related to the bit extraction problem, where given an integer $x$ the goal is to find its bit representation $x =\sum 2^i b_i$ which is useful in many cases, e.g., comparisons.
Thus, an efficient implementation of `parity(x)` would lead to an efficient implementation of bit extraction.

## Challenge Info:

1. **Challenge type:** This challenge is a White Box challenge.
Participants are required to submit the project with their source code.
2. **Encryption Scheme:** CKKS.
3. **FHE Library:** OpenFHE, Lattigo.
4. **Input Data:** Encrypted vector $x = (x_1, \ldots)$, where $x_i \in [0,255]$.
5. **Output Data:** The outcome should be an encrypted vector $parity(x)= \left(parity\left(x_1\right), \ldots\right)$.

## How to Participate

This is a **White Box** challenge.
We've prepared a guide on how to participate in these types of challenges, please see our [User Guide](https://fherma.io/how_it_works).

## Parameters of the Key

1. **Bootstrapping:** The key will support bootstrapping.
2. **Number of slots:** $2^{16}$.
3. **Multiplication depth:** 29.
4. **Fractional part precision (ScaleModSize):** 59 bits.
5. **First Module Size:** 60 bits.

## Parameters of the input
## Challenge requirements
### Parameters of the input

1. **Packing:** Each slot will contain one value $x_i$.
2. **Input range:** For each element $x_i=n_i + e_i$, where $n_i\in [0,255]$ is an integer, and $|e_i| < 10^{-5}$ is a small noise.

## Requirements of the output
### Requirements of the output

1. **Packing:** Each slot will contain one value $y_i = parity(x_i)$, the parity of the corresponding slot in the input.
2. **Ouput range:** For each element $y_i=b_i + E_i$, where $y_i\in [0,1]$ is an integer, and $|E_i| < 10^{-2}$ is a small noise.

## Implementation

Your implementation could be here!

## Test Environment

Submissions will be evaluated on a single-core CPU.
The following libraries/packages will be used for generating test case data and for testing solutions:

- **OpenFHE:** v1.1.4
- **OpenFHE-Python:** v0.8.6
- **Lattigo:** v5.0.2
- **HElayers:** v1.5.3.1 [Download](https://ibm.github.io/helayers/ 'Download HElayers')
- **pyhelayers:** v1.5.3.1 [Download](https://ibm.github.io/helayers/ 'Download pyhelayers')
### Example

## Example

The executable will be run as follows:
The executable runs as follows:

```
./app --cc cc.bin --key_public pub.bin --key_mult mult.bin --input in.bin --output out.bin
@@ -82,16 +45,40 @@ An example output for this input:

`Output = [0.995, 0.008, 1.002, 1.004, -0.003, ...]`

## Useful Links
## Running and testing
### Test environment

* [OpenFHE](https://github.com/openfheorg/openfhe-development)
* [OpenFHE Python](https://github.com/openfheorg/openfhe-python)
* [Lattigo](https://github.com/tuneinsight/lattigo)
The solution was tested inside a Docker container using the following libraries/packages:

- **OpenFHE**: v1.1.4
- **OpenFHE-Python**: v0.8.6

### Hardware

**CPU**: 12 core
**RAM**: 54 GB

### Command-line interface

## Help
The component supports the following CLI options:

If you have any questions, you can:
* Contact us by email: support@fherma.io
* Ask a question in our [Discord](https://discord.gg/NfhXwyr9M5).
* Open an issue in the [GitHub Repository](https://github.com/Fherma-challenges/parity/issues).
* Use [OpenFHE Discourse](https://openfhe.discourse.group).
- **--input** [path]: path to the file containing the encrypted vector.
- **--output** [path]: path to the output file for results.
- **--cc** [path]: path to the **cryptocontext** file serialized in **BINARY** form.
- **--key_public** [path]: path to the **public key** file.
- **--key_mult** [path]: path to the **evaluation (multiplication) ke**y file.
- **--key_rot** [path]: path to the **rotation key** file.

### Parameters

Parameters that are used to generate crypto context, keys, and ciphertext are specified in the config file **config.json** located in the project root.
See the **Parameters** section in our [FHERMA participant guide](https://fherma.io/how_it_works) for more info on what can be configured.

## Useful links

* [FHERMA participation guide](https://fherma.io/how_it_works)—more on FHERMA challenges
* [OpenFHE](https://github.com/openfheorg/openfhe-development) repository, README, and installation guide
* [OpenFHE Python](https://github.com/openfheorg/openfhe-python)
* [OpenFHE Rust](https://github.com/fairmath/openfhe-rs), its tutorial and documentation
* [OpenFHE AAAI 2024 Tutorial](https://openfheorg.github.io/aaai-2024-lab-materials/)—Fully Homomorphic Encryption for Privacy-Preserving Machine Learning Using the OpenFHE Library
* [A vast collection of resources](https://fhe.org/resources) collected by [FHE.org](http://FHE.org), including tutorials and walk-throughs, use-cases and demos.
Loading