Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cross-Silo Vertical Learning? #1

Open
kylezhaoxc opened this issue Dec 22, 2020 · 0 comments
Open

Cross-Silo Vertical Learning? #1

kylezhaoxc opened this issue Dec 22, 2020 · 0 comments

Comments

@kylezhaoxc
Copy link

kylezhaoxc commented Dec 22, 2020

I understand that for cross-silo horizontal learning settings, what need to be encrypted are gradients from different data owners, then performing BatchCrypt means HE + "local batch norm", and it will hold true for an acceptable loss of precision, and is not likely to harm convergence and performance of the model.

Do you have any idea on how this could be done in a cross-silo vertical learning setting? since the intermediate results are not only gradients, but linear computation result, as well as a component used for computing the gradient. Applying "BatchNorm" seems doesn't make sense on these, any suggestions?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant