Here, we provide a deep dive into Confidential Computing, how it can protect data privacy, and where it comes from?
This vulnerability can be used to mount a Man in the Middle attack. We found a fix that Teaclave implemented following the release of this article.
How we partnered with Avian to deploy sensitive Forensic services thanks to Zero Trust Elastic search.
If you’re wondering what the benefits and weaknesses of differential privacy, confidential computing, federated learning, etc are, and how they can be combined to improve artificial intelligence and data privacy, you’ve come to the right place.
Hackers can easily hijack the data science libraries you use every day and get full access to the datasets you are working with. Data owners need tools to prevent it from happening.
BastionLab is a simple privacy framework for data science collaboration. It lets data owners protect the privacy of their datasets and enforces that only privacy-friendly operations are allowed on the data and anonymized outputs are shown to the data scientist.
Discover BastionAI, a Rust project for Confidential deep learning training. BastionAI leverages Confidential Computing and Differential Privacy to make AI training between multiple parties more privacy-friendly
Deploy medical image analysis with confidentiality thanks to BlindAI
Deep dive into the data-in-use protection mechanisms of secure enclaves
An introduction to remote attestation, which is the key to trust a remote enclave.
An introduction to Confidential Computing and the problems it solves