We could have built our privacy framework BastionLab in any language - Python, for example, which is data science’s beloved. But we chose Rust because of its efficiency and security features. Here are the reasons why we loved doing so, but also some challenges we encountered along the way.
Hackers can easily hijack the data science libraries you use every day and get full access to the datasets you are working with. Data owners need tools to prevent it from happening.
When collaborating remotely on sensitive data, their usually amazing interactivity and flexibility need safeguards, or whole datasets can be extracted in a few lines of code.
BastionLab is a simple privacy framework for data science collaboration. It lets data owners protect the privacy of their datasets and enforces that only privacy-friendly operations are allowed on the data and anonymized outputs are shown to the data scientist.
One year and a half later, Mithril Security’s roadmap has transformed significantly, but our initial goal stayed the same: democratizing privacy in data science.
Discover BastionAI, a Rust project for Confidential deep learning training. BastionAI leverages Confidential Computing and Differential Privacy to make AI training between multiple parties more privacy-friendly
A view on the key upcoming EU regulations, and how these are likely to affect data and AI industry practices.