On the 14th of June, the AI Act was successfully passed by the EU parliament. We gathered information on this complex piece of legislation for you. Let’s break down how the EU wants to regulate Artificial Intelligence with 10 questions.
In this article, we'll demonstrate how you can efficiently analyze code at scale while maintaining privacy. We'll use BlindBox, our open-source secure enclave tooling, to serve StarCoder with privacy guarantees on Azure.
Discover confidential computing with our tutorials. Fill the knowledge gap, become proficient in secure enclaves, and craft applications with their strengths. Join us to become a Confidential Computing wizard! Dive into our content and start your journey today.
With BlindBox, you can use Large Language Models without any intermediary or model owner seeing the data sent to the models. This type of solution is critical today, as the newfound ease-of-use of generative AI (GPT4, MidJourney, GitHub Copilot…) is already revolutionizing the tech industry.
We are excited to introduce BlindBox, our latest open-source solution designed to enhance SaaS deployment security. Our tooling enables developers to wrap any Docker image with isolation layers and deploy them inside Confidential Containers.
We could have built our privacy framework BastionLab in any language - Python, for example, which is data science’s beloved. But we chose Rust because of its efficiency and security features. Here are the reasons why we loved doing so, but also some challenges we encountered along the way.
If you’re wondering what the benefits and weaknesses of differential privacy, confidential computing, federated learning, etc are, and how they can be combined to improve artificial intelligence and data privacy, you’ve come to the right place.
Hackers can easily hijack the data science libraries you use every day and get full access to the datasets you are working with. Data owners need tools to prevent it from happening.
When collaborating remotely on sensitive data, their usually amazing interactivity and flexibility need safeguards, or whole datasets can be extracted in a few lines of code.
BastionLab is a simple privacy framework for data science collaboration. It lets data owners protect the privacy of their datasets and enforces that only privacy-friendly operations are allowed on the data and anonymized outputs are shown to the data scientist.
One year and a half later, Mithril Security’s roadmap has transformed significantly, but our initial goal stayed the same: democratizing privacy in data science.
Discover BastionAI, a Rust project for Confidential deep learning training. BastionAI leverages Confidential Computing and Differential Privacy to make AI training between multiple parties more privacy-friendly