Ethical AI with (homomorphic) encryption in healthcare, finance and IT
Artificial intelligence revolutionises industries, from healthcare to finance, bringing opportunities for innovation and growth. As AI systems become more sophisticated, the ethical challenges surrounding data privacy and security become greater as well. With regards to the General Data Protection Regulation (GDPR) within the EU, navigating these challenges is not a task to take lightly. This is where the expertise of LegalAIR comes into play, offering a way to balance innovation and privacy with technologies like encryption.
Legal and ethical implications
For sectors with strict regulatory requirements such as finance, IT and healthcare enhancing data privacy is of great importance. While large and diverse datasets are crucial for improving the accuracy and applicability of AI models, privacy protection regulations like the GDPR make data collection a challenge. Besides that, AI must be trustworthy, lawful, ethical and robust.
Encryption helps with this. However, while powerful, it is not a silver bullet. Its use must be carefully regulated to ensure that it truly protects data privacy without creating new vulnerabilities. For businesses and institutions looking to adopt certain encryption methods, seeking expert legal and technological advice is crucial. This is where LegalAIR can help.
New study on homomorphic encryption
Homomorphic encryption (HE) is an encryption method that allows computations on encrypted data. This means that HE is a type of encryption technology that allows data to remain encrypted while still being used in computations or analyses. In other words, you can perform mathematical operations or other types of processing on the encrypted data without needing to decrypt it first. This ensures that the data stays secure and private throughout the entire process, even when it is being actively used for calculations, such as in AI models. The results of the computations can be decrypted later to reveal the final outcome, while the data itself remains protected.
A recent study in South Korea illustrated the potential of HE by applying it to medical data from over 341.000 patients across three hospitals. In this study, the data was collected to train an AI model for predicting mortality rates within 30 days after surgery and was encrypted with HE. The research confirmed that HE can be practically applied to securely combine extensive datasets from multiple sources while ensuring privacy.
The study further suggests that smaller hospitals can benefit from this encryption technique by safely accessing data from larger institutions to develop their own AI models. These models are likely to exhibit enhanced predictive accuracy compared to those built solely on data from a single institution.
What does this mean in practice?
Prediction models that used multi-institution datasets processed with HE performed better than models that used single-institution datasets. This shows that securely combining data from multiple sources can significantly improve the accuracy and reliability of an AI model. It also shows that using homomorphic encryption to integrate data from multiple institutions can be an important strategy for developing more robust and effective predictive models, especially in sectors such as healthcare and finance, where data privacy is of great importance.
As attractive as homomorphic encryption sounds, its implementation still raises important legal and ethical questions. For example, how can this technology align with existing data protection laws? What are the potential liabilities if something goes wrong? And how can companies ensure that their use of HE is not only compliant with the law, but also ethical? Our team has legal, ethical and technical expertise and can inform you about the complexities of data protection laws, AI ethics and the latest encryption technologies. We can help you implement encryption in a way that maximises benefits while minimising risks.