In the modern world, what has become increasingly obvious to Massachusetts Institute of Technology futurist David Shrier is that no one is safe and it’s time for a new model for security and data privacy.
In a keynote address at the SecTor security conference here on Nov. 14, Shrier outlined efforts underway by the MIT Trust::Data Consortium, which he helps to lead, to build new systems that redefine how data is shared and secured.
While there is a need for privacy and encryption, there is also a need to be able to access private data in a secure manner for such use cases as counterterrorism, medical emergencies and in times of humanitarian crisis, he said.
“How do we share data safely, and how do we secure it better and then shift the whole model?” Shrier asked the capacity keynote audience.
“We need a policy around us owning our own personal data and not Facebook, and we should also have an audit trail of access,” he said. “And we want to be able to share information when we want.”
Shrier is also the editor of the book “Trust:Data: A New Framework for Identity and Data Sharing,” which outlines the approach that the MIT-led effort wants to see take shape. The key insight in the book is that what is really needed is the ability to share answers to a query and not full data sets, he said. For example, a bank doesn’t necessarily need all of a person’s data to make a decision on whether to authorize a car loan; it should just be able to query a data set to make a decision.
In the current model of data collection, Shrier said private data is collected and stored by companies in large data lakes. In the model that he and his colleagues are proposing, there is a query function that talks to different pieces of code to help ensure that only authorized queries are accessing the data system.
The query part is being developed as an open-source effort known as the Open Algorithms (OPAL) Project. The OPAL Project site defines the effort as a “collaborative project being developed by a group of partners committed to leveraging the power of platforms, big data and advanced analytics for the public good in a privacy-preserving, commercially sensible, stable, scalable, and sustainable manner.”
Shrier said the overall trust model can make use of blockchain to provide a cryptographically verified distributed system for audit. The computation and query can be performed in a sandbox, providing further data security protections. He added that with the Trust::Data model, individuals give permission to who can see their data and that permission can be rescinded.
“The model lets you perform useful calculations on the useful data in its encrypted form, so you can share critical insights without leaking data,” Shrier said.
The Trust::Data framework is still a work a progress, though Shrier is optimistic about its chances for success. “We made it an open-source project for a reason,” he said. “We want people to help make it highly resilient.”
Originally published on eWeek
SK Hynix says Nvidia chief executive Jensen Huang asked if production of next-gen HBM4 memory…
Digital transformation is an ongoing journey, requiring continuous adaptation, strong leadership, and skilled talent to…
Australian computer scientist faces contempt-of-court claim after suing Jack Dorsey's Block and Bitcoin Core developers…
OpenAI's ChatGPT gets search features, putting it in direct competition with Microsoft and Google, amidst…
New Google Maps allows users to ask for detailed information on local spots, adds AI-summarised…
US-sanctioned Huawei sees sales surge in first three quarters of 2024 on domestic smartphone popularity,…
View Comments
Can nobody see how important this idea is to cybersecurity, especially if used in conjunction with something like z Series full encryption as the data source?