Could Interoperable Decentralization of Data Help Its Integrity Problems?

Ivermectin research is plagued with data integrity failures, raising an important question for security and privacy professionals: what better data control options are available?

The latest news seems right on track to demand interoperability from technology that facilitates more individually controlled patient data stores:

…calling for scientists to adopt a new standard for meta-analyses, where individual patient data, not just a summary of that data, is provided by scientists who conducted the original trials and subsequently collected for analysis”.

In other words using the Solid protocol would enable patients to participate in a consensual study by opening access to their data for research, while still allowing the highest possible integrity.

Saying “accuracy is still bad” is the defining security story of the 2010s and now 2020s as well… seriously holding back technology usefulness by undermining knowledge.

Integrity is lacking innovation and needs a complete new approach; it’s way too far behind where we are in terms of confidentiality and availability control engineering.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.