ANONYMIZATION-BASED PRIVACY PROTECTION
Advances in information technology, and its use in research, are increasing both the need for anonymized data and the risks of poor anonymization. In this thesis, we point out some questions raised by current anonymization techniques such as a) support for additional adversary models and the diï¬ƒculty of measuring privacy pro- vided, b) ï¬‚exibility of algorithms-generalizations with respect to a utility cost metric, and c) working with complex data. To address these issues, a) We propose a human understandable privacy notion, Î´-presence ; b) We increase ï¬‚exibility by introduc- ing a new family of algorithms, clustering-based anonymity algorithms and two new types of generalizations, natural domain generalizations, generalizations with proba- bility distributions. We also point out weaknesses such as metric-utility anomalies ; c) We extend the deï¬nitions of current anonymization techniques for multirelational and spatio-temporal setting by presenting multirelational k-anonymity, and trajectory anonymity.
2008 – 12 – 1