w2c logo Missouri S&T
About People News Projects Publications Services Grants Contact Us
Projects

Replicated Data Integrity Verification in Cloud

Cloud computing is an emerging model in which computing infrastructure resources are provided as a ser- vice over the Internet. Data owners can outsource their data by remotely storing them in the cloud and enjoy on-demand high quality applications and services from a shared pool of configurable computing resources. However, the fact that data owners and cloud servers are not in the same trusted domain may put the out- sourced data at risk, as the cloud server may no longer be fully trusted. Thus, data integrity is of critical importance in such a scenario. Cloud should let either the owners or a trusted third party to audit their data storage without demanding for a local copy of the data from owners. Replicating the data on cloud servers across multiple data centers provides a higher level of scalability, availability, and durability. When the data owners ask the Cloud Service Provider (CSP) to replicate data copies at different servers, they are charged a higher fee by the CSP. Therefore, the data owners need to be strongly convinced that the CSP is storing all the data copies that are agreed upon in the service level contract, and the data-update requests issued by the customers have been correctly executed on all the remotely stored copies. To deal with such problems, previous multi copy verification schemes either focused on static files or incurred huge update costs in a dynamic file scenario. In this project, we identify some shortcomings of the previous approaches and then propose some ideas under a Dynamic Multi-Replica Provable Data Possession scheme (DMR-PDP) that pre- vents the CSP from cheating; for example, maintaining fewer copies than paid for. DMR-PDP also supports efficient dynamic operations like block modification, insertion, deletion and append on data replicas over cloud servers.

Researcher

Raghul Mukundan

Advisor

Dr. Sanjay Madria