With data volumes proliferating, global organizations, including universities are using the cloud to grow their data footprint. Data used to be expensive. Now, it is not uncommon for an organization to consume a petabyte or even a zettabyte of data yearly among its employees and customers (i.e. think content consuming organizations like Netflix, Amazon etc.) For each of these organizations; however, there often comes a need to develop strategic data warehousing systems, proprietary data lakes, unique security and privacy protocols, rules, policies, and workflows to manage these mega-enterprises. What are some of the challenges that you think these organizations incur to maintain quality, to aggregate consistency, currency, reliability, and systemic reliability. How can these organizations also maintain absolute assurance in their privacy and security controls at the same time?