At this year’s World Economic Forum, the issues surrounding the gathering and use of big data by corporations and governments is discussed by an expert and multistakeholder panel. They discuss the way that failure to properly manage data collection and use threatens the basic human right to privacy, and leads to less safe societies and lack of customer trust. Calls for respect for the rule of law, transparency and clear guidelines regarding use of data must be central to the collection of data and its protocols. The use of subpoenas and clear legislation must also be utilised to ensure data privacy and data integrity. Consideration of the differences in legal contexts for gathering and using data from the United States and Europe in comparison with countries with low freedom of information and levels of Internet penetration is also raised. These are just some of many other issues that are discussed in the context of our big data oriented world that continues to rapidly develop.
In this Privacy International post Alexandrine Pirlot explains the way that private sector data can fall into the wrong hands, therefore enabling monitoring of individuals, identification and surveillance. Despite anonymisation, connecting separate pieces of data can (re)identify an individual and provide information about them that is even more private than the data they consented to share. Moreover, since big data is the result of aggregated data from various sources (some of which is unidentifiable), there is no process to request the consent of a person for the resulting data that emerges. She also explains that while such data is used for developing policies and programmes it may not necessarily represent those towards whom those they targeted. Suggestions on how to combat this are given.
In this interview with Tim O’Reilly by McKinsey, he discusses key areas to consider in the opening up of data. For instance, he says that secrecy is a poor way of being secure since it does not enable understanding of the rules that guide behaviour. In addition, he explains the importance of common standards and building a powerful, effective platform, which he says can enable a market. He also says that the definition of open should better be explained, because it is not always necessary for anybody to be able to take the data and reuse it in whatever way they want. In addition, he explains why technology regulation is important and the way that government operating more like a lean startup can increase productivity.
In this Gartner post Andrea Di Maio discusses the unintended consequences of using open data and scraping data through citing two Yale students, that used university data to build a website that allowed students to plan their schedules and compare class evaluations and teacher ratings. He says that the growing availability of open data and the individual’s ability to exchange their own data via social networks is challenging data ownership and that trust is shifting from established organizations to people.
This post discusses the inevitability of open data for government and the need for a more deliberate effort by government agencies to ensure data is harnessed effectively. These were 2 of the most discussed issues at the U.S. Department of Health and Human Services' Open DataFest and highlights of some of the key points of presentations are given.