Chapter 9

The Future

The unstoppable march of technology will continue to spawn new privacy challenges. Innovations once so complex or expensive that they were limited to the laboratory or university setting have become readily accessible and commercialized. Many of these advances present risks for what might be the final frontier of privacy, the human body.

1. Genetic Testing

It still seems like the stuff of science fiction, but for about $200 you can explore your own DNA through genetic testing.

One company, 23andMe — named for the 23 pairs of chromosomes in the nucleus of most cells — provides you with reports on more than 100 health conditions and traits, data about inherited risk factors, and how you might respond to certain medications, as well as information on your ancestral lineage.

While the company promises to tell you about traits such as male pattern baldness and lactose intolerance, it acknowledges life insurance companies or employers might ask for your genetic information, or whether you have had a test.

The Canadian Life and Health Insurance Association code for member companies says while insurers will not ask that tests be done, they may request that existing genetic test results be made available to the insurer for the purposes of classifying risk. The association maintains that an insurance contract is a “good faith” agreement and that both the applicant and insurer have an obligation to disclose any information that may be relevant.[1]

The federal privacy commissioner has urged the industry association to go further and refrain from requesting access to existing genetic test results “until such time as they can be shown to be demonstrably necessary and effective.”[2] For the vast majority of genetic tests, the ability to predict health and life outcomes with any reasonable degree of certainty was still relatively low, the watchdog said.

The commissioner has supported proposed legislation to prohibit any person from requiring an individual to undergo a genetic test or disclose the results of a genetic test as a condition of providing goods or services to, or entering into a contract with, the individual. As of mid-2016, the latest version of the bill was making its way through Parliament.

In the United States, the Genetic Information Nondiscrimination Act of 2008 prohibits a health insurer from requiring a genetic test or denying benefits as a result of a pre-existing condition stemming from a genetic test result.

2. Wearable Devices

You can now wear a wide range of small and smart-looking electronic gizmos that track your mood, movements, and health status as well as details about your immediate environment, including location, images, sounds, temperature, and humidity. For instance, the popular Fitbit can monitor steps taken and calories burned during the day and measure how well you sleep at night — data that can be viewed and analyzed through a smartphone app.

As more and more information about our bodies is collected and digitized through wearable computing devices and connected with other online and offline information about us, the impacts can be profound, Canadian Privacy Commissioner Daniel Therrien’s office said in a report on key priorities.

“The integration of technology in the human body is certainly something to watch for in the future,” Therrien said in a recent interview with The Canadian Press. “It’s a growing phenomenon, obviously, and it’s something we’ll be trying to monitor as best we can.”

3. Big Data

Number crunching has clearly caught the eye of senior decision-makers. The following examples were highlighted in a May 2015 presentation, Big Data Analytics in the World of Safety and Security, for Public Safety Canada’s internal policy committee:[3]

• The Philadelphia police mined data to predict a parolee’s risk of reoffending to determine the necessary level of supervision.

• US researchers found that a genetic variant related to schizophrenia was not detectable when reviewing 3,500 cases but were able to pinpoint a trend by looking at 35,000 cases.

• In Guatemala, a pilot project revealed how mobile phone movement patterns could be used to predict socioeconomic status.

Big data analytics generally refers to the process of gathering and systematically sifting through millions or even billions of pieces of data (e.g., numbers, text, graphics, videos, sensor information) to glean insights that can’t be detected through standard methods.

In the United States, the White House has championed big data as an opportunity to boost economic productivity, drive improved consumer and government services, thwart terrorism, and save lives — all while protecting American core values.

Years ago, Jennifer Stoddart, Canadian privacy commissioner at the time, cautioned that big data had not simply increased the risk to privacy — it had changed the very nature of that risk.

The assumption behind big data analytics is that personal data has to be captured — regardless of privacy interests — in the hope that it might be useful at some future point, said Colin Bennett, Political Science Professor at the University of Victoria. This logic flies in the face of a premise of Canadian and international privacy protection law, that organizations should only be capturing personal data for precise, specified, and transparent purposes, Bennett added.

The expanded capacity for data mining, especially the ability to cross-reference various types of personal information, is perhaps the major privacy issue right now, said Vincent Gogolek, Executive Director of the British Columbia Freedom of Information and Privacy Association.

“Big data combined with the use of algorithms can result in the creation of what is called a digital persona. However, this digital persona is not an actual person, but a digital shadow which cannot replicate the complete human being,” Gogolek said. “How the digital persona is created and what information is used is a highly guarded, proprietary secret, and decisions are increasingly made by machines and systems analyzing this digital shadow. These decisions can involve such important issues as our creditworthiness, eligibility of government programs, or security status — and these decisions can be difficult if not impossible to reverse. The consequences can be catastrophic for the individuals involved.”

Gus Hosein, Executive Director of Privacy International in London, wonders whether this will prompt people to try to game the system, and in the process undermine social bonds.

“We will choose our friends online so that our credit profile is improved,” Hosein said. “We will only say publicly and read things that are acceptable to others. The repression of our emotions and our ambitions will disturb the human psyche even as we delude ourselves that somehow we are winning.”

Privacy does not need to be a barrier to innovative data analytics, the Public Safety presentation says. “We need to think strategically about what we want to accomplish with data and then design in appropriate privacy protections.”

1 “Industry Code: Genetic Testing Information for Insurance Underwriting,” Canadian Life and Health Insurance Association, accessed March 2016. clhia.ca/domino/html/clhia/CLHIA_LP4W_LND_Webstation.nsf/page/E79687482615DFA485257D5D00682400/$file/Industry_Code_Genetic_Testing.pdf

2 “Appearance before the Senate Standing Committee on Human Rights on Bill S-201, An Act to Prohibit and Prevent Genetic Discrimination, The Genetic Non-Discrimination Act,” Office of the Privacy Commissioner of Canada, accessed March 2016. priv.gc.ca/parl/2014/parl_20141002_e.asp

3 “Federal Security Officials Eye ’Big Data Analytics’ in a Bid to Pinpoint Threats,” Jim Bronskill, National Newswatch, accessed March 2016. nationalnewswatch.com/2015/11/12/federal-security-officials-eye-big-data-analytics-in-a-bid-to-pinpoint-threats/#.Vt5K0EB_SSr