It was an honour to be invited to testify last week before the Standing Committee on Industry and Technology (INDU) at the House of Commons of Canada Chambre des communes du Canadaon Bill C-27, Canada's pending privacy and AI legislation.
My full opening statement may be found below:
Thank you for the invitation to appear before this committee for its important review of Bill C-27.
I am a privacy lawyer and consultant based in Toronto. Having worked in the privacy field for over 15 years while raising three sons, I have a passion for children’s privacy and will focus my remarks on this topic today.
My interest in privacy law was sparked as a law student at the University of Ottawa, working on research with Professor Michael Geist and the late Professor Ian Kerr at the time when PIPEDA was a new bill being debated similarly to how we are today. When Professor Geist appeared here a few weeks ago, he reflected on his first appearance before a committee to discuss PIPEDA, noting that it is important to get it right rather than to get it fast. When Professor Kerr appeared in 2017 to discuss PIPEDA reform, he stated that at the time, “the dominant metaphor was George Orwell's 1984, “Big Brother is Watching You”, noting that technological developments in the years since PIPEDA go well beyond watching.
Both Professors Geist and Kerr were right, especially in the context of children’s privacy. Given that children are inundated with emerging technologies well beyond Orwell’s 1984 - from AI tools to edtech to virtual reality to social media - it is more important than ever to get it right when it comes to children’s privacy.
When Bill C-11 was introduced in late 2020, it did not address children at all. As I argued in a Policy Options article in 2021, this was a missed opportunity given that the amount of online activity for children was at an all-time high during the pandemic.
I commend the legislators for addressing children’s privacy in Bill C-27 by stating that “information of minors is considered to be sensitive” and by including language that could provide minors with a more direct route to delete their personal information - otherwise known as the “right to be forgotten”. I also understand that Minister Champagne proposes further amendments to include stronger protections for minors.
However, I think there is more that the law can do to get it right for children’s privacy. I will focus on two points - first, creating clear definitions; and second, looking to leading jurisdictions for guidance.
First, the law should define the terms “minor” and “sensitive”. Without these definitions, businesses - who already have the upper hand in this law - are left to decide what is sensitive and appropriate for minors. The CPPA should follow the lead of other leading privacy laws. The California Consumer Privacy Act, the US COPPA, the EU’s GDPR, and Quebec’s Law 25 all establish a minimum age for consent ranging from 13 to 16.
Further, the law should explicitly define the term “sensitive”. The current wording recognizes that minors’ data is sensitive, which means that other provisions in the statute have to interpret the treatment of sensitive information through a contextual analysis, whether it be for safeguarding, consent, or retention. The law should define “sensitive” and provide non-exhaustive examples of sensitive data, so that businesses, regulators and courts will have more guidance in applying the legislative framework.
Second, I recommend that you consider revising the law - as an amendment or regulation - in order to align the CPPA with leading jurisdictions, namely the age appropriate design code legislation in the UK and California. Both of these demonstrate a more prescriptive approach to regulating the personal information of children.
The California “Kids Code” requires businesses to prioritize the privacy of children by default and in the design of their products. For example, default settings on apps and platforms for users under 18 must be set to the highest privacy level. This is something that should be considered in the CPPA as well.
Further, the California Code establishes a level of fiduciary care for platforms such that if a conflict of interest arises between what is best for the platform and what is best for a user under 18, the child’s best interest must come first. This is consistent with the recommendation of former Commissioner Thierren and others in these hearings about including language around the “best interest of the child” into the legislation. The CPPA should contemplate requirements for how businesses use children’s data, considering the child’s best interest. For example, use of children’s data could be limited to those actions necessary to provide an age-appropriate service.
As I argued in my Policy Options article in January 2023, we need a collaborative approach that includes lawmakers and policymakers from all levels of government, coordination with global privacy laws, engagement with parents, and coordination with educators. For this approach to work, the law needs to strike the balance between privacy and innovation. We want laws that are flexible enough to last so that technology can evolve, new business ideas can succeed, and children can be innovators while growing up in a world that recognizes their special needs and rights.