Week of 2024-12-03

Public Safety Canada’s dealing with natural and communication infrastructure disasters

Ken Rubin | Hill Times

Public Safety Canada is actively addressing the challenges posed by natural disasters and communication infrastructure failures. The department has been investing in artificial intelligence (AI) solutions to enhance disaster response and infrastructure resilience. However, concerns have been raised about the transparency of these expenditures and the effectiveness of the AI projects implemented. The introduction of Bill C-26 aims to strengthen the security of telecommunications networks against such threats. This legislative effort underscores the government's commitment to safeguarding critical infrastructure and ensuring public safety in the face of evolving challenges.

Vancouver police ignore FOI requests for chief's communications

Andrew Weichel | Lisa Steacy | CTV News

The Vancouver Police Department (VPD) has not fulfilled two Freedom of Information (FOI) requests submitted in November 2023, which sought communications from Chief Adam Palmer and other senior officials regarding then-mayoral candidate Ken Sim, the 2022 municipal election, and the issue of stranger assaults. Under British Columbia's Freedom of Information and Protection of Privacy Act, public bodies are required to respond to FOI requests within 30 days, with possible extensions under certain conditions. Transparency advocates, such as Mike Larsen, president of the Freedom of Information and Privacy Association, have expressed concern over the VPD's non-compliance, emphasizing that FOI requests are legal mechanisms that public institutions are obligated to honor. The Office of the Information and Privacy Commissioner is currently reviewing the VPD's handling of these requests, but no findings have been released yet.

New AI safety institute limited by legislative vacuum: NDP innovation critic Brian Masse

Stuart Benson | Hill Times

The Canadian government has established the Canadian Artificial Intelligence Safety Institute (CAISI) to advance AI safety research and promote responsible AI development. However, NDP innovation critic Brian Masse has expressed concerns that, in the absence of comprehensive legislation like Bill C-27, CAISI's effectiveness may be limited. Bill C-27, which includes the Artificial Intelligence and Data Act (AIDA), aims to regulate AI systems but has faced delays and is currently awaiting government amendments. The legislative gap raises questions about the enforceability of AI safety measures and the overall governance framework for AI in Canada.

UK government failing to list use of AI on mandatory register

Robert Booth | The Guardian

The UK government mandated in February 2024 that all departments must register their use of artificial intelligence (AI) systems on a public platform to ensure transparency. However, as of November 2024, not a single Whitehall department has complied with this requirement. This lack of adherence has raised concerns among experts about the unmonitored deployment of AI technologies in critical areas such as welfare, immigration enforcement, and policing. The absence of transparency hampers public understanding and oversight of AI's role in government decision-making processes.

Australia’s debate on age verification for social media reaches Parliament

Masha Borak | Biometric Update

Australia’s Parliament is considering legislation requiring social media platforms to verify users’ ages, restricting access for those under 16. The law aims to protect children from online harms but has sparked concerns about privacy and the potential for intrusive data collection. Critics argue that implementing age verification may isolate vulnerable youth and create new challenges around data security. Proponents, however, see it as a necessary step to improve online safety for young users. The outcome of this debate could influence global approaches to age verification and digital child safety.

LifeLabs didn't protect millions of Canadians' privacy, report finds

Jeremy Hainsworth | Richmond News

In 2019, LifeLabs, a Canadian medical services company, experienced a cyberattack compromising the personal and health information of approximately 15 million individuals, primarily in Ontario and British Columbia. A joint investigation by the privacy commissioners of these provinces revealed that LifeLabs failed to implement adequate safeguards to protect sensitive data, collecting more personal health information than necessary. Despite complying with subsequent orders to improve security practices, LifeLabs attempted to prevent the public release of the investigation report, citing litigation and solicitor-client privilege. However, the Ontario Court of Appeal dismissed these appeals, leading to the report's publication in November 2024. This incident underscores the critical importance of robust data protection measures and transparency in handling personal health information.

Botched upgrade at National Defence led to ‘life-threatening’ email outage

David Reevely | The Logic

The Department of National Defence (DND) and the Canadian Armed Forces (CAF) experienced a significant email outage affecting their Defence 365 (D365) platform, which is essential for communication and collaboration among personnel. The outage disrupted operations, highlighting the critical role of reliable digital infrastructure in national security. DND's IT teams worked diligently to restore services, implementing measures to prevent future occurrences. This incident underscores the importance of robust cybersecurity protocols and contingency planning within defense organizations.

The Citizen Lab’s submission to the Senate Standing Committee on National Security, Defence and Veterans Affairs

Kate Robertson | Munk School of Global Affairs & Public Policy

In November 2024, the Citizen Lab submitted a brief to the Senate Standing Committee on National Security, Defence and Veterans Affairs, critically analyzing Bill C-26, which proposes amendments to the Telecommunications Act to enhance cybersecurity. The brief identifies several concerns, including:

  • Broad Ministerial Powers: The bill grants extensive authority to the government to issue directives to telecommunications providers without sufficient limitations, potentially leading to overreach.

  • Excessive Secrecy: Provisions in the bill could result in secret laws and regulations, undermining transparency and public accountability.

  • Privacy and Charter Rights: The proposed amendments may infringe upon privacy rights and freedoms protected under the Canadian Charter of Rights and Freedoms, particularly regarding freedom of expression and protection against unreasonable search and seizure.

  • Encryption Concerns: The bill could compel network operators to weaken encryption standards, compromising the security of communications networks.

The Citizen Lab recommends specific amendments to address these issues, emphasizing the need for transparency, accountability, and the protection of fundamental rights in Canada's cybersecurity legislation.

Meta fights CRTC, refuses to publicly release info on news blocking measures

Anja Karadeglija | CTV News

Meta Platforms Inc. is resisting directives from the Canadian Radio-television and Telecommunications Commission (CRTC) to disclose information regarding its compliance with the Online News Act. Despite blocking news content on its platforms in Canada, Meta has neither made the relevant data public nor provided detailed justifications for keeping it confidential. Heritage Minister Pascale St-Onge's office criticized Meta's stance, stating it "sends a troubling message" and suggests the company believes it is "above oversight in the public interest." This situation highlights ongoing tensions between the Canadian government and major tech firms over regulatory compliance and transparency.

UK Parliament launches inquiry into algorithmic spread of misinformation

UK Parliament

The UK Parliament's Science, Innovation and Technology Committee has initiated an inquiry to examine the connections between social media algorithms, generative artificial intelligence (AI), and the dissemination of harmful and false content online. This action follows anti-immigration demonstrations and riots in mid-2024, which were partly fueled by misinformation spread on social media platforms. The inquiry aims to assess how social media companies and search engines utilize algorithms to rank content, the role of generative AI in creating and spreading misinformation, and the effectiveness of current and proposed regulations, including the Online Safety Act, in addressing these challenges. The committee is seeking written submissions by December 18, 2024, to inform its investigation.

Notes from the IAPP Canada: Ontario IPC shares enforcement philosophy with law students

Kris Klein | IAPP

In a recent lecture to law students, Ontario's Information and Privacy Commissioner, Patricia Kosseim, shared insights into her enforcement philosophy. She emphasized a balanced approach that combines negotiation, compromise, and understanding, utilizing both incentives and enforcement actions when necessary. Kosseim highlighted the importance of acknowledging organizations' positive privacy practices, not solely focusing on their shortcomings. She also discussed the necessity for responsible information sharing among law enforcement and regulatory bodies to address complex privacy issues effectively. Additionally, Kosseim introduced the IPC's "Info Matters" podcast as a tool for educating the public about their privacy rights and obligations.

Lacking definitions among Alberta privacy commissioner's worries over 2 government bills

Jack Farrell | CBC News

Alberta's Information and Privacy Commissioner, Diane McLeod, has raised significant concerns regarding two government bills aimed at amending access to information and privacy regulations. She highlights that the proposed legislation contains vague definitions and lacks sufficient safeguards, which could create legislative gaps. One particular issue is a provision allowing the disclosure of minors' personal information without their consent if deemed in their "best interest," without clearly defining who determines this or what criteria are used. Additionally, McLeod is apprehensive about changes that could grant the government increased control over information access, potentially leading to greater gatekeeping. In response, Technology and Innovation Minister Nate Glubish has stated that the government will review the commissioner's concerns and recommendations.

Alberta Proposes Modernized Public Sector Privacy and Information Access Legislation: Unpacking Bills 33 and 34

Julia Loney | Stephen Johnson | McMillan

Alberta has introduced Bills 33 and 34 to modernize public sector privacy and access laws. Bill 33 enhances data security by requiring privacy-by-design principles and oversight for automated decision-making, while Bill 34 updates access processes with revised timelines and expanded exemptions. However, Alberta’s Privacy Commissioner has criticized the bills for vague provisions and privacy risks, especially regarding minors’ data. The government is reviewing these concerns to ensure the legislation balances transparency with robust privacy protections.

Human rights groups question Sandvine’s commitment to change

Terry Pender | The Record

Human rights groups remain skeptical about Sandvine's commitment to human rights, despite the company's recent reforms following its addition to the U.S. Entity List in 2024. Sandvine exited business in 32 non-democratic countries and committed to leaving 24 more, citing a "democracy-only" business realignment. While these changes led to its removal from the Entity List, organizations like Access Now and the Committee to Protect Journalists have called for more transparency and stronger human rights policies. They urge Sandvine to adhere to the UN Guiding Principles on Business and Human Rights and address past misuse of its technology.

Safety-critical workers have “diminished expectation of privacy” in alcohol and drug testing

Emma Hamer | Norton Rose Fulbright

The Federal Court of Appeal's decision in Power Workers’ Union v. Canada (Attorney General), 2024 FCA 182, supports pre-placement and random alcohol and drug testing for employees in safety-critical positions, such as those in the nuclear industry. The court determined that these workers have a "diminished expectation of privacy" due to the high-risk nature of their roles, making such testing more justifiable. This ruling emphasizes that in environments where safety is paramount, the need to prevent incidents outweighs individual privacy concerns. Consequently, employers in safety-sensitive sectors may find it easier to implement mandatory testing protocols to ensure workplace safety.

Previous
Previous

Week of 2024-12-10

Next
Next

Week of 2024-11-25