The UK's National Cyber Security Centre (NCSC), along with GCHQ, quietly removed official advice recommending the use of Apple's device encryption for protecting sensitive information. While no official explanation was given, the change coincides with the UK government's ongoing push for legislation enabling access to encrypted communications, suggesting a conflict between promoting security best practices and pursuing surveillance capabilities. This removal raises concerns about the government's commitment to strong encryption and the potential chilling effect on individuals and organizations relying on such advice for data protection.
The EU's AI Act, a landmark piece of legislation, is now in effect, banning AI systems deemed "unacceptable risk." This includes systems using subliminal techniques or exploiting vulnerabilities to manipulate people, social scoring systems used by governments, and real-time biometric identification systems in public spaces (with limited exceptions). The Act also sets strict rules for "high-risk" AI systems, such as those used in law enforcement, border control, and critical infrastructure, requiring rigorous testing, documentation, and human oversight. Enforcement varies by country but includes significant fines for violations. While some criticize the Act's broad scope and potential impact on innovation, proponents hail it as crucial for protecting fundamental rights and ensuring responsible AI development.
Hacker News commenters discuss the EU's AI Act, expressing skepticism about its enforceability and effectiveness. Several question how "unacceptable risk" will be defined and enforced, particularly given the rapid pace of AI development. Some predict the law will primarily impact smaller companies while larger tech giants find ways to comply on paper without meaningfully changing their practices. Others argue the law is overly broad, potentially stifling innovation and hindering European competitiveness in the AI field. A few express concern about the potential for regulatory capture and the chilling effect of vague definitions on open-source development. Some debate the merits of preemptive regulation versus a more reactive approach. Finally, a few commenters point out the irony of the EU enacting strict AI regulations while simultaneously pushing for "right to be forgotten" laws that could hinder AI development by limiting access to data.
Summary of Comments ( 160 )
https://news.ycombinator.com/item?id=43271177
HN commenters discuss the UK government's removal of advice recommending Apple's encryption, speculating on the reasons. Some suggest it's due to Apple's upcoming changes to client-side scanning (now abandoned), fearing it weakens end-to-end encryption. Others point to the Online Safety Bill, which could mandate scanning of encrypted messages, making previous recommendations untenable. A few posit the change is related to legal challenges or simply outdated advice, with Apple no longer being the sole provider of strong encryption. The overall sentiment expresses concern and distrust towards the government's motives, with many suspecting a push towards weakening encryption for surveillance purposes. Some also criticize the lack of transparency surrounding the change.
The Hacker News post titled "NCSC, GCHQ, UK Gov't expunge advice to 'use Apple encryption'" sparked a discussion with several insightful comments. Many commenters focused on the implications of the UK government's seemingly changed stance on end-to-end encryption.
Several commenters speculated on the reasons behind the removal of the advice to use Apple's encryption. Some suggested it might be related to the UK's ongoing efforts to push through legislation that could potentially weaken end-to-end encryption, like the Online Safety Bill. The idea being that promoting specific encryption methods now could complicate later arguments in favor of breaking or bypassing that encryption. Others posited that the removal was less nefarious, perhaps simply a matter of avoiding the appearance of endorsing a specific commercial product or recognizing the evolving landscape of secure messaging where other platforms offer comparable security.
A recurring theme was the inherent tension between government surveillance desires and individual privacy rights. Commenters debated the merits and drawbacks of end-to-end encryption, acknowledging its crucial role in protecting sensitive communications while also recognizing the challenges it poses for law enforcement.
Some commenters highlighted the subtle language changes in the updated guidance, noting that while the specific mention of Apple encryption was removed, the general advice to use end-to-end encrypted services remained. This led to discussions about the nuances of security advice and the difficulty of providing clear, actionable recommendations to the public without inadvertently promoting specific products or overlooking potential vulnerabilities.
A few technical comments delved into the specifics of different encryption implementations and their relative strengths and weaknesses. One commenter mentioned the potential issues related to metadata, even with end-to-end encrypted messages, and another discussed the importance of verifying the authenticity of encryption software.
Overall, the comments section reflected a nuanced understanding of the complex issues surrounding encryption, government surveillance, and online privacy. Commenters generally expressed concern over the implications of the UK government's actions while also engaging in productive discussions about the technical and societal aspects of encryption technology.