Pressure is mounting on the UK Parliament's Intelligence and Security Committee (ISC) to hold its hearing on Apple's data privacy practices in public. The ISC plans to examine claims made in a recent report that Apple's data extraction policies could compromise national security and aid authoritarian regimes. Privacy advocates and legal experts argue a public hearing is essential for transparency and accountability, especially given the significant implications for user privacy. The ISC typically operates in secrecy, but critics contend this case warrants an open session due to the broad public interest and potential impact of its findings.
The UK's National Cyber Security Centre (NCSC), along with GCHQ, quietly removed official advice recommending the use of Apple's device encryption for protecting sensitive information. While no official explanation was given, the change coincides with the UK government's ongoing push for legislation enabling access to encrypted communications, suggesting a conflict between promoting security best practices and pursuing surveillance capabilities. This removal raises concerns about the government's commitment to strong encryption and the potential chilling effect on individuals and organizations relying on such advice for data protection.
HN commenters discuss the UK government's removal of advice recommending Apple's encryption, speculating on the reasons. Some suggest it's due to Apple's upcoming changes to client-side scanning (now abandoned), fearing it weakens end-to-end encryption. Others point to the Online Safety Bill, which could mandate scanning of encrypted messages, making previous recommendations untenable. A few posit the change is related to legal challenges or simply outdated advice, with Apple no longer being the sole provider of strong encryption. The overall sentiment expresses concern and distrust towards the government's motives, with many suspecting a push towards weakening encryption for surveillance purposes. Some also criticize the lack of transparency surrounding the change.
Apple has removed its iCloud Advanced Data Protection feature, which offers end-to-end encryption for almost all iCloud data, from its beta software in the UK. This follows reported concerns from the UK's National Cyber Security Centre (NCSC) that the enhanced security measures would hinder law enforcement's ability to access data for investigations. Apple maintains that the feature will be available to UK users eventually, but hasn't provided a clear timeline for its reintroduction. While the feature remains available in other countries, this move raises questions about the balance between privacy and government access to data.
HN commenters largely agree that Apple's decision to pull its child safety features, specifically the client-side scanning of photos, is a positive outcome. Some believe Apple was pressured by the UK government's proposed changes to the Investigatory Powers Act, which would compel companies to disable security features if deemed a national security risk. Others suggest Apple abandoned the plan due to widespread criticism and technical challenges. A few express disappointment, feeling the feature had potential if implemented carefully, and worry about the implications for future child safety initiatives. The prevalence of false positives and the potential for governments to abuse the system were cited as major concerns. Some skepticism towards the UK government's motivations is also evident.
The UK government is pushing for a new law, the Investigatory Powers Act, that would compel tech companies like Apple to remove security features, including end-to-end encryption, if deemed necessary for national security investigations. This would effectively create a backdoor, allowing government access to user data without their knowledge or consent. Apple argues that this undermines user privacy and security, making everyone more vulnerable to hackers and authoritarian regimes. The law faces strong opposition from privacy advocates and tech experts who warn of its potential for abuse and chilling effects on free speech.
HN commenters express skepticism about the UK government's claims regarding the necessity of this order for national security, with several pointing out the hypocrisy of demanding backdoors while simultaneously promoting end-to-end encryption for their own communications. Some suggest this move is a dangerous precedent that could embolden other authoritarian regimes. Technical feasibility is also questioned, with some arguing that creating such a backdoor is impossible without compromising security for everyone. Others discuss the potential legal challenges Apple might pursue and the broader implications for user privacy globally. A few commenters raise concerns about the chilling effect this could have on whistleblowers and journalists.
Summary of Comments ( 9 )
https://news.ycombinator.com/item?id=43361381
HN commenters largely agree that Apple's argument for a closed-door hearing regarding data privacy doesn't hold water. Several highlight the irony of Apple's public stance on privacy conflicting with their desire for secrecy in this legal proceeding. Some express skepticism about the sincerity of Apple's privacy concerns, suggesting it's more about competitive advantage. A few commenters suggest the closed hearing might be justified due to legitimate technical details or competitive sensitivities, but this view is in the minority. Others point out the inherent conflict between national security and individual privacy, noting that this case touches upon that tension. A few express cynicism about government overreach in general.
The Hacker News post titled "Pressure grows to hold secret Apple data privacy hearing in public" (https://news.ycombinator.com/item?id=43361381) has generated several comments discussing the implications of the related BBC article about a legal dispute between Apple and Corellium. The discussion centers around transparency, national security, and the potential chilling effect on security research.
Several commenters express concern over the secrecy surrounding the hearing. They argue that issues involving fundamental rights, such as data privacy, should be conducted publicly to ensure accountability and allow for public scrutiny. One commenter highlights the irony of Apple, a company that champions user privacy, being involved in a closed-door hearing on a related matter. The sentiment expressed is that transparency is crucial for building trust and ensuring that decisions are made in the best interest of the public.
A recurring theme in the comments is the potential misuse of national security concerns to justify secrecy. Commenters suggest that the government might be overusing national security arguments to avoid public scrutiny, thus potentially hiding questionable practices or decisions. They point out that while genuine national security concerns warrant certain levels of secrecy, it shouldn't be used as a blanket justification to avoid transparency in matters of public interest.
The potential impact on security research is also a significant concern raised by commenters. They argue that closed-door hearings and potential restrictions arising from them could stifle legitimate security research. One commenter suggests that the government's actions might create a chilling effect on researchers who expose vulnerabilities, potentially leaving critical systems more vulnerable to exploitation. This could lead to a situation where vulnerabilities are discovered and exploited by malicious actors before they can be patched.
Some comments also delve into the specifics of the case, questioning Corellium's business practices and the implications of their technology. They also express concern over who would really benefit from a "backdoor" in Apple. Commenters analyze the legal arguments and the potential outcomes, speculating on the ramifications for the broader tech industry.
In summary, the comments on Hacker News express considerable concern over the lack of transparency in the Apple-Corellium case, with particular emphasis on the potential negative impact on data privacy, security research, and the perceived overuse of national security arguments to justify secrecy.