The FBI raided the home of Mateo D’Amato, a renowned computer scientist specializing in cryptography and anonymity technologies, and seized several electronic devices. D’Amato has since vanished, becoming incommunicado with colleagues and family. His university profile has been removed, and the institution refuses to comment, further deepening the mystery surrounding his disappearance and the reason for the FBI's interest. D’Amato's research focused on areas with potential national security implications, but no details regarding the investigation have been released.
The post "Everyone knows all the apps on your phone" argues that the extensive data collection practices of mobile advertising networks effectively reveal which apps individuals use, even without explicit permission. Through deterministic and probabilistic methods linking device IDs, IP addresses, and other signals, these networks can create detailed profiles of app usage across devices. This information is then packaged and sold to advertisers, data brokers, and even governments, allowing them to infer sensitive information about users, from their political affiliations and health concerns to their financial status and personal relationships. The post emphasizes the illusion of privacy in the mobile ecosystem, suggesting that the current opt-out model is inadequate and calls for a more robust approach to data protection.
Hacker News users discussed the privacy implications of app usage data being readily available to mobile carriers and how this data can be used for targeted advertising and even more nefarious purposes. Some commenters highlighted the ease with which this data can be accessed, not just by corporations but also by individuals with basic technical skills. The discussion also touched upon the ineffectiveness of current privacy regulations and the lack of real control users have over their data. A few users pointed out the potential for this data to reveal sensitive information like health conditions or financial status based on app usage patterns. Several commenters expressed a sense of resignation and apathy, suggesting the fight for data privacy is already lost, while others advocated for stronger regulations and user control over data sharing.
EFF warns that age verification laws, ostensibly designed to restrict access to adult content, pose a serious threat to online privacy. While initially targeting pornography sites, these laws are expanding to encompass broader online activities, such as accessing skincare products, potentially requiring users to upload government IDs to third-party verification services. This creates a massive database of sensitive personal information vulnerable to breaches, government surveillance, and misuse by private companies, effectively turning age verification into a backdoor for widespread online monitoring. The EFF argues that these laws are overbroad, ineffective at their stated goals, and disproportionately harm marginalized communities.
HN commenters express concerns about the slippery slope of age verification laws, starting with porn and potentially expanding to other online content and even everyday purchases. They argue that these laws normalize widespread surveillance and data collection, creating honeypots for hackers and potentially enabling government abuse. Several highlight the ineffectiveness of age gates, pointing to easy bypass methods and the likelihood of children accessing restricted content through other means. The chilling effect on free speech and the potential for discriminatory enforcement are also raised, with some commenters drawing parallels to authoritarian regimes. Some suggest focusing on better education and parental controls rather than restrictive legislation. The technical feasibility and privacy implications of various verification methods are debated, with skepticism towards relying on government IDs or private companies.
Rayhunter is a Rust-based tool designed to detect IMSI catchers (also known as Stingrays or cell site simulators) using an Orbic Wonder mobile hotspot. It leverages the hotspot's diagnostic mode to collect cellular network data, specifically neighboring cell information, and analyzes changes in this data to identify potentially suspicious behavior indicative of an IMSI catcher. By monitoring for unexpected appearances, disappearances, or changes in cell tower signal strength, Rayhunter aims to alert users to the possible presence of these surveillance devices.
Hacker News users discussed Rayhunter's practicality and potential limitations. Some questioned the effectiveness of relying on signal strength changes for detection, citing the inherent variability of mobile networks. Others pointed out the limited scope of the tool, being tied to a specific hardware device. The discussion also touched upon the legality of using such a tool and the difficulty in distinguishing IMSI catchers from legitimate cell towers with similar behavior. Several commenters expressed interest in expanding the tool's compatibility with other hardware or exploring alternative detection methods based on signal timing or other characteristics. There was also skepticism about the prevalence of IMSI catchers and the actual risk they pose to average users.
Belgian artist Dries Depoorter created "The Flemish Scrollers," an art project using AI to detect and publicly shame Belgian politicians caught using their phones during parliamentary livestreams. The project automatically clips videos of these instances and posts them to a Twitter bot account, tagging the politicians involved. Depoorter aims to highlight politicians' potential inattentiveness during official proceedings.
HN commenters largely criticized the project for being creepy and invasive, raising privacy concerns about publicly shaming politicians for normal behavior. Some questioned the legality and ethics of facial recognition used in this manner, particularly without consent. Several pointed out the potential for misuse and the chilling effect on free speech. A few commenters found the project amusing or a clever use of technology, but these were in the minority. The practicality and effectiveness of the project were also questioned, with some suggesting politicians could easily circumvent it. There was a brief discussion about the difference between privacy expectations in public vs. private settings, but the overall sentiment was strongly against the project.
The UK's National Cyber Security Centre (NCSC), along with GCHQ, quietly removed official advice recommending the use of Apple's device encryption for protecting sensitive information. While no official explanation was given, the change coincides with the UK government's ongoing push for legislation enabling access to encrypted communications, suggesting a conflict between promoting security best practices and pursuing surveillance capabilities. This removal raises concerns about the government's commitment to strong encryption and the potential chilling effect on individuals and organizations relying on such advice for data protection.
HN commenters discuss the UK government's removal of advice recommending Apple's encryption, speculating on the reasons. Some suggest it's due to Apple's upcoming changes to client-side scanning (now abandoned), fearing it weakens end-to-end encryption. Others point to the Online Safety Bill, which could mandate scanning of encrypted messages, making previous recommendations untenable. A few posit the change is related to legal challenges or simply outdated advice, with Apple no longer being the sole provider of strong encryption. The overall sentiment expresses concern and distrust towards the government's motives, with many suspecting a push towards weakening encryption for surveillance purposes. Some also criticize the lack of transparency surrounding the change.
Apple is challenging a UK court order demanding they create a "backdoor" into an encrypted iPhone belonging to a suspected terrorist. They argue that complying would compromise the security of all their devices and set a dangerous precedent globally, potentially forcing them to create similar backdoors for other governments. Apple claims the Investigatory Powers Act, under which the order was issued, doesn't authorize such demands and violates their human rights. They're seeking judicial review of the order, arguing existing tools are sufficient for the investigation.
HN commenters are largely skeptical of Apple's claims, pointing out that Apple already complies with lawful intercept requests in other countries and questioning whether this case is truly about a "backdoor" or simply about the scope and process of existing surveillance capabilities. Some suspect Apple is using this lawsuit as a PR move to bolster its privacy image, especially given the lack of technical details provided. Others suggest Apple is trying to establish legal precedent to push back against increasing government surveillance overreach. A few commenters express concern over the UK's Investigatory Powers Act and its implications for privacy and security. Several highlight the inherent conflict between national security and individual privacy, with no easy answers in sight. There's also discussion about the technical feasibility and potential risks of implementing such a system, including the possibility of it being exploited by malicious actors.
The Register reports that Google collects and transmits Android user data, including hardware identifiers and location, to its servers even before a user opens any apps or completes device setup. This pre-setup data collection involves several Google services and occurs during the initial boot process, transmitting information like IMEI, hardware serial number, SIM serial number, and nearby Wi-Fi access point details. While Google claims this data is crucial for essential services like fraud prevention and software updates, the article raises privacy concerns, particularly because users are not informed of this data collection nor given the opportunity to opt out. This behavior raises questions about the balance between user privacy and Google's data collection practices.
HN commenters discuss the implications of Google's data collection on Android even before app usage. Some highlight the irony of Google's privacy claims contrasted with their extensive tracking. Several express resignation, suggesting this behavior is expected from Google and other large tech companies. One commenter mentions a study showing Google collecting data even when location services are disabled, and another points to the difficulty of truly opting out of this tracking without significant technical knowledge. The discussion also touches upon the limitations of using alternative Android ROMs or de-Googled phones, acknowledging their usability compromises. There's a general sense of pessimism about the ability of users to control their data in the Android ecosystem.
The blog post "Removing Jeff Bezos from My Bed" details the author's humorous, yet slightly unsettling, experience with Amazon's Echo Show 15 and its personalized recommendations. The author found that the device, positioned in their bedroom, consistently suggested purchasing a large, framed portrait of Jeff Bezos. While acknowledging the technical mechanisms likely behind this odd recommendation (facial recognition misidentification and correlated browsing data), they highlight the potential for such personalized advertising to become intrusive and even creepy within the intimate space of a bedroom. The post emphasizes the need for more thoughtful consideration of the placement and application of AI-powered advertising, especially as smart devices become increasingly integrated into our homes.
Hacker News users generally found the linked blog post humorous and relatable. Several commenters shared similar experiences with unwanted targeted ads, highlighting the creepiness factor and questioning the effectiveness of such highly personalized marketing. Some discussed the technical aspects of how these ads are generated, speculating about data collection practices and the algorithms involved. A few expressed concerns about privacy and the potential for misuse of personal information. Others simply appreciated the author's witty writing style and the absurdity of the situation. The top comment humorously suggested an alternative headline: "Man Discovers Retargeting."
Apple has removed its iCloud Advanced Data Protection feature, which offers end-to-end encryption for almost all iCloud data, from its beta software in the UK. This follows reported concerns from the UK's National Cyber Security Centre (NCSC) that the enhanced security measures would hinder law enforcement's ability to access data for investigations. Apple maintains that the feature will be available to UK users eventually, but hasn't provided a clear timeline for its reintroduction. While the feature remains available in other countries, this move raises questions about the balance between privacy and government access to data.
HN commenters largely agree that Apple's decision to pull its child safety features, specifically the client-side scanning of photos, is a positive outcome. Some believe Apple was pressured by the UK government's proposed changes to the Investigatory Powers Act, which would compel companies to disable security features if deemed a national security risk. Others suggest Apple abandoned the plan due to widespread criticism and technical challenges. A few express disappointment, feeling the feature had potential if implemented carefully, and worry about the implications for future child safety initiatives. The prevalence of false positives and the potential for governments to abuse the system were cited as major concerns. Some skepticism towards the UK government's motivations is also evident.
Google's Threat Analysis Group (TAG) observed multiple Russia-aligned threat actors, including APT29 (Cozy Bear) and Sandworm, actively targeting Signal users. These campaigns primarily focused on stealing authentication material from Signal servers, likely to bypass Signal's robust encryption and gain access to user communications. Although Signal's server-side infrastructure was targeted, the attackers needed physical access to the device to complete the compromise, significantly limiting the attack's effectiveness. While Signal's encryption remains unbroken, the targeting underscores the lengths to which nation-state actors will go to compromise secure communications.
HN commenters express skepticism about the Google blog post, questioning its timing and motivations. Some suggest it's a PR move by Google, designed to distract from their own security issues or promote their own messaging platforms. Others point out the lack of technical details in the post, making it difficult to assess the credibility of the claims. A few commenters discuss the inherent difficulties of securing any messaging platform against determined state-sponsored actors and the importance of robust security practices regardless of the provider. The possibility of phishing campaigns, rather than Signal vulnerabilities, being the attack vector is also raised. Finally, some commenters highlight the broader context of the ongoing conflict and the increased targeting of communication platforms.
Bipartisan U.S. lawmakers are expressing concern over a proposed U.K. surveillance law that would compel tech companies like Apple to compromise the security of their encrypted messaging systems. They argue that creating a "back door" for U.K. law enforcement would weaken security globally, putting Americans' data at risk and setting a dangerous precedent for other countries to demand similar access. This, they claim, would ultimately undermine encryption, a crucial tool for protecting sensitive information from criminals and hostile governments, and empower authoritarian regimes.
HN commenters are skeptical of the "threat to Americans" angle, pointing out that the UK and US already share significant intelligence data, and that a UK backdoor would likely be accessible to the US as well. Some suggest the real issue is Apple resisting government access to data, and that the article frames this as a UK vs. US issue to garner more attention. Others question the technical feasibility and security implications of such a backdoor, arguing it would create a significant vulnerability exploitable by malicious actors. Several highlight the hypocrisy of US lawmakers complaining about a UK backdoor while simultaneously pushing for similar capabilities themselves. Finally, some commenters express broader concerns about the erosion of privacy and the increasing surveillance powers of governments.
The UK government is pushing for a new law, the Investigatory Powers Act, that would compel tech companies like Apple to remove security features, including end-to-end encryption, if deemed necessary for national security investigations. This would effectively create a backdoor, allowing government access to user data without their knowledge or consent. Apple argues that this undermines user privacy and security, making everyone more vulnerable to hackers and authoritarian regimes. The law faces strong opposition from privacy advocates and tech experts who warn of its potential for abuse and chilling effects on free speech.
HN commenters express skepticism about the UK government's claims regarding the necessity of this order for national security, with several pointing out the hypocrisy of demanding backdoors while simultaneously promoting end-to-end encryption for their own communications. Some suggest this move is a dangerous precedent that could embolden other authoritarian regimes. Technical feasibility is also questioned, with some arguing that creating such a backdoor is impossible without compromising security for everyone. Others discuss the potential legal challenges Apple might pursue and the broader implications for user privacy globally. A few commenters raise concerns about the chilling effect this could have on whistleblowers and journalists.
Tim investigated the precision of location data used for targeted advertising by requesting his own data from ad networks. He found that location information shared with these networks, often through apps on his phone, was remarkably precise, pinpointing his location to within a few meters. He successfully identified his own apartment and even specific rooms within it based on the location polygons provided by the ad networks. This highlighted the potential privacy implications of sharing location data with apps, demonstrating how easily and accurately individuals can be tracked even without explicit consent for precise location sharing. The experiment revealed a lack of transparency and control over how this granular location data is collected, used, and shared by advertising ecosystems.
HN commenters generally agreed with the article's premise that location tracking through in-app advertising is pervasive and concerning. Some highlighted the irony of privacy policies that claim not to share precise location while effectively doing so through ad requests containing latitude/longitude. Several discussed technical details, including the surprising precision achievable even without GPS and the potential misuse of background location data. Others pointed to the broader ecosystem issue, emphasizing the difficulty in assigning blame to any single actor and the collective responsibility of ad networks, app developers, and device manufacturers. A few commenters suggested potential mitigations like VPNs or disabling location services entirely, while others expressed resignation to the current state of surveillance. The effectiveness of "Limit Ad Tracking" settings was also questioned.
A new report reveals California law enforcement misused state databases over 7,000 times in 2023, a significant increase from previous years. These violations, documented by the California Department of Justice, ranged from unauthorized access for personal reasons to sharing information improperly with third parties. The most frequent abuses involved accessing driver's license information and criminal histories, raising concerns about privacy and potential discrimination. While the report highlights increased reporting and accountability measures, the sheer volume of violations underscores the need for continued oversight and stricter enforcement to prevent future misuse of sensitive personal data.
Hacker News users discuss the implications of California law enforcement's misuse of state databases. Several express concern over the lack of meaningful consequences for officers, suggesting the fines are too small to deter future abuse. Some highlight the potential chilling effect on reporting crimes, particularly domestic violence, if victims fear their information will be improperly accessed. Others call for greater transparency and public access to the audit data, along with stricter penalties for offenders, including termination and criminal charges. The need for stronger oversight and systemic changes within law enforcement agencies is a recurring theme. A few commenters question the scope of permissible searches and the definition of "misuse," suggesting further clarification is needed.
Cory Doctorow's "It's Not a Crime If We Do It With an App" argues that enclosing formerly analog activities within proprietary apps often transforms acceptable behaviors into exploitable data points. Companies use the guise of convenience and added features to justify these apps, gathering vast amounts of user data that is then monetized or weaponized through surveillance. This creates a system where everyday actions, previously unregulated, become subject to corporate control and potential abuse, ultimately diminishing user autonomy and creating new vectors for discrimination and exploitation. The post uses the satirical example of a potato-tracking app to illustrate how seemingly innocuous data collection can lead to intrusive monitoring and manipulation.
HN commenters generally agree with Doctorow's premise that large corporations use "regulatory capture" to avoid legal consequences for harmful actions, citing examples like Facebook and Purdue Pharma. Some questioned the framing of the potato tracking scenario as overly simplistic, arguing that real-world supply chains are vastly more complex. A few commenters discussed the practicality of Doctorow's proposed solutions, debating the efficacy of co-ops and decentralized systems in combating corporate power. There was some skepticism about the feasibility of truly anonymized data collection and the potential for abuse even in decentralized systems. Several pointed out the inherent tension between the convenience offered by these technologies and the potential for exploitation.
This guide emphasizes minimizing digital traces for protesters through practical smartphone security advice. It recommends using a secondary, "burner" phone dedicated to protests, ideally a basic model without internet connectivity. If using a primary smartphone, strong passcodes/biometrics, full-disk encryption, and up-to-date software are crucial. Minimizing data collection involves disabling location services, microphone access for unnecessary apps, and using privacy-respecting alternatives to default apps like Signal for messaging and a privacy-focused browser. During protests, enabling airplane mode or using Faraday bags is advised. The guide also covers digital threat models, stressing the importance of awareness and preparedness for potential surveillance and data breaches.
Hacker News users discussed the practicality and necessity of the guide's recommendations for protesters. Some questioned the threat model, arguing that most protesters wouldn't be targeted by sophisticated adversaries. Others pointed out that basic digital hygiene practices are beneficial for everyone, regardless of protest involvement. Several commenters offered additional tips, like using a burner phone or focusing on physical security. The effectiveness of GrapheneOS was debated, with some praising its security while others questioned its usability for average users. A few comments highlighted the importance of compartmentalization and using separate devices for different activities.
A federal court ruled the NSA's warrantless searches of Americans' data under Section 702 of the Foreign Intelligence Surveillance Act unconstitutional. The court found that the "backdoor searches," querying a database of collected communications for information about Americans, violated the Fourth Amendment's protection against unreasonable searches. This landmark decision significantly limits the government's ability to search this data without a warrant, marking a major victory for digital privacy. The ruling specifically focuses on querying data already collected, not the collection itself, and the government may appeal.
HN commenters largely celebrate the ruling against warrantless searches of 702 data, viewing it as a significant victory for privacy. Several highlight the problematic nature of the "backdoor search" loophole and its potential for abuse. Some express skepticism about the government's likely appeals and the long road ahead to truly protect privacy. A few discuss the technical aspects of 702 collection and the challenges in balancing national security with individual rights. One commenter points out the irony of the US government criticizing other countries' surveillance practices while engaging in similar activities domestically. Others offer cautious optimism, hoping this ruling sets a precedent for future privacy protections.
A seemingly innocuous USB-C to Ethernet adapter, purchased from Amazon, was found to contain a sophisticated implant capable of malicious activity. This implant included a complete system with a processor, memory, and network connectivity, hidden within the adapter's casing. Upon plugging it in, the adapter established communication with a command-and-control server, potentially enabling remote access, data exfiltration, and other unauthorized actions on the connected computer. The author meticulously documented the hardware and software components of the implant, revealing its advanced capabilities and stealthy design, highlighting the potential security risks of seemingly ordinary devices.
Hacker News users discuss the practicality and implications of the "evil" RJ45 dongle detailed in the article. Some question the dongle's true malicious intent, suggesting it might be a poorly designed device for legitimate (though obscure) networking purposes like hotel internet access. Others express fascination with the hardware hacking and reverse-engineering process. Several commenters discuss the potential security risks of such devices, particularly in corporate environments, and the difficulty of detecting them. There's also debate on the ethics of creating and distributing such hardware, with some arguing that even proof-of-concept devices can be misused. A few users share similar experiences encountering unexpected or unexplained network behavior, highlighting the potential for hidden hardware compromises.
Summary of Comments ( 159 )
https://news.ycombinator.com/item?id=43527001
Hacker News users discussed the implications of the FBI raid and subsequent disappearance of the computer scientist, expressing concern over the lack of public information and potential chilling effects on academic research. Some speculated about the reasons behind the raid, ranging from national security concerns to more mundane possibilities like grant fraud or data mismanagement. Several commenters questioned the university's swift removal of the scientist's webpage, viewing it as an overreaction and potentially damaging to his reputation. Others pointed out the difficulty of drawing conclusions without knowing the specifics of the investigation, advocating for cautious observation until more information emerges. The overall sentiment leaned towards concern for the scientist's well-being and apprehension about the precedent this sets for academic freedom.
The Hacker News post titled "FBI raids home of prominent computer scientist who has gone incommunicado" (linking to an Ars Technica article about the disappearance of Dr. Yingying (Jennifer) Chen) has generated a significant number of comments discussing various aspects of the situation. Many commenters express concern over the lack of information and the chilling effect this kind of action could have on academic research and international collaboration.
Several commenters focus on the potential implications of the FBI raid and Chen's subsequent disappearance. Some speculate about possible reasons, ranging from intellectual property theft to espionage, while acknowledging the absence of publicly available evidence. Others caution against jumping to conclusions and emphasize the importance of due process and the presumption of innocence until proven guilty. The secrecy surrounding the case fuels speculation and anxieties.
A recurring theme in the comments is the potential damage to academic freedom and international collaboration. Commenters worry that incidents like this could deter foreign researchers from working in the US or collaborating with American institutions. Some express concerns that the incident could exacerbate existing tensions between the US and China.
Some commenters question the proportionality of the FBI's response, particularly given the lack of publicly disclosed information about the nature of the alleged wrongdoing. They highlight the potential for such raids to disrupt research, damage reputations, and cause significant personal distress even if the individual is ultimately exonerated.
A few commenters offer alternative perspectives, suggesting that the lack of public information might indicate the sensitivity or complexity of the investigation. They argue that it's premature to criticize the FBI's actions without a clearer understanding of the circumstances.
Many comments dissect the Ars Technica article itself, pointing out what they perceive as journalistic shortcomings, such as the reliance on anonymous sources and the lack of concrete details. Some commenters express frustration with the article's focus on speculation rather than verifiable facts.
Finally, several commenters offer practical advice and support, sharing information about legal resources and expressing solidarity with Chen and her family. There's a palpable sense of concern within the community for Chen's well-being and the broader implications of her disappearance. The comments reflect a desire for transparency and a cautious approach to judgment in the absence of confirmed information.