The Register reports that Google collects and transmits Android user data, including hardware identifiers and location, to its servers even before a user opens any apps or completes device setup. This pre-setup data collection involves several Google services and occurs during the initial boot process, transmitting information like IMEI, hardware serial number, SIM serial number, and nearby Wi-Fi access point details. While Google claims this data is crucial for essential services like fraud prevention and software updates, the article raises privacy concerns, particularly because users are not informed of this data collection nor given the opportunity to opt out. This behavior raises questions about the balance between user privacy and Google's data collection practices.
Mozilla's Firefox Terms state that they collect information you input into the browser, including text entered in forms, search queries, and URLs visited. This data is used to provide and improve Firefox features like autofill, search suggestions, and syncing. Mozilla emphasizes that they handle this information responsibly, aiming to minimize data collection, de-identify data where possible, and provide users with controls to manage their privacy. They also clarify that while they collect this data, they do not collect the content of web pages you visit unless you explicitly choose features like Pocket or Firefox Screenshots, which are governed by separate privacy policies.
HN users express concern and skepticism over Mozilla's claim to own "information you input through Firefox," interpreting it as overly broad and potentially invasive. Some argue the wording is likely a clumsy attempt to cover necessary data collection for features like sync and breach alerts, not a declaration of ownership over user-created content. Others point out the impracticality of Mozilla storing and utilizing such vast amounts of data, suggesting it's a legal safeguard rather than a reflection of actual practice. A few commenters highlight the contrast with Firefox's privacy-focused image, questioning the need for such strong language. Several users recommend alternative browsers like LibreWolf and Ungoogled Chromium, perceiving them as more privacy-respecting alternatives.
South Korea's Personal Information Protection Commission has accused DeepSeek, a South Korean AI firm specializing in personalized content recommendations, of illegally sharing user data with its Chinese investor, ByteDance. The regulator alleges DeepSeek sent personal information, including browsing histories, to ByteDance servers without proper user consent, violating South Korean privacy laws. This data sharing reportedly occurred between July 2021 and December 2022 and affected users of several popular South Korean apps using DeepSeek's technology. DeepSeek now faces a potential fine and a corrective order.
Several Hacker News commenters express skepticism about the accusations against DeepSeek, pointing out the lack of concrete evidence presented and questioning the South Korean regulator's motives. Some speculate this could be politically motivated, related to broader US-China tensions and a desire to protect domestic companies like Kakao. Others discuss the difficulty of proving data sharing, particularly with the complexity of modern AI models and training data. A few commenters raise concerns about the potential implications for open-source AI models, wondering if they could be inadvertently trained on improperly obtained data. There's also discussion about the broader issue of data privacy and the challenges of regulating international data flows, particularly involving large tech companies.
The author claims to have found a vulnerability in YouTube's systems that allows retrieval of the email address associated with any YouTube channel for a $10,000 bounty. They describe a process involving crafting specific playlist URLs and exploiting how YouTube handles playlist sharing and unlisted videos to ultimately reveal the target channel's email address within a Google Account picker. While they provided Google with a proof-of-concept, they did not fully disclose the details publicly for ethical and security reasons. They emphasize the seriousness of this vulnerability, given the potential for targeted harassment and phishing attacks against prominent YouTubers.
HN commenters largely discussed the plausibility and specifics of the vulnerability described in the article. Some doubted the $10,000 price tag, suggesting it was inflated. Others questioned whether the vulnerability stemmed from a single bug or multiple chained exploits. A few commenters analyzed the technical details, focusing on the potential involvement of improperly configured OAuth flows or mismanaged access tokens within YouTube's systems. There was also skepticism about the ethical implications of disclosing the vulnerability details before Google had a chance to patch it, with some arguing responsible disclosure practices weren't followed. Finally, several comments highlighted the broader security risks associated with OAuth and similar authorization mechanisms.
The Asurion article outlines how to manage various Apple "intelligence" features, which personalize and improve user experience but also collect data. It explains how to disable Siri suggestions, location tracking for specific apps or entirely, personalized ads, sharing analytics with Apple, and features like Significant Locations and personalized recommendations in apps like Music and TV. The article emphasizes that disabling these features may impact the functionality of certain apps and services, and offers steps for both iPhone and Mac devices.
HN commenters largely express skepticism and distrust of Apple's "intelligence" features, viewing them as data collection tools rather than genuinely helpful features. Several comments highlight the difficulty in truly disabling these features, pointing out that Apple often re-enables them with software updates or buries the relevant settings deep within menus. Some users suggest that these "intelligent" features primarily serve to train Apple's machine learning models, with little tangible benefit to the end user. A few comments discuss specific examples of unwanted behavior, like personalized ads appearing based on captured data. Overall, the sentiment is one of caution and a preference for maintaining privacy over utilizing these features.
Summary of Comments ( 22 )
https://news.ycombinator.com/item?id=43253167
HN commenters discuss the implications of Google's data collection on Android even before app usage. Some highlight the irony of Google's privacy claims contrasted with their extensive tracking. Several express resignation, suggesting this behavior is expected from Google and other large tech companies. One commenter mentions a study showing Google collecting data even when location services are disabled, and another points to the difficulty of truly opting out of this tracking without significant technical knowledge. The discussion also touches upon the limitations of using alternative Android ROMs or de-Googled phones, acknowledging their usability compromises. There's a general sense of pessimism about the ability of users to control their data in the Android ecosystem.
The Hacker News post discussing The Register's article about Google's Android tracking practices has generated a substantial discussion with various viewpoints and insights.
Several commenters express concerns about the extent of data collection occurring before users even interact with apps. They discuss the implications of pre-installed apps and system-level services sending data to Google, highlighting the potential privacy risks, especially for users unaware of this background activity. Some debate the necessity of this data collection for functionality versus Google's potential exploitation for advertising or other purposes. The discussion also touches upon the difficulty for users to opt out of this tracking, given its integration within the Android operating system itself.
One recurring theme is the comparison of Android's data collection practices to those of Apple's iOS. Commenters debate which operating system provides better privacy, with some arguing that Apple's approach is more transparent and user-centric. Others point out that both companies collect significant user data, albeit through different mechanisms.
A few commenters delve into the technical aspects of the data collection, discussing the role of Firebase Cloud Messaging (FCM) and other system-level components. They explain how these components facilitate communication between devices and Google servers, enabling features like push notifications but also potentially contributing to the pre-app usage data collection.
The discussion also extends to the broader issue of data privacy in the tech industry. Commenters express frustration with the lack of control users have over their data and the pervasive nature of tracking across various platforms and services. Some advocate for stronger regulations and greater transparency from tech companies regarding data collection practices.
There are also more skeptical comments questioning the novelty or significance of the findings in The Register's article. Some suggest that this type of background data transmission is inherent in modern mobile operating systems and necessary for basic functionality. They argue that the article might be overstating the privacy implications or presenting information already known within the tech community.
Finally, some commenters offer practical advice for users concerned about privacy, such as using alternative ROMs like LineageOS or exploring privacy-focused mobile operating systems like GrapheneOS. They discuss the trade-offs between functionality and privacy, acknowledging that more privacy-centric options may require technical expertise or involve sacrificing certain features.