The Register reports that Google collects and transmits Android user data, including hardware identifiers and location, to its servers even before a user opens any apps or completes device setup. This pre-setup data collection involves several Google services and occurs during the initial boot process, transmitting information like IMEI, hardware serial number, SIM serial number, and nearby Wi-Fi access point details. While Google claims this data is crucial for essential services like fraud prevention and software updates, the article raises privacy concerns, particularly because users are not informed of this data collection nor given the opportunity to opt out. This behavior raises questions about the balance between user privacy and Google's data collection practices.
Mozilla's Firefox Terms state that they collect information you input into the browser, including text entered in forms, search queries, and URLs visited. This data is used to provide and improve Firefox features like autofill, search suggestions, and syncing. Mozilla emphasizes that they handle this information responsibly, aiming to minimize data collection, de-identify data where possible, and provide users with controls to manage their privacy. They also clarify that while they collect this data, they do not collect the content of web pages you visit unless you explicitly choose features like Pocket or Firefox Screenshots, which are governed by separate privacy policies.
HN users express concern and skepticism over Mozilla's claim to own "information you input through Firefox," interpreting it as overly broad and potentially invasive. Some argue the wording is likely a clumsy attempt to cover necessary data collection for features like sync and breach alerts, not a declaration of ownership over user-created content. Others point out the impracticality of Mozilla storing and utilizing such vast amounts of data, suggesting it's a legal safeguard rather than a reflection of actual practice. A few commenters highlight the contrast with Firefox's privacy-focused image, questioning the need for such strong language. Several users recommend alternative browsers like LibreWolf and Ungoogled Chromium, perceiving them as more privacy-respecting alternatives.
The blog post "Removing Jeff Bezos from My Bed" details the author's humorous, yet slightly unsettling, experience with Amazon's Echo Show 15 and its personalized recommendations. The author found that the device, positioned in their bedroom, consistently suggested purchasing a large, framed portrait of Jeff Bezos. While acknowledging the technical mechanisms likely behind this odd recommendation (facial recognition misidentification and correlated browsing data), they highlight the potential for such personalized advertising to become intrusive and even creepy within the intimate space of a bedroom. The post emphasizes the need for more thoughtful consideration of the placement and application of AI-powered advertising, especially as smart devices become increasingly integrated into our homes.
Hacker News users generally found the linked blog post humorous and relatable. Several commenters shared similar experiences with unwanted targeted ads, highlighting the creepiness factor and questioning the effectiveness of such highly personalized marketing. Some discussed the technical aspects of how these ads are generated, speculating about data collection practices and the algorithms involved. A few expressed concerns about privacy and the potential for misuse of personal information. Others simply appreciated the author's witty writing style and the absurdity of the situation. The top comment humorously suggested an alternative headline: "Man Discovers Retargeting."
Meta's Project Aria research kit consists of smart glasses and a wristband designed to gather first-person data like video, audio, eye-tracking, and location, which will be used to develop future AR glasses. This data is anonymized and used to train AI models that understand the real world, enabling features like seamless environmental interaction and intuitive interfaces. The research kit is not a consumer product and is only distributed to qualified researchers participating in specific studies. The project emphasizes privacy and responsible data collection, employing blurring and redaction techniques to protect bystanders' identities in the collected data.
Several Hacker News commenters express skepticism about Meta's Project Aria research kit, questioning the value of collecting such extensive data and the potential privacy implications. Some doubt the project's usefulness for AR development, suggesting that realistic scenarios are more valuable than vast amounts of "boring" data. Others raise concerns about data security and the possibility of misuse, drawing parallels to previous controversies surrounding Meta's data practices. A few commenters are more optimistic, seeing potential for advancements in AR and expressing interest in the technical details of the data collection process. Several also discuss the challenges of processing and making sense of such a massive dataset, and the limitations of relying solely on first-person visual data for understanding human behavior.
A UK watchdog is investigating Apple's compliance with its own App Tracking Transparency (ATT) framework, questioning why Apple's first-party apps seem exempt from the same stringent data collection rules imposed on third-party developers. The Competition and Markets Authority (CMA) is particularly scrutinizing how Apple gathers and uses user data within its own apps, given that it doesn't require user permission via the ATT pop-up prompts like third-party apps must. The probe aims to determine if this apparent double standard gives Apple an unfair competitive advantage in the advertising and app markets, potentially breaching competition law.
HN commenters largely agree that Apple's behavior is hypocritical, applying stricter tracking rules to third-party apps while seemingly exempting its own. Some suggest this is classic regulatory capture, where Apple leverages its gatekeeper status to stifle competition. Others point out the difficulty of proving Apple's data collection is for personalized ads, as Apple claims it's for "personalized experiences." A few commenters argue Apple's first-party data usage is less problematic because the data isn't shared externally, while others counter that the distinction is irrelevant from a privacy perspective. The lack of transparency around Apple's data collection practices fuels suspicion. A common sentiment is that Apple's privacy stance is more about marketing than genuine user protection. Some users also highlight the inherent conflict of interest in Apple acting as both platform owner and app developer.
A recent study reveals that CAPTCHAs are essentially a profitable tracking system disguised as a security measure. While ostensibly designed to differentiate bots from humans, CAPTCHAs allow companies like Google to collect vast amounts of user data for targeted advertising and other purposes. This system has cost users a staggering amount of time—an estimated 819 billion hours globally—and has generated nearly $1 trillion in revenue, primarily for Google. The study argues that the actual security benefits of CAPTCHAs are minimal compared to the immense profits generated from the user data they collect. This raises concerns about the balance between online security and user privacy, suggesting CAPTCHAs function more as a data harvesting tool than an effective bot deterrent.
Hacker News users generally agree with the premise that CAPTCHAs are exploitative. Several point out the irony of Google using them for training AI while simultaneously claiming they prevent bots. Some highlight the accessibility issues CAPTCHAs create, particularly for disabled users. Others discuss alternatives, such as Cloudflare's Turnstile, and the privacy implications of different solutions. The increasing difficulty and frequency of CAPTCHAs are also criticized, with some speculating it's a deliberate tactic to push users towards paid "captcha-free" services. Several commenters express frustration with the current state of CAPTCHAs and the lack of viable alternatives.
Tim investigated the precision of location data used for targeted advertising by requesting his own data from ad networks. He found that location information shared with these networks, often through apps on his phone, was remarkably precise, pinpointing his location to within a few meters. He successfully identified his own apartment and even specific rooms within it based on the location polygons provided by the ad networks. This highlighted the potential privacy implications of sharing location data with apps, demonstrating how easily and accurately individuals can be tracked even without explicit consent for precise location sharing. The experiment revealed a lack of transparency and control over how this granular location data is collected, used, and shared by advertising ecosystems.
HN commenters generally agreed with the article's premise that location tracking through in-app advertising is pervasive and concerning. Some highlighted the irony of privacy policies that claim not to share precise location while effectively doing so through ad requests containing latitude/longitude. Several discussed technical details, including the surprising precision achievable even without GPS and the potential misuse of background location data. Others pointed to the broader ecosystem issue, emphasizing the difficulty in assigning blame to any single actor and the collective responsibility of ad networks, app developers, and device manufacturers. A few commenters suggested potential mitigations like VPNs or disabling location services entirely, while others expressed resignation to the current state of surveillance. The effectiveness of "Limit Ad Tracking" settings was also questioned.
The Asurion article outlines how to manage various Apple "intelligence" features, which personalize and improve user experience but also collect data. It explains how to disable Siri suggestions, location tracking for specific apps or entirely, personalized ads, sharing analytics with Apple, and features like Significant Locations and personalized recommendations in apps like Music and TV. The article emphasizes that disabling these features may impact the functionality of certain apps and services, and offers steps for both iPhone and Mac devices.
HN commenters largely express skepticism and distrust of Apple's "intelligence" features, viewing them as data collection tools rather than genuinely helpful features. Several comments highlight the difficulty in truly disabling these features, pointing out that Apple often re-enables them with software updates or buries the relevant settings deep within menus. Some users suggest that these "intelligent" features primarily serve to train Apple's machine learning models, with little tangible benefit to the end user. A few comments discuss specific examples of unwanted behavior, like personalized ads appearing based on captured data. Overall, the sentiment is one of caution and a preference for maintaining privacy over utilizing these features.
Summary of Comments ( 22 )
https://news.ycombinator.com/item?id=43253167
HN commenters discuss the implications of Google's data collection on Android even before app usage. Some highlight the irony of Google's privacy claims contrasted with their extensive tracking. Several express resignation, suggesting this behavior is expected from Google and other large tech companies. One commenter mentions a study showing Google collecting data even when location services are disabled, and another points to the difficulty of truly opting out of this tracking without significant technical knowledge. The discussion also touches upon the limitations of using alternative Android ROMs or de-Googled phones, acknowledging their usability compromises. There's a general sense of pessimism about the ability of users to control their data in the Android ecosystem.
The Hacker News post discussing The Register's article about Google's Android tracking practices has generated a substantial discussion with various viewpoints and insights.
Several commenters express concerns about the extent of data collection occurring before users even interact with apps. They discuss the implications of pre-installed apps and system-level services sending data to Google, highlighting the potential privacy risks, especially for users unaware of this background activity. Some debate the necessity of this data collection for functionality versus Google's potential exploitation for advertising or other purposes. The discussion also touches upon the difficulty for users to opt out of this tracking, given its integration within the Android operating system itself.
One recurring theme is the comparison of Android's data collection practices to those of Apple's iOS. Commenters debate which operating system provides better privacy, with some arguing that Apple's approach is more transparent and user-centric. Others point out that both companies collect significant user data, albeit through different mechanisms.
A few commenters delve into the technical aspects of the data collection, discussing the role of Firebase Cloud Messaging (FCM) and other system-level components. They explain how these components facilitate communication between devices and Google servers, enabling features like push notifications but also potentially contributing to the pre-app usage data collection.
The discussion also extends to the broader issue of data privacy in the tech industry. Commenters express frustration with the lack of control users have over their data and the pervasive nature of tracking across various platforms and services. Some advocate for stronger regulations and greater transparency from tech companies regarding data collection practices.
There are also more skeptical comments questioning the novelty or significance of the findings in The Register's article. Some suggest that this type of background data transmission is inherent in modern mobile operating systems and necessary for basic functionality. They argue that the article might be overstating the privacy implications or presenting information already known within the tech community.
Finally, some commenters offer practical advice for users concerned about privacy, such as using alternative ROMs like LineageOS or exploring privacy-focused mobile operating systems like GrapheneOS. They discuss the trade-offs between functionality and privacy, acknowledging that more privacy-centric options may require technical expertise or involve sacrificing certain features.