The Asurion article outlines how to manage various Apple "intelligence" features, which personalize and improve user experience but also collect data. It explains how to disable Siri suggestions, location tracking for specific apps or entirely, personalized ads, sharing analytics with Apple, and features like Significant Locations and personalized recommendations in apps like Music and TV. The article emphasizes that disabling these features may impact the functionality of certain apps and services, and offers steps for both iPhone and Mac devices.
The Asurion article, "How to Turn Off Apple Intelligence," provides a comprehensive guide for users of Apple devices who wish to limit the amount of data Apple collects for the purposes of improving its products and services. The article focuses on several key areas where data collection occurs and details the steps necessary to disable or restrict this collection. It begins by explaining that "Apple Intelligence" is a broad term encompassing various data gathering processes, not a single, monolithic feature that can be toggled on or off. Therefore, managing data sharing requires adjusting several individual settings across different areas of the operating system.
The article carefully outlines how to manage "Personalized Recommendations," which leverage user data to suggest apps, music, and other content. It explains how to disable these recommendations within the App Store, Apple Music, Apple Books, Apple Podcasts, Apple TV, and for News notifications. The article provides specific instructions for each, including navigating to the relevant menus and toggling the appropriate switches. For instance, within the App Store, users can disable personalized recommendations by tapping on their profile icon, then selecting "Personalized Recommendations" and toggling the switch to the off position.
Furthermore, the article addresses "Location Services," a feature that allows Apple and third-party apps to access location data. It emphasizes the importance of understanding the various levels of location access, including "Never," "While Using the App," "Always," and "Ask Next Time." The article thoroughly explains how to adjust these settings for individual apps, allowing users to granularly control which apps have access to their location and under what circumstances. The authors also highlight the "System Services" section within Location Services, which allows users to manage location-based system features such as location-based alerts, significant locations, and sharing location with family members.
The article then delves into "Siri & Dictation," explaining how voice data is used to improve Siri's performance. It guides users through the process of disabling Siri and Dictation entirely, or alternatively, opting out of sharing audio recordings with Apple for review and improvement purposes. The steps involve navigating to the "Siri & Search" section within the device's settings and adjusting the relevant toggles.
"Usage & Diagnostics," another significant data collection area, is also covered in the article. This feature shares diagnostic and usage data with Apple to help identify and resolve issues. The article explains how to disable the automatic sharing of this data by navigating to the "Privacy & Security" settings, then to "Analytics & Improvements," and disabling "Share [Device] Analytics."
Finally, the article briefly touches upon "iCloud Analytics," which analyzes iCloud data to improve services like Siri and Photos. The article explains how to disable this feature for specific services, such as Photos, by navigating to the respective app's settings within iCloud.
In conclusion, the article serves as a detailed manual for users who want to take control of their data privacy on Apple devices. It meticulously outlines the various data collection points, provides step-by-step instructions for disabling or limiting data sharing, and emphasizes the importance of understanding the implications of each setting.
Summary of Comments ( 425 )
https://news.ycombinator.com/item?id=43385268
Hacker News users reacted with cynicism and resignation to the news that Amazon silently removed the Alexa voice recording privacy option. Many expressed the belief that Amazon never truly honored the setting in the first place, speculating the data was still collected regardless of user preference. Some commenters suggested that this move further erodes trust in Amazon and reinforces the perception that "big tech" companies prioritize data collection over user privacy. Others recommended alternative smart home solutions that respect privacy or simply avoiding such devices altogether. A few wondered about the technical or legal reasons behind the change, with some speculating it might be related to training large language models.
The Hacker News post titled "The Alexa feature 'do not send voice recordings' you enabled no longer available" sparked a discussion with several comments expressing concern and frustration over Amazon's removal of the feature that allowed users to prevent their voice recordings from being used for human review.
Many commenters felt betrayed by Amazon, highlighting the importance of such privacy controls. They expressed disappointment that Amazon seemingly reversed its stance on user privacy without adequate notification or explanation. The feeling of being misled was a recurring theme.
Some users speculated about the reasons behind Amazon's decision. One popular theory was that the data collected through human review was crucial for training Alexa's AI models, and discontinuing it might negatively impact the assistant's performance. Others suggested it could be related to cost-cutting measures, while some more cynical comments hinted at potential ulterior motives related to data collection and monetization.
Several commenters discussed alternative smart speakers and voice assistants, emphasizing privacy-focused options like Mycroft. This indicated a potential shift in user preference towards platforms that prioritize data privacy and transparency.
There was also a discussion on the effectiveness of such privacy controls in the first place. Some users questioned whether disabling human review genuinely prevented Amazon from accessing and utilizing user data. This skepticism reflected a broader distrust of large tech companies and their data practices.
A few commenters shared their personal experiences with Alexa, some recounting instances where they felt their privacy had been compromised. These anecdotes further fueled the conversation about the importance of robust privacy controls and transparency from tech companies.
The overall sentiment in the comments section was negative, with many users expressing disappointment and frustration towards Amazon's decision. The removal of the privacy feature sparked a broader discussion about the balance between technological advancement, user privacy, and the trustworthiness of large tech companies.