France's data protection watchdog, CNIL, fined Apple €8 million and Meta (Facebook's parent company) €60 million for violating EU privacy law. The fines stem from how the companies implemented targeted advertising on iOS and Android respectively. CNIL found that users were not given a simple enough mechanism to opt out of personalized ads; while both companies offered some control, users had to navigate multiple settings. Specifically, Apple defaulted to personalized ads requiring users to actively disable them, while Meta made ad personalization integral to its terms of service, requiring active consent to activate non-personalized ads. The CNIL considered both approaches violations of EU regulations that require clear and straightforward consent for personalized advertising.
Simon Willison speculates that Meta's decision to open-source its Llama large language model might be a strategic move to comply with the upcoming EU AI Act. The Act places greater regulatory burdens on "foundation models"—powerful, general-purpose AI models like Llama—especially those deployed commercially. By open-sourcing Llama, Meta potentially sidesteps these stricter regulations, as the open nature arguably diminishes Meta's direct control and thus their designated responsibility under the Act. This move allows Meta to benefit from community contributions and improvements while possibly avoiding the costs and limitations associated with being classified as a foundation model provider under the EU's framework.
Several commenters on Hacker News discussed the potential impact of the EU AI Act on Meta's decision to release Llama as "open source." Some speculated that the Act's restrictions on foundation models might incentivize companies to release models openly to avoid stricter regulations applied to closed-source, commercially available models. Others debated the true openness of Llama, pointing to the community license's restrictions on commercial use at scale, arguing that this limitation makes it not truly open source. A few commenters questioned if Meta genuinely intended to avoid the AI Act or if other factors, such as community goodwill and attracting talent, were more influential. There was also discussion around whether Meta's move was preemptive, anticipating future tightening of "open source" definitions within the Act. Some also observed the irony of regulations potentially driving more open access to powerful AI models.
Y Combinator, the prominent Silicon Valley startup accelerator, has publicly urged the White House to back the European Union's Digital Markets Act (DMA). They argue the DMA offers a valuable model for regulating large online platforms, promoting competition, and fostering innovation. YC believes US support would strengthen the DMA's global impact and encourage similar pro-competition regulations internationally, ultimately benefiting both consumers and smaller tech companies. They emphasize the need for interoperability and open platforms to break down the current dominance of "gatekeeper" companies.
HN commenters are generally supportive of the DMA and YC's stance. Several express hope that it will rein in the power of large tech companies, particularly Google and Apple, and foster more competition and innovation. Some question YC's motivations, suggesting they stand to benefit from increased competition. Others discuss the potential downsides, like increased compliance costs and fragmentation of the digital market. A few note the irony of a US accelerator supporting EU regulation, highlighting the perceived lack of similar action in the US. Some commenters also draw parallels with net neutrality and debate its effectiveness and impact. A recurring theme is the desire for more platform interoperability and less vendor lock-in.
Ecosia and Qwant, two European search engines prioritizing privacy and sustainability, are collaborating to build a new, independent European search index called the European Open Web Search (EOWS). This joint effort aims to reduce reliance on non-European indexes, promote digital sovereignty, and offer a more ethical and transparent alternative. The project is open-source and seeks community involvement to enrich the index and ensure its inclusivity, providing European users with a robust and relevant search experience powered by European values.
Several Hacker News commenters express skepticism about Ecosia and Qwant's ability to compete with Google, citing Google's massive data advantage and network effects. Some doubt the feasibility of building a truly independent index and question whether the joint effort will be significantly different from using Bing. Others raise concerns about potential bias and censorship, given the European focus. A few commenters, however, offer cautious optimism, hoping the project can provide a viable privacy-respecting alternative and contribute to a more decentralized internet. Some also express interest in the technical challenges involved in building such an index.
Several key EU regulations are slated to impact startups in 2025. The Data Act will govern industrial data sharing, requiring companies to make data available to users and others upon request, potentially affecting data-driven business models. The revised Payment Services Directive (PSD3) aims to enhance payment security and foster open banking, impacting fintechs with stricter requirements. The Cyber Resilience Act mandates enhanced cybersecurity for connected devices, adding compliance burdens on hardware and software developers. Additionally, the EU's AI Act, though expected later, could still influence product development strategies throughout 2025 with its tiered risk-based approach to AI regulation. These regulations necessitate careful preparation and adaptation for startups operating within or targeting the EU market.
Hacker News users discussing the upcoming EU regulations generally express concerns about their complexity and potential negative impact on startups. Several commenters predict these regulations will disproportionately burden smaller companies due to the increased compliance costs, potentially stifling innovation and favoring larger, established players. Some highlight specific regulations, like the Digital Services Act (DSA) and the Digital Markets Act (DMA), and discuss their potential consequences for platform interoperability and competition. The platform liability aspect of the DSA is also a point of contention, with some questioning its practicality and effectiveness. Others note the broad scope of these regulations, extending beyond just tech companies, and affecting sectors like manufacturing and AI. A few express skepticism about the EU's ability to effectively enforce these regulations.
The author argues that relying on US-based cloud providers is no longer safe for governments and societies, particularly in Europe. The CLOUD Act grants US authorities access to data stored by US companies regardless of location, undermining data sovereignty and exposing sensitive information to potential surveillance. This risk is compounded by increasing geopolitical tensions and the weaponization of data, making dependence on US cloud infrastructure a strategic vulnerability. The author advocates for shifting towards European-owned and operated cloud solutions that prioritize data protection and adhere to stricter regulatory frameworks like GDPR, ensuring digital sovereignty and reducing reliance on potentially adversarial nations.
Hacker News users largely agreed with the article's premise, expressing concerns about US government overreach and data access. Several commenters highlighted the lack of legal recourse for non-US entities against US government actions. Some suggested the EU's data protection regulations are insufficient against such power. The discussion also touched on the geopolitical implications, with commenters noting the US's history of using its technological dominance for political gain. A few commenters questioned the feasibility of entirely avoiding US cloud providers, acknowledging their advanced technology and market share. Others mentioned open-source alternatives and the importance of developing sovereign cloud infrastructure within the EU. A recurring theme was the need for greater digital sovereignty and reducing reliance on US-based services.
The EU's AI Act, a landmark piece of legislation, is now in effect, banning AI systems deemed "unacceptable risk." This includes systems using subliminal techniques or exploiting vulnerabilities to manipulate people, social scoring systems used by governments, and real-time biometric identification systems in public spaces (with limited exceptions). The Act also sets strict rules for "high-risk" AI systems, such as those used in law enforcement, border control, and critical infrastructure, requiring rigorous testing, documentation, and human oversight. Enforcement varies by country but includes significant fines for violations. While some criticize the Act's broad scope and potential impact on innovation, proponents hail it as crucial for protecting fundamental rights and ensuring responsible AI development.
Hacker News commenters discuss the EU's AI Act, expressing skepticism about its enforceability and effectiveness. Several question how "unacceptable risk" will be defined and enforced, particularly given the rapid pace of AI development. Some predict the law will primarily impact smaller companies while larger tech giants find ways to comply on paper without meaningfully changing their practices. Others argue the law is overly broad, potentially stifling innovation and hindering European competitiveness in the AI field. A few express concern about the potential for regulatory capture and the chilling effect of vague definitions on open-source development. Some debate the merits of preemptive regulation versus a more reactive approach. Finally, a few commenters point out the irony of the EU enacting strict AI regulations while simultaneously pushing for "right to be forgotten" laws that could hinder AI development by limiting access to data.
Summary of Comments ( 174 )
https://news.ycombinator.com/item?id=43770337
Hacker News commenters generally agree that the fines levied against Apple and Meta (formerly Facebook) are insignificant relative to their revenue, suggesting the penalties are more symbolic than impactful. Some point out the absurdity of the situation, with Apple being fined for giving users more privacy controls, while Meta is fined for essentially ignoring them. The discussion also questions the effectiveness of GDPR and similar regulations, arguing that they haven't significantly changed data collection practices and mostly serve to generate revenue for governments. Several commenters expressed skepticism about the EU's motives, suggesting the fines are driven by a desire to bolster European tech companies rather than genuinely protecting user privacy. A few commenters note the contrast between the EU's approach and that of the US, where similar regulations are seemingly less enforced.
The Hacker News post "Apple and Meta fined millions for breaching EU law" generated a modest number of comments, primarily focusing on the perceived absurdity of the fines and the EU's regulatory approach.
Several commenters expressed skepticism about the effectiveness and rationale behind the fines. One user questioned the logic of fining companies for allegedly violating user privacy while simultaneously mandating features (like ATT, App Tracking Transparency) that purportedly aim to protect user privacy. They highlighted the seemingly contradictory nature of being penalized for not adhering to a standard while also being forced to implement a mechanism that seemingly leads to that penalty.
Another commenter pointed out the relatively small amount of the fines compared to the companies' vast revenues, suggesting that such penalties are unlikely to deter future behavior. They argued that these fines essentially amount to a "cost of doing business" rather than a genuine deterrent.
The discussion also touched on the complexities of obtaining user consent and the practical challenges of adhering to regulations like GDPR. A commenter sarcastically remarked on the expectation that users should meaningfully engage with complex consent pop-ups, noting the impracticality of expecting users to carefully consider and understand the implications of every consent request.
One comment questioned the actual impact on user privacy, suggesting that the fines might be more about generating revenue for the EU than genuinely protecting users. They also suggested the possibility of regulatory capture, implying that regulators might be influenced by larger tech companies.
Finally, a comment highlighted the seeming disparity in the application of GDPR regulations, observing that smaller companies face stricter enforcement while larger companies often seem to escape significant consequences. They used the analogy of enforcing traffic laws strictly on bicycles while ignoring violations by large trucks.
In essence, the comments reflect a general sentiment of skepticism and cynicism towards the EU's approach to regulating tech giants, questioning the effectiveness and motivations behind the fines, and highlighting the practical difficulties and perceived inconsistencies in their application.