PunchCard Key Backup is an open-source tool that allows you to physically back up cryptographic keys, like PGP or SSH keys, onto durable, punch-out cards. It encodes the key as a grid of punched holes, readable by a webcam and decodable by the software. This provides a low-tech, offline backup method resistant to digital threats and EMP attacks, ideal for long-term storage or situations where digital backups are unavailable or unreliable. The cards are designed to be easily reproducible and verifiable, and the project includes templates for printing your own cards.
The author removed the old-school "intermediate" certificate from their HTTPS site configuration. While this certificate was previously included to support older clients, modern clients no longer need it and its inclusion adds complexity, potential points of failure, and very slightly increases page load times. The author argues that maintaining compatibility with extremely outdated systems isn't worth the added hassle and potential security risks, especially considering the negligible real-world user impact. They conclude that simplifying the certificate chain improves security and performance while only affecting a minuscule, practically nonexistent portion of users.
HN commenters largely agree with the author's decision to drop support for legacy SSL/TLS versions. Many share anecdotes of dealing with similar compatibility issues, particularly with older embedded devices and niche software. Some discuss the balance between security and accessibility, acknowledging that dropping older protocols can cause breakage but ultimately increases security for the majority of users. Several commenters offer technical insights, discussing specific vulnerabilities in older TLS versions and the benefits of modern cipher suites. One commenter questions the author's choice of TLS 1.3 as a minimum, suggesting 1.2 as a more compatible, yet still reasonably secure, option. Another thread discusses the challenges of maintaining legacy systems and the pressure to upgrade, even when resources are limited. A few users mention specific tools and techniques for testing and debugging TLS compatibility issues.
Tinfoil, a YC-backed startup, has launched a platform offering verifiable privacy for cloud AI. It enables users to run AI inferences on encrypted data without decrypting it, preserving data confidentiality. This is achieved through homomorphic encryption and zero-knowledge proofs, allowing users to verify the integrity of the computation without revealing the data or model. Tinfoil aims to provide a secure and trustworthy way to leverage the power of cloud AI while maintaining full control and privacy over sensitive data. The platform currently supports image classification and stable diffusion tasks, with plans to expand to other AI models.
The Hacker News comments on Tinfoil's launch generally express skepticism and concern around the feasibility of their verifiable privacy claims. Several commenters question how Tinfoil can guarantee privacy given the inherent complexities of AI models and potential data leakage. There's discussion about the difficulty of auditing encrypted computation and whether the claimed "zero-knowledge" properties can truly be achieved in practice. Some users point out the lack of technical details and open-sourcing, hindering proper scrutiny. Others doubt the market demand for such a service, citing the costs and performance overhead associated with privacy-preserving techniques. Finally, there's a recurring theme of distrust towards YC companies making bold claims about privacy.
To secure President Obama's BlackBerry, the NSA developed a custom, highly-secured device nicknamed the Sectera Edge. It featured strong encryption, limited functionality (like no camera), and a heavily modified operating system to prevent malware and hacking. Only a small number of pre-screened contacts could communicate with the President through this device, and all communications were routed through secure government servers. Essentially, it was a stripped-down BlackBerry designed solely for secure communication, sacrificing features for unparalleled protection.
Hacker News users discussed the logistical and security challenges of securing a President's mobile device. Several commenters highlighted the inherent conflict between security and usability, questioning the actual functionality of Obama's secured BlackBerry. Some expressed skepticism about the claimed level of security, suggesting that a truly secure device would be severely limited in its capabilities. Others pointed out the irony of securing a device primarily used for communication with people likely using less secure devices, making the overall communication chain vulnerable. The discussion also touched on the use of hardware security modules and the difficulty in verifying the implementation of such security measures. A few users commented on the age of the article and how technology has changed since its publication.
This blog post analyzes "TM Sgnl," an Android app marketed as a secure messaging platform used by some Trump officials, including Mike Waltz. The author reverse-engineered the app, revealing it relies on the open-source Signal protocol but lacks crucial security features like forward secrecy and disappearing messages. Furthermore, TM Sgnl uses its own centralized server, raising concerns about data privacy and potential vulnerabilities compared to the official Signal app, which uses a federated server architecture. The analysis concludes that despite presenting itself as a secure alternative, TM Sgnl likely offers weaker security and potentially exposes user data to greater risk.
HN commenters discuss the implications of using an obscure, unofficial Signal fork, TM-SGNL, by Trump officials. Several express concerns about the security and trustworthiness of such a client, particularly given its lack of transparency and potential for vulnerabilities. Some question the choice, suggesting it stems from a misunderstanding of Signal's functionality, specifically the belief that official servers could access their data. Others point out the irony of using a supposedly more secure app while simultaneously broadcasting its usage, potentially defeating the purpose. The feasibility of sideloading this app onto government-issued devices is also debated. A few comments highlight the difficulty of truly secure communication, even with robust tools like Signal, if operational security practices are poor. The discussion also touches on the broader issues of government officials' use of encrypted messaging and the challenges of balancing transparency and privacy.
Digital cinema encryption uses a Key Delivery Message (KDM) unique to each movie and projector combination. This KDM, encrypted with the cinema's public key, contains the decryption key for the movie, which is itself encrypted with a studio-specific key. Upon receiving the KDM, the projector decrypts it using its private key, obtains the decryption key, and then uses that key to decrypt and play the movie. This system ensures that only authorized projectors can play specific movies within a given timeframe, preventing piracy.
Hacker News users discussed the complexities and vulnerabilities of the Digital Cinema Package (DCP) system. Several commenters highlighted the reliance on security through obscurity and the ease with which keys can be extracted, even with hardware security measures like the KDM. Concerns were raised about the practicality of the system, especially given the manual KDM delivery process and the potential for piracy stemming from insider threats or weak links in the distribution chain. The lack of robust end-to-end encryption and the limited lifespan of KDMs were also pointed out as significant security flaws. Some commenters drew parallels to other DRM systems, noting their eventual failure and suggesting that a similar fate might await DCP. Others discussed the challenges of updating the system due to the substantial investment in existing hardware. The feasibility and implications of using blockchain for key distribution were also briefly explored.
The blog post "AES and ChaCha" compares two popular symmetric encryption algorithms, highlighting ChaCha's simplicity and speed advantages, particularly in software implementations and resource-constrained environments. While AES, the Advanced Encryption Standard, is widely adopted and hardware-accelerated, its complex structure makes it more challenging to implement securely in software. ChaCha, designed with software in mind, offers easier implementation, potentially leading to fewer vulnerabilities. The post concludes that while both algorithms are considered secure, ChaCha's streamlined design and performance benefits make it a compelling alternative to AES, especially in situations where hardware acceleration isn't available or software implementation is paramount.
HN commenters generally praised the article for its clear and concise explanation of ChaCha and AES, particularly appreciating the accessible language and lack of jargon. Some discussed the practical implications of choosing one cipher over the other, highlighting ChaCha's performance advantages on devices lacking AES hardware acceleration and its resistance to timing attacks. Others pointed out that while simplicity is desirable, security and correctness are paramount in cryptography, emphasizing the rigorous scrutiny both ciphers have undergone. A few commenters delved into more technical aspects, such as the internal workings of the algorithms and the role of different cipher modes. One commenter offered a cautionary note, reminding readers that even well-regarded ciphers can be vulnerable if implemented incorrectly.
Researchers have demonstrated a method for cracking the Akira ransomware's encryption using sixteen RTX 4090 GPUs. By exploiting a vulnerability in Akira's implementation of the ChaCha20 encryption algorithm, they were able to brute-force the 256-bit encryption key in approximately ten hours. This breakthrough signifies a potential weakness in the ransomware and offers a possible recovery route for victims, though the required hardware is expensive and not readily accessible to most. The attack relies on Akira's flawed use of a 16-byte (128-bit) nonce, effectively reducing the key space and making it susceptible to this brute-force approach.
Hacker News commenters discuss the practicality and implications of using RTX 4090 GPUs to crack Akira ransomware. Some express skepticism about the real-world applicability, pointing out that the specific vulnerability exploited in the article is likely already patched and that criminals will adapt. Others highlight the increasing importance of strong, long passwords given the demonstrated power of brute-force attacks with readily available hardware. The cost-benefit analysis of such attacks is debated, with some suggesting the expense of the hardware may be prohibitive for many victims, while others counter that high-value targets could justify the cost. A few commenters also note the ethical considerations of making such cracking tools publicly available. Finally, some discuss the broader implications for password security and the need for stronger encryption methods in the future.
The blog post details a successful effort to decrypt files encrypted by the Akira ransomware, specifically the Linux/ESXi variant from 2024. The author achieved this by leveraging the power of multiple GPUs to significantly accelerate the brute-force cracking of the encryption key. The post outlines the process, which involved analyzing the ransomware's encryption scheme, identifying a weakness in its key generation (a 15-character password), and then using Hashcat with a custom mask attack on the GPUs to recover the decryption key. This allowed for the successful decryption of the encrypted files, offering a potential solution for victims of this particular Akira variant without paying the ransom.
Several Hacker News commenters expressed skepticism about the practicality of the decryption method described in the linked article. Some doubted the claimed 30-minute decryption time with eight GPUs, suggesting it would likely take significantly longer, especially given the variance in GPU performance. Others questioned the cost-effectiveness of renting such GPU power, pointing out that it might exceed the ransom demand, particularly for individuals. The overall sentiment leaned towards prevention being a better strategy than relying on this computationally intensive decryption method. A few users also highlighted the importance of regular backups and offline storage as a primary defense against ransomware.
Apple is reportedly planning to add support for encrypted Rich Communication Services (RCS) messaging between iPhones and Android devices. This means messages, photos, and videos sent between the two platforms will be end-to-end encrypted, providing significantly more privacy and security than the current SMS/MMS system. While no official timeline has been given, the implementation appears to be dependent on Google updating its Messages app to support encryption for group chats. This move would finally bring a modern, secure messaging experience to cross-platform communication, replacing the outdated SMS standard.
Hacker News commenters generally expressed skepticism about Apple's purported move towards supporting encrypted RCS messaging. Several doubted Apple's sincerity, suggesting it's a PR move to deflect criticism about iMessage lock-in, rather than a genuine commitment to interoperability. Some pointed out that Apple benefits from the "green bubble" effect, which pressures users to stay within the Apple ecosystem. Others questioned the technical details of Apple's implementation, highlighting the complexities of key management and potential vulnerabilities. A few commenters welcomed the move, though with reservations, hoping it's a genuine step toward better cross-platform messaging. Overall, the sentiment leaned towards cautious pessimism, with many anticipating further "Apple-style" limitations and caveats in their RCS implementation.
NIST has chosen HQC (Hamming Quasi-Cyclic) as the fifth and final public-key encryption algorithm to standardize for post-quantum cryptography. HQC, based on code-based cryptography, offers small public key and ciphertext sizes, making it suitable for resource-constrained environments. This selection concludes NIST's multi-year effort to standardize quantum-resistant algorithms, adding HQC alongside the previously announced CRYSTALS-Kyber for general encryption, CRYSTALS-Dilithium, FALCON, and SPHINCS+ for digital signatures. These algorithms are designed to withstand attacks from both classical and quantum computers, ensuring long-term security in a future with widespread quantum computing capabilities.
HN commenters discuss NIST's selection of HQC, expressing surprise and skepticism. Several highlight HQC's vulnerability to side-channel attacks and question its suitability despite its speed advantages. Some suggest SPHINCS+ as a more robust, albeit slower, alternative. Others note the practical implications of the selection, including the need for hybrid approaches and the potential impact on existing systems. The relatively small key and ciphertext sizes of HQC are also mentioned as positive attributes. A few commenters delve into the technical details of HQC and its underlying mathematical principles. Overall, the sentiment leans towards cautious interest in HQC, acknowledging its strengths while emphasizing its vulnerabilities.
The UK's National Cyber Security Centre (NCSC), along with GCHQ, quietly removed official advice recommending the use of Apple's device encryption for protecting sensitive information. While no official explanation was given, the change coincides with the UK government's ongoing push for legislation enabling access to encrypted communications, suggesting a conflict between promoting security best practices and pursuing surveillance capabilities. This removal raises concerns about the government's commitment to strong encryption and the potential chilling effect on individuals and organizations relying on such advice for data protection.
HN commenters discuss the UK government's removal of advice recommending Apple's encryption, speculating on the reasons. Some suggest it's due to Apple's upcoming changes to client-side scanning (now abandoned), fearing it weakens end-to-end encryption. Others point to the Online Safety Bill, which could mandate scanning of encrypted messages, making previous recommendations untenable. A few posit the change is related to legal challenges or simply outdated advice, with Apple no longer being the sole provider of strong encryption. The overall sentiment expresses concern and distrust towards the government's motives, with many suspecting a push towards weakening encryption for surveillance purposes. Some also criticize the lack of transparency surrounding the change.
Apple is challenging a UK court order demanding they create a "backdoor" into an encrypted iPhone belonging to a suspected terrorist. They argue that complying would compromise the security of all their devices and set a dangerous precedent globally, potentially forcing them to create similar backdoors for other governments. Apple claims the Investigatory Powers Act, under which the order was issued, doesn't authorize such demands and violates their human rights. They're seeking judicial review of the order, arguing existing tools are sufficient for the investigation.
HN commenters are largely skeptical of Apple's claims, pointing out that Apple already complies with lawful intercept requests in other countries and questioning whether this case is truly about a "backdoor" or simply about the scope and process of existing surveillance capabilities. Some suspect Apple is using this lawsuit as a PR move to bolster its privacy image, especially given the lack of technical details provided. Others suggest Apple is trying to establish legal precedent to push back against increasing government surveillance overreach. A few commenters express concern over the UK's Investigatory Powers Act and its implications for privacy and security. Several highlight the inherent conflict between national security and individual privacy, with no easy answers in sight. There's also discussion about the technical feasibility and potential risks of implementing such a system, including the possibility of it being exploited by malicious actors.
Delta Chat is a free and open-source messaging app that leverages existing email infrastructure for communication. Instead of relying on centralized servers, messages are sent and received as encrypted emails, ensuring end-to-end encryption through automatic PGP key management. This means users can communicate securely using their existing email addresses and providers, without needing to create new accounts or convince contacts to join a specific platform. Delta Chat offers a familiar chat interface with features like group chats, file sharing, and voice messages, all while maintaining the decentralized and private nature of email communication. Essentially, it transforms email into a modern messaging experience without compromising user control or security.
Hacker News commenters generally expressed interest in Delta Chat's approach to secure messaging by leveraging existing email infrastructure. Some praised its simplicity and ease of use, particularly for non-technical users, highlighting the lack of needing to manage separate accounts or convince contacts to join a new platform. Several users discussed potential downsides, including metadata leakage inherent in the email protocol and the potential for spam. The reliance on Autocrypt for key exchange was also a point of discussion, with some expressing concerns about its discoverability and broader adoption. A few commenters mentioned alternative projects with similar aims, like Briar and Status. Overall, the sentiment leaned towards cautious optimism, acknowledging Delta Chat's unique advantages while recognizing the challenges of building a secure messaging system on top of email.
The post "Learn How to Break AES" details a hands-on educational tool for exploring vulnerabilities in simplified versions of the AES block cipher. It provides a series of interactive challenges where users can experiment with various attack techniques, like differential and linear cryptanalysis, against weakened AES implementations. By manipulating parameters like the number of rounds and key size, users can observe how these changes affect the cipher's security and practice applying cryptanalytic methods to recover the encryption key. The tool aims to demystify advanced cryptanalysis concepts by providing a visual and interactive learning experience, allowing users to understand the underlying principles of these attacks and the importance of a full-strength AES implementation.
HN commenters discuss the practicality and limitations of the "block breaker" attack described in the article. Some express skepticism, pointing out that the attack requires specific circumstances and doesn't represent a practical break of AES. Others highlight the importance of proper key derivation and randomness, reinforcing that the attack exploits weaknesses in implementation rather than the AES algorithm itself. Several comments delve into the technical details, discussing the difference between a chosen-plaintext attack and a known-plaintext attack, as well as the specific conditions under which the attack could be successful. The overall consensus seems to be that while interesting, the "block breaker" is not a significant threat to AES security when implemented correctly. Some appreciate the visualization and explanation provided by the article, finding it helpful for understanding block cipher vulnerabilities in general.
Apple has removed its iCloud Advanced Data Protection feature, which offers end-to-end encryption for almost all iCloud data, from its beta software in the UK. This follows reported concerns from the UK's National Cyber Security Centre (NCSC) that the enhanced security measures would hinder law enforcement's ability to access data for investigations. Apple maintains that the feature will be available to UK users eventually, but hasn't provided a clear timeline for its reintroduction. While the feature remains available in other countries, this move raises questions about the balance between privacy and government access to data.
HN commenters largely agree that Apple's decision to pull its child safety features, specifically the client-side scanning of photos, is a positive outcome. Some believe Apple was pressured by the UK government's proposed changes to the Investigatory Powers Act, which would compel companies to disable security features if deemed a national security risk. Others suggest Apple abandoned the plan due to widespread criticism and technical challenges. A few express disappointment, feeling the feature had potential if implemented carefully, and worry about the implications for future child safety initiatives. The prevalence of false positives and the potential for governments to abuse the system were cited as major concerns. Some skepticism towards the UK government's motivations is also evident.
Bipartisan U.S. lawmakers are expressing concern over a proposed U.K. surveillance law that would compel tech companies like Apple to compromise the security of their encrypted messaging systems. They argue that creating a "back door" for U.K. law enforcement would weaken security globally, putting Americans' data at risk and setting a dangerous precedent for other countries to demand similar access. This, they claim, would ultimately undermine encryption, a crucial tool for protecting sensitive information from criminals and hostile governments, and empower authoritarian regimes.
HN commenters are skeptical of the "threat to Americans" angle, pointing out that the UK and US already share significant intelligence data, and that a UK backdoor would likely be accessible to the US as well. Some suggest the real issue is Apple resisting government access to data, and that the article frames this as a UK vs. US issue to garner more attention. Others question the technical feasibility and security implications of such a backdoor, arguing it would create a significant vulnerability exploitable by malicious actors. Several highlight the hypocrisy of US lawmakers complaining about a UK backdoor while simultaneously pushing for similar capabilities themselves. Finally, some commenters express broader concerns about the erosion of privacy and the increasing surveillance powers of governments.
The UK government is pushing for a new law, the Investigatory Powers Act, that would compel tech companies like Apple to remove security features, including end-to-end encryption, if deemed necessary for national security investigations. This would effectively create a backdoor, allowing government access to user data without their knowledge or consent. Apple argues that this undermines user privacy and security, making everyone more vulnerable to hackers and authoritarian regimes. The law faces strong opposition from privacy advocates and tech experts who warn of its potential for abuse and chilling effects on free speech.
HN commenters express skepticism about the UK government's claims regarding the necessity of this order for national security, with several pointing out the hypocrisy of demanding backdoors while simultaneously promoting end-to-end encryption for their own communications. Some suggest this move is a dangerous precedent that could embolden other authoritarian regimes. Technical feasibility is also questioned, with some arguing that creating such a backdoor is impossible without compromising security for everyone. Others discuss the potential legal challenges Apple might pursue and the broader implications for user privacy globally. A few commenters raise concerns about the chilling effect this could have on whistleblowers and journalists.
The blog post details how the author lost access to a BitLocker-encrypted drive due to a Secure Boot policy change, even with the correct password. The TPM chip, responsible for storing the BitLocker recovery key, perceived the modified Secure Boot state as a potential security breach and refused to release the key. This highlighted a vulnerability in relying solely on the TPM for BitLocker recovery, especially when dual-booting or making system configuration changes. The author emphasizes the importance of backing up recovery keys outside the TPM, as recovery through Microsoft's account proved difficult and unhelpful in this specific scenario. Ultimately, the data remained inaccessible despite possessing the password and knowing the modifications made to the system.
HN commenters generally concur with the article's premise that relying solely on BitLocker without additional security measures like a TPM or Secure Boot can be risky. Several point out how easy it is to modify boot order or boot from external media to bypass BitLocker, effectively rendering it useless against a physically present attacker. Some commenters discuss alternative full-disk encryption solutions like Veracrypt, emphasizing its open-source nature and stronger security features. The discussion also touches upon the importance of pre-boot authentication, the limitations of relying solely on software-based security, and the practical considerations for different threat models. A few commenters share personal anecdotes of BitLocker failures or vulnerabilities they've encountered, further reinforcing the author's points. Overall, the prevailing sentiment suggests a healthy skepticism towards BitLocker's security when used without supporting hardware protections.
iOS 18 introduces homomorphic encryption for some Siri features, allowing on-device processing of encrypted audio requests without decrypting them first. This enhances privacy by preventing Apple from accessing the raw audio data. Specifically, it uses a fully homomorphic encryption scheme to transform audio into a numerical representation amenable to encrypted computations. These computations generate an encrypted Siri response, which is then sent to Apple servers for decryption and delivery back to the user. While promising improved privacy, the post raises concerns about potential performance impacts and the specific details of the implementation, which Apple hasn't fully disclosed.
Hacker News users discussed the practical implications and limitations of homomorphic encryption in iOS 18. Several commenters expressed skepticism about Apple's actual implementation and its effectiveness, questioning whether it's fully homomorphic encryption or a more limited form. Performance overhead and restricted use cases were also highlighted as potential drawbacks. Some pointed out that the touted benefits, like encrypted search and image classification, might be achievable with existing techniques, raising doubts about the necessity of homomorphic encryption for these tasks. A few users noted the potential security benefits, particularly regarding protecting user data from cloud providers, but the overall sentiment leaned towards cautious optimism pending further details and independent analysis. Some commenters linked to additional resources explaining the complexities and current state of homomorphic encryption research.
Summary of Comments ( 23 )
https://news.ycombinator.com/item?id=44145202
HN users generally praised the project for its cleverness and simplicity, viewing it as a fun and robust offline backup method. Some discussed the practicality, pointing out limitations like the 255-bit key size being smaller than modern standards. Others suggested improvements such as using a different encoding scheme for greater density or incorporating error correction. Durability of the cards was also a topic, with users considering lamination or metal stamping for longevity. The overall sentiment was positive, appreciating the project as a novel approach to cold storage.
The Hacker News post titled "Show HN: PunchCard Key Backup" generated a moderate discussion with several interesting comments. Many commenters expressed appreciation for the novelty and physicality of the punchcard backup system, contrasting it with the more abstract and digital nature of typical key backup methods.
One commenter highlighted the advantage of this system being resistant to electromagnetic pulses (EMPs), a concern for some individuals preparing for disaster scenarios. They further elaborated on the potential longevity of punchcards, pointing out their durability and resistance to data degradation over time compared to electronic storage media. Another commenter echoed this sentiment, emphasizing the robustness and simplicity of the punchcard approach.
Several commenters discussed the practicality of the system. One questioned the number of keys that could be reasonably stored on a punchcard, while another suggested potential improvements like using a more robust material than card stock for the punchcards. The discussion also touched upon the potential for errors during the punching process and the possibility of developing tools to assist with accurate punching.
One user jokingly compared the method to storing secrets on bananas, alluding to the unusual nature of using fruit for data storage, while acknowledging the cleverness of the punchcard concept.
Some commenters explored the historical context of punchcards, drawing parallels to their use in early computing. One mentioned the potential for using existing punchcard readers to interface with the backup system, bridging the gap between this modern application and its historical roots.
The security aspect was also addressed. A commenter raised the concern that punchcards might not be as secure as other backup methods if not stored carefully, as they are visually decipherable. This led to a discussion about the importance of physical security in any backup strategy, regardless of the medium.
Overall, the comments reflected a mixture of amusement, appreciation for the ingenuity, and practical considerations regarding the punchcard key backup system. The discussion highlighted the trade-offs between simplicity, durability, security, and practicality inherent in this unconventional approach.