The IEEE Spectrum article explores do-it-yourself methods to combat cybersickness, the nausea and disorientation experienced in virtual reality. It highlights the mismatch between visual and vestibular (inner ear) cues as the root cause. Suggested remedies include matching in-game movements with real-world actions, widening the field of view, reducing latency, stabilizing the horizon, and taking breaks. The article also discusses software solutions like reducing peripheral vision and adding a fixed nose point, as well as physical aids like ginger and wristbands stimulating the P6 acupuncture point. While scientific backing for some methods is limited, the article offers a range of potential solutions for users to experiment with and find what works best for them.
John Carmack's talk at Upper Bound 2025 focused on the complexities of AGI development. He highlighted the immense challenge of bridging the gap between current AI capabilities and true general intelligence, emphasizing the need for new conceptual breakthroughs rather than just scaling existing models. Carmack expressed concern over the tendency to overestimate short-term progress while underestimating long-term challenges, advocating for a more realistic approach to AGI research. He also discussed potential risks associated with increasingly powerful AI systems.
HN users discuss John Carmack's 2012 talk on "Independent Game Development." Several commenters reminisce about Carmack's influence and clear communication style. Some highlight his emphasis on optimization and low-level programming as key to achieving performance, particularly in resource-constrained environments like mobile at the time. Others note his advocacy for smaller, focused teams and "lean methodologies," contrasting it with the bloat they perceive in modern game development. A few commenters mention specific technical insights they gleaned from Carmack's talks or express disappointment that similar direct, technical presentations are less common today. One user questions whether Carmack's approach is still relevant given advancements in hardware and tools, sparking a debate about the enduring value of optimization and the trade-offs between performance and developer time.
WorldGen is an open-source Python library for procedurally generating 3D scenes. It aims to be versatile, supporting various use cases like game development, VR/XR experiences, and synthetic data generation. Users define scenes declaratively using a YAML configuration file, specifying elements like objects, materials, lighting, and camera placement. WorldGen boasts a modular and extensible design, allowing for the integration of custom object generators and modifiers. It leverages Blender as its rendering backend, exporting scenes in common 3D formats.
Hacker News users generally praised WorldGen's potential and its open-source nature, viewing it as a valuable tool for game developers, especially beginners or those working on smaller projects. Some expressed excitement about the possibilities for procedural generation and the ability to create diverse and expansive 3D environments. Several commenters highlighted specific features they found impressive, such as the customizable parameters, real-time editing, and export compatibility with popular game engines like Unity and Unreal Engine. A few users questioned the performance with large and complex scenes, and some discussed potential improvements, like adding more biomes or improving the terrain generation algorithms. Overall, the reception was positive, with many eager to experiment with the tool.
This pull request introduces initial support for Apple's visionOS platform in the Godot Engine. It adds a new build target enabling developers to create and export Godot projects specifically for visionOS headsets. The implementation leverages the existing xr
interface and builds upon the macOS platform support, allowing developers to reuse existing XR projects and code with minimal modifications. This preliminary support focuses on enabling core functionality and rendering on the device, paving the way for more comprehensive visionOS features in future updates.
Hacker News users generally expressed excitement about Godot's upcoming native visionOS support, viewing it as a significant step forward for the engine and potentially a game-changer for VR/AR development. Several commenters praised Godot's open-source nature and its commitment to cross-platform compatibility. Some discussed the potential for new types of games and experiences enabled by visionOS and the ease with which existing Godot projects could be ported. A few users raised questions about Apple's closed ecosystem and its potential impact on the openness of Godot's implementation. The implications of Apple's developer fees and App Store policies were also briefly touched upon.
Researchers at Nagoya University have found that a specific, broadband sound, dubbed "pink noise," can reduce motion sickness symptoms. In a driving simulator experiment, participants exposed to pink noise experienced significantly less severe symptoms compared to those who listened to no sound or white noise. The study suggests that pink noise may suppress the conflict between visual and vestibular sensory information, which is believed to be the primary cause of motion sickness. This discovery could lead to new non-invasive methods for alleviating motion sickness in various situations, such as in vehicles or virtual reality environments.
Hacker News users discuss the study with some skepticism, questioning the small sample size (17 participants) and lack of a placebo control. Several commenters express interest in the potential mechanism, wondering if the sound masks disturbing inner ear signals or if it simply provides a distraction. The specific frequency (100Hz) is noted, with speculation about its potential connection to bodily rhythms. Some users share personal anecdotes of using other sensory inputs like ginger or focusing on the horizon to combat motion sickness, while others mention existing solutions like scopolamine patches and wristbands that provide acupressure. A few commenters request more information about the nature of the sound, questioning if it's a pure tone or something more complex. Overall, the comments express a cautious optimism tempered by the need for more rigorous research.
Google Cloud's Immersive Stream for XR and other AI technologies are powering Sphere's upcoming "The Wizard of Oz" experience. This interactive exhibit lets visitors step into the world of Oz through a custom-built spherical stage with 100 million pixels of projected video, spatial audio, and interactive elements. AI played a crucial role in creating the experience, from generating realistic environments and populating them with detailed characters to enabling real-time interactions like affecting the weather within the virtual world. This combination of technology and storytelling aims to offer a uniquely immersive and personalized journey down the yellow brick road.
HN commenters were largely unimpressed with Google's "Wizard of Oz" tech demo. Several pointed out the irony of using an army of humans to create the illusion of advanced AI, calling it a glorified Mechanical Turk setup. Some questioned the long-term viability and scalability of this approach, especially given the high labor costs. Others criticized the lack of genuine innovation, suggesting that the underlying technology isn't significantly different from existing chatbot frameworks. A few expressed mild interest in the potential applications, but the overall sentiment was skepticism about the project's significance and Google's marketing spin.
This post introduces rotors as a practical alternative to quaternions and matrices for 3D rotations. It explains that rotors, like quaternions, represent rotations as a single action around an arbitrary axis, but offer a simpler, more intuitive geometric interpretation based on the concept of "geometric algebra." The author argues that rotors are easier to understand and implement, visually demonstrating their geometric meaning and providing clear code examples in Python. The post covers basic rotor operations like creating rotations from an axis and angle, composing rotations, and applying rotations to vectors, highlighting rotors' computational efficiency and stability.
Hacker News users discussed the practicality and intuitiveness of using rotors for 3D rotations. Some found the rotor approach more elegant and easier to grasp than quaternions, especially appreciating the clear geometric interpretation and connection to bivectors. Others questioned the claimed advantages, arguing that quaternions remain the superior choice for performance and established library support. The potential benefits of rotors in areas like interpolation and avoiding gimbal lock were acknowledged, but some commenters felt the article didn't fully demonstrate these advantages convincingly. A few requested more comparative benchmarks or examples showcasing rotors' practical superiority in specific scenarios. The lack of widespread adoption and existing tooling for rotors was also raised as a barrier to entry.
Thomas Kole's project offers a 3D reconstruction of Tenochtitlan, the capital of the Aztec empire, circa 1519. Built using Blender, the model aims for historical accuracy based on archaeological data, historical accounts, and codices. The interactive website allows users to explore the city, featuring key landmarks like the Templo Mayor, palaces, canals, and causeways, offering a vivid visualization of this pre-Columbian metropolis. While still a work in progress, the project strives to present a detailed and immersive experience of what Tenochtitlan may have looked like before the Spanish conquest.
HN users largely praised the 3D reconstruction of Tenochtitlan, calling it "beautiful," "amazing," and "impressive" work. Several commenters pointed out the value of such visualizations for understanding history and engaging with the past in a more immersive way. Some discussed the technical aspects of the project, inquiring about the software used and the challenges of creating such a detailed model. Others expressed interest in similar reconstructions of other historical cities, like Constantinople or Rome. A few commenters also delved into the historical context, discussing the Aztec empire, its conquest by the Spanish, and the modern-day location of Tenochtitlan beneath Mexico City. One commenter questioned the accuracy of certain details in the reconstruction, prompting a discussion about the available historical evidence and the inherent limitations of such projects.
This blog post explores how video games can induce motion sickness and offers developers practical advice for mitigating it. The author explains how conflicting sensory information between visual motion and the vestibular system creates motion sickness, highlighting common culprits like field of view, camera acceleration, and head bob. The post advocates for robust accessibility options, suggesting features such as adjustable FOV, camera smoothing, disabling head bob, and providing comfort settings presets. By incorporating these considerations, developers can create more inclusive gaming experiences for players susceptible to motion sickness.
HN commenters largely agree that motion sickness in games is a significant accessibility issue, with several sharing personal experiences of being unable to play certain games due to it. Some suggest that developers often prioritize visual fidelity over comfort, neglecting those susceptible to motion sickness. Several commenters offer specific technical suggestions for mitigating the problem, including adjustable FOV, head bob reduction, and implementing "comfort modes" with features like vignette filters. A few mention that the prevalence of first-person perspective in modern games exacerbates the issue and highlight the need for more third-person options or improved camera controls. There's also discussion around the physiological basis of motion sickness and the varying susceptibility among individuals. One commenter suggests that VR sickness and game motion sickness are distinct experiences with different triggers.
Karl Guttag analyzes the newly announced "Halliday" AR glasses, skeptical of their claimed capabilities. He argues that the demonstrated "AI features" like real-time language translation and object recognition are likely pre-programmed demos, not actual artificial intelligence. Guttag points to the lack of specific technical details, reliance on pre-recorded videos, and improbable battery life as evidence. He concludes that the Halliday glasses, while potentially impressive AR technology, are almost certainly overselling their AI integration and are more likely sophisticated augmented reality, not AI-powered, glasses.
HN commenters discuss the practicality and potential invasiveness of the Halliday glasses. Several express skepticism about the claimed battery life, especially given the purported onboard processing power. Others question the usefulness of constant AR overlays and raise privacy concerns related to facial recognition and data collection. Some suggest alternative approaches, like bone conduction audio and smaller, simpler displays for notifications. The closed-source nature of the project also draws criticism, with some arguing it limits community development and fosters distrust. Finally, the high price point is mentioned as a significant barrier to entry.
The Vatican's website offers a free, immersive digital experience of St. Peter's Basilica. Users can explore high-resolution 360° panoramic views of both the Basilica's interior and exterior, including spaces not typically accessible to the public. This virtual tour allows detailed examination of the art, architecture, and religious significance of the Basilica, providing a rich and engaging experience for anyone interested in experiencing this iconic landmark from anywhere in the world.
HN commenters generally found the Vatican's digital twin of St. Peter's Basilica underwhelming. Several criticized the low resolution and poor quality of the 3D model, especially given the readily available high-resolution scans and photographic data. Others noted the lack of interactivity and limited navigation, comparing it unfavorably to other virtual museum experiences. Some suggested the project seemed rushed and poorly executed, speculating about potential internal politics or technical limitations at play. A few commenters expressed interest in a higher-fidelity version, but the prevailing sentiment was disappointment with the current offering.
Summary of Comments ( 6 )
https://news.ycombinator.com/item?id=44080840
HN commenters generally agree that cybersickness is a real and sometimes debilitating issue. Several suggest physical remedies like ginger or Dramamine, while others focus on software and hardware solutions. A common thread is matching the in-game FOV to the user's real-world peripheral vision, and minimizing latency. Some users have found success with specific VR games or headsets that prioritize these factors. A few commenters mention the potential for VR sickness to lessen with continued exposure, a sort of "VR legs" phenomenon, but there's disagreement on its effectiveness. Overall, the discussion highlights a variety of potential solutions, from simple home remedies to more technical approaches.
The Hacker News post "DIY Cybersickness Remedies" linking to an IEEE Spectrum article on the same topic has generated a moderate discussion with several insightful comments.
Many commenters share their personal experiences and remedies for cybersickness. One compelling comment thread discusses the effectiveness of ginger, a common remedy for motion sickness, in alleviating cybersickness symptoms. Some users report its efficacy, while others find it less helpful, highlighting the subjective nature of the condition and its remedies.
Another commenter points out the crucial role of the user interface in inducing or mitigating cybersickness. They argue that poorly designed interfaces, especially those with rapid, unexpected movements or a disconnect between visual and physical motion cues, are major contributors to the problem. This suggests that focusing on better UI/UX design in VR and other immersive technologies could be a key preventative measure.
A few commenters discuss the phenomenon of "VR legs," the feeling of instability or disorientation after extended VR use, which is distinct but related to cybersickness. This highlights the broader challenges of adapting to virtual environments and the need for further research into the physiological and psychological effects of immersive technologies.
Several users mention specific hardware and software solutions, including higher refresh rates, lower latency, and field-of-view adjustments, as factors that can influence cybersickness. This emphasizes the ongoing technological evolution in VR and the potential for future advancements to minimize these issues.
The discussion also touches upon the potential for adaptation, with some users reporting a decrease in cybersickness symptoms with continued VR use. However, others note persistent sensitivity, suggesting individual variation in susceptibility.
Finally, a few commenters mention the potential for biofeedback and other training methods to help users manage cybersickness, offering a more proactive approach to tackling the problem.
Overall, the comments section offers a valuable collection of anecdotal evidence, practical tips, and insightful observations about the causes, effects, and potential solutions for cybersickness, reflecting the ongoing challenges and evolving understanding of this issue in the context of emerging immersive technologies.