Entropy, in the context of information theory, quantifies uncertainty. A high-entropy system, like a fair coin flip, is unpredictable, as all outcomes are equally likely. A low-entropy system, like a weighted coin always landing on heads, is highly predictable. This uncertainty is measured in bits, representing the minimum number of yes/no questions needed to determine the outcome. Entropy also relates to compressibility: high-entropy data is difficult to compress because it lacks predictable patterns, while low-entropy data, with its inherent redundancy, can be compressed significantly. Ultimately, entropy provides a fundamental way to measure information content and randomness within a system.
Building a jet engine is incredibly difficult due to the extreme conditions and tight tolerances involved. The core operates at temperatures exceeding the melting point of its components, requiring advanced materials, intricate cooling systems, and precise manufacturing. Furthermore, the immense speeds and pressures within the engine necessitate incredibly balanced and durable rotating parts. Developing and integrating all these elements, while maintaining efficiency and reliability, presents a massive engineering challenge, requiring extensive testing and specialized knowledge.
Hacker News commenters generally agreed with the article's premise about the difficulty of jet engine manufacturing. Several highlighted the extreme tolerances required, comparing them to the width of a human hair. Some expanded on specific challenges like material science limitations at high temperatures and pressures, the complex interplay of fluid dynamics, thermodynamics, and mechanical engineering, and the rigorous testing and certification process. Others pointed out the geopolitical implications, with only a handful of countries possessing the capability, and discussed the potential for future innovations like 3D printing. A few commenters with relevant experience validated the author's points, adding further details on the intricacies of the manufacturing and maintenance processes. Some discussion also revolved around the contrast between the apparent simplicity of the Brayton cycle versus the actual engineering complexity required for its implementation in a jet engine.
Researchers at the University of Surrey have theoretically demonstrated that two opposing arrows of time can emerge within specific quantum systems. By examining the evolution of entanglement within these systems, they found that while one subsystem experiences time flowing forward as entropy increases, another subsystem can simultaneously experience time flowing backward, with entropy decreasing. This doesn't violate the second law of thermodynamics, as the overall combined system still sees entropy increase. This discovery offers new insights into the foundations of quantum mechanics and its relationship with thermodynamics, particularly in understanding the flow of time at the quantum level.
HN users express skepticism about the press release's interpretation of the research, questioning whether the "two arrows of time" are a genuine phenomenon or simply an artifact of the chosen model. Some suggest the description is sensationalized and oversimplifies complex quantum behavior. Several commenters call for access to the actual paper rather than relying on the university's press release, emphasizing the need to examine the methodology and mathematical framework to understand the true implications of the findings. A few commenters delve into the specifics of microscopic reversibility and entropy, highlighting the challenges in reconciling these concepts with the claims made in the article. There's a general consensus that the headline is attention-grabbing but potentially misleading without deeper analysis of the underlying research.
Classical physics is generally considered deterministic, meaning the future state of a system is entirely determined by its present state. However, certain situations appear non-deterministic due to our practical limitations. These include chaotic systems, where tiny uncertainties in initial conditions are amplified exponentially, making long-term predictions impossible, despite the underlying deterministic nature. Other examples involve systems with a vast number of particles, like gases, where tracking individual particles is infeasible, leading to statistical descriptions and probabilistic predictions, even though the individual particle interactions are deterministic. Finally, systems involving measurement with intrinsic limitations also exhibit apparent non-determinism, arising from our inability to perfectly measure the initial state. Therefore, non-determinism in classical physics is often a result of incomplete knowledge or practical limitations rather than a fundamental property of the theory itself.
Hacker News users discuss deterministic chaos and how seemingly simple classical systems can exhibit unpredictable behavior due to sensitivity to initial conditions. They mention examples like the double pendulum, dripping faucets, and billiard balls, highlighting how minute changes in starting conditions lead to vastly different outcomes, making long-term prediction impossible. Some argue that while these systems are technically deterministic, the practical limitations of measurement render them effectively non-deterministic. Others point to the three-body problem and the chaotic nature of weather systems as further illustrations. The role of computational limitations in predicting chaotic systems is also discussed, along with the idea that even if the underlying laws are deterministic, emergent complexity can make systems appear unpredictable. Finally, the philosophical implications of determinism are touched upon, with some suggesting that quantum mechanics introduces true randomness into the universe.
Summary of Comments ( 102 )
https://news.ycombinator.com/item?id=43684560
Hacker News users generally praised the article for its clear explanation of entropy, particularly its focus on the "volume of surprise" and use of visual aids. Some commenters offered alternative analogies or further clarifications, such as relating entropy to the number of microstates corresponding to a macrostate, or explaining its connection to lossless compression. A few pointed out minor perceived issues, like the potential confusion between thermodynamic and information entropy, and questioned the accuracy of describing entropy as "disorder." One commenter suggested a more precise phrasing involving "indistinguishable microstates", while another highlighted the significance of Boltzmann's constant in relating information entropy to physical systems. Overall, the discussion demonstrates a positive reception of the article's attempt to demystify a complex concept.
The Hacker News post "What Is Entropy?" with the URL https://news.ycombinator.com/item?id=43684560 has generated a moderate number of comments discussing various aspects of entropy and the linked article. Several commenters offer alternative explanations or nuances to the concept of entropy.
One commenter argues that entropy is better understood as the "spreading out of energy," emphasizing that organized energy tends to become more dispersed and less useful over time. This commenter clarifies that entropy is not simply disorder but rather a shift towards equilibrium and maximum probability. They use the example of a hot object cooling down in a room, with the heat energy spreading throughout the room until equilibrium is reached.
Another commenter focuses on the statistical nature of entropy, highlighting that a system with higher entropy has more possible microstates corresponding to its macrostate. This means there are more ways for the system to be in that particular macrostate, making it statistically more likely. They use the example of a deck of cards, where a shuffled deck has much higher entropy than a sorted deck because there are vastly more possible arrangements corresponding to a shuffled state.
Several commenters discuss the concept of "information entropy" and its relationship to thermodynamic entropy, pointing out similarities and subtle differences. One commenter emphasizes the context-dependent nature of entropy, mentioning how, for example, the entropy of a system can appear to decrease locally while the overall entropy of the universe continues to increase. They use the example of life on Earth, where complex, low-entropy structures are formed despite the increasing entropy of the universe as a whole.
Another thread of discussion revolves around the common misconception of entropy as "disorder," with commenters explaining that this is a simplification and can be misleading. They propose alternative analogies, such as "spread" or "options," to better convey the underlying principle.
A few commenters appreciate the article's clarity and its focus on the statistical interpretation of entropy. They find it a helpful introduction to the concept. However, some also critique the article for not delving into specific applications or more advanced aspects of entropy.
Overall, the comments provide a variety of perspectives and elaborations on the concept of entropy, highlighting its statistical nature, the importance of microstates and macrostates, and the connection between thermodynamic entropy and information entropy. They also address common misconceptions and offer alternative ways to think about this complex concept. While appreciative of the linked article, commenters also point out areas where it could be expanded or clarified.