The blog post "Vpternlog: When three is 100% more than two" explores the confusion surrounding ternary logic's perceived 50% increase in information capacity compared to binary. The author argues that while a ternary digit (trit) can hold three values versus a bit's two, this represents a 100% increase (three being twice as much as 1.5, which is the midpoint between 1 and 2) in potential values, not 50%. The post delves into the logarithmic nature of information capacity and uses the example of how many bits are needed to represent the same range of values as a given number of trits, demonstrating that the increase in capacity is closer to 63%, calculated using log base 2 of 3. The core point is that measuring increases in information capacity requires logarithmic comparison, not simple subtraction or division.
The blog post "Vpternlog: When three is 100% more than two" delves into a nuanced exploration of percentage calculations and their potential for misinterpretation, particularly when applied to ternary logic in the context of computer science. The author posits that a common misconception arises when comparing binary (two-state) systems to ternary (three-state) systems. Specifically, the erroneous assumption is frequently made that ternary logic offers a 50% increase in capacity or efficiency over binary logic. This assumption stems from the straightforward observation that three is 50% larger than two.
However, the author argues that this simplification overlooks the fundamental nature of percentage change calculations. A proper assessment requires considering the relative change in capacity. To illustrate, the author demonstrates that moving from two states to three states represents a 100% increase, not a 50% increase. This is because the increase (one additional state) is calculated relative to the original number of states (two), and one is 100% of two.
Further elaborating on this concept, the author emphasizes that percentages are inherently multiplicative factors, representing changes relative to an initial value. Therefore, an increase of 50% implies multiplying the original value by 1.5 (1 + 0.5), while an increase of 100% implies multiplying by 2 (1 + 1). In the case of transitioning from two states to three, the multiplication factor is indeed 1.5, but the percentage increase corresponding to this factor is 50%, not the other way around. The author elucidates this point with a clear mathematical breakdown of the percentage change formula: [(new value - old value) / old value] * 100%.
Finally, the post underscores the importance of precision in language and calculations, particularly when dealing with technical concepts like percentage change. The seemingly small difference between a 50% increase and a 100% increase can have significant implications in the realm of computer science and engineering, where even fractional differences in efficiency can translate to substantial real-world gains. The author's ultimate message is a cautionary one, urging readers to carefully consider the underlying mathematics when making comparisons based on percentages.
Summary of Comments ( 1 )
https://news.ycombinator.com/item?id=42753953
Hacker News users discuss the nuances of ternary logic's efficiency compared to binary. Several commenters point out that the article's claim of ternary being "100% more" than binary is misleading. They argue that the relevant metric is information density, calculated using log base 2, which shows ternary as only about 58% more efficient. Discussions also revolved around practical implementation challenges of ternary systems, citing issues with noise margins and the relative ease and maturity of binary technology. Some users mention the historical use of ternary computers, like Setun, while others debate the theoretical advantages and whether these outweigh the practical difficulties. A few also explore alternative bases beyond ternary and binary.
The Hacker News post "Vpternlog: When three is 100% more than two" (linking to an article about ternary logic) generated a moderate amount of discussion, with several commenters exploring different facets of ternary computing.
One of the most compelling threads revolved around the practical applications of ternary logic. A commenter pointed out the historical use of ternary in the Setun computer, highlighting its potential advantages in terms of efficiency for certain operations. This sparked further discussion about the reasons why ternary computing hasn't become mainstream, with theories ranging from the difficulty in manufacturing reliable ternary hardware to the entrenched dominance of binary logic in the computing industry. The challenges in designing ternary logic circuits were also mentioned, emphasizing the complexity compared to their binary counterparts.
Another interesting discussion thread emerged around the interpretation of the article's title. Some users debated the mathematically correct way to express the relationship between two and three, while others focused on the nuances of the percentage increase calculation. This led to a clarification about the difference between saying "three is 100% more than two" versus "three is 50% larger than two," highlighting the importance of precise language when discussing mathematical concepts.
Furthermore, a commenter brought up the topic of balanced ternary, a system that uses -1, 0, and 1 as its three states. They explained how this system simplifies certain mathematical operations and offered an example of representing numbers in balanced ternary. This introduced a different perspective on the potential benefits of ternary logic beyond the simple 0, 1, and 2 system.
Some users also discussed the potential benefits of ternary logic in specific applications, such as representing fractional values and optimizing certain algorithms. While acknowledging the challenges in widespread adoption, they suggested that ternary could hold promise for niche applications where its unique properties could be leveraged.
Finally, there was a brief mention of other alternative number systems beyond binary and ternary, acknowledging the broader landscape of computational possibilities and the ongoing exploration of different approaches to information processing.