The original Macintosh's 512x342 resolution stemmed from a confluence of technical and cost constraints. Apple aimed for a 1:1 pixel mapping to the printer, allowing WYSIWYG functionality. The chosen printer, the Canon LBP-CX, had a resolution of 300 DPI, and a 72 DPI screen resolution allowed for an integer scaling factor of 4, simplifying the mapping. Maintaining this proportion on a 9-inch diagonal screen resulted in 512x342 pixels. While other resolutions were considered, this combination balanced screen size, memory demands, and processing power limitations of the Motorola 68000 processor. Ultimately, the goal was to create a usable and affordable computer, and the chosen resolution facilitated this while enabling a key feature, on-screen print preview.
Early Unix's file system imposed significant limitations on filenames. Initially, the Version 1 file system only supported 6-character filenames with a 2-character extension, totaling 8. Version 2 extended this to 14 characters, but still without any directory hierarchy support. The move to a hierarchical file system with Version 5 further restricted filenames to 14 characters total, without separate extensions. This 14-character limit persisted for a surprisingly long time, even into the early days of Linux and BSD. The restrictions stemmed from the structure of the i-node, which held file metadata, and a focus on simplicity and efficient use of limited storage capacity. Later versions of Unix and its derivatives gradually increased the limit to 255 characters and beyond.
HN commenters discuss the historical context of early Unix filename limitations, with some pointing out that PDP-11 directories were effectively single-level and thus short filenames were less problematic. Others mention the influence of punched cards and teletypes on early computing conventions, including filename length. Several users shared anecdotes about working with these older systems and the creative workarounds employed to manage the restrictions. The technical reasons behind the limitations, such as inode structure and memory constraints, are also explored. One commenter highlights the blog author's incorrect assertion about the original ls
command, clarifying its actual behavior with early Unix versions. Finally, the discussion touches on the evolution of filename lengths in later Unix versions and other operating systems.
This blog post details how to install Windows NT 4.0 Server within a Proxmox virtual machine. The process involves creating a new VM in Proxmox, using an IDE hard disk and a legacy network card. Crucially, the installation requires a modified Windows NT 4.0 ISO image with updated drivers to support the virtualized environment. The author provides a download link to a pre-patched ISO, simplifying the process. After configuring the VM and attaching the ISO, the standard Windows NT 4.0 installation process is followed within the Proxmox console. The post also briefly covers installing the guest agent for enhanced integration with Proxmox.
Several commenters on Hacker News expressed nostalgia for Windows NT 4.0 Server, recalling its stability and simplicity compared to later Windows server versions. Some discussed specific use cases, like running legacy applications or exploring older technologies. Others shared personal anecdotes about their experiences with NT 4.0, highlighting its role in their early IT careers. A few commenters offered tips on the installation process, including workarounds for potential issues and suggestions for optimizing performance within a Proxmox environment. One user pointed out the potential security risks of running such an outdated operating system.
The rise of affordable hobby computers in the late 1970s and early 1980s fostered a unique culture driven by experimentation, collaboration, and a DIY ethos. Individuals, often lacking formal training, learned programming, built hardware, and shared their creations through clubs, magazines, and informal networks. This vibrant community fueled innovation, leading to the development of new software, hardware peripherals, and even entire operating systems. The spirit of open sharing and collaborative development significantly shaped the early personal computer industry and contributed to the rapid pace of technological advancement during this period.
Hacker News users discussed the nostalgic elements of the hobby computer era, highlighting the accessibility and affordability of early machines compared to modern tech. Several commenters emphasized the joy of learning to program and tinker with hardware, fostering a sense of ownership and deep understanding that's arguably lost today. The tight-knit community aspect and the sharing of knowledge through magazines and user groups were also fondly remembered. Some debated the impact of closed-source software and the increasing complexity of modern systems, contrasting it with the open, explorable nature of early personal computers. A few comments also explored the cyclical nature of technology, suggesting that the current maker movement and interest in retro computing could be seen as a resurgence of the hobbyist spirit.
The PC-98, a Japanese personal computer dominant throughout the 80s and 90s, fostered a unique and isolated software ecosystem. Its high resolution graphics, driven by the needs of Japanese text display, and proprietary architecture resulted in a wealth of distinctive games and applications rarely seen elsewhere. While expensive compared to IBM compatibles, its popularity in Japan stemmed from early adoption by businesses and a snowballing effect of software development tailored specifically to its hardware. This created a closed-loop system where the PC-98 thrived, insulated from the global PC market, eventually giving way to more standardized platforms in the late 90s. Its legacy, however, remains a fascinating example of a parallel computing world.
Hacker News users discuss the unique characteristics of Japan's PC-98, praising its high-quality sound and graphics for its time. Several commenters reminisce about using the platform, highlighting specific games and the distinct experience of Japanese computing culture during that era. Some lament the lack of PC-98 emulation options compared to other retro platforms, citing technical challenges in accurately replicating the system's intricacies. Others delve into the technical specifications, explaining the reasons behind the platform's isolation and the challenges it posed for international developers. The discussion also touches on the eventual decline of the PC-98, attributing it to the rising popularity of IBM PC compatibles and Windows 95. Several users shared links to relevant resources like emulators, ROM archives, and technical documentation for those interested in exploring the PC-98 further.
The blog post details the author's deep dive into debugging a mysterious "lake effect" graphical glitch appearing in their Area 51 5150 emulator. Through meticulous tracing and analysis of the CGA video controller's logic and interaction with the CPU, they discovered the issue stemmed from a subtle timing error in the emulator's handling of DMA requests during horizontal retrace. Specifically, the emulator wasn't correctly accounting for the CPU halting during these periods, leading to incorrect memory accesses and the characteristic shimmering "lake effect" on-screen. The fix involved a small adjustment to ensure accurate cycle counting and proper synchronization between the CPU and the video controller. This corrected the timing and eliminated the visual artifact, demonstrating the complexity of accurate emulation and the importance of understanding the intricate interplay of hardware components.
The Hacker News comments discuss the challenges and intricacies of debugging emulator issues, particularly in the context of the referenced blog post about an Area 5150 PC emulator and its "lake effect" graphical glitch. Several commenters praise the author's methodical approach and detective work in isolating the bug. Some discuss the complexities of emulating hardware accurately, highlighting the differences between cycle-accurate and less precise emulation methods. A few commenters share their own experiences debugging similar issues, emphasizing the often obscure and unexpected nature of such bugs. One compelling comment thread dives into the specifics of CGA palette registers and how their behavior contributed to the problem. Another interesting exchange explores the challenges of maintaining open-source projects and the importance of clear communication and documentation for collaborative debugging efforts.
The RISC OS GUI, developed by Acorn, prioritizes speed and efficiency through cooperative multitasking and a unique event handling model. Its distinctive drag-and-drop interface, involving iconic "icons" for applications, files, and even system settings, allows direct manipulation of objects and actions. Menus, invoked by clicking and holding on the desktop or objects, offer context-sensitive options, further streamlining interaction. While unconventional compared to prevalent windowing systems, RISC OS emphasizes minimal overhead and direct user control, leading to a responsive and arguably intuitive experience.
Several commenters on Hacker News praised RISC OS's GUI for its speed, simplicity, and innovative features like the icon bar and context menus. Some noted its influence on other operating systems, particularly in the mobile space. Others discussed its unique cooperative multitasking model and its drawbacks compared to preemptive multitasking. A few users shared personal anecdotes about using RISC OS, highlighting its appeal to hobbyists and its dedicated community. Some lamented the lack of wider adoption and speculated about its potential had it been open-sourced earlier. The discussion also touched upon the challenges of porting it to modern hardware and the limitations of its single-user design.
A supposedly "lost" Japanese-language ROM for the Macintosh Plus has resurfaced. While believed to be rare and possibly unique, the author discovered the ROM was actually readily available within the Macintosh Plus's "System Tools" disk all along. The ROM simply enables KanjiTalk, the Japanese language support system for classic Macs, and its existence was documented, just not widely known or easily accessible online until now. Essentially, the mystery surrounding the ROM stemmed from obscurity and a misunderstanding rather than genuine rarity.
Commenters on Hacker News largely focused on the misleading nature of the article's title. Many pointed out that the ROM was never truly "lost," but simply undocumented and not widely distributed. Some shared personal anecdotes about using KanjiTalk in the past, highlighting its known existence. Others corrected inaccuracies in the article, like the claim about KanjiTalk's limited availability, noting that it was included on System disks. The general sentiment was one of mild amusement at the rediscovery of something not particularly hidden, coupled with appreciation for the author's enthusiasm and effort in exploring this piece of Macintosh history. A few users expressed interest in trying KanjiTalk and exploring its specific quirks and limitations.
In 2013, the author encountered the common "black screen" issue in Basilisk II, an emulator for classic 68k Macintosh computers, when attempting to run old versions of Windows. After extensive troubleshooting involving various graphics settings and configurations within Basilisk II, they finally discovered the problem stemmed from using Basilisk II's built-in graphics acceleration with Windows. Disabling acceleration by forcing Basilisk II into software rendering mode completely resolved the black screen issue, allowing Windows to boot and display correctly within the emulator. This fix also highlighted a performance difference between Basilisk II and SheepShaver, another classic Mac emulator, as SheepShaver didn't exhibit the same issue with Windows and graphics acceleration.
Commenters on Hacker News largely praised the author's detective work in resolving the Basilisk II black screen bug, with several noting the satisfying nature of such deep dives into obscure technical issues. Some shared their own experiences with Basilisk II and similar emulators, reminiscing about older Mac software and hardware. A few commenters offered additional technical insights, suggesting potential contributing factors or alternative solutions related to graphics acceleration and virtual machine configurations. One commenter pointed out a potential error in the author's description of the MMU, while another questioned the use of "infamous" to describe the bug, suggesting it wasn't widely known. The overall sentiment, however, was one of appreciation for the author's effort and the nostalgic value of revisiting older technology.
This website hosts a browser-based emulator of the Xerox NoteTaker, a portable Smalltalk-78 system developed in 1978. It represents a significant step in the evolution of personal computing, showcasing early concepts of overlapping windows, a bitmapped display, and a mouse-driven interface. The emulation, while not perfectly replicating the original hardware's performance, provides a functional recreation of the NoteTaker's software environment, allowing users to explore its unique Smalltalk implementation and experience a piece of computing history. This allows for experimentation with the system's class browser, text editor, and graphics capabilities, offering insight into the pioneering work done at Xerox PARC.
Hacker News users discuss the Smalltalk-78 emulator with a mix of nostalgia and technical curiosity. Several commenters reminisce about their experiences with early Smalltalk, highlighting its revolutionary impact on GUI development and object-oriented programming. Some express interest in the NoteTaker's unique features, like its pioneering use of a windowing system and a mouse. The practicality of NoteTaker's hardware limitations, particularly its limited memory, is also discussed. A few commenters delve into specific technical details, like the differences between Smalltalk-72, -76, and -78, and the challenges of emulating historic hardware. Others express appreciation for the preservation effort and the opportunity to experience a piece of computing history.
Fascinated by Snobol's unique string-centric nature and pattern matching capabilities, the author decided to learn the language. They found its table-driven implementation particularly intriguing, inspiring them to explore implementing a similar structure for a different language. This led to the creation of a small, experimental Forth interpreter written in Snobol, showcasing how Snobol's pattern matching could effectively parse and execute Forth code. The project served as a practical exercise to solidify their understanding of Snobol while exploring the underlying mechanics of language implementation.
Hacker News users discuss the original poster's experience learning SNOBOL and subsequently creating a toy Forth implementation. Several commenters express nostalgia for SNOBOL, praising its unique string manipulation capabilities and lamenting its relative obscurity today. Some discuss its influence on later languages like Icon and Perl. Others debate SNOBOL's performance characteristics and its suitability for various tasks. A few users share personal anecdotes about using SNOBOL in the past, including applications in bioinformatics and text processing. The discussion also touches on the differences between SNOBOL and Forth, with some commenters expressing interest in the poster's Forth implementation.
The Apple II MouseCard's interrupt requests (IRQs) are indeed synchronized with the vertical blanking interval (VBL). Through oscilloscope analysis and examining the MouseCard's firmware, the author confirmed that the card cleverly uses the VBL signal to time its counter, ensuring consistent IRQ generation every 1/60th of a second. This synchronization prevents screen tearing and jerky mouse movement, as updates are coordinated with the display refresh. Despite prior speculation and documentation suggesting otherwise, the investigation conclusively demonstrates the VBL-synced nature of the MouseCard's IRQ.
HN commenters discuss the intricacies of the Apple II MouseCard's interrupt handling, particularly its synchronization with the vertical blanking interval (VBL). Some express admiration for the clever engineering required to achieve stable mouse input within the constraints of the Apple II's hardware. One commenter recounts experiences with similar timing challenges on the Atari 8-bit and C64, emphasizing the difficulty of accurate timing without dedicated hardware support. Others delve into the specifics of the MouseCard's design, mentioning the use of a shift register and the challenges of debouncing button presses. The overall tone is one of appreciation for the ingenuity required to implement seemingly simple features on older hardware.
The blog post recounts the author's experience using Lilith, a workstation specifically designed for the Modula-2 programming language in the 1980s. Fascinated by Niklaus Wirth's work, the author acquired a Lilith and found it to be a powerful and elegant machine, deeply integrated with Modula-2. The post highlights the impressive speed of the system, the innovative windowing system, and the seamless integration of the Modula-2 development environment. Despite its advantages, the Lilith's specialized nature and limited software library ultimately led to its decline, making it a fascinating footnote in computing history.
HN commenters discuss Modula-2's strengths, primarily its clarity and strong typing, which fostered maintainable code. Some fondly recall using it for various projects, including operating systems and embedded systems, praising its performance and modularity. Others compare it to Oberon and discuss Wirth's design philosophy. Several lament its lack of widespread adoption, attributing it to factors like Wirth's resistance to extensions and the rise of C++. The lack of garbage collection and the complexity of its module system are also mentioned as potential downsides. Several commenters mention Wirth's preference for simpler systems and his perceived disdain for object-oriented programming. Finally, there's some discussion of alternative historical paths and the influence Modula-2 had on later languages.
Pascal for Small Machines explores the history and enduring appeal of Pascal, particularly its suitability for resource-constrained environments. The author highlights Niklaus Wirth's design philosophy of simplicity and efficiency, emphasizing how these principles made Pascal an ideal language for early microcomputers. The post discusses various Pascal implementations, from UCSD Pascal to modern variants, showcasing its continued relevance in embedded systems, retrocomputing, and educational settings. It also touches upon Pascal's influence on other languages and its role in shaping computer science education.
HN users generally praise the simplicity and elegance of Pascal, with several reminiscing about using Turbo Pascal. Some highlight its suitability for resource-constrained environments and embedded systems, comparing it favorably to C for such tasks. One commenter notes its use in the Apple Lisa and early Macs. Others discuss the benefits of strong typing and clear syntax for learning and maintainability. A few express interest in modern Pascal dialects like Free Pascal and Oxygene, while others debate the merits of static vs. dynamic typing. Some disagreement arises over whether Pascal's enforced structure is beneficial or restrictive for larger projects.
This GitHub repository contains the source code for QModem 4.51, a classic DOS-based terminal emulation and file transfer program. Released under the GNU General Public License, the code offers a glimpse into the development of early dial-up communication software. It includes functionality for various protocols like XModem, YModem, and ZModem, as well as terminal emulation features. This release appears to be a preservation of the original QModem software, allowing for study and potential modification by interested developers.
Hacker News users discussing the release of QModem 4.51 source code express nostalgia for the software and dial-up BBS era. Several commenters reminisce about using QModem specifically, praising its features and reliability. Some discuss the challenges of transferring files over noisy phone lines and the ingenuity of the error correction techniques employed. A few users delve into the technical details of the code, noting the use of assembly language and expressing interest in exploring its inner workings. There's also discussion about the historical significance of QModem and its contribution to the early internet landscape.
Forty years ago, in 1982, the author joined Sun Microsystems, a startup at the time with only about 40 employees. Initially hired as a technical writer, the author quickly transitioned into a marketing role focused on the Sun-1 workstation, learning about the technology alongside the engineers. This involved creating marketing materials like brochures and presentations, attending trade shows, and generally spreading the word about Sun's innovative workstation. The author reflects fondly on this exciting period of growth and innovation at Sun, emphasizing the close-knit and collaborative atmosphere of a small company making a big impact in the burgeoning computer industry.
HN commenters discuss the author's apparent naiveté about Sun's business practices, particularly regarding customer lock-in through proprietary hardware and software. Some recall Sun's early open-source friendliness contrasting with their later embrace of closed systems. Several commenters share anecdotes about their own experiences with Sun hardware and software, both positive and negative, highlighting the high cost and complexity, but also the power and innovation of their workstations. The thread also touches on the cultural shift in the tech industry since the 80s, noting the different expectations and pace of work. Finally, some express nostalgia for the era and the excitement surrounding Sun Microsystems.
The IEEE Spectrum article details the rapid development of the Commodore 64, driven by a small, dedicated team of engineers at MOS Technology. Led by Jack Tramiel's aggressive pricing strategy, the team innovatively designed custom chips for sound, graphics, and memory management, significantly reducing costs and outperforming competitors. Facing tight deadlines and a demanding boss, they overcame technical challenges and internal skepticism to create a groundbreaking, affordable computer that sold tens of millions of units and democratized computing for a generation.
Hacker News commenters on the IEEE Spectrum article about the Commodore 64 largely reminisce about their experiences with the machine. Several shared personal anecdotes about learning to program on the C64, exploring its capabilities, and the impact it had on their careers. Some discussed the technical ingenuity involved in its creation, particularly its sound chip and surprisingly powerful graphics for the time. Others debated the role of "demos" in showcasing the C64's potential and driving innovation. A few commenters also pointed out the relative affordability of the C64, making it accessible to a wider audience and contributing to the home computer revolution. There's also a brief discussion about the limitations of the C64's keyboard and the challenges of working with its limited memory.
The blog post explores hidden "Easter eggs" within the 8-bit BASIC interpreters Bill Gates co-authored for the Altair 8800, Apple II, and other early personal computers. These undocumented features, accessible through specific inputs or memory locations, include messages like a copyright notice listing Gates and Paul Allen, a list of developers who worked on the interpreter, and even a small game called DONKEY.BAS, which challenged players to avoid hitting donkeys with a car. The author discovered these secrets through reverse engineering and experimentation, highlighting a playful and less-known side of Gates's early programming career. The Easter eggs serve as a fascinating glimpse into the history of personal computing and the personalities behind its creation.
Several commenters on Hacker News expressed appreciation for the historical context and nostalgia surrounding Gates's Easter eggs, with some sharing personal anecdotes of discovering them in their youth. A few pointed out that these Easter eggs, alongside the overall accessibility of early BASIC interpreters, played a role in inspiring a generation of programmers. Some discussed the technical limitations of the time and how Gates cleverly worked within them to create these hidden messages. The discussion also touched upon the shift in software development culture, contrasting the playful nature of these Easter eggs with the more corporate and security-conscious environment of today. A recurring sentiment was that such personal touches are now rare in modern software. Finally, several commenters corrected some technical details in the original article, such as the actual size of the Easter egg message displayed, and the mechanisms by which they were triggered.
The Evertop is a DIY open-source project aiming to create a modern, portable, and extremely low-power IBM XT compatible computer. It features a 4.2" e-ink display, mechanical keyboard, and boasts over 100 hours of battery life. Based on a Raspberry Pi Pico microcontroller emulating an NEC V20 processor, it can run original IBM XT software and DOS games. The project includes custom-designed PCBs and 3D-printed case files, making it reproducible for others. While still under development, the Evertop represents a unique blend of retro computing and modern hardware for a highly portable and energy-efficient experience.
Hacker News commenters generally expressed enthusiasm for the Evertop project, praising its unique combination of retro computing and modern e-ink technology. Several highlighted the impressive battery life and the appeal of a distraction-free writing device. Some questioned the practicality given the slow refresh rate of e-ink, particularly for gaming, while others suggested improvements like backlighting and a more ergonomic keyboard. A few commenters expressed interest in similar projects using different retro hardware, such as the Apple II or Commodore 64. There was also discussion about the challenges of sourcing parts and the potential market for such a niche device. Several users shared their personal experiences with similar e-ink projects and offered technical insights.
The blog post discusses the challenges and benefits of using older software for children's learning. While newer educational software often boasts flashy features, older programs can offer a simpler, more focused learning experience without the distractions of modern interfaces and internet connectivity. The author describes their process of restoring vintage educational software onto modern hardware, highlighting the technical hurdles involved in making older operating systems and software compatible. Ultimately, the post advocates for considering older software as a viable option for providing a safe, distraction-free digital learning environment for children.
Hacker News users discussed the benefits and challenges of using old software for children's learning. Some highlighted the appeal of simpler interfaces and the potential for focused learning without distractions like ads or internet access. Others emphasized the importance of curated experiences, acknowledging that while some older software can be valuable, much of it is simply obsolete. Several commenters mentioned the difficulty of getting old software to run on modern hardware and operating systems, with suggestions like DOSBox and virtual machines offered as solutions. The idea of a curated repository of suitable older software was also raised, but concerns about copyright and the ongoing maintenance effort were also noted. A few users pointed out the educational value in teaching children how to deal with older technology and its limitations, viewing it as a form of digital literacy.
Michael Steil's blog post explores the behavior of illegal or undocumented opcodes on the MOS 6502 processor. Rather than simply halting or throwing an error, these opcodes execute as combinations of shorter, legal instructions. The 6502's instruction decoding mechanism, which combines bits from different parts of the opcode byte, leads to these unintended combinations. Steil demonstrates how these combined instructions can be predicted and even utilized for creative programming tricks, offering a deep dive into the processor's architecture. He provides examples of how these illegal opcodes can manipulate registers and flags in unexpected ways, opening a window into the inner workings of this classic CPU.
HN commenters discuss the cleverness of undocumented opcodes on the 6502, with several sharing their experiences using them in demos and games. Some appreciated the author's clear explanations and visualizations of the normally chaotic behavior, while others reminisced about discovering and exploiting these opcodes in their youth on platforms like the C64 and Apple II. A few highlighted the community effort in meticulously documenting these behaviors, comparing it to similar explorations of the Z80 and other CPUs. Some commenters also pointed out the article's brief mention of the security implications of these undefined instructions in modern contexts.
The blog post details the integration of a limited, pre-C89 compliant TCP/IP stack into the PRO/VENIX operating system using Slirp-CK, a small footprint networking library. This allows PRO/VENIX, a vintage Unix-like system, to connect to modern networks for tasks like downloading files. The implementation focuses on simplicity and compatibility with the system's older C compiler, intentionally avoiding more complex and modern networking features. While functional, the author acknowledges its limitations and describes it as "barely adequate," prioritizing the demonstration of networking capability over robust performance or complete standards compliance.
Hacker News users discuss the blog post about porting a TCP/IP stack (Slirp-CK) to the PRO/VENIX operating system. Several commenters express excitement and nostalgia for PRO/VENIX, sharing personal anecdotes about using it in the past. Some question the practical use cases, while others suggest potential applications like retro gaming or historical preservation. The technical details of the porting process are discussed, including the challenges of working with older hardware and software limitations. There's a general appreciation for the effort involved in preserving and expanding the capabilities of vintage systems. A few users mention interest in contributing to the project or exploring similar endeavors with other older operating systems.
The Atari 1200XL, intended as a high-end successor to the Atari 800, was a commercial failure due to a combination of poor design choices and unfortunate timing. Released in 1983, it boasted a sleek, compact design and some minor hardware improvements. However, its redesigned, non-standard keyboard layout, lack of function keys, limited memory expansion options, and higher price point compared to the existing 800 alienated consumers. Its launch coincided with the video game crash of 1983, further hindering its chances of success. The 1200XL was quickly discontinued, paving the way for the more successful XL series, which addressed many of the 1200XL's shortcomings.
Hacker News users discuss the Atari 1200XL's failure, citing its high price, lack of backwards compatibility with popular peripherals, limited improvements over the existing 800XL, and poor marketing as key factors. Some commenters argue that its redesigned keyboard, though unpopular at the time, was actually superior. Others note the internal politics and mismanagement within Atari during this period as contributing to the console's demise. Several users share personal anecdotes of their experiences with the 1200XL, highlighting both its strengths and weaknesses. The overall sentiment is that the 1200XL was a missed opportunity for Atari, representing a pivotal moment in the company's decline.
The 6502 processor, known for its limitations, inspired clever programming tricks to optimize speed and memory. These "dirty tricks" leverage quirks like the processor's behavior during undocumented opcodes, zero-page addressing, and interactions between instructions and flags. Techniques include self-modifying code to dynamically alter instructions, using the carry flag for efficient branching, and exploiting specific instruction timings for precise delays. By understanding the 6502's nuances, programmers could achieve remarkable results despite the hardware constraints.
Hacker News users generally expressed appreciation for the article on 6502 programming tricks, finding it informative and nostalgic. Several commenters shared additional tricks or variations, including using the undocumented SAX
instruction and manipulating the stack for efficient data storage. Some discussed the cleverness borne out of the 6502's limitations, while others reminisced about using these techniques in their youth. A few pointed out the techniques' applicability to other architectures or modern resource-constrained environments. There was some debate about the definition of "dirty" vs. "clever" tricks, but the overall sentiment was positive towards the article's content and the ingenuity it showcased. The discussion also touched on the differences between assembly programming then and now, and the challenges of optimizing for limited resources.
In 2004, a blogger explored creating a striped RAID array using four USB floppy drives under OS X. Driven by curiosity and a desire for slightly faster floppy access, they used the then-available Disk Utility to create a RAID 0 set. While the resulting "RAID" technically worked and offered a minor performance boost over a single floppy, the setup was complex, prone to errors due to the floppies' unreliability, and ultimately impractical. The author concluded the experiment was more of a fun exploration of system capabilities than a genuinely useful storage solution.
Hacker News users reacted with a mix of nostalgia and amusement to the 2004 article about creating a striped RAID array from USB floppy drives. Several commenters reminisced about the era's slow transfer speeds and the impracticality of the setup, highlighting the significant advancements in storage technology since then. Some appreciated the ingenuity and "mad science" aspect of the project, while others questioned its real-world usefulness. A few pointed out the potential data integrity issues with floppy disks, making the RAID setup even less reliable. The dominant sentiment was one of lighthearted appreciation for a bygone era of computing.
Christopher Drum has ported Infocom's Z-machine, specifically the Unix version 1.1, to a single executable using Cosmopolitan Libc. This allows classic Infocom text adventures, which were originally designed for various platforms, to run natively on modern operating systems (Windows, macOS, Linux, FreeBSD, OpenBSD, NetBSD) without emulation or VMs. The porting process involved minimal code changes, primarily focused on resolving system call discrepancies between the original Unix environment and Cosmopolitan's compatibility layer. This approach leverages Cosmopolitan's ability to build statically linked, universally compatible executables, effectively "resurrecting" these classic games for contemporary systems while preserving their original codebase.
Hacker News users generally praised the project for its clever use of Cosmopolitan Libc to create truly portable Z-machine binaries. Several commenters expressed nostalgia for Infocom games and appreciated the effort to preserve them. Some discussed the technical aspects, like the benefits of static linking and the challenges of porting old code. A few users offered suggestions, such as adding features like save/restore functionality and improving the command-line interface. One commenter pointed out the potential for running these games on embedded systems thanks to Cosmopolitan's small footprint. The overall sentiment was positive, with many excited about the possibility of playing classic text adventures on modern and diverse platforms.
Code page 437, the original character set for the IBM PC, includes a small house character (⌂) because it was intended for general business use, not just programming. Inspired by the pre-existing PETSCII character set, IBM included symbols useful for forms, diagrams, and even simple games. The house, specifically, was likely included to represent "home" in directory structures or for drawing simple diagrams, similar to how other box-drawing characters are utilized. This practicality over pure programming focus explains many of 437's seemingly unusual choices.
HN commenters discuss various aspects of Code Page 437. Some recall using it in early PC gaming and the limitations it imposed on game design. Others delve into the history of character sets and code pages, including the inclusion of box-drawing characters for creating UI elements in text-based environments. Several speculate about the specific inclusion of the "house" character (⌂), suggesting it might be a remnant of a planned but never implemented feature, potentially related to home banking or smart home technologies nascent at the time. A few commenters point out its resemblance to Japanese family crests (kamon) or stylized depictions of Shinto shrines. The impracticality of representing a real house address with a single character is also mentioned.
Kicksmash32 is a dual Kickstart ROM replacement for Amiga computers, offering a streamlined way to switch between different Kickstart versions (1.2, 1.3, 2.04, 3.1, 3.2.1). It uses a compact menu activated by holding both mouse buttons during startup, allowing users to select their desired Kickstart ROM without physical hardware modifications. The project is open-source and supports various Amiga models including A500, A600, A1200, and A4000. This simplifies the process of booting into different AmigaOS versions for compatibility with various software and games.
Commenters on Hacker News largely expressed excitement and nostalgia for the Amiga, praising the Kicksmash project for its ingenuity and potential. Several users shared their personal experiences with Amiga kickstart ROMs, highlighting the challenges of managing multiple versions for different software and configurations. The convenience of switching between ROMs using a selector was lauded as a major benefit. Some questioned the legality of distributing ROMs, even modified ones, and discussed the nuances of copyright law concerning abandonware. Others delved into technical details, speculating about the possibility of running Kickstart 3.1.4 from RAM and exploring the intricacies of Amiga hardware. A few users also inquired about compatibility with various Amiga models and expansions. The overall sentiment was one of positive interest and appreciation for the project's contribution to the Amiga community.
The Jupiter Ace, a British home computer from the early 1980s, stood out due to its use of Forth as its primary programming language instead of the more common BASIC. While Forth offered advantages in speed and efficiency, its steeper learning curve likely contributed to the Ace's commercial failure. Despite its innovative use of a then-obscure language and compact, minimalist design, the Jupiter Ace ultimately lost out in the competitive home computer market, becoming a curious footnote in computing history.
HN commenters discuss the Jupiter Ace's unique use of Forth, some appreciating its educational value and elegance while others find it esoteric and limiting. Several recall fond memories of using the machine, praising its speed and compact design. The limited software library and RAM are mentioned as drawbacks, alongside the challenges of garbage collection in Forth. The unconventional keyboard layout and the machine's overall fragility are also discussed. One commenter notes the irony of its Sinclair connection, being designed by former Sinclair employees yet failing where Sinclair succeeded. A few comments delve into the technicalities of Forth and its implementation on the Ace, while others lament its ultimate commercial failure despite its innovative aspects.
The blog post explores the recently released and surprisingly readable Macintosh QuickDraw and MacPaint 1.3 source code. The author dives into the inner workings of the software, highlighting the efficient use of assembly language and clever programming techniques employed to achieve impressive performance on limited hardware. Specific examples discussed include the rectangle drawing algorithm, region handling for complex shapes, and the "FatBits" zoomed editing mode, illustrating how these features were implemented with minimal resources. The post celebrates the code's clarity and elegance, demonstrating how the original Macintosh developers managed to create a powerful and user-friendly application within the constraints of early 1980s technology.
Hacker News commenters on the MacPaint source code release generally expressed fascination with the code's simplicity, small size, and cleverness, especially given the hardware limitations of the time. Several pointed out interesting details like the use of hand-unrolled loops for performance and the efficient drawing algorithms. Some discussed the historical context, marveling at Bill Atkinson's programming skill and the impact of MacPaint on the graphical user interface. A few users shared personal anecdotes about using early Macintosh computers and the excitement surrounding MacPaint's innovative features. There was also some discussion of the licensing and copyright status of the code, and how it compared to modern software development practices.
Summary of Comments ( 42 )
https://news.ycombinator.com/item?id=44110219
Hacker News users discuss the unusual 512x342 resolution of the original Macintosh. Several commenters point to the simplicity of the math involved, noting that 512 is a power of 2, which simplified calculations for the Motorola 68000 processor. This also allowed efficient use of memory and facilitated bit shifting operations for screen manipulation. Some suggest the choice was influenced by the available memory and the desire to keep costs manageable, while others mention the impact on font rendering, particularly the difficulty of displaying proportional fonts smoothly. The discussion also touches upon the trade-offs between resolution and refresh rate, with the limitations of the then-current technology playing a significant role. A few commenters recall personal anecdotes related to working with early Macintosh hardware and software.
The Hacker News post discussing the original Macintosh screen resolution has a moderate number of comments, exploring various aspects beyond the original article's focus.
Several commenters delve into the technical constraints of the early 1980s. One explains the relationship between the 32-bit Motorola 68000 processor, the display's refresh rate, and the limited video memory available. They point out that generating a higher resolution image would have required significantly more memory, a costly component at the time. Another commenter builds on this, noting that even with the chosen resolution, the Macintosh needed every clock cycle available from the 68000 just to update the display, leaving little processing power for other tasks. This aligns with another comment referencing the complexities of accessing memory quickly, a challenge that influenced the choice of screen resolution.
The conversation extends to the historical context of bitmap displays and the limitations of CRT technology. One commenter explains that bitmap displays were a relatively new concept at the time, contrasting them with character-based terminals commonly used then. Another commenter discusses the constraints of the CRT technology itself, mentioning the difficulty in achieving a high-resolution display within the available budget and technological capabilities. The limitations of RAM are further discussed, mentioning that cost was a major concern and higher resolutions would have necessitated considerably more RAM, pushing the already high price of the Macintosh even higher.
Several comments delve into personal anecdotes and comparisons with other early computers. One user recalls using a Lisa and comparing its display to the Macintosh. Another mentions owning an original Macintosh and details its graphical capabilities, while another discusses the transition from character-based displays to the Macintosh's graphical interface, marking a significant shift in user experience.
Beyond the technical aspects, there's discussion about the design philosophy behind the original Macintosh. One commenter argues that the chosen resolution was a deliberate decision, allowing for a compact and affordable computer without sacrificing usability. This comment emphasizes the careful balance between performance, cost, and user experience that informed the Macintosh's design. Another commenter compares the Macintosh to other computers of the time, including IBM PCs, emphasizing the Mac's focus on graphical user interface, a feature considered revolutionary for its time. The minimalist aesthetic is also mentioned, with one comment appreciating the simplicity and elegance of the system despite its technical limitations.
Finally, the thread touches upon broader themes about the evolution of technology. One commenter reflects on how far technology has progressed since the early 1980s, highlighting the incredible advancements in display technology and processing power. Another echoes this sentiment, marveling at how limitations that seemed insurmountable then are now trivial to overcome.