The author argues that man pages themselves are a valuable and well-structured source of information, contrary to popular complaints. The problem, they contend, lies with the default man
reader, which uses less, hindering navigation and readability. They suggest alternatives like mandoc
with a pager like less -R
or specialized man page viewers for a better experience. Ultimately, the author champions the efficient and comprehensive nature of man pages when presented effectively, highlighting their consistent organization and advocating for improved tooling to access them.
Lilly is a TUI text editor built with Rust that aims to offer a modern, performant, and customizable alternative to Vim and Neovim. It prioritizes extensibility through plugins written in Lua, offering a familiar experience for Neovim users. Featuring built-in Language Server Protocol (LSP) support, tree-sitter for syntax highlighting, and asynchronous execution for responsiveness, Lilly seeks to combine the speed and efficiency of a terminal interface with the advanced features of modern GUI editors. The project is actively under development and welcomes contributions.
Hacker News users discuss Lilly, a TUI editor and potential Vim/Neovim alternative, focusing on its Lua extensibility and clean slate design. Some express excitement about a modern, scriptable TUI editor, praising its apparent performance and the potential of Lua for customization. Others question its long-term viability given the established competition, and some debate the merits of modal vs. non-modal editing. Several commenters highlight the difficulty of attracting users away from entrenched editors, while others suggest Lilly could find a niche among those seeking a simpler, more easily customized TUI experience. A few express interest in specific features like the integrated file explorer and fuzzy finder. Overall, the comments show cautious optimism tempered by an awareness of the challenges faced by new text editors.
My-yt is a personalized YouTube frontend built using yt-dlp. It offers a cleaner, ad-free viewing experience by fetching video information and streams directly via yt-dlp, bypassing the standard YouTube interface. The project aims to provide more control over the viewing experience, including features like customizable playlists and a focus on privacy. It's a self-hosted solution intended for personal use.
Hacker News users generally praised the project for its clean interface and ad-free experience, viewing it as a superior alternative to the official YouTube frontend. Several commenters appreciated the developer's commitment to keeping the project lightweight and performant. Some discussion revolved around alternative frontends and approaches, including Invidious and Piped, with comparisons of features and ease of self-hosting. A few users expressed concerns about the project's long-term viability due to YouTube's potential API changes, while others suggested incorporating features like SponsorBlock. The overall sentiment was positive, with many expressing interest in trying out or contributing to the project.
The author, frustrated by the steep learning curve of Git, is developing a game called "Oh My Git!" to make learning the version control system more accessible and engaging. The game visually represents Git's inner workings, allowing players to experiment with commands and observe their effects on a simulated repository. The goal is to provide a safe, interactive environment for understanding core concepts like branching, merging, rebasing, and resolving conflicts, ultimately demystifying Git and reducing the frustration commonly associated with learning it. The game aims to be suitable for beginners while also offering challenges for more experienced users looking to refine their skills.
Hacker News users generally expressed enthusiasm for the Git game concept, viewing it as a valuable tool for learning a complex system. Several commenters shared their own struggles with Git and suggested specific game mechanics, such as branching and merging scenarios, rebasing challenges, and visualizing the commit graph. Some questioned the chosen game engine (Godot) and proposed alternatives like Unity or a web-based approach. There was also discussion about the potential target audience, with suggestions to focus on beginners while providing sufficient depth to engage experienced users as well. A few users highlighted existing Git learning resources, including "Oh My Git!" and the official Git documentation's interactive tutorial.
This blog post demonstrates how to efficiently integrate Large Language Models (LLMs) into bash scripts for automating text-based tasks. It leverages the curl
command to send prompts to LLMs via API, specifically using OpenAI's API as an example. The author provides practical examples of formatting prompts with variables and processing the JSON responses to extract desired text output. This allows for dynamic prompt generation and seamless integration of LLM-generated content into existing shell workflows, opening possibilities for tasks like code generation, text summarization, and automated report creation directly within a familiar scripting environment.
Hacker News users generally found the concept of using LLMs in bash scripts intriguing but impractical. Several commenters highlighted potential issues like rate limiting, cost, and the inherent unreliability of LLMs for tasks that demand precision. One compelling argument was that relying on an LLM for simple string manipulation or data extraction in bash is overkill when more robust and predictable tools like sed
, awk
, or jq
already exist. The discussion also touched upon the security implications of sending potentially sensitive data to an external LLM API and the lack of reproducibility in scripts relying on probabilistic outputs. Some suggested alternative uses for LLMs within scripting, such as generating boilerplate code or documentation.
Calendar.txt outlines a simple, universal calendar format based on plain text. Each line represents a day, formatted as YYYY-MM-DD followed by optional event descriptions separated by tabs. This minimalist approach allows for easy creation, parsing, and manipulation by any text editor or scripting tool, promoting interoperability across diverse platforms and applications. The post emphasizes the benefits of this format's portability, version control friendliness, and longevity, contrasting it with proprietary calendar systems that often lock users into specific software or data formats. The suggested structure allows for complex recurring events and to-do lists with simple extensions, making it adaptable to various scheduling needs.
Hacker News users discuss the minimalist approach of calendar.txt
, appreciating its simplicity and portability. Some highlight its alignment with the Unix philosophy of doing one thing well. Others suggest improvements like adding support for recurring events or integration with other tools. A few users express skepticism, finding the plain text format too limiting for practical use, while others champion its accessibility and ease of parsing. The discussion also touches upon alternative calendar solutions and the benefits of plain text for archiving and data longevity. Several commenters share their personal workflows incorporating plain text files for task management and scheduling.
Paul Samuels advocates for using simple, project-specific shell scripts instead of complex build systems or task runners for small to medium-sized projects. He argues that shell scripts offer better transparency, debuggability, and control, while reducing cognitive overhead. They facilitate easier understanding of project dependencies and build processes, which ultimately contributes to better maintainability, especially for solo developers or small teams. By leveraging the shell's built-in features and readily available Unix tools, project scripts provide a lightweight yet powerful approach to managing common development tasks.
Hacker News users generally praised the simplicity and practicality of "Project Scripts." Several commenters appreciated the lightweight nature of the approach compared to more complex build systems or dedicated project management tools, highlighting the benefit of reduced cognitive overhead. Some suggested potential improvements like incorporating direnv or using a Makefile for more complex projects. A few users expressed skepticism, arguing that the proposed "Project Scripts" offered little beyond basic shell scripting and questioned the need for a dedicated term. Others found the idea valuable for its focus on explicitness and ease of sharing project setup within a team. The discussion also touched on related tools like Taskfile and justfile, comparing their features and complexity to the author's approach.
An interactive, annotated version of the classic "Unix Magic" poster has been created. This online resource allows users to explore the intricate diagram of Unix commands and their relationships. By clicking on individual commands, users can access descriptions, examples, and links to further resources, providing a dynamic and educational way to learn or rediscover the power of the Unix command line. The project aims to make the dense information of the original poster more accessible and engaging for both beginners and experienced Unix users.
Commenters on Hacker News largely praised the interactive Unix magic poster for its nostalgic value, clear presentation, and educational potential. Several users reminisced about their experiences with the original poster and expressed appreciation for the updated, searchable format. Some highlighted the project's usefulness as a learning tool for newcomers to Unix, while others suggested improvements like adding links to man pages or expanding the command explanations. A few pointed out minor inaccuracies or omissions but overall considered the project a valuable resource for the Unix community. The clean interface and ease of navigation were also frequently mentioned as positive aspects.
Julia Evans expresses frustration with several common terminal shortcomings. She highlights the difficulty of accurately selecting and copying text, especially across multiple lines or with special characters, often resorting to workarounds like opening the command in a text editor. Additionally, she points out the inconsistency of terminal escape codes leading to unpredictable behavior between different terminals and programs. Finally, she laments the lack of a standardized method to directly interact with and manipulate the output of a previously executed command, requiring awkward copying or screenshotting for further analysis. These limitations, she argues, interrupt her workflow and make the terminal less efficient than it could be.
HN users generally agreed with the author's frustrations regarding terminal emulators. Several commenters pointed to specific pain points like inconsistent copy/paste behavior, difficulties with selecting text, and the lack of proper mouse support across different terminals. Alacritty and Warp were frequently mentioned as modern alternatives attempting to address some of these issues, though some users expressed reservations about Warp's closed-source nature and Electron base. Others discussed the challenges inherent in terminal emulation given its historical baggage and the trade-offs between features, performance, and compatibility. The desire for a truly modern and consistent terminal experience was a recurring theme.
The blog post explores using #!/usr/bin/env uv
as a shebang line to execute PHP scripts with the uv
runner, offering a performance boost compared to traditional PHP execution methods like php-fpm
. uv
leverages libuv for asynchronous operations, making it particularly advantageous for I/O-bound tasks. The author demonstrates this by creating a simple "Hello, world!" script and showcasing the performance difference using wrk
. The post concludes that while setting up uv
might require some initial effort, the potential performance gains, especially in asynchronous contexts, make it a compelling alternative for running PHP scripts.
Hacker News users discussed the practicality and security implications of using uv
as a shebang line. Some questioned the benefit given the small size savings compared to a full path, while others highlighted potential portability issues and the risk of uv
not being installed on target systems. A compelling argument against this practice centered on security, with commenters noting the danger of path manipulation if uv
isn't found and the shell falls back to searching the current directory. One commenter suggested using env
to locate usr/bin/env
reliably, proposing #!/usr/bin/env uv
as a safer, though slightly larger, alternative. The overall sentiment leaned towards avoiding this shortcut due to the potential downsides outweighing the minimal space saved.
Milo Fultz's blog post details a method for finding the oldest lines of code in a Git repository. The approach leverages git blame
combined with awk
and sort
to extract commit dates and line numbers. By sorting the output based on these dates, the script identifies and displays the oldest surviving lines, effectively pinpointing code that has remained unchanged since its initial introduction. This technique can be useful for understanding the evolution of a codebase, identifying potential legacy code, or simply satisfying curiosity about a project's history.
Hacker News users discussed various methods and tools for finding the oldest lines of code in a repository, expanding on the article's git blame
approach. Several commenters suggested using git log -L
for more precise tracking of specific lines or functions, highlighting its ability to handle code moves and rewrites. The practicality of such analysis was debated, with some arguing its usefulness for understanding legacy code and identifying potential refactoring targets, while others questioned its value beyond curiosity. Alternatives like git-quick-stats
and commercial tools like CodeScene were also mentioned for broader code history analysis, including visualizing code churn and developer contributions over time. The potential pitfalls of relying solely on line age were also brought up, emphasizing the importance of considering code quality and functionality regardless of its age.
Git's autocorrect, specifically the help.autocorrect
setting, can be frustratingly quick, correcting commands before users finish typing. This blog post explores the speed of this feature, demonstrating that even with deliberately slow, hunt-and-peck typing, Git often corrects commands before a human could realistically finish inputting them. The author argues that this aggressive correction behavior disrupts workflow and can lead to unintended actions, especially for complex or unfamiliar commands. They propose increasing the default autocorrection delay from 50ms to a more human-friendly value, suggesting 200ms as a reasonable starting point to allow users more time to complete their input. This would improve the user experience by striking a better balance between helpful correction and premature interruption.
HN commenters largely discussed the annoyance of Git's aggressive autocorrect, particularly git push
becoming git pull
, leading to unintended overwrites of local changes. Some suggested the speed of the correction is disorienting, making it hard to interrupt, even for experienced users. Several proposed solutions were mentioned, including increasing the correction delay, disabling autocorrect for certain commands, or using aliases entirely. The behavior of git help
was also brought up, with some arguing its prompt should be less aggressive as typos are common when searching documentation. A few questioned the blog post's F1 analogy, finding it weak, and others pointed out alternative shell configurations like zsh
and fish
which offer improved autocorrection experiences. There was also a thread discussing the implementation of the autocorrection feature itself, suggesting improvements based on Levenshtein distance and context.
/etc/glob
was an early Unix mechanism (predating regular expressions) allowing users to create named patterns representing sets of filenames, simplifying command-line operations. These patterns, using globbing characters like *
and ?
, were stored in /etc/glob
and could be referenced by name prefixed with g
. While conceptually powerful, /etc/glob
suffered from limited wildcard support and was eventually superseded by more powerful and flexible tools like shell globbing and regular expressions. Its existence offers a glimpse into the evolution of filename pattern matching and Unix's pursuit of concise yet powerful user interfaces.
HN commenters discuss the blog post's exploration of /etc/glob
in early Unix. Several highlight the post's clarification of the mechanism's purpose, not as filename expansion (handled by the shell), but as a way to store user-specific command aliases predating aliases and shell functions. Some commenters share anecdotes about encountering this archaic feature, while others express fascination with this historical curiosity and the evolution of Unix. The overall sentiment is appreciation for the post's shedding light on a forgotten piece of Unix history and prompting reflection on how modern systems have evolved. Some debate the actual impact and usage prevalence of /etc/glob
, with some suggesting it was likely rarely used even in early Unix.
Summary of Comments ( 63 )
https://news.ycombinator.com/item?id=43631672
HN commenters largely agree with the author's premise that man pages are a valuable resource, but the tools for accessing them are often clunky. Several commenters point to the difficulty of navigating long man pages, especially on mobile devices or when searching for specific flags or options. Suggestions for improvement include better search functionality within man pages, more concise summaries at the beginning, and alternative formatting like collapsible sections.
tldr
andcheat
are frequently mentioned as useful alternatives for quick reference. Some disagree, arguing that man pages' inherent structure, while sometimes verbose, makes them comprehensive and adaptable to different output formats. Others suggest the problem lies with discoverability, and tools likeapropos
should be highlighted more. A few commenters even advocate for generating man pages automatically from source code docstrings.The Hacker News post titled "Man pages are great, man readers are the problem" (linking to an article of the same name) generated a substantial discussion with diverse opinions. Several commenters agreed with the article's premise, citing the density and comprehensive nature of man pages as strengths, and pointing to the awkwardness of
man
's default pager,less
, as a major usability hurdle. They suggest alternative pagers likebat
or incorporating search functionalities within the pager as improvements. One commenter specifically praised the use oftldr
pages for quicker access to common usage examples, while acknowledging man pages as the ultimate source of truth. Another commenter noted how valuable the full technical specifications and corner cases documented in man pages are, even if they are not needed for everyday usage. The verbosity and occasional outdatedness of man pages were mentioned as minor drawbacks, though not significant enough to detract from their overall value.Some commenters argued against the article's premise. They expressed frustration with the structure of man pages, finding the information organization illogical and difficult to navigate, even with improved pagers. They criticized the lack of consistency across different man pages, making it challenging to predict where specific information might be located. These commenters often suggested alternative documentation formats like web pages or dedicated documentation sites, which they perceived as being more user-friendly. One commenter pointed out that the author's preferred approach using
man -Tpdf
and a PDF viewer was a workaround rather than a solution to the underlying usability issues withman
.A few commenters took a more nuanced approach, acknowledging the strengths of man pages while also recognizing their shortcomings. They proposed improvements such as better indexing and search capabilities, more consistent formatting, and perhaps even incorporating some of the strengths of alternative documentation styles into man pages themselves. One commenter highlighted the importance of context and how man pages, being primarily designed for command-line use, fit well within that specific context. They also pointed to the benefit of man pages being readily available offline, a crucial advantage in certain situations. There was also some discussion about the learning curve associated with using man pages effectively, with some users appreciating the challenge while others found it unnecessarily steep.
Finally, there were a few tangential comments, including one about the history of Unix documentation and the cultural significance of man pages. Another commenter questioned the value of man pages in the modern software development landscape, arguing that many modern tools and libraries often lack adequate man page documentation. Overall, the comment section reflects a wide range of opinions on the utility and usability of man pages, with a general agreement that improvements are needed but disagreement on the best approach to achieve them.