Ribbonfarm A Blog by Venkatesh Rao www.ribbonfarm.com AngkorwatificationPremium MediocreDomestic Cozy
Domestic Cozy An Article from Ribbonfarm by Venkatesh Rao www.ribbonfarm.com Millennials and Gen. ZA squeezable nugget of comfortPremium Mediocre vs. Domestic Cozy A Brief History of the Digital Garden
Fermi Estimates and Dyson Designs An Article by Venkatesh Rao www.ribbonfarm.com A Fermi estimate is a quick-and-dirty solution to an arbitrary scientific or engineering analysis problem. Fermi estimation uses widely known numbers, readily observable phenomenology, basic physics equations, and a bunch of approximation techniques to arrive at rough answers that tend to be correct within an order of magnitude or so. The term is named for Enrico Fermi, who was famously good at this sort of thing. …It struck me that there is counterpart to this kind of thinking on the synthesis side, where you use similar techniques to arrive at a very rough design for a complex engineered artifact. I call such a design approach Dyson design, after the physicist Freeman Dyson, who was one of the best practitioners of it (not to be confused with inventor James Dyson, whose designs, ironically, are not Dyson designs). designphysics
One Tenth of a Second An Article by Venkatesh Rao studio.ribbonfarm.com The details are fascinating, but the central argument — that the birth of modernity can be traced to a meta-crisis spawned by the 0.1s problem — is worth understanding and appreciating whether or not you’re a time nerd like me. There is no convenient leitmotif, comparable to the 0.1s problem, for our contemporary version of the rhyming conditions, but something very similar to the “tenth of a second crisis” is going on today. I suspect our Great Weirding too involves some sort of limiting factor on human cognition that we haven’t yet properly wrapped our minds around. It isn’t reaction time, but something analogous. timeanalogyprogresscognition
Mediocratopia An Article by Venkatesh Rao www.ribbonfarm.com I once read a good definition of aptitude. Aptitude is how long it takes you to learn something. The idea is that everybody can learn anything, but if it takes you 200 years, you essentially have no aptitude for it. Useful aptitudes are in the <10 years range. Leveling up aptitude You need to make the step forward skill
Premium Mediocre An Article from Ribbonfarm by Venkatesh Rao www.ribbonfarm.com Cupcakes and froyoMaya MillennialWhat premium mediocre is not societyculture
The small web is beautiful An Essay by Ben Hoyt benhoyt.com I believe that small websites are compelling aesthetically, but are also important to help us resist selling our souls to large tech companies. In this essay I present a vision for the “small web” as well as the small software and architectures that power it. Why aim small?Features and complexitySolving the problem of software bloatRaw size isn't enough Rediscovering the Small Web wwwmicrosites
Why aim small? Why aim small in this era of fast computers with plenty of RAM? A number of reasons, but the ones that are most important to me are: Fewer moving parts. It’s easier to create more robust systems and to fix things when they do go wrong. Small software is faster. Fewer bits to download and clog your computer’s memory. Reduced power consumption. This is important on a “save the planet” scale, but also on the very local scale of increasing the battery life of your phone and laptop. The light, frugal aesthetic. That’s personal, I know, but as you’ll see, I’m not alone. performancesystemsconservation
Features and complexity Niklaus Wirth of Pascal fame wrote a famous paper in 1995 called A Plea for Lean Software. His take is that “a primary cause for the complexity is that software vendors uncritically adopt almost any feature that users want”, and “when a system’s power is measured by the number of its features, quantity becomes more important than quality”. A Plea for Lean SoftwareSpeed is a featureRequirements proliferation featurescomplexity
Solving the problem of software bloat But instead of just complaining, how do we actually solve this problem? Concretely, I think we need to start doing the following: Care about size: this sounds obvious, but things only change when people think they’re important. Measure: both your executable’s size, and your program’s memory usage. You may want to measure over time, and make it a blocking issue if the measurements grow more than x% in a release. Or you could hold a memory-reduction sprint every so often. Language: choose a language that has a chance. Remove: cut down your feature set. Aim for a small number of high-quality features. My car can’t fly or float, and that’s okay – it drives well. Say no to new features: unless they really fit your philosophy, or add more than they cost over the lifetime of your project. Dependencies: understand the size and complexity of each dependency you pull in. Use only built-in libraries if you can.
Raw size isn't enough A few months ago there was a sequence of posts to Hacker News about various “clubs” you could post your small website on: the 1MB Club, 512KB Club, 250KB Club, and even the 10KB Club. I think those are a fun indicator of renewed interested in minimalism, but I will say that raw size isn’t enough – a 2KB site with no real content isn’t much good, and a page with 512KB of very slow JavaScript is worse than a snappy site with 4MB of well-chosen images. ...[Instead, it's about] an “ethos of small”. It’s caring about the users of your site: that your pages download fast, are easy to read, have interesting content, and don’t load scads of JavaScript for Google or Facebook’s trackers. minimalismcontentsize