A holograph of itself All [physical properties of matter] derive from the different patterns of the interaction of electrons and photons within the fields of the positively charged atomic nuclei, stabilized in a particular morphology by the interaction of the levels themselves. Matter is a holograph of itself in its own internal radiation. Matter versus Materials: A Historical View physics
Reality just seems to go on crunching I once met a fellow who thought that if you used General Relativity to compute a low-velocity problem, like an artillery shell, General Relativity would give you the wrong answer—not just a slow answer, but an experimentally wrong answer—because at low velocities, artillery shells are governed by Newtonian mechanics, not General Relativity. This is exactly how physics does not work. Reality just seems to go on crunching through General Relativity, even when it only makes a difference at the fourteenth decimal place, which a human would regard as a huge waste of computing power. Physics does it with brute force. No one has ever caught physics simplifying its calculations—or if someone did catch it, the Matrix Lords erased the memory afterward. Eliezer Yudkowsky, Rationality: From AI to Zombies physics
Corpuscles of nothing and atoms of something The structure of matter devolved ultimately into the intimate coexistence of something like corpuscles of nothing and atoms of something, segregating through the accidents of history to yield regions differing in density intimately interwoven on different scales. The experience of the world as well as human perception and analysis of any part of it is a matter of the angular scale of resolution and of the time necessary for making comparison between the different parts. Without such variations and without time to compare remembrances of them, nothing can be experiences. Cyril Stanley Smith, The Tiling Patterns of Sebastien Truchet and the Topology of Structural Hierarchy physicsperception
I know all about entropy Adell: I know as much as you do. Lupov: Then you know everything's got to run down someday. Isaac Asimov, The Last Question timedeathphysics
The Iridium System Several Low-Earth-Orbit (LEO) networks were proposed, but only one got off the ground: the Iridium system. The original Iridium proposal called for a "constellation" of 77 satellites, which gave the plan its name: the element iridium has atomic number 77, meaning that an iridium atom has 77 orbiting electrons. Before the satellites were launched, the constellation was scaled back to 66 active satellites, but no one wanted to change the name to Dysprosium. Brian Hayes, Infrastructure: A Guide to the Industrial Landscape physicscommunicationaerospacecosmos
Fermi Estimates and Dyson Designs An Article by Venkatesh Rao www.ribbonfarm.com A Fermi estimate is a quick-and-dirty solution to an arbitrary scientific or engineering analysis problem. Fermi estimation uses widely known numbers, readily observable phenomenology, basic physics equations, and a bunch of approximation techniques to arrive at rough answers that tend to be correct within an order of magnitude or so. The term is named for Enrico Fermi, who was famously good at this sort of thing. …It struck me that there is counterpart to this kind of thinking on the synthesis side, where you use similar techniques to arrive at a very rough design for a complex engineered artifact. I call such a design approach Dyson design, after the physicist Freeman Dyson, who was one of the best practitioners of it (not to be confused with inventor James Dyson, whose designs, ironically, are not Dyson designs). designphysics
The Fidelity Curve An Article by Ryan Singer m.signalvnoise.com How do we choose which level of fidelity is appropriate for a project? I think about it like this: The purpose of making sketches and mockups before coding is to gain confidence in what we plan to do. I’m trying to remove risk from the decision to build something by somehow “previewing” it in a cheaper form. There’s a trade-off here. The higher the fidelity of the mockup, the more confidence it gives me. But the longer it takes to create that mockup, the more time I’ve wasted on an intermediate step before building the real thing. I like to look at that trade-off economically. Each method reduces risk by letting me preview the outcome at lower fidelity, at the cost of time spent on it. The cost/benefit of each type of mockup is going to vary depending on the fidelity of the simulation and the work involved in building the real thing. Four levels of fidelityTime to build versus confidence gained prototypesinterfaces
Four levels of fidelity Suppose we have four levels of fidelity… Rough sketch (on paper or an iPad) Static mock-up (eg. Photoshop or Sketch) Interactive mock-up (eg. Framer, InVision) Working code prototype (HTML/CSS, iOS views) Depending on the feature you’re working on, these levels of fidelity take different amounts of time to create. If you plot them in terms of time to build versus confidence gained, you could imagine something like a per-feature fidelity curve.
Time to build versus confidence gained Show image 0 Show image 1 Show image 2 Take a simple CRUD web UI, where you’re just navigating between screens. It doesn’t take much more time to build the real version than it does to mock it when the design is simple. If you were to build out an interactive mock first, you would end up spending twice as much time in total without gaining much out of it. Contrast that with a complicated Javascript interaction. Or a native iOS feature that requires programmer time to build out. If it takes substantially more time to build the real code version, then it may be smart to do an interactive mockup first.