I’ve noticed a recent trend on the web — or at least, on the parts of it I’ve visited. Maybe you’ve noticed it too.
Here’s what happens: you’re on a website, and one of these little prompts pops up...[to] let you know that there’s an app, and that the website you’re on...well, it’s not quite the app, is it?
...Sometimes, the website wants me to install the app — no, it needs me to install the app. It’s like a paywall, but for apps. An appwall.
In recent years, these prompts have gotten more prominent, and occasionally impassable. And I think that trend’s interesting. Why would a company promote a native app over their perfectly usable website?
It feels like a glimpse into that company’s design priorities. And it’s possibly providing us with insight into the business value they place on the open web — a medium that’s meant to be accessible everywhere, on any screen, on any device.
And it really does feel like these glimpses are becoming more common.
The [Lake Erie] ecosystem underwent a series of changes, each of which were related. There was an increase in the human population; which led to higher phosophorus levels in the water; which led, at last, to an increased level of algae in the lake. In effect, Lake Erie’s ecosystem was rewritten. Changed by human activities into…something else.
But Franklin cites the study because it’s doing something slightly novel: applying Selye’s principle of stress to ecological systems, suggesting that they are, much like humans, just as susceptible to external stressors. And I’ve been thinking about that a lot lately, especially this week. Because Franklin’s suggesting that the work begins not by “fixing the system.” Rather, she suggests it’s about shifting the priority a little: to removing whatever stress you can.
In the early days, design systems promised us more consistent interfaces, more collaborative teams, and improved shipping times. While they’ve certainly delivered on some of those fronts, they’ve introduced new challenges too. Let’s talk through what’s working well—and what could be working better—as we take a closer look at the systems between us and our work.
Sometimes there’s a Heuristic That Almost Always Works, like “this technology won’t change everything” or “there won’t be a hurricane tomorrow”.
And sometimes the rare exceptions are so important to spot that we charge experts with the task. But the heuristics are so hard to beat that the experts themselves might be tempted to secretly rely on them, while publicly pretending to use more subtle forms of expertise.
…Maybe this is because the experts are stupid and lazy. Or maybe it’s social pressure: failure because you didn’t follow a well-known heuristic that even a rock can get right is more humiliating than failure because you didn’t predict a subtle phenomenon that nobody else predicted either. Or maybe it’s because false positives are more common (albeit less important) than false negatives, and so over any “reasonable” timescale the people who never give false positives look more accurate and get selected for.