It’s been 19 years since Pixar released Monsters, Inc. with all that CGI hair. Where are my hairy icons? Ones that get all long and knotted as the notifications number goes up.
Why can’t I feel my phone? I found that paper from 2010 (when I was complaining about keyboards) about using precision electrostatics to make artificial textures on touchscreens.
I should be able to run my thumb over my phone while it’s in my pocket and feel bumps for apps that want my attention. Touching an active element should feel rough. A scrollbar should *slip. Imagine the accessibility gains. But honestly I don’t even care if it’s useful: 1.5 billion smartphone screens are manufactured every year. For that number, I expect bells. I expect whistles.
Like a programming language interpreter, GPT-3 translates the designer’s intent from a language they’re already familiar with (English) to one they need to learn (Figma’s information architecture, as manifested in its UI.) This can be easier for a new/busy designer, much like Python is easier and faster to work with than assembly language.
But that’s not “designing” — at least not any more than compiling Python code is “programming.” In both cases, all the system does is translate human intent into a lower level of abstraction. Sure, the process saves time — but the key is getting the intent part right. I’ll be convinced the system is “designing” when it can produce a meaningful output to a directive like “change the product page’s layout to increase conversions.”