I used to have a wonderful toaster. It was just the best thing since sliced bread.
It was simple. It always worked. It was efficient and fast, and got the job done.
Every day, for many years, it made my life better.
Then one day, the toaster updated itself. I went to make my breakfast, and found that buttons had moved, It looked a bit different. Before I could use it, I needed to accept some new privacy agreements, and figure out how to adjust the settings.
But after a few weeks I grew to love the new interface.
It was my same toaster, just looked a bit fresher, and had a few extra bells and whistles I really didn't need.
At the core, it was still making my life better, every day.
Fast forward a year.
One morning, another update landed, and this one was a major one. I had to stand there, raw bread in my hand, clicking things and feeling like an idiot for 5 minutes while my toaster updated, before I could use it.
When the update finished, my toaster now looked more like a microwave with lots of new buttons and settings and options. Everything was different, even the toast slots had moved!
Another 5 minutes to work it out. The bread in my hand is already getting stale.
Finally, I find the "toast bread" option, and proceed with my morning routine. I'm not happy, but... it still worked.
I could still toast my bread.
I found it a bit baffling why all these changes were being made, and how I was supposed to be benefitting from them, but hey... I was still getting my toast.
So I adapted, and continued my daily morning routine, with just a tiny bit more friction around getting my toast made. All these new features seemed to be slowing my toaster down and complicating the basic toast my bread process, but it was still usable.
And on the whole, it still made my life better, every day.
Until this morning, October 4th, 2021, when something new happened.
My toaster had apparently updated again, and now it was talking to me. I walked into my kitchen, and suddenly pop-ups started appearing. I didn't ask for that. I wasn't even near my toaster. I wasn't even interested in making breakfast yet.
But for some reason, my toaster suddenly demanded my attention.
I felt like a chihuahua had moved into my kitchen overnight.
It was at that moment, that I realized that the friction my toaster was generating in my life had exceeded the value it was providing.
It was no longer making my life better.
I reflected on this as I calmly sipped my coffee, walked into my garden and headed for the shed...
Where I keep my sledgehammer.
Eh. I'd been meaning to try keto anyway.
Our World is Full of Toasters
Obviously, this is not a tale about toasters.
I use a lot of tools in my work. Dozens or even hundreds of software tools, libraries, applications. Not to mention the computers, monitors, keyboards, tablets and mice that are all part of making that work.
I've been all-too-aware lately a disturbing trend towards increasing friction of the tech tools I'm using.
Tools are meant to make life easier, to make you more productive, and to reduce friction in your life. All of that breaks down when those tools introduce interference and distraction into your work environment.
Here are the most common types of interference I'm seeing...
Oh. My. God.
I'm afraid to reboot my PC simply for the number of updates I'll have to wade through.
These are often much too frequent, and they require interaction with each application separately.
The notifications that "there is an update available" are often launched as an in-your-face modal dialog on startup for each application, And these aren't for critical security issues or bug fixes, they're for new features that I don't have time to explore right now.
For non-critical updates, it somehow seems more logical to put these at application shutdown, as a "would you like to update silently after closing this app?"
Less friction please...
Overly complex security
Yes, I want my data safe & secure... but not from me.
I work from my home computer 99% of the time, which is locked safely inside my house.
Does Google really need me to re-authenticate myself every 48 hours, on every account that I use? Do I really need to confirm my phone number & email monthly, when it hasn't changed in the last 10 years?
Yes, it's still me Google. And if someone was to break into my house and try to mess with my Google account, you'd assume it was me and let them right in.
So what are we accomplishing here?
Too many alerts and configurations, everywhere
When I am researching and browsing the web, do I really need to go through a cookie consent and configuration process on every website I visit?
Choice is great- I like having options. I like food too, but don't stuff it down my throat. Let me eat when I'm damn well hungry.
There must be a better way.
Unhelpful features & complexity
Just like humans, software companies don't always deal with success well. They usually take that success and hire more developers, who are bored and looking for things to do.
They add widgets and features and add-ons to the product that... don't really make it any better. Often, those features just fall under the category of "bloat." Applications become larger, slower, and more complex.
Ultimately, basic functions are impacted, and companies destroy the very thing that made their product popular to begin with.
Why? If you have the resources and want to increase your market share- great, go build some wonderful awesome apps.
But don't touch my @!%#$ toaster.
This problem is broad, it's like a megatrend.
I'm seeing this problem almost everywhere, but here are some of the worst offenders, which will serve to illustrate my point.
The toaster analogy was inspired by Evernote when I was greeted this morning with a large desktop pop-up. For absolutely no reason.
For the past year I've watched Evernote change and bloat continually, warping it away from the clean, fast, simple tool that I once loved.
That love has faded.
It is no longer the nimble athlete that it once was. It has gained 100kg and just can't compete anymore on the sporting field. Especially in the pole-vaulting category, which it used to dominate.
For the past year, I've struggled with the periodic UI changes, including new bugs that would occasionally make it inaccessible on my Android phone. I call it the white screen of death.
Now I find the phone version useless. Even if it does let me open the app, the slowness ( on my fast, new, expensive Android phone ) makes it too painful. It's about 30 seconds from when I have the thought "I need to record this" to the point when I can actually write my note.
I could have tattooed it on my arm faster.
On my desktop, I sort of tolerate Evernote now, using it as little as possible since I have other apps that do the job more quickly or better.
It's important to share that I'm on the paid plan for Evernote. These aren't advertising pop-ups, or encouragements to upgrade. I see those as fair play. If you give someone a product for free, you have the right to nagware them a bit.
But I'm paying.
And I am tired of paying to be annoyed.
I ditched Apple iPhones a few years ago because of the typical battery issues, and the increasing sluggishness of the device over time.
It was far too difficult to get files on to and off of my phone. I found the entire iTunes setup horrific, and difficult to use. Far worse, it consumed gigabytes of space on my system drive, and it cost me hours trying to free space so my PC could function.
In the end, there was just too much friction, and it increased until I threw in the towel and switched to Android.
I immediately wished I'd made that switch sooner.
Suddenly things were working again, like they were supposed to. My phone was fast, and configurable the way I wanted. I could copy files to and from it with ease.
Brilliant, and frictionless.
But lately, the friction is increasing. My Android phone now wants an OS update every couple of months and when that happens, it greets me with a full-screen "update now" message every time I use it.
The problem is, I get that message every time I pick up my phone, and the reason I picked it up is that I need to use it now.
I can't wait for an update.
This, this moment right now, would be the worst possible time for an update.
So I dismiss the screen, and do what was needed. Once I'm done, I could actually afford the downtime of an update. But that screen is gone. I glance at notifications area, and the settings menu... nothing obvious. Where to find the option to do the OS update right now? It's buried somewhere... I'd need to do some Googling to find it out.
A better way would be a less obtrusive message, and one which allows me to quickly approve and schedule a time for the update, e.g. tonight at midnight. That could work.
I'm a big fan of Microsoft.
For my money, Microsoft makes some of the best development tools and office-productivity tools in the world, and that includes the Windows operating system.
For me, it offers the right balance of ease-of-use, and customizability, for all of the different types of work I do.
But... Windows Update. Need I say more?
For me, Windows update is like a big nose wart on the Mona Lisa. It's just too intrusive on every level, and I've invested a fair bit of time learning how to disable the Windows Update pop-up alerts, and most importantly, its "automatic reboot" feature, which has destroyed unsaved work.
Sure, a power out could have done the same thing... but that's why I have a UPS.
You don't expect your OS to be a threat to your work, and when it violates that trust, it's hard to rebuild.
I hope Windows 11 does better.
A Problem of Epic Proportions
Yes, I'm ranting a bit, but what I'm sharing here is a big deal.
It's what happens when consumer-culture advocates encounter the x100 multiplier effect of the Internet.
- Pop-ups, everywhere
- Upgrades, daily. When was the last time you rebooted your computer and didn't have an update to do?
- Instant obsolescence.
History is Our Teacher
Here's why this matters so much.
The Tao of Cost-Benefit
Humans are a stingy, lazy lot - it's part of our evolved survival strategy. We don't like to burn time, money, and energy on things unnecessarily.
All of our decision making is based on a cost-benefit analysis.
When we invest time, energy, or money in something, we accept that cost because we expect some form of benefit. It might be an immediate reward like pleasure, or it might be a long-term reward like a future advantage we gain.
It might be a tangible reward, like when we buy a car, or it might be an intangible reward... perhaps social approval, or the warm, positive feeling of living by your core values.
For a long time, cost-benefit was a relatively simple calculation, with simple rules. It was largely intuitive- a simple assessment of the pain and pleasure of your current situation compared with the pain & pleasure of an imagined future situation.
When you ask yourself "Should I buy this car?" you are comparing how you feel about two things-
Your current situation
Your future imagined situation, plus that car, and minus $30k
You'll naturally choose the option which promises less pain.
This cost-benefit analysis applies to all situations where you're considering a change. Your car. Your home. Your job. Even your relationship.
The Natural Motivation to Upgrade
In the past, when you buy a tool like a good kitchen knife, that knife functions predictably well long term.
Wear tends to be low, and sharpening it is easy... so unless it breaks or you lose it, you will probably never need another kitchen knife. Your kids might even inherit it.
As long as that knife kept working, it was very unlikely you'd get a new kitchen-cutting tool unless it presented a substantial cost-benefit improvement that your knife just couldn't.
Food processors achieved this. So did blenders, and blending sticks. You might even own more than one.
This kind of clear cost-benefit improvement is the natural motivator that drives technology adoption.
Notable examples of this in history include...
- The telegraph replacing postal mail
- The phone network replacing the telegraph
- The internet replacing the phone network
- The car replacing the horse
- Indoor plumbing replacing the outhouse and the chamber pot
- Central heating replacing the fireplace
- Electric light bulbs replacing candles
- Coin money replacing merchandise trades
- Paper money replacing coin money
- Checks replacing paper money
- Credit cards replacing checks
- Contactless credit cards replacing credit cards
- Cryptocurrency... who knows eh?
In every one of those examples there was an evolution. The new replaced the old because it offered a better cost-value metric. Usually, the obsolete technology faded from use until it simply disappeared.
Survival of the fittest.
The Un-Natural Motivation to Upgrade
In my lifetime, I've watched all of this go rather sideways.
In the pursuit of money, at some point the tactics of companies shifted...
- from building the best product possible- that everyone wanted, and no one else could compete with...
- to maximizing market expansion, through marketing, by buying the competition, and expanding overseas...
- to maximizing profit margins, by decreasing materials, labor, and overhead costs...
- to maximizing repeat purchase, by the existing customer base.
And this is where the un-natural motivation to upgrade comes in.
You don't need a new phone, but Apple will do everything in its power to convince you that you do, every 2 years, at $1000 a pop, or more.
How do they do that?
- Marketing, obviously. TV ads, Super Bowl commercials, product placements in your favorite TV shows, the whole bit.
"In 2015, Apple increased its ad spend by an enormous 50 percent for a record $1.8 billion. Then, in 2016—we don’t know. The company chose not to disclose its ad spend in its 2016 annual report." [ Business Insider ]
- Creating cool new features that you don't need, but that you feel less than without. Sure would be nice to have 1000x zoom on my camera, even if I never use it.
- Planned obsolescence- designing the phone with components, batteries, screens etc are are designed to only last a few years.
- Halting support for your "old, obsolete phone" as soon as they're able.
- Actually slowing down your old phone progressively, until it becomes unusable.
That's every money-making tactic I can think of. I'm sure I've missed some.
Does it work? Yes it does.
Apple revenue for the twelve months ending June 30, 2021 was $347.155B, a 26.77% increase year-over-year. Apple’s annual revenue has quadrupled in the last ten years.
For those of you that love Apple, don't be dismayed. Apple does make great products. I've highlighted them here because they're a perfect example to illustrate my points - but they are just one example. This pattern, and this approach to business is everywhere.
I like to call it... milking the cow.
Here's the thing. Unless a new Apple division starts poking holes in condoms, we're not going to see a major population growth. You're not going to get a billion new consumers tomorrow.
So how can Apple sustain this growth?
By selling more to YOU. More stuff, at higher prices, more frequently.
There's a Better Way
Any software developers or development companies reading this, I hope you'll take this to heart, and shift your perspective.
Success can't be measured by the number of features you've packed into a product, or by how many updates you release in a month.
It's measured by how much you tangibly improve the lives of your users, and how much friction you take away.
5 Rules to Code By
Here's how I see it...
#1 - Every feature is bloat to someone who doesn't need it
That's such an important perspective, that it bears repeating.
Every feature is bloat to someone who doesn't need it.
Whenever you're adding a feature, ask yourself "does everyone really need this, or should it be an optional add-on module?"
Quite often, it should be a separate product entirely.
The Rule: Never shove new features on users who didn't request it. Give users the option to reject changes they don't need. Consider this at the individual level. And by all means, track who is using which features in your product. You'll learn a lot about your users.
#2 - Updates are good, sometimes
Sometimes a software update is warranted. Major bug fixes and security risks should absolutely go to the front of the queue, and in those hopefully super-rare cases ( is your QA team doing it's job? ) it's justified to alert the user the moment the application is launched.
But think of it as blocking the on-ramp to the motorway, during rush hour. You'd better have a very good reason for making everyone late to work. Like, you're saving lives.
The Rule: Make smooth traffic flow your top priority. The update process itself is a major impediment to that. Be very sparing with updates, and strategically position them for lowest impact. Suggesting updates at application launch is the highest impact, so reserve that for critical situations only.
#3 - UI changes are expensive
And they are more so to your users, than to you.
Even brilliantly-crafted, well-designed UI changes will force your users into a new learning curve, while they adapt and change their habits.
How would you feel if the turn signal on your car kept moving to a new location on the dashboard? You'd probably not be that appreciative, and neither would your rear bumper.
Most companies are wizening up to the reality of these impacts and to change resistance, and they offer a "preview" of a new UI long before it's forced on them.
This is a good compromise. If you think you can make things more efficient, great- but give your users time to adapt, and the ability to choose when they have the time to tackle that learning curve.
Right now, at the moment they launch your app, they're looking to complete a specific task so suddenly changing the game on them will not be appreciated.
The Rule: Minimize UI changes. Give users the ability to preview them, when they're ready and have time. Consider offering a "keep the old UI" as a long-term option.
#4 - Security is the enemy of usability
We're getting better at this, gradually, so I have hope.
Technology is improving, and I imagine that it won't be long before my smartwatch will know it's being worn by me, and when I'm wearing it near my keyboard, it will make for a pretty reliable form of authentication.
Someday, logins might even go away, with retinal scanning or facial recognition, or fingerprint detection on my mouse and keyboard.
It will all get better. I hope.
But right now, it feels like the friction is increasing.
More frequent logins, with more complex password requirements. Two-factor authentication which means I can't do anything without my phone.
Let's collectively step back and rethink what we're trying to accomplish here.
The Rule: Be kind to your users. Don't demand overly complex security unless what you're protecting warrants it. Bank account, medical histories, private diaries, sure. Social media accounts, ok yes, makes sense. But not everyone is storing the nuclear launch codes to a user's personal life. Does a recipe database really warrant 2 factor authentication?
#5 - Frictionlessness, safety & usability are everyone's responsibility
The designers of a road and its traffic signals share just as much responsibility for the safety and comfort of its users as the car manufacturers do. But these usually get less attention, because it's harder to sue a government.
On the web, we're seeing the same dynamic in the so-called "cookie consent" screens are popping up everywhere. Someone decided that cookies were dangerous and that users must be presented with the option to reject them.
This has led to a trend of overly aggressive compliance notices, that require extra clicks and add all kinds of UX friction to websites worldwide.
I get that there are no standards yet... but wouldn't this make more sense to handle in the browser itself?
Should I, as a user, be able to configure my privacy settings in my web browser...
- YES / NO - I'm OK with being tracked by advertising agencies
- YES / NO - I'm OK with being tracked by analytics agencies
- YES / NO - I'm OK with a website remembering my preferences
- YES / NO - I'm OK with a website remembering my past visits & history
And choose to adjust those settings for any website I wish?
Or possibly even distinguish and "classify" cookies as security cookies, advertising cookies, analytics cookies, user-preferences cookies, and then the browser itself could determine which ones get saved, and which don't.
Any of these approaches would allow for a far smoother user experience, cut development costs, and improve legal compliance, all in one move.
Seems like the way to go.
The Rule: If you're anywhere in the chain of delivery of a product or service, understand how you contribute to the overall UX, particularly when a new problem arises such as cookie-compliance laws, or GDPR, or COPPA. Yes, a hamburger company is responsible for the health & safety of its customers, but so is the grocery store, and the refrigerated truck that transported them. Understand your part, the problems your end-users face, and how you can best contribute to the overall solutions.
In general, if a problem is being experienced market-wide, the best solution is usually found higher in the chain. Website owners probably shouldn't be trying to fix global cookie-compliance problems, when browser manufacturers can do it much more efficiently.
I hope that gave you some good things to ponder.
Good luck, and code well.
BROJO: Confidence. Clarity. Connection.
Join BROJO - the premier international self-development community - FREE!
- Connect with like-minded people who will support you with your goals and issues
- Overcome people-pleasing and Nice Guy Syndrome to build strong social confidence
- Get access to exclusive online courses to learn advanced social skills, how to master your psychology, proven career progression techniques and more
A big part of solving this involved increasing our awareness of the problem, and our visibility of it. Ultimately, this means new metric in product evaluation that I describe as friction.
Friction is a measurement of cost-benefit, combining these factors;
- Initial cost
- Ongoing cost
- Benefit it provides now ( especially measurable benefits, like the amount of time I'm saved, or the amount of money I win )
- Future benefit it provides
Everything can be measured against a kind of quadrant grid, with cost to the left, benefit to the right, Now as the baseline and future along the Y axis.
You can even evaluate things like "should I get a gym membership?"
In assessing cost and benefit, it's important to assess more than money. Both cost and benefit involve money, time, energy, and probably even emotional components.
UI v. UX - What's the difference?
In case you're unfamiliar with the difference, UI and UX refer to distinct things;
UI is a part of UX.
UI is just the physical shape of an interface, which is
What is "Friction"?
I quite like this phrasing...
"Friction, in the UX world, is generally something to be avoided. We are taught to make things easier, more effortless, and smoother for users. Friction is what happens when users are impeded in the tasks they are trying to perform."
What is Change Resistance?
Just adding this because I thought it's interesting. It's difficult to find a specific reference to the psychology of change resistance, but I encountered a related term which is structural inertia.
It refers more to the difficulty people have in accommodating changes in organizational structures, political envronments, etc.
Some reading, for my fellow students of human nature...
- Structural Inertia and Organizational Change
- Structural Inertia and Organizational Change Revisited III: The Evolution of Organizational Inertia*
- How to Deal With Resistance to Change
- 5 Tips for Managing Resistance to Change
Users are not the Enemy
I can't find the source of the quote "security is the enemy of usability", but it has long been one of my favorite design maxims.
Use the restrictive controls that security demands only then the net benefit to the user is worth it.
Hers's another perspective I like...