Temporary simplification
New software projects or products are often introduced which take an existing, successful piece of software and offer similar function, but in a much simpler way.
Sometimes there is a genuine improvement by capturing the gist of the requirements in a much more general way. Take git for example. If you've used certain older version control systems, git seems like a dream come true. This is because git's underlying design cracks the version control problem really neatly. Branching and merging is no longer a sweat. It just works.
But other times a new piece of software only appears simpler than an established alternative because it hasn't yet had time to meet all the requirements. I've noticed this mostly in applications. Newer versions of Apple mail have (useful) features deleted. Google appear to be systematically ruining their applications in the search for simplicity. This kind of temporary simplification where, as important features are added back over time, the result ends up being no simpler than the original, seems like a bit of a con to me.
I'd be interested in other people's views on this phenomenon. Is temporary simplification a good way of appealing to a new group of users or is it a failure to make a genuine design improvement?
5 comments:
+1. Not only does that happen for end user apps, but also for frameworks, languages, etc. The examples are numerous.
This is the illness of our industry. We like to craft in ignorance of our predecessors and research. This is why we'll never move past the stage of craft. "Ignore the past and keep reinventing the wheel".
I think a lot of the pathological side of this phenomenon is driven by changes in the ease with which products can be adopted, the de-facto rules being something like:
1) Any tool or technology that cannot be incrementally adopted will not be adopted.
2) Any successful tool or technology which changes so that it can no longer be incrementally adopted is doomed to be reinvented.
It’s easy to see how this could lead to a cycle of reinvention, as successive products get adoption, then grow more and more sophisticated until they make some change that inadvertently kicks away the ladder that made them accessible to new users, whereupon the cycle begins again.
Of course different people can tolerate different barriers to entry, but once there’s a community that finds a sophisticated product inaccessible, I think it’s vulnerable to reinvention with lossy simplifications.
To me at least, all of this seems quite distinct from simplification through better abstractions, which is highly desirable and well exemplified by Git vs CVS/Subversion or iPhone vs Treo.
@Robert - yes, the incremental adoption angle seems pretty important.
I think that deleting certain piece of existing functionality makes an ultimate test for its usefulness. Time will show if that was wrong or right decision.
If that turns to be wrong, then re-adding (often re-designed) feature back may still serve a better job than preserving that feature in the first place.
@Igor yes, provided the consequences of such a wrong decision are not too catastrophic.
Post a Comment