Planned obsolescence has been an engineering approach to design for as long as there
has been mass production. Nothing lasts forever. But, how long should something be
expected to last? At its best, the process of obsolescence design (or predicting failure
modes) is the combination of economic and technical considerations. Not only is it a
question of how long something should last, but a question of the cost for increasing
the operating life of a product. Such decisions on mechanical parts, like bearings in an
automobile, determine the useful life expectancy of the product.
For mechanical systems, as design margins and safety factors diminished with
the advent of computer-aided design, the expected useful life of products also
decreased. Washing machines and vacuum cleaners of earlier times often served a
household for 20 years or more. Today’s offerings work well, but have been “unitized”
(designed with nonserviceable components) and designed with fewer margins.
Result: more rapid obsolesce. Businesses in the old economy (mechanical/industrial)
developed their product obsolescence around failure modes for the product (bearings,
gears, belts, etc.). Businesses in the new economy (electronic/information) develop
their product obsolescence cycle around the antiquation of their products due to
advancing capabilities (speed, features, compatibility, tax law changes, etc.).
Moore’s Law has lead to a completely new form of planned obsolesce, something
I call predictable antiquation.
Predictable antiquation estimates when product
abandonment will occur due to technological advance, not due to product failure.
Electronic products quickly fall into disfavor, often before they fail to function.
Antiquation may result from computational speed (PCs), operating system incompatibility
(software products), image resolution (digital cameras), storage capacity
(MP3 players), network incompatibility (modems, cell phones), or system integration
usurpation (personal digital assistants). Frequently, it is a battery that will no longer
hold a charge that is the last straw for a frustrated consumer.
Decreasing product life cycles and a continued battle to maintain market position
in the electronic marketplace have been the direct result of the rapid advances in
electronic technology. The emerging and overwhelming phenomenon of product
abandonment, rather than product failure, has given rise to business models that
succeed by significantly surpassing previous product performance capabilities,
inducing a shift in consumer buying patterns. Companies cannibalize themselves
(i.e., they create new and better offerings while they still hold a leadership position
in the marketplace with their previous offering) to feed the consumers’ hunger for
more, faster, better, cheaper. Intel Corp., developer and manufacturer of microprocessors,
has been the leader in pursuing a techonomic business model based on
predicted antiquation.
9
Gordon Moore was a cofounder of Intel and also the originator of Moore’s Law.
Predictable obsolescence was the business DNA upon which Intel was founded, and it was based upon Moore’s technical observations. Basically, if the company ever
stood still for two years with an existing product, it would be surpassed by a wave
of competitors continually improving their offerings. From the start, Intel organized
their design process, marketing launches, and financial planning around the concept
that they must be the first to innovate the next generation of product.
The technological advances revealed in processor performance gains were
remarkable and positioned Intel as the de facto leader in the microprocessor marketplace.
The financial commitment in research, development, marketing, and fabrication
was justifiable only with the technical understanding of this model in mind.
Before Intel, it would have been considered ludicrous to introduce a product that
would usurp market share from your own product when you already controlled most
of the market. Today, market leaders in consumer electronics have to execute their
product plans in this way simply to remain viable in the marketplace. New features,
more memory, more colorful display, greater compatibility, etc. are the requirements
of an ever-more-informed consumer. The instant information and procurement paths
offered by the Internet do not let producers hide behind brand name or strong
distribution channels.
Another strategy Intel used for staying on top of predictable antiquation was
research into new applications that would demand greater product performance —
research into how innovators (small companies and academics, for example) were
using Intel products in demanding ways or with new peripheral devices to stretch
the limits of performance. The PC continued to improve as application demands
continued to expand. Color, sound, networking, display resolution, print resolution,
digital cameras, digital video, modems, broadband, graphics, animation — the list
of software and peripheral advancements related to the PC is lengthy. Most of these
advances occurred in laboratories or engineer’s garages a few years before the speed
of the PC made their performance possible or economically feasible. But each one
placed new demands on the performance of the PC, from speed to memory to
compatibility. The new advances also created new markets with expanding demands
for PCs. PCs moved from the workplace to the home, and from the desktop to the
laptop. Intel’s wisdom, in addition to predictable obsolescence, was the encouragement
and creation of applications that accelerated the demand for better performance.
The increasing performance of the microprocessor has not come without a price.
One major contributor to improvements in operating speed has been manufacturing
methods that pack more components into a given area of silicon. As designs shrank,
manufacturing tolerances have become very demanding, leveraging from numerous
improvements in supporting technologies including material purity, clean-room techniques,
process control, etc. New fabrication facilities and advanced equipment are
required for each new generation of microprocessor. Through technology, microprocessor
improvements will continue to track Moore’s Law expectations for the foreseeable
future.
Economically, is there an end in sight for the expenditure of capital needed to
build fabrication facilities for production of next-generation chips? Certainly Intel
remains committed to this approach. At what point would a techonomic analysis of
the market, its pricing structure, and potential reveal that the risk of taking the next
step exceeds anticipated rewards? Due to the short product life cycle, about two years, the facility capital cost becomes a significant contributor to the overall consideration
of product per unit cost. Below is a techonomic metric for products that
have a limited shelf life and require a significant capital investment.
No comments:
Post a Comment