By Doug Hornig, Senior Editor
When looking at technology, it is easy to focus just on the most bleeding-edge things - genetic medicine, nanotechnology, or quantum computing. No question, it's amazing what we've accomplished. But these things are happening in labs far removed from our everyday lives.
There is also change going on right at our fingertips in an industry that touches billions of people every day, yet to which few of them ever really give much thought. Despite that, hundreds of scientists in many countries are continually at work, trying to reinvent this now centuries-old "technology" using semiconductors, nano-scale engineering, even biology. All to improve the seemingly simplest of devices - the light bulb.
Despite the dim (ahem) interest, lighting accounts for more than 15% of global electricity usage, and is a $50-billion-per-year industry. Today we look at a fundamental shift in lighting technology; away from long-established market leaders, this is now being accelerated by policy around the world.
In the Beginning
Of all human accomplishments, the control of fire ranks near the top of the list in terms of importance. It was, along with creation of crude stone tools, among the first technological breakthroughs - the first times we manipulated nature to create tools for ourselves that would improve quality (and length) of life in a dramatic way.
No one knows when fire was first harnessed by human beings. A conservative guess would be about a half-million years ago, although other estimates place it as far back as two million. In any event, it probably came about after some prehistoric savant had an "Aha!" moment and deduced that it was possible to preserve a few embers after a lightning strike... and then, that they could be carried from place to place, enabling a perpetual renewal of the flame. To be appointed keeper of the fire was likely a great honor.
Fire bestows upon its users many gifts. It warms living spaces, it can be used to cook food, and it is a weapon against predators. But it is also a source of light, serving as a surrogate sun and prolonging the possibility of human activity into the nighttime hours. Early humans were freed to gather around the hearth after dark, to work, to socialize, and - as language evolved - to tell each other stories.
Over time, humans have created light by burning just about anything flammable, from wood to whale oil to gas. But using actual fire as a light source has a lot of drawbacks. It's impractical: you need to constantly maintain your fuel supply. It's inefficient: the great majority of the energy is lost as heat. And it's dangerous, always a threat to spread if you're careless.
In order to take the next step on the long road to Las Vegas Boulevard, people needed something better. That thing happened to come in the form of a vacuum-sealed glass chamber filled with electroluminescent materials - the now-ubiquitous incandescent light bulb.
The invention of this bulb is most often - and mistakenly - attributed to the Wizard of Menlo Park, Thomas Edison. In truth, its antecedents date back to 1802, when the great British scientist Sir Humphry Davy demonstrated incandescence for the first time by passing an electric current through a thin strip of platinum, chosen because of its high melting point. It didn't work very well, but the principle was established.
After Davy set the stage, it was off to the races. Over the ensuing decades, experimenters fiddled with many different combinations of metal wires, carbon rods, and evacuated or semi-evacuated enclosures.
It wasn't until much later - in 1879 - that the 32-year-old Edison started coming close to a commercially feasible version of the incandescent light, with a carbon-filament lamp. In 1880, he created a carbonized bamboo filament with a working life previously unimagined, of over 1,200 hours.
Original Edison carbon-filament bulb
Carbon, however, was not destined to become the standard for filaments. Tungsten - which lasted longer and gave a brighter light - was. In 1906, the General Electric Company patented a method of making filaments from sintered tungsten and, in 1911, used ductile tungsten wire for incandescent light bulbs. Then, in 1913, GE's Irving Langmuir found that filling a lamp with inert gas instead of a vacuum resulted in twice the luminous efficacy and a reduction in bulb blackening.
Thus was born the incandescent lamp as we know it; it has survived relatively unchanged for the past century.
|Outline of glass bulb Low-pressure inert gas (argon, neon, nitrogen) Tungsten filament Contact wire (goes out of stem) Contact wire (goes into stem) Support wires Stem (glass mount) Contact wire (goes out of stem) Cap (sleeve) Insulation Electrical contact|
While technology progresses more rapidly in the Information Age than it did in the preceding centuries, the incubation time from invention to commercialization continues to be a defining trend for technology. In a commodity industry such as lighting it is especially important, as the economies of scale push down the commercially tolerable price for mass adoption to incredibly low levels. There are, however, ways to enter the market and expand over time. There is no better example than the incandescent bulb's most effective competitor to date.
The modern incandescent gives off a light that's fairly close to sunlight. People are used to it. They like it. However, the problem with these bulbs has always been their inefficiency. Ninety percent or more of the energy used to get the filament to glow is dissipated as heat... not as bad as a fire, but still hardly a best in show.
This inefficiency and the associated large electrical cost per lumen (the standard measurement of light, a photonic equivalent to a gram or meter) left a big opening for competitive technologies. And, as technologists tend to do with such an obvious problem, over the course of the past century, numerous alternatives have been introduced, including: sodium vapor, neon, xenon, carbon arc, and many others. Some of these specialized lights are still in use today in niche applications. But, for one reason or another - short lifespan, too many toxic chemicals, or simply too expensive - none was suitable for everyday lighting.
Nearly from the beginning, the only real challenger has been the fluorescent lamp.
Fluorescents are tubes containing pressurized mercury vapor. When an electrical current is passed through them, the excited mercury atoms produce short-wave ultraviolet light that then causes phosphors coating the tube's interior to fluoresce, producing visible light.
Again, understanding of the principle goes back to the mid-19th century; all the proper elements were available by the 1920s. But producing a workable model took even longer than with incandescents. The tubes may look simple, but they're not.
The main difficulty is that fluorescent lamps are "negative differential resistance" devices, which means that as more current flows through them, electrical resistance drops, allowing even more current to flow, and so on. So you can't connect one directly to a constant-voltage power supply and walk away; pretty soon, it would self-destruct under the buildup of current. To prevent this, you have to have something in between - a ballast - to continually regulate the current flow through the tube.
There were plenty of other problems, too. It wasn't until 1934 that GE finally rolled out its prototype. As one industry historian wrote of the event: "A great deal of experimentation had to be done on lamp sizes and shapes, cathode construction, gas pressures of both argon and mercury vapor, colors of fluorescent powders, methods of attaching them to the inside of the tube, and other details of the lamp and its auxiliaries before the new device was ready for the public."
With the cost savings they afforded, fluorescents were quickly adopted by schools, businesses, municipal buildings, and such. By 1951, more light in the US was produced by them than by incandescents.
But because, for a long time, they could only be made in long tubes (either straight or with several U-shaped bends), they failed to catch on in home lighting. And even if they'd been initially adaptable, few would have chosen them. People hated the cold, harsh light they gave off, that they required some warm-up time, and that they had a tendency to flicker.
Still, there were enough researchers on the job to ensure that eventually most problems would be resolved, and we would get a fluorescent bulb that could screw into a standard socket. That happened in 1980, when Philips introduced a screw-in lamp with integral ballast, the first compact fluorescent lamp (CFL). Osram followed that in 1985 with the first CFL to include an electronic ballast. These tube types remain the most popular in Europe, but in North America, helical lamps , first released in 1995, have become the favorites.
CFLs cost much more than standard incandescents - and their price has shot up even more lately, due to their need for rare-earth metals, whose cost has risen rapidly in recent years as they are used in virtually every single type of electronic component. However, despite cost struggles, they have a number of advantages. The primary one is energy consumption. They convert far more electricity into light (measured in lumens per watt), as you can see here:
In addition, CFLs have a life rated at some 10-20 times longer than their rivals, at least when operated continuously, for several hours at a time. Their light is more diffuse, reducing glare. And you can touch them without getting burned.
Nevertheless, consumer response has been tepid, as there are some major disadvantages. Many still don't like the light quality. If there is frequent on/off switching, fluorescents age rapidly and their life is severely shortened. The shape is not suitable for many lampshades. They emit much more UV light than incandescent bulbs, which can affect sensitive individuals and harm paintings. Some people may also be sensitive to their flicker rate, and there's suspicion they can trigger migraines. They can't be used with a dimmer switch. And the mercury they contain can be a health hazard if they're broken.
It's this potential toxicity issue that has critics of the conversion to fluorescents most up in arms.
The US government has mostly downplayed the mercury risk. However, in Australia, they're considerably more cautious. Lengthy guidelines for cleanup of a broken fluorescent include the following admonitions:
Open nearby windows and doors to allow the room to ventilate for 15 minutes; do not use a vacuum cleaner or broom, instead scoop up broken material (e.g. using stiff paper or cardboard), if possible into a glass container which can be sealed with a metal lid; use disposable rubber gloves rather than bare hands; use a disposable brush to carefully sweep up the pieces; and it is important to emphasize that the transfer of the broken CFL and clean-up materials to an outside rubbish bin (preferably sealed) as soon as possible is the most effective way of reducing potential contamination of the indoor environment.
That's not even exhaustive and it's already quite a procedure.
Despite these differing viewpoints on the safety of fluorescent bulbs, CFLs have to some extent become the battle flag of an escalating political and economic war being waged against the incandescent bulb.
The Lighting Wars
It's doubtful that any of the pioneers of electric lighting could have imagined their products becoming political footballs. But never underestimate the capacity of the government to involve itself in every possible aspect of life, including consumer choice.
Welcome to the lighting wars.
Governments around the world are moving to mandate the demise of the incandescent bulb. Brazil and Venezuela started phasing them out in 2005. The European Union, Switzerland, New Zealand, and Australia began their phase-outs in 2009, and others are on track to do the same, including Argentina, Russia, India, and Canada this year.
The US government got into the act in December 2007, with the passage of the Energy Independence and Security Act (EISA). Tucked inside it was a requirement that all general-purpose light bulbs that produce 310-2,600 lumens be 30% more energy efficient than then-current incandescent bulbs between 2012 and 2014, starting with standard 100-watt bulbs and working down to 40-watt bulbs.
By 2020, a second tier of restrictions would become effective, requiring all general-purpose bulbs to produce at least 45 lumens per watt (similar to current CFLs). Exemptions from the Act include reflector flood, three-way, candelabra, colored, appliance lamps, plant lights, stage lighting, and other specialty bulbs.
Mostly, they want to force you to change the light bulbs in your living room.
The near-term winner from all of this meddling thus far has been the CFL. However, there are quite a few dark horses in the race to fill the gap, and they're about to be thrust upon the market.
The Future of Light
The future is... well, bright.
What technology will ultimately come to dominate the lighting market is still up for grabs, but in the near term, one possibility is incandescents redux. Once EISA was put in place, manufacturers of the traditional bulb tried any number of ways to make it more efficient. The first bulb to emerge from this push - Philips Lighting's Halogena Energy Saver - uses a special chamber to reflect formerly wasted heat back to the filament to provide additional lighting power. They're expensive, selling for about $5 apiece. But they're cheaper than many fluorescents and are also 30% more efficient than standard bulbs, bringing them into EISA compliance.
Longer term, however, it's likely that both fluorescents and incandescents will go the way of the torch.
One potential replacement is electron-stimulated luminescence (ESL). It works through accelerated electrons hitting a phosphor surface, making the bulb glow in a process known as "cathodoluminescence." The process is similar to that employed by your old computer monitor's cathode-ray tube (CRT). ESLs are mercury-free and have the same light quality as incandescent lamps, but are about 70% more energy efficient, produce 50% less heat, and are rated to last up to five times longer than incandescents.
The bulbs were developed by a small company, Vu1, and are new to the market, so they're rather hard to find at the moment. If you can locate one, a 65-watt equivalent will set you back about $20 - far from a valid economic substitute for the few dollars a standard light bulb will typically cost.
But the real challenger on the horizon is the LED (light emitting diode). These little dots of light have become familiar to many consumers because of their widespread use in automotive taillights and more recently their appearance in flat screen TVs.
LEDs came out of electronics technology and are semiconductor based. They stimulate electrons to release energy in the form of photons, a process called "electroluminescence."
They have been around since the 1960s, so adapting them to home lighting has taken some time. But the first commercial products are on the way. They're also mercury-free, more energy efficient than incandescents, and dimmable. They don't degrade with frequent on/off switching, and they're rated to last six times longer than CFLs. Right now, they are just beginning to appear in the marketplace; their main drawback is cost, about $20 apiece for a replacement bulb. Individual LEDs are cheap, down to a few pennies apiece from hundreds of dollars in the 1970s. However, stringing together a set of them - with the wiring required to make them work together and with a standard lighting socket - still costs quite a few dollars. That seems likely to come down as mass production sets in and the tech improves; thus many people have tabbed them as the bulb of the future. And we have to say, the new Switch LED bulb is nothing if not futuristic.
In truth, though, there's no telling what is going to catch on in the end - after all, an LED bulb may be rated for 20 years of life, but a lot of new innovation can happen over two decades. Something better could easily come along even before the first LEDs start to burn out.
Holy Light Saber
Yes, it's just possible that one of those innovations is going to be laser lighting.
The first step along that road, so to speak, has been taken by BMW, which recently revealed that the company plans to outfit its cars with laser headlights "within a few years."
If you're concerned about tooling down the highway and suddenly getting blasted by a laser that burns your retinas, BMW says, "Don't be." The bluish laser beam isn't emitted directly, but is first converted by means of a fluorescent phosphor material inside the headlight into a pure white light that is suitable for use in road traffic and poses no risk to humans or animals. The emitted light would also be very bright and white, making it more comfortable to the eye.
Because it is a "coherent" light source, BMW says that laser lighting can produce a near-parallel beam with an intensity that is a thousand times greater than LEDs. At the same time, it requires only half the energy of LED headlights. Where LED lighting generates about 100 lumens per watt, the laser would generate around 170 lumens.
The diodes used in a laser light are also small, about 10 microns in length - far shorter than the one-millimeter-long, square-shaped cells used in LED lighting. This means that BMW could radically reduce the size of its headlights. The company says it has no plans to do so, but this raises some intriguing possibilities.
One can envision the day when a home or office is illuminated by a few tiny lights that consume almost no energy at all.
Now there's a prospect that should please both consumers and meddlesome bureaucrats.
To profit from exciting 21st-century breakthroughs like revolutionary new forms of lighting, you must avoid these three technology-investing myths.]
Bits & Bytes
Solar Startup Sets New Efficiency Record (Technology Review)
One of the main drawbacks of using solar panels to generate more than just a small fraction of the world's electricity needs is their inefficiency, which makes their cost per kilowatt hour significantly above that of coal, gas, or nuclear power plants. Conventional silicon solar panels typically convert less than 15% of light into electricity. But a small startup out of Durham, North Carolina has just achieved an independently verified efficiency rating of 33.9% with its new concentrated gallium arsenide solar panel, marking the first time any solar module has been able to convert more than one-third of the sunlight that falls on it into electricity. These new solar cells still can't come close to competing with conventional methods of electricity generation, but it's an impressive advancement nonetheless.
Eolas Technologies, an ostensibly Tyler, Texas-based company with 12 employees, is suing some of the largest Internet firms in the world (like Google and Amazon) for $600 million claiming - get this! - ownership of the "interactive web." Michael Doyle, a biologist and chairman of Eolas, claims that he and a couple of colleagues actually patented the "interactive web" back in 1993. He argues that a program he created to allow doctors to view embryos over the nascent World Wide Web was the first program that allowed users to interact with images inside of a web-browser window. Doyle and his lawyers now claim he's owed royalties for the use of technologies such as watching online video and having a "search suggestion" pop up in a search bar. It may sound ridiculous, but apparently the situation was serious enough to motivate Tim Berners-Lee, the inventor of the World Wide Web, to fly to Texas and testify in the case in an attempt to defeat the patent claims.
Micron's CEO Dies in Tragic Plane Crash (Daily Tech)
Finally, we would like to express our sincere condolences to the family and friends of Steven Appleton, longtime chairman and CEO of Micron Technology, who died in a plane crash late last week. Appleton was instrumental in Micron's diversification beyond DRAM and into non-volatile flash memory in an effort to drive future growth for the company and its global workforce of 20,000 employees. RIP, Steve.