Top Rounded Image
02 February 2010 New York Times

Smart Dust? Not Quite, but We’re Getting There

Peter Hartwell of Hewlett-Packard Labs compares today's computers to brains that are blind to their surroundings. But development of low-cost sensors, he says, is 'closing that gap.'
Peter Hartwell of Hewlett-Packard Labs compares today's computers to brains that are blind to their surroundings. But development of low-cost sensors, he says, is 'closing that gap.'.
Image Credit: Jim Wilson/The New York Times.

In computing, the vision always precedes the reality by a decade or more. The pattern has held true from the personal computer to the Internet, as it takes time, brainpower and investment to conquer the scientific and economic obstacles to nudging a game-changing technology toward the mainstream.

The same pattern, according to scientists in universities and corporate laboratories, is unfolding in the field of sensor-based computing. Years ago, enthusiasts predicted the coming of “smart dust” — tiny digital sensors, strewn around the globe, gathering all sorts of information and communicating with powerful computer networks to monitor, measure and understand the physical world in new ways. But this intriguing vision seemed plucked from the realm of science fiction.

Smart dust, to be sure, remains a ways off. But technology’s virtuous cycle of smaller, faster and cheaper has reached the point that experts say sensors may soon be powerful enough to be the equivalent of tiny computers. Some ambitious sensor research projects provide a glimpse of where things are headed.

Last year, Hewlett-Packard began a project it grandly calls “Central Nervous System for the Earth,” a 10-year initiative to embed up to a trillion pushpin-size sensors around the globe. H.P. researchers, combining electronics and nanotechnology expertise, announced in November that they had developed sensors with accelerometers that were up to 1,000 times more sensitive than the commercial motion detectors used in Nintendo Wii video game controllers and some smartphones.

The use of accelerometers in consumer products points to the changing economics of sensors, notes Peter Hartwell, a senior researcher at H.P. Labs. In the 1980s, accelerometers began to be used in automobiles, to detect crashes so that air bags would inflate. That was a specialized, costly application of motion sensing. But today’s low-cost sensors, Mr. Hartwell says, are opening the door to widespread use, linking the physical world to computing as never before.

In places like desktops and data centers, computing power marches ahead relentlessly. “But it is still as if the computer is a brain that is blind, deaf and dumb to its surroundings,” Mr. Hartwell says. “Closing that gap is what the sensor revolution is all about.”

Microchip-equipped sensors can be designed to monitor and measure not only motion, but also temperature, chemical contamination or biological changes. The applications for sensor-based computing, experts say, include buildings that manage their own energy use, bridges that sense motion and metal fatigue to tell engineers they need repairs, cars that track traffic patterns and report potholes, and fruit and vegetable shipments that tell grocers when they ripen and begin to spoil.

Power consumption has long been the Achilles’ heel of sensor-based computing. Smart dust, observed Joshua Smith, a principal engineer at Intel Labs in Seattle, proved impossible because the clever sensors needed batteries. Instead of dust, he said, the sensor nodules would be the size of grapefruits.

But the power barrier, Mr. Smith says, is rapidly eroding. Advances in sensor chips are delivering predictable, rapid progress in the amount of data processing that can be done per unit of energy. That, he said, expands the potential data workloads that sensors can handle and the distance over which they can communicate — without batteries.

At Intel, Mr. Smith is doing sensor research that builds on commercial RFID technology (for remote identification) and adds an accelerometer and a programmable chip — in a package measured in millimeters. Its power, he explains, can come from either a radio-frequency reader, as in RFID, or the ambient radio power from television, FM radio and WiFi networks. (For the latter, Intel is developing “power-harvesting circuits,” he adds.)

“The ability to eliminate batteries for these sensors brings the vision of smart dust closer to reality,” Mr. Smith says.

In this model of computing, the sensors are servants. They exist to generate data. And the more sensors there are, the better the data quality should be. When mined and analyzed, better data should in turn help people make smarter decisions about things as diverse as energy policy and product marketing.

If sensor-based computing takes off, it will ignite fresh demand for a wide range of hardware and software to store, process and search the new oceans of data for nuggets of useful knowledge. So it could be a boon to business, a foundation for what analysts call “the Internet of Things.”

“It does feel almost like the beginning of the Internet,” says Katharine Frase, vice president for emerging technologies at I.B.M. Research. “You can see that sensor computing is going to be important and useful, but it’s not possible to see in advance just how it will transform things.”

The recent advances in stand-alone sensors may be impressive, but some researchers are pursuing a different path. “We already have massively distributed wireless sensors — they’re called cellphones,” explains Deborah Estrin, a computer scientist at the University of California, Los Angeles.

Ms. Estrin and her colleagues at the university’s Center for Embedded Networked Sensing have designed several projects that use cellphones and people in data-gathering and analysis. Cellphones, they say, are versatile data collectors and are becoming more powerful all the time — with cameras, GPS, accelerometers and Internet connectivity. Their work is at the forefront of an emerging field called participatory sensing.

One project involves collecting travel, time and location data that is fed into Web databases to calculate an individual’s personal environmental impact and exposure to pollutants (peir.cens.ucla.edu). Another project, in cooperation with the National Park Service, uses a smartphone application to identify, photograph and track the advance of invasive plants, like Harding grass and poison hemlock, which can crowd out local species and undermine biodiversity (whatsinvasive.com).

STILL another is a Twitter application for self-reported data on one’s daily life (your.flowingdata.com), which can be assembled into small graphs that show a person’s behavior over time. The most common use since the site went up last fall, says Nathan Yau, a graduate student who created the application, has been to track personal health — eating habits, weight, blood pressure, glucose and sleep times.

The cellphone is a constant companion — immediate and intimate, always there to inform, remind and prompt. “The killer app for this is personalized health and wellness,” Ms. Estrin says. “The potential to help people make behavior changes and lead healthier lives is tremendous.”

Source: New York Times /...

Previous Story: Crack-proof concrete set to improve roads
Next Story: Scientists Use Light Beams to Prick Mystery of Acupuncture

Bookmark and Share

Leave a Comment

The Institute of Nanotechnology puts significant effort into ensuring that the information provided on its news pages is accurate and up-to-date. However, we cannot guarantee absolute accuracy. Consequently, the Institute of Nanotechnology disclaims any and all responsibility for inaccuracy, omission or any kind of deficiency in relation to the news items and articles hosted herein.

Bottom Rounded Image