Y2K 25th anniversary

VISIONS: Come out, come out, wherever you are –
Get set to say hi to the neighbors

By John Noble Wilford

If there is life here on Earth, why not elsewhere?

The question has long inspired wonder and some fear, doomed a few to the heretic’s stake and invigorated recent investigations of other planets in the solar system and of the stars beyond. If life is indeed the inevitable outcome of cosmic evolution, a “cosmic imperative,” in the words of the Nobel Prize-winning chemist Christian de Duve, the first clear evidence of extraterrestrial life will most likely be discovered in the first century of this new millennium.

One reason for the optimism is as old as speculations by the ancient Greeks and Romans impressed by the vastness of the heavens. “Space contains such a huge supply of atoms that all eternity would not be enough time to count them and the force which drives the atoms into various places just as they have been driven together in this world,” wrote the Roman philosopher Lucretius in the first century B.C. “So we must realize that there are other worlds in other parts of the universe, with races of different men and different animals.”

The modern variation of this reasoning stems from astronomy’s recent estimates of upward of 100 billion galaxies, each with tens of billions of stars. So many opportunities for life to have emerged, as it did on one planet of one star, the Sun. And everything scientists learn shows that the physical forces nearby apply everywhere; there is no reason to think that the solar system or its Milky Way galaxy are unique.

Biological studies on Earth suggest that, if anything, scientists have been underestimating the potential for life and its resilience in seemingly adverse environments. Drilling deep into Earth’s crust, geochemists have encountered bacteria living there, without sunlight and probably on a diet of chemicals. Similarly, oceanographers have been surprised to find undersea vents disgorging mineral-rich hot water that supports a teeming community of exotic life, also existing without sunlight.

“The question may not be the probability of the origin of life,” Dr. Norman R. Pace, a University of California biologist, has said, “but rather the probability that life, having arisen, survives and comes to dominate a planet.”

Seekers of extraterrestrial life, at least in its humblest forms, were encouraged by discoveries in the 1990s. The Galileo spacecraft found strong evidence of a global ocean of water, where simple life just may exist, under the ice of Europa, a moon of Jupiter; Europa will be a target of life-seeking exploration in coming years. A report that a meteorite from Mars contained circumstantial evidence for microbial life created a stir; though the findings were suspect, they galvanized a renewed drive to explore Mars for fossil or extant life.

Of surpassing significance, the recent discovery of planets around other stars is fixing for scientists the places to look for extraterrestrial life. Knowing where to search, the National Aeronautics and Space Administration is developing instruments for spacecraft to be launched over the next two decades. These instruments will be capable of finding Earth-sized planets and detecting there the chemical signatures associated with life: oxygen, water and carbon dioxide. The absolute clincher, scientists say, would be to find oxygen and methane together.

Current efforts to detect radio signals from other intelligent beings, begun in the 1960s, are a long shot. But for the first time in human history the technology exists to search other planetary systems for worlds harboring life in some form. The results could be the most reverberating scientific discovery of the new century.

VISIONS –
Great hits headed for the attic

The challenge put to The Times’s critics was direct: Name a 20th-century work considered “timeless” today that you think will be all but forgotten in 100 years. Their selections:

  • ‘WHITE CHRISTMAS,’ PAST DREAMING OF

The oldies stations of 2099 won’t be playing much 20th-century pop. While all art is a product of its time, popular music doesn’t pretend otherwise. In a century, audio-only music may itself be an antique, superseded by tunes tied to video, virtual reality or direct neural stimuli.

Oblivion could envelop even the 20th century’s most popular songs. Consider Irving Berlin’s “White Christmas,” the best-selling single until Elton John’s “Candle in the Wind 1997.” When Bing Crosby recorded it in 1942, it was a taste of home-front coziness in the midst of World War II. The melody uncoils craftily, but what would late-21st-century listeners identify with? Not Christmas cards, long since replaced by e-messaging. And the snow for Bing’s white Christmases? Gone with global warming. While pop needs nostalgia for preservation, “White Christmas” may sound like an ecological indictment instead. JON PARELES

  • ‘ULYSSES,’ REVERED BUT UNREADABLE

James Joyce’s “Ulysses” came up first on a recent list of the 20th century’s greatest 100 novels, but in all likelihood few of the 10 writers and critics who drew up that list had read the book they were honoring. The fact is that “Ulysses” has been both revered and unread for decades, and, over the long run, its unreadability will outweigh the reverence it inspires.

This is a melancholy prediction. Joyce’s work is astonishing, full of amazing wordplay and biting lyricism. Leopold Bloom is a great character, and Dublin emerges from Joyce’s pages as a place of mythic magic. But “Ulysses” was the product of a modernist moment, and it probably will be seen more and more as a brilliant but impenetrable literary gimmick, one whose essential inaccessibility will reduce its already shrinking audience. RICHARD BERNSTEIN

  • ‘EINSTEIN’ ON THE SHOALS

“Einstein on the Beach,” with music by Philip Glass and stage pictures by Robert Wilson, is a fragile beauty that may not survive its time. The music and production are so co-dependent that Mr. Wilson’s successors (or even an older Robert Wilson) will be hard pressed to re-establish the original chemistries.

There is also the nature of the music, which is as much a rejection of the culture around it as something in itself. “Einstein” is not about symphonic string sections, expressive phrases or the artist as hero. It is proudly loud and defiantly mechanical; it replaces beginning, middle and end with music that suddenly starts and arbitrarily stops.

Rebellion needs an enemy, and there is a good chance that the influence of 19th-century Romanticism won’t linger another 100 years. Having nothing left to push against, “Einstein” may lose its reason for being. BERNARD HOLLAND

  • ‘BITCHES BREW,’ CHIC AND PERISHABLE

In 1969, there was no way that Miles Davis’s “Bitches Brew” would not have been seen as pivotal. Jazz and rock, and their respective audiences, were bending toward each other, and this double album had strange instrumentation and six extremely long tracks; its psychedelic cover illustrations were done by Mati Klarwein, the artist Santana had hired.

Undoubtedly, some of it remains beautiful. But it is bland compared with Davis’s destination in electric music a few years later. A great deal of “Bitches Brew” was chic and perishable: its long, dull stretches of funk, electric piano and spidery bass-clarinet now seem as innocuous as incidental music for a film. Reissued last year in a four-CD boxed set, with alternate takes, it has assumed masterpiece status again, though some of the best music from Davis’s early ‘70s period has never been available on CD in this country. BEN RATLIFF

  • GOING, GOING, ‘GONE WITH THE WIND’

Still regarded by die-hard nostalgists as the greatest Hollywood film ever made, “Gone With the Wind” has been looking awfully musty lately, and, as time goes by, its blushing, romantic vision of the Confederacy, the Civil War, the burning of Atlanta and the freeing of the slaves can only look increasingly quaint. The movie’s black characters, while sympathetically drawn, are crude racial stereotypes, and its portrayal of the Confederacy as a lost paradise of dashing cavaliers yakking about honor and cinched-waisted Southern belles swooning on their pedestals is almost embarrassingly sentimental.

The movie’s strengths remain its narrative sweep and Vivien Leigh and Clark Gable’s portrayals of two sexy, ruthless narcissists zestfully torturing each other. These deft, knowing performances are the most modern aspects of a movie that 100 years from now will be regarded as a period piece – a guilty pleasure to be consumed not for its historical or artistic value, but as an extra-sudsy bubble bath infused with the scent of Georgia pine. STEPHEN HOLDEN

  • PICASSO, A BIT TOO VIRTUOUS

When Picasso was commissioned to paint for the Spanish Republican Pavilion at the World’s Fair in Paris in 1937, he came up with the idea of a big, abstract picture about the recent bombing of a Basque town he had never visited.

“Guernica” became the 20th century’s most famous painting because Picasso was the century’s most famous artist and because, having been intended for Republican Spain, the picture spent several decades, with Picasso’s approval, in conspicuous and splendid exile at the Museum of Modern Art in New York. There it became an international symbol of antifascism until Generalissimo Francisco Franco died.

With the personality cult of Picasso having already begun to fade along with memories of the general, “Guernica” can increasingly be seen as a virtuous hodgepodge of Cubist design enlarged to the scale of a billboard and hitched to a program of political dissent that Picasso seemed never to have been very comfortable with. In the end, he was comfortable in his art only with himself as the subject. MICHAEL KIMMELMAN

  • ‘A CHORUS LINE,’ THE HUGFEST ENDS

“A Chorus Line” was the show everyone wanted to cuddle with. This big-hearted, collaborative musical, overseen by the great choreographer Michael Bennett, seemed to reflect the solipsism of the 1970s, as it transformed a dance audition into group therapy. Audiences loved to give the love so earnestly demanded by the needy, soul-exposing gypsies onstage, turning “A Chorus Line” into Broadway’s longest-running show until its eclipse by “Cats.”

Listen with cold ears to a cast recording today, however, and you may wonder why you fell so hard. The show felt like one big hugfest during its run. But it fed off the specific sentimentality of its time as much as “South Pacific” did, and without anything like Richard Rodgers’s lush score. All those songs about growing up lonely and rejected, set to Marvin Hamlisch’s jinglelike melodies, now bring to mind lines from a tougher musical of the period, “Chicago”: “None of us ever got enough love in our childhood. That’s show biz, kid.” BEN BRANTLEY

  • ‘THE NUTCRACKER,’ A TIRED GIMMICK

Visions of sugarplum fairies will continue to dance in our heads through Tchaikovsky’s glorious score for “The Nutcracker.” The music will endure, but as a ballet, “The Nutcracker” is approaching self-destruction.

Although the first staging dates from 1892, the “Nutcracker’ craze began with George Balanchine’s 1954 production for the New York City Ballet. A shrewd mix of spectacle and romantic plot (a child’s dream journey into a candy kingdom) has been turned by dance troupes throughout the land into a box-office gimmick.

The truth is that “The Nutcracker,” slim on classical dancing, is not the ideal introduction to ballet. Hence the urge to tinker and the spate of revisionist versions: Drosselmeier, the little heroine’s godfather, is now likely to be a child molester. At some point, we will prefer an ice show from that other magic kingdom. ANNA KISSELGOFF

VISIONS: Biology –
It shouldn’t happen to a dog

By Robin Finn

For those who wonder where genetic engineering may take man in the future, it might be instructive to look at what man has already wrought on his supposed best friend. In the $2 billion canine industry, designer dogs rule.

Unfortunately, the process of engineering blue-ribbon perfection has left some breeds, like toy poodles, Lhasa apsos and Shih Tzus, as authentic as vanity license plates.

It’s the less popular, less manipulated breeds like Brittany spaniels, otter hounds and Chesapeake Bay retrievers that are the lucky ones: they have not suffered the physical malformations that arise when inbreeding and overrefinement backfire. In dogs, as in humans, when there is a doubling up on genes, it is not just the desirable qualities that are repeated.

Presumably, the perfect dog for a future in which the microchip is mightier than any mastiff won’t bark, won’t bite, won’t shed and will be color coordinated to its decor. Hypoallergenic? Naturally (well, not quite naturally). Resourceful as Rin Tin Tin and reliable as Lassie, it will possess the flair of Pongo the Disney Dalmatian and the wit of Dinky, the Taco Bell chihuahua.

But purveyors and purchasers of perfect dogs have to hope that their animals won’t inherit hip dysplasia (German shepherds), vision problems (poodles) or a decided lack of intelligence (collies) because too much matter has been crammed into too small a space.

After Disney’s live-action version of “101 Dalmatians” created a craze several years ago, the rush to supply Dalmatians might have had something to do with the fact that many of them went deaf in at least one ear. Slaves to miniaturization, toy dogs, like the latest mass-media sensation, the chihuahua, tend toward disproportionate dental and skin troubles.

Even the shar-pei, the novelty dog whose skin hangs from its frame with the artful drape of a window treatment, has fallen victim to vanity breeding: the more wrinkles, the more expensive the dog. But more wrinkles mean more skin irritation, and wrinkles that sag into the eyes can portend eye surgery.

Veterinarians like Dr. Elliot Katz, the founder of the animal rights group In Defense of Animals, have called for a moratorium on the breeding and selling of purebreds and consider the American Kennel Club, which recognizes 141 breeds and registers more than a million dogs annually, an enemy.

“The A.K.C. is promoting dogs almost as an accessory,” Dr. Katz contends. “When it comes to objectification and exploitation, they could be put side by side with the people who promote baby beauty contests.”

These dogs are turned into “toys and caricatures,” he said, adding: “The process has been speeded up via genetic engineering, and the bottom line is that somebody’s making money.”

Not us, say the breeders, the American Kennel Club and officials of the Westminster Kennel Club Dog Show, dogdom’s own Miss Universe pageant. They say they are out to preserve perfect dogs, not invent them. They acknowledge a measure of breeder ignorance before a screening test was developed to detect rogue genes in dogs about to be mated. But with this test, which costs about $1,200, responsible breeders are producing healthier dogs. Not so with the puppy mills, the bane of every popular purebred. “With popularity comes bastardization,” said Thomas Bradley, a spokesman for the Westminster club.

Gene Zaphiris, the founder and editor of Dog News, added, “Let’s face it, every breed was man-made. Whether bred for pretty or bred for a reason, with inbreeding comes problems. But a moratorium on breeding won’t stop the problems. What’s really too bad is that you can’t breed human ethics.”

VISIONS: Power –
Missing in Russia: A history and a purpose

By Michael Wines

Anyone trying to divine the next step on Russia’s wandering course toward nationhood might start with this arresting fact: the last time Russians lived within borders as constricted as the current ones, Ivan the Terrible had been dead for only a few decades, St. Petersburg was a frozen swamp, and New York City was still the property of the Dutch.

In Russia, more than in most places, geography is history. That the collapse of the Soviet Union wiped out 350 years of imperial expansion underlines a central fact of modern Russian history: for all practical purposes, there isn’t any. Russians not only lost their shackles with the end of the cold war, they also lost their historical reason for being.

“Russia has a major problem finding its identity in the world,” said Vyachaslav A. Nikonov, a political scientist and expert on contemporary Russia. “In the people’s mind, Russia doesn’t belong to anything.”

It is not too great a stretch to say that Russia has not been this bereft – of friends, wealth, territory, purpose – in centuries. The loss of empire cost her 100 million people and shattered military and political alliances that had been the basis of both foreign and domestic policies. The country’s gross domestic product has declined by about half since 1991. Birth rates have plummeted and death rates have jumped. The population, now 145.7 million, is projected to fall to 120 million, and possibly to 80 million, by 2050.

Also, Russians must somehow overcome the physical toll wrought by 70 years of irrational economic decisions. Much of the infrastructure, from factories to roads to telephone lines to basic things like doors and windows, is obsolete or worn out.

“Rebuilding the country is going to be an extremely prolonged process – a generation, a generation and a half, is going to be about the right time frame,” said Thomas Graham, a former American diplomat in Moscow and now a scholar at the Carnegie Endowment for International Peace. “And there’s nothing inevitable about it.”

The hope of Western nations was that Russians would embrace the kind of liberal democratic and economic philosophies that were their lamplights in the dark days of communist rule and reorient the nation outward toward cooperation and peaceful competition.

But as Robert Legvold, a scholar at Columbia University, noted in an essay, “The Three Russias,” such hopes ignore Russian history. Russia has always been ruled by strongmen. It has often viewed the outside world not as something to be engaged but as either a threat to be shunned or an enemy to be defeated.

It sees Russian civilization as exceptional – a truth in many ways, but also a short hop away from the view that other civilizations are decadent and polluting. It obsesses over the ebb and surge of its western and southern borders, boundaries that historically have been unstable precisely because they were not really Russian, but imperial.

“Contemporary Russia can no more escape this history than it can immaculately will itself into democracy or any other de nove identity,” Mr. Legvold writes.

For Russians, the difficulty in shedding the past is evident in trends of the last five years: the return of the belief that the West is a military threat; the interpretation of American oil deals in former Soviet states as hostile acts; the rising conviction, expressed in opinion polls, that Moscow should return to its days of empire.

Many experts here speak with longing of the prospects for a “Russian Pinochet,” a Latin-style strongman who will restore order while imposing a permanent template of modern capitalism. More than a few analysts think the mix of xenophobia, nostalgia and simple exhaustion will instead lead Russia to the same end as did the abdication of Nicholas II in 1917 – dictatorship, and a quest to regain lost territory.

Mr. Legvold avoids a prediction, but offers yet another model – not a Russian Pinochet or a noncommunist Stalin, but a Russian Slobodan Milosevic, whipping people’s frustrations and rage into a dangerously nationalistic threat to peace.

There is another possible outcome, too: a Russia that moves toward its own form of democratic capitalism. Aleksandr A. Kabakov, author of two fictional accounts of the future of Russia, said he hoped that the next president would tug Russia along that course. Not until World War II did Europe abandon the imperial model of nationhood, he observed, and even then, great nations like France could not free their colonies without wrenching transitions. His hope is that Russia will learn from the mistakes of the West.

That said, the second of his futurist novels, “The Sentenced One,” depicts a world 50 years from now in which Russia is reduced to a Muscovite duchy, a Berlin-style wall runs through Paris, and both Belgium and Britain are split on ethnic and racial lines.

“It’s not a prediction,” Mr. Kabakov said. “It’s a warning. Because if our people will be such fools as they are being now, it will be so. Not just Russia, but the whole world.”

VISIONS: Technology –
Technology sprints, but users set their own pace

By Tim Race

Technology may leap, but anthropology creeps.

Of all the technology lessons from 20th-century America, that one may offer the best guidance for the coming decades.

The century did bring countless technical marvels. Sending sounds and pictures across continents through thin air and humans across the ocean in machines heavier than air. Tweaking the nuclei of atoms to light up, or blow up, cities. Going to the moon. In short, the century was one extended physics problem, with the variables of distance, velocity and mass all in play.

Yet, for all the breakthroughs and their cumulative effect on everyday life, the way people live hasn’t changed as radically as the prophets of World’s Fair promotionalism and Jetsonian democracy would have led us to expect. True, we live longer. Medical techniques enable the infertile to bear children. Scientists can manipulate genes to clone a sheep. But there are no atomic cars, no robotic butlers, no jet backpacks, no moon colonies, no universal cure for cancer. And despite the value of the personal computer, half the nation’s households don’t have one 20 years after its advent.

Technology may leap, but anthropology creeps. Maybe that shouldn’t be surprising. In our capitalist society, which rewards innovation and enterprise, technologies continue to be invented at a prolific rate. But because ours is a democratic capitalism, there usually must be a political or market consensus before we, the people, adopt a fundamentally new way of doing things.

After the Manhattan Project’s physicists helped hasten the end of World War II, there was a drive to create a civilian atomic-energy era in the United States. But public resistance created such a political and regulatory backlash that nuclear reactors now play only a marginal role on the nation’s power grid. A similar public debate now rages over genetically engineered crops for fear they will unleash mutants capable of untold environmental mischief. As researchers race to conclude the Human Genome Project – a map of the 80,000 genes in every human cell – the bioethics lobby is staking out office suites in Washington.

But this is a nation so conflicted about even generally accepted scientific thought that a Gallup poll last June found that 68 percent of all those surveyed agreed that schools should teach creationism as an alternative to evolution.

“It’s part of the American myth that we are naturally a tinkering people and accepting of technological change,” said Alex Roland, a professor at Duke University who specializes in the history of technology. “But we’re not entirely devoid of Ludditism from time to time.”

And often, it’s not regulation or politics or religion but sheer market forces that reject what the technologists promote, as a company called Pointcast discovered a few years back with its ill-fated “push technology.” Initially, investors swarmed to the idea of pushing information onto computer screens throughout the day. If you left the computer idle for a few minutes, headlines or stock prices or weather maps would be pushed onto your screen. You could choose to pay attention – or to click the whole mess away with mounting annoyance.

Push technology was like an eager office temp trying too hard to make a good impression. Technology leapt, but anthropology couldn’t stand all the interruptions.

Know-how alone is never enough. Before a new technology catches on, it typically goes through at least three phases. First comes the basic invention, then a period of refinement. Finally, there must come innovations that give people a motive and means for adopting the technology. Guglielmo Marconi’s radio, invented in 1895, didn’t become a mass phenomenon until the 1920s, after refinements like electronic amplification and innovations like news and entertainment programming made it parlor-friendly.

“The last stage, of innovation and making it marketable, is important – and can take a long time,” Professor Roland noted.

The Internet has obediently followed this arc. Invented in the late ‘60s to let arms engineers, scientists and Pentagon underwriters swap files and messages, the network gradually developed over a few decades. But roaming the Internet resembled the early days of automobile travel: there were no road maps, and it helped to have a mechanic ride along.

Then, in the early ‘90s, came easy-to-use network software like America Online’s, which helped nonmechanics discover the utility and allure of e-mail. Next came the World Wide Web, a software overlay that made the broader Internet more navigable – even if the market and the masses are still winnowing its potential uses.

The Internet’s long march from cold-war research tool to nascent mass medium also illustrates the difficulty of predicting which technologies will be widely adopted. To cold-war seers, it was supposed to be outer space – not cyberspace – that would take the public on flights of fancy by 2000.

The charismatic rocket scientist Wernher von Braun went on the “Disneyland” television show in 1955 and proclaimed, “I believe a practical passenger rocket could be built and tested within 10 years.” He even helped design an 80-foot model rocket that, when Disneyland opened in California that year, towered over tourists in Tomorrowland.

Such hype helped generate public support for the $40 billion the federal government would spend to land a man on the moon in 1969. But hyperbole in the name of financing technology may have little relation to what people want or need.

To borrow the lingo of today’s technology entrepreneurs, manned spaceflight hasn’t proved “scalable.” It worked in small numbers at great cost. But it could not be scaled up to the high-volume, mass-market model that society typically demands of its technical innovations. At least not yet.

Technology may skyrocket, but anthropology still stands in line at Disneyland.

VISIONS: Identity –
Rx for brain makeovers

By Erica Goode

The Scarecrow in “The Wizard of Oz” wanted only a brain – he did not much care what kind.

But human beings, most of whom already have such organs in their possession, have rarely been satisfied with simple ownership. They yearn for improvement: for smarter brains, happier brains, calmer brains, brains that are less forgetful, less stressed and less vulnerable to age and disease.

To pursue these desires once meant taking up meditation, ingesting mind-altering substances without supervision or spending years on the analytic couch. But modern science has yielded other approaches to brain repair and improvement. Doctors now have ways to tweak the brain’s machinery, including drugs and other techniques that can relieve depression, calm hallucinations, increase concentration and soothe jangled nerves.

Yet these remedies, scientists say, are just a prelude to what is to come. In the next century, and maybe in the next three decades, they say, greatly increased understanding of the interplay of chemicals, genes and environmental influences will bring exponential leaps in the ability to manipulate the brain and to influence thinking, feeling, memory and behavior.

Brain-altering drugs of the future will have several advantages over those now on the market. While many medications used to treat wayward thoughts, emotions or behaviors were discovered through serendipity, scientists increasingly are able to manipulate drugs with intention, maximizing effectiveness or reducing side effects. While most drugs already in use have an effect on the brain that is far from specific – the tread of an elephant, not a mouse – a new generation of psychoactive drugs will home in on more precise targets. Advances in genetics will also let drugs be matched more successfully to the needs of individuals.

A result will be not only better treatments for illnesses like depression, Alzheimer’s disease and schizophrenia but also ways to make essentially normal brains work better, like enhancing memory, increasing sociability or expanding intelligence.

“Will we come up with a Viagra for the brain, so to speak?” asked Dr. Steven M. Paul, group vice president for the therapeutic area, discovery research and clinical investigation at Lilly Research Laboratories in Indianapolis. In the past, such a thing was science fiction. But in the context of finding treatments for Alzheimer’s and other forms of dementia, he said, “There is no question that we and other companies are working on things that can enhance cognition.”

The availability of new compounds will pose increasingly difficult questions about who should use them, and under what circumstances. In some cases, the benefits may clearly outweigh any risks: Drugs that could better treat mental illnesses like schizophrenia and manic-depression would probably find few critics.

But when it comes to altering and adjusting mental qualities that all people share to some extent, the situation grows murkier.

“It is my own view that if these medications are developed, they will be used,” said Dr. Paul Appelbaum, chairman of the psychiatry department at the University of Massachusetts Medical School in Worcester. “I think it’s almost inevitable that people, given the opportunity to improve their personalities, their cognitions, their interpersonal abilities, will take that opportunity.”

Perhaps the most difficult challenge will be presented by drugs that hold out the possibility of sidestepping the most painful parts of life. As one example, Dr. Peter Kramer, the author of “Listening to Prozac,” said he had been considering what might happen if there were a drug that could interrupt the cascade of hormones and other neurochemicals set off by major stress. Several pharmaceutical companies, he said, have such medications in their pipelines.

In theory, such stress reactions change the brain’s anatomy in ways that may make someone more vulnerable to depression or other problems later on. “But what if,” Dr. Kramer asked, “there is a 4-year-old child and the mother dies, or someone, at any age, loses a close companion or is raped, and you could give them a chemical that could prevent the more permanent encoding of that loss?”

Some might argue, he said, that offering such a drug would be the only ethical thing to do. But equally persuasive is the argument that coming to terms with loss and emotional injury is essential for growth.

It comes down to a question of value. How much is pain worth? How much does depression contribute? A new generation of drugs may force us to re-examine which aspects of our mental lives we consider essential and which we are willing to relinquish.

VISIONS: Identity –
A generation’s anthem: ‘Smells Like Teen Pressure’

By Dirk Johnson

Young, hip and smart, Casey Collier seems to glide from one peak to the next, knocking out A’s, winning student council elections and starring in the school play.

She is an 18-year-old with a dazzling future. So why does she fret?

“I feel overwhelmed – I feel like, ‘Oh my gosh, this stress,’” said Miss Collier, a senior at Shawnee Mission North High School in Kansas, outside Kansas City. “The other night, I wondered, 'Is it possible to have a nervous breakdown at age 18?’”

Coming of age at a time when technology is magic, the economy is purring like a mountain lion and the nation’s cultural hegemony is unrivaled, American teenagers might be expected to look at the 21st century as the promised land: theirs for the taking.

But in conversations about the future with dozens of teenagers – in sleepy farm towns and hip-hop urban neighborhoods and leafy suburbs – one word threaded them together: pressure. Virtually every one of these teenagers spoke about the looming demand that they succeed and the trepidation they feel about whether they can measure up.

“People feel like young people today have so many opportunities, that if you don’t succeed, then what’s wrong with you,” said Ali Cruso, a 16-year-old high school sophomore in Westerly, R.I. “The bar has really been raised high.”

Groomed by a golden age of national triumph, American teenagers see their charge as obvious: Do not drop the ball.

It is too early to know how such a foundation will shape these young people, as they set out to shape the new century. With their anxieties about the future and economic security, the children of affluence seem to share something, oddly enough, with the children of the Depression, whose life and legacy was pervaded by the sense that the bottom could drop out at any time.

Being a hardheaded realist can be useful. It curbs utopian excesses. But if young Americans take on the 21st century simply playing not to lose, will they undermine the expansion they are trying to protect? A willingness to risk, after all, lies at the core of American successes.

In his book “The Rise and Fall of the American Teenager,” Thomas Hine argues that the notion of a teenager was an invention of the New Deal, a way to extend childhood as to keep young people out of jobs desperately needed by men and women.

But America today needs workers, and teenagers seem ready to get to work.

Miss Cruso said: “We are the richest young generation in history. We have cars, cell phones, stereos, pagers.” But privilege is only one end of the bargain.

“It seems like, if you mess up here, it’s going to affect your whole life,” said Matt Scheidler, an 18-year-old high school senior in Chicago. “Life shouldn’t be terrible if you make one mistake.”

Growing up in the culture of money, teenagers have been schooled to be shrewd. They invest in mutual funds. They plan for early retirement. They know about Bill Gates as their parents knew about Paul McCartney. The importance of possessions is so taken for granted by many that it goes unquestioned. One hit song of 1999 could have been a fashion advertisement. “I like girls who wear Abercrombie & Fitch,” the band LFO crooned about cooler-than-cool jeans that can’t be touched for less than $70.

Adam Coleman, 15, a sophomore in Littleton, Colo., does not hesitate about his career goals. “I want to be wealthy,” he said.

The 31 million people between 12 and 19 spent $153 billion this year, which has made them a bull’s-eye for the nation’s marketers. An entire television network, the WB, aims at teenagers with “Dawson’s Creek” and “Buffy the Vampire Slayer.”

If money reigns, “cute” and “brute” come next. The top Web site for boys features professional wrestling. The top Web site for girls is MTV’s, where the musical acts are increasingly known more for heartthrob looks than their timbre.

While earlier generations found touchstones in books like “The Catcher in the Rye” and “Zen and the Art of Motorcycle Maintenance,” no contemporary text seems to have galvanized today’s teenagers. College professors bemoan that a big share of incoming freshmen cannot name the last full-length book they read.

But teenagers hear in such lamentations the tone of a sneer. On a range of issues, they say they feel wounded by characterizations of teenagers as shallow at least, and menacing at worst.

Jessica Brown, a 16-year-old high school sophomore in rural Leland, Ill., said teenagers were tired of being seen as trouble.

“When a group of teenagers walk into a store, everybody gets real nervous,” Miss Brown said. “People are afraid of us, like we’re going to hurt them.”

In her view, the enormous publicity over recent school shootings has irrationally tarnished the image of teenagers, who often feel invisible around adults, Miss Cruso said. “They won’t make eye contact, as if we’re all juvenile delinquents,” she said.

As some teenagers see it, the generation now running the world, notably the baby boomers, should scarcely throw stones.

Ryan Watson, a 14-year-old who lives on the West Side of Chicago, said today’s teenagers get along much better than their elders with people of different races. “People now know that if you can’t get along with someone because of their race, well, you’re not going to get very far in life,” he said.

What concerns Mr. Watson, and many other teenagers, is the seemingly unstoppable power of technology. Although some 88 percent of teenagers have found their way onto the Internet, and sending e-mail is almost as popular as talking on the telephone, many say the computer is a dangerous tool. Mr. Watson, himself something of a computer programmer, said young people await human cloning and artificial intelligence as surely as an earlier generation anticipated man’s walking on the moon.

But virtual reality blurs the line between truth and image in unsettling ways. Amy Smith, 15, a sophomore in Bloomington, Ill., said she had stopped using chat rooms. “Who’s real and who’s not?” she asked.

For the most part, teenagers are much more worried about earning the right college admission, getting a sheepskin and making their way up the ladder. “I just want to reach my goals as quickly as I can,” Miss Cruso said. “There’s a lot of pressure.”

VISIONS: Identity –
Seeking a home in the Brave New World

By Edward Rothstein

Each day, in the courtyard of the nation’s most secure federal prison in Florence, Colo., a strange convocation takes place. Three men stand in isolated mesh cages and talk for an hour. One prosecutor has called it “the oddest kaffeeklatsch in the history of Western civilization.” Its members are Timothy J. McVeigh, who bombed the federal office building in Oklahoma City; Theodore J. Kaczynski, the Unabomber; and Ramzi Ahmed Yousef, the mastermind of the World Trade Center bombing. And recently, a place was reserved for Luis Felipe, the leader of the violent Latin Kings gang in New York, who arranged three murders from his prison cell.

And what do the members of this kaffeeklatsch discuss? Movies, we are told – the one cultural experience shared by all Americans, no matter what their criminal proclivities. But there may be other common ground. This right-wing extremist, mathematical Luddite, Islamic fundamentalist and Latin gang leader not only compose the oddest kaffeeklatsch in Western civilization, they also share a disgust of Western civilization. Broadly speaking, each has rebelled against the 20th-century culture of “modernity.” And the rebellion reveals much about the contested status of “identity” at the beginning of the 21st century.

Consider their common enemy. Mr. McVeigh’s militia mind objected to the liberal institutions and democratic process born in the European Enlightenment; in their place he was determined to establish a new world in which his notion of identity would reign supreme. Mr. Kaczynski left a trail of mutilation and murder in his attempt to overturn the life of “modern man,” dreaming of an identity that would rise, he wrote, out of untamed Nature, “independent of human management and free of human interference and control.” Mr. Yousef could chat about how modern Western ideas of rights and liberty are a plot by the Zionists, whose modern nation-state trespassed on the religious claims of Islam. And Mr. Felipe might explain why his creation of a violent ethnic gang, ruthlessly demanding its own laws and loyalties, should displace the blandly secular, melting-pot kingdom of society.

In each case, the modern world is attacked for standing in the way of identity’s claims because identity is not just a declaration of belonging; it is a declaration of opposition. Identity is a dissent from the universal; it declares an exception.

The exception it declares is also from the demands of modernity. The 18th-century Enlightenment gave birth to the idea that there is something beyond identity, that one might be more than a citizen, or an aristocrat, or a deist or an Aryan. One might be, in the best of all possible worlds, a member of a species, endowed with inalienable rights, deserving guaranteed liberties. Society was to be defined not by inherited difference but by inherited humanity. The American Declaration of Independence is not a declaration of identity but a declaration of freedom from compelled identity. Modernity has often accommodated identity, but only by requiring that it submit to a higher law.

This opposition between identity and modernity has defined the main struggles of the 20th century. Identity’s evil triumph was in fascism. Nazism opposed the notion of a universal humanity united under the rule of reason; it celebrated a national identity that staked its claims in blood. Other violent declarations of identity have occurred when national sovereignty lost its power: recent tribal genocides in Africa and Eastern Europe were assertions that identity should define civil authority.

The forces of Enlightenment, though, have led to problems as well. Universal principles might seem noble enough, but in practice they often led to atrocity. How, after all, did communist tyrannies develop? By asserting that the rulers possessed a humane vision justified by science and reason; sacrifices would have to be made in service to that vision, but they would be made for the good of all. The number of deaths attributable to such utopian visions in the last century easily rival those attributable to national genocides.

Each pole, then, has its dangers. And the struggle between them will cast its shadow into the 21st century. But how has this future struggle been envisioned? And what is most likely to unfold? Forecasters during this century have been far less worried about identity madness than Enlightenment madness, embodied in the triumphant technological state. That state has provided the familiar dystopia of 20th-century science fiction – from Aldous Huxley’s still-disturbing vision in “Brave New World” to films like “Blade Runner” or “The Matrix.”

These predictions of modernity’s evils have also come to embody the way we see our world. In novels, films and computer-hacker fantasies, we imagine ourselves to be powerless puppets and pawns, caught in the identity-crushing maw of corporate capitalism, government, law, media and social institutions. Typically, salvation is offered by an iconoclast, an outlaw who shatters the oppressors’ bonds, ushering in a new age. In practice, this revolutionary myth can occasionally lead to real reforms. But it also leads to exaggerated perceptions and ready-made villains familiar in American culture. The kaffeeklatsch members see the world this way. They claim to be its saviors.

But generally the myth does not, like the kaffeeklatschers, invoke premodern identity to overturn modernity; the state is not replaced with an ethnic, religious or romantic paradise. Instead, identity is itself seen as a creation of the corrupt world, “socially constructed,” as many academic studies now assert. It, too, is imprisoning, ruthlessly imposed. After the libertarian revolution, even identity’s constraints will be shed; one will finally be able to freely invent oneself.

This is an old fantasy, imagined by communist tyrants as well, but technology is reinvigorating it. Internet role-playing games allow the invention of identities. Test-tube conceptions have clouded notions of parents and children; genetic engineering will go even further. And bioengineering is creating human-mechanical hybrids, in which artificial limbs and organs are united with blood and flesh. The inventor Raymond Kurzweil has even argued that in the next century computers will develop consciousness, creating a new species of humanity, amplified in its powers. We are being constructed? Well, then, let us construct ourselves. Technology will liberate us from the oppressions of technology. The future will free us from both identity and modernity.

These grand hopes are also being encouraged by the weakening of the most significant political invention of modernity: the nation-state. Commerce and conversation are becoming borderless transactions. Cultural distinctions are dissolving. In Europe, countries have already ceded some economic and symbolic independence to form the euro, the first modern currency not associated with a particular country. The United States, in recent military actions, has declined the prerogatives of a world power and insisted on multinational consensus.

Some think that transnational corporations will become all-powerful, requiring further revolutionary spasms. But utopians, undeterred, believe that racial and national identities will fade, that distinctions between the sexes will disappear along with traditional family roles, that a global union based on libertarian principles will evolve, that technology will be so inexpensive that it will democratize the world.

Yet these prospects seem unlikely. Yes, social roles will change with prosperity and technological innovation, just as they have in the past. Yes, there will be a greater need for international authority. Yes, humans will have engineered parts and machines with human qualities. But identity’s demands will not disappear, nor will the challenge of mediating between those demands and those of a grander social compact.

There may be no escaping identity, because – so it seems – humanity cannot exist without creating such distinctions and allegiances. We seem to have an inalienable tendency to establish circles of identity, beginning with allegiances of family or tribe, extending outward to allegiances of profession or religion or material interest, encompassing the allegiance of nation, and, only finally, the allegiance of the human. These circles may change shape and range, but can they ever be eliminated? Every utopian project tries to undo these bonds, beginning with the family circle; and every attempt becomes inhuman in its demands.

There may also be no escaping modernity. It provides the premises and vocabulary – of rights, reason and representation – that permeate even efforts to demolish it. So as the new century begins, many of modernity’s opponents deal in paradox. They reject the Enlightenment in the name of the Enlightenment. They celebrate the particular in the name of universalism. Identity is treated as the apotheosis of reason. Some supporters of multiculturalism, for example, at once reject Enlightenment ideals – claiming that even reason is culturally relative and unjustly imposed – and enthrone them, invoking equal rights.

Far more unsettling, though, is the already familiar sight of terrorist groups, kaffeeklatsch types and rogue states taking arms against modernity in the name of identity, invoking notions of rights and equality while violently undoing their meanings, turning modernity against itself.

So, as the 21st century begins, the struggles of the 20th will mutate.

But their consequences may remain familiar. We will still be tempted to prefer extreme formulas to messy truth. Even now, dystopian visions are so prevalent, we tend to forget that imperfectly just societies really do exist, all around us; utopian visions are so tempting, we tend to forget their impossibility. Modernity is so frequently attacked, we tend to forget its necessity. And the passions of identity can be so frightening we tend to forget their inevitability.

VISIONS: Technology –
Through the eyes of children, a more friendly future

By Jodi Wilgoren

Two women playing tennis under a star-filled sky, trading volleys for a chance to win a trip to the moon. Three malformed animals trying to destroy a tree that has taken away people’s lives and is threatening their master. Pokemon saving the human race from a trio of volcanoes erupting at the same time.

These were among the images that fifth graders at Amistad Academy in New Haven came up with when asked to draw a scene from a day in their lives 20 years hence. Sure, there were cars that could fly and computers that had swallowed whole rooms. But many of the pictures skipped the cartoonish characters of comic books and animated television, suggesting instead simpler, subtler changes to buildings and buses, grass and sky.

Tiquentes Gray drew his own auto-body shop, painted bright yellow, where it would cost $90 to get your car fixed. Amanda Bell-King staged a millennium fashion show. (Fubu survives.) The caption that Johnny Walker wrote to explain his picture of a bare-chested man surrounded by a half-blue, half-green background was spare: “It’s a man who is flying to his destiny. It’s my uncle.”

Amistad, a charter school in its first year, serves mostly poor black and Latino students, three-quarters of whom read below grade level. Its hallways are festooned with banners proclaiming the school’s values: Respect, Enthusiasm, Achievement, Citizenship, Hard Work (Reach). Here are examples of what the students were thinking as they drew the year 2019.

VISIONS: Identity –
‘We tend to do the right thing when we get scared’

In Octavia E. Butler’s latest novels, “Parable of the Sower” and “Parable of the Talents,” the near future is grim and dangerous. Americans have been devastated by an economic meltdown that makes water too expensive to bathe in. A religious right has captured the White House, unleashing a tide of intolerance, misogyny and ruthless persecution. Yet rising from the rubble of society is a movement of renewal called Earthseed. Its leader, an “empath,” is a complex and conflicted black woman.

Never expect Ms. Butler to anchor her tales in sunny-day visions of tomorrow. And never be surprised to discover extraordinary yet flawed black women creating much of the gravity that governs Ms. Butler’s earthy brand of science fiction.

The author of a dozen novels, Ms. Butler, 52, is part of a small cadre of black science-fiction writers.

She grew up in Pasadena, Calif., and recently moved to Seattle. In 1995, she received a fellowship from the John D. and Catherine T. MacArthur Foundation, a so-called “genius award,” for her unusual melding of science fiction with African-American spiritualism.

In an interview with Michel Marriott, a reporter for The New York Times, Ms. Butler considered the prospects for real social change in the future. Following are excerpts.

Q. Will racial and sexual attitudes improve in the 21st century?

A. Absolutely not. I don’t mean that it’s going to get worse. I just mean that we human beings are such naturally contentious creatures. The only way the two issues could disappear is under a regime so totalitarian that we are not permitted to talk about it…

In countries where there are no racial differences or no religious differences, people find other reasons to set aside one certain group of people and generally spit in their direction. … It delights people to find a reason to be able to kick other people.

One of the books that I read when I was doing “Kindred” [about a woman who slips in and out of her ancestors’ slave pasts] was a book called “Slavery Defended.” It was a wonderful addition to my research because you don’t read very much about the defenses of slavery these days. And there were a lot of them. One of them said blatantly that it was necessary that the poorest class of white people have someone that they could be better than…

We are a naturally hierarchical species. When I say these things in my novels, sure I make up the aliens and all of that, but I don’t make up the essential human character, the way we are…

It’s like war. I mean it’s remarkable how many nice boys go off to war and do horrendous things that they don’t really have to do.

I’m not talking about shooting at the enemy. I’m talking about making use of the local women, whether they wish to be made use of or not, or killing people just because they can. I mean power is … you can get drunk on it very easily, and it really does corrupt.

Q. Why do you place black women at the heart of so much of your work?

A. When I began writing science fiction, when I began reading, heck, I wasn’t in any of this stuff I read.

I certainly wasn’t in the science fiction. The only black people you found were occasional characters or characters who were so feeble-witted that they couldn’t manage anything, anyway. I wrote myself in, since I’m me and I’m here and I’m writing. I can write my own stories and I can write myself in.

Q. Could you imagine yourself living in any other era?

A. I was sitting next to a white man who was flying home, and we got talking. Once he understood what I did for a living, he said, which era did I think it would be most fun to live in, and he chose one. It wasn’t the antebellum era, thank goodness. He chose something that he thought would be fascinating.

And what did I think?

As a black and as a woman, I didn’t think that I would really want to live in any of the eras before this, because I would inevitably be worse off. I would have spent more time struggling just to prove I was human than doing my work.

Q. Are you pessimistic about the future?

A. I’m not pessimistic about much of anything. No, I’m hopeful. The only problem that we human beings really suffer from, and this is important to us as a species, is that we tend to do the right thing when we get scared.

Like, consider history as our emergency medical system. It works real well at that level, but unfortunately, we have to wait until the disaster looms. With a disaster like global warming, it’s too late to worry about when it’s looming except to figure out how to adapt to it…

We human beings make a lot of the same mistakes over and over again. It doesn’t seem to help. I’m alarmed at how easy it is, for instance, to railroad people into acting against their own best interest.

VISIONS: Power –
Economic thinking finds a free market

By Floyd Norris

Will the forces of globalism continue to push the world toward American-style capitalism?

As the 21st century begins, advocates of the free market have no doubt that they have won the economic argument. Socialism is dead. Moreover, as a means of creating wealth and material progress, American capitalism seems to be clearly superior to the Asian variety, with its greater level of government planning, or the European version, with its emphasis on social welfare and protection of workers from losing their jobs.

The proponents see the coming era as a period of global transformation: prosperity will rise around the world as technological change and global integration spread along with democracy. “This new boom has the potential to pull the whole world into it, allowing literally billions of people to move into middle-class lifestyles,” Peter Schwartz, Peter Leyden and Joel Hyatt write in their 1999 book, “The Long Boom: A Vision for the Coming Age of Prosperity.”

Certainly capitalism has extended to areas where it was previously unfamiliar. From Ghana to Mongolia, stock markets have sprung up, even as the world’s largest – the New York Stock Exchange – scrambles to hold on to its position as technology opens up new ways to trade. Private enterprise is the dominant economic force, and it is hard to see that changing soon.

Moreover, the technological revolution brought on by computers is tying the world together as never before. Peter F. Drucker, the business historian and management thinker, compares the effect of the Internet to that of the railroad. By lowering transportation costs, the railroad made national businesses possible. Now, with information available instantly, the Internet allows even small businesses to have global ambitions, and it places pressure on nations to relax regulations and cut taxes that make it more difficult for their businesses to compete.

That pressure is to follow the American path. By making it relatively easy to start companies – and to hire and fire workers – the American way has produced innovation. Microsoft, the company that investors think is worth more than any other, did not exist a quarter-century ago. Without such flexibility, the computer revolution might not have been so dominated by American companies.

But the forces of globalism and American capitalism face resistance. The Asian economic collapse in 1997 brought home the risk to countries of dealing with huge, uncontrolled capital flows. Free trade may bring overall benefits, but there are losers whose political influence may be substantial. The protests at the World Trade Organization meeting in Seattle last month united those who fear world government with those who want stricter standards on the environment and the treatment of workers.

The suspicion has grown, even among Americans, that globalism is really a means of helping big business. In a poll of Americans by the Program on International Policy Attitudes at the University of Maryland, 54 percent said that Washington trade officials gave “too much” consideration to the views of multinational corporations; 73 percent believed “too little” attention was paid to the views of “people like you.”

That viewpoint has little support among major political leaders, but it could yet resonate. Jude Wanniski of Polyconomics, a conservative research organization, complains that multinational corporations want “an international trade bureaucracy that serves their interest all the time.”

“This means,” he said, “having more international government under their control, beyond the reach of ordinary Americans who do not understand the imperatives of this new bang-bang world of cyberspace and megadeals.”

That such resentment has appeared now – with the American economy booming and the current period of growth soon to become the longest in its history – may indicate just how vigorous the debate could become, if and when the economy stumbles.

It may be worth recalling that a century ago there was also a trend toward globalization. Improved transport and instantaneous transmission of information – remember the telephone and telegraph? – were hailed as being certain to bring the world together. Trade barriers fell – until World War I. Charlene Barshefsky, the United States trade representative, says that much of what has been accomplished in trade talks over the last couple of decades has simply removed roadblocks that were erected between the two World Wars. Had people acted in an economically rational way, the gains in free trade made a century ago might not have been lost. But they did not.

The economy a millennium ago was far different from ours, but it may hold a lesson as a new millennium begins. In their book, “The Year 1000,” Robert Lacey and Danny Danziger write that honey was the principal source of sweetness. “People paid taxes with it, and it was a lucky day when a swarm of bees settled in your thatch,” they write. Not only was honey valuable, but beeswax provided the best candles, and thus the most reliable lighting system.

“Thus,” observed Marc Faber, a Hong Kong-based money manager, in a review of that book he sent to clients, “it is probable that honey and beeswax were as precious a commodity in medieval times as the Microsoft operating system is today.”

When sugar arrived in Europe, the value of honey collapsed. Mr. Faber thinks something similar will happen to the value of Microsoft one day.

That change is not on the immediate horizon. For now, the information economy is king, and its effects include pressures on other countries to come closer to American-style capitalism. The extent to which opposition to global capitalism slows or even reverses that trend may be the most important economic story of the next few decades.

VISIONS: Biology: Restaurants –
Pristine cuisine kills germs, and tastebuds

By Eric Asimov

In the anteroom of Island, the latest Pristine Cuisine restaurant in fashionable Forest Hills, the crowd was avoiding any contact with one another as they switched their shoes for nylon booties and dipped their hands in that pricey California biocide coyly evocative of eau de Lysol (just ask your grandmother).

Thus protected, they entered the pale, bare dining arena one by one. The food technicians – gowned, gloved and masked – completed the tedious choreography of sani-seating, draping the chairs and the diners. Though surgical green may not be everyone’s most flattering color, at least it protects the white that everyone in New York has been wearing for far too many seasons now.

Let me say this straight out: I am so tired of these restaurants that I am considering guerrilla action – a surprise handshake or even, heaven protect me, spearing a morsel off someone’s plate. I suppose the supergerms and the continuing rise in allergies requires common-sense precautions. But the Health Department’s new no-sharing policy certainly makes my job more difficult. Oh, for the good old days before fear was a fashion statement, when a bite-for-a-bite was a fair trade and not considered assault.

Yes, I know I sound like a decrepit geeze-bag. But while I’m at it, does anyone remember “fusion” cooking, when you could season the quenelles with lemongrass? Pretty soon no one will, if the new chefs continue down their isolationist road. I mean, would it be a sin to serve, say, a Vietnamese dipping sauce with the “deep-water kreplach,” or does that violate some sacred vow among nationalistic chefs? And why is it that chefs can serve the foods of different countries on one compartmentalized plate, but they’re never allowed to touch? Are we afraid of some culinary miscegenation? And isn’t anybody else sick of having everything wrapped in dumpling skins?

I will concede that the dumpling disguise works well with the catch of the day these days; with marine laws restricting so many of the more pleasant-looking varieties until they replenish themselves, the bottom feeders that are permissible are best eaten concealed.

But at Island, food camouflaging has been taken too far. Shrimp shumai, pirogis and lamb manti are all tasty enough, but each plate is a study in beige. And will somebody please explain to me the appeal of salad dumplings? Can anybody actually say they like dough-sheathed olives? It’s as if somebody dumped the pig and put everything else in blankets, to recall a dish from the days when finger food was not a danger signal.

And then, of course, there’s dessert. At least the tarts – of tropical fruit hydroponically grown on the premises since the import ban last year – are colorful. But the Great Chocolate Scare of 2015 is long past. It’s time to bring back mud pies and oozy chocolate cakes.

The one redeeming feature at Island is the impressive list of antiseptic beverages; the bartender and sommelier have certainly risen to the opportunity to offer a well-chosen and potent selection. If only they would allow the clinking of glasses again.

Island
203-45 Austin St., Forest Hills, Queens; (832718) 555-5555.

ATMOSPHERE: All the intimacy of an operating room.

SOUND LEVEL: Hushed.

COMMUNICATIONS POLICY: Personal microwave contact permitted.

RECOMMENDED DISHES: Pierogi, manti, shumai.

SERVICE: Reptilian.

WINE LIST: Excellent selection of Hungarians and Koreans.

CREDIT CHIP: Individual and group scanners.

ARTIFICIAL TRANSPORT: Fueling and charging at each table.

WHAT THE STARS MEAN:
(None) Poor to satisfactory
[ * ] Good
[ ** ] Very good
[ *** ] Excellent
[ **** ] Extraordinary

Ratings reflect the reviewer’s reaction to food, ambience and service, with price taken into consideration. Menu listings and prices are subject to change.

VISIONS: Technology: Quantum computers and cars smarter than you are –
Can molecules beat Moore’s Law?

By John Markoff

Isaac Chuang has built a machine that may one day leave the world’s fastest supercomputers behind, but just now the thing is busy helping its master perform a little magic. Mr. Chuang holds out a paper clip in his cupped hand, and the clip instantly suspends itself in thin air, trapped in a force field generated by the weird device five feet away – a quantum computer.

It’s an elaborate parlor trick. But this is an elaborate parlor: I.B.M.’s Almaden Research Center, where Mr. Chuang, a physicist, is one of hundreds of people searching out the next frontiers of computation. It’s a submicroscopic realm where the rules can be confounding and the results uncertain.

Mr. Chuang’s quantum computer, for example, is based on strange physics that apply when it is possible to control the behavior of individual molecules and atoms. In contrast to conventional computers, which store and retrieve information represented as ones and zeros, quantum machines can simultaneously compute upon a potentially vast number of pieces of information in the space of the same one or zero. It is a through-the-looking-glass world in which yes and no can be true at the same time, and where events can happen simultaneously in multiple places.

The potential is enormous. The reality is that Mr. Chuang’s device has not shown that it can solve useful scientific problems. And the need for such a breakthrough is being felt with urgency by the computer industry.

In a range of fields that place demands on the components of a modern computer – processing logic, memory and permanent storage – researchers are facing the challenge of how to continue creating faster and more powerful computers once the limits of Moore’s Law have been reached.

In a phenomenon first observed in the 1960s by Gordon Moore, the co-founder of the Intel Corporation, the semiconductor industry has been able to double the number of transistors on a single piece of silicon every 18 months, leading to a still accelerating increase in computer processing power. The industry thinks this will continue until 2014 when, if not sooner, the basic processes underlying semiconductor electronics will simply stop working. Devices will be so small that they will be plagued by the very subatomic forces that Mr. Chuang and his colleagues here are hoping to harness.

In his laboratory, Mr. Chuang holds a thin test tube filled with a bright yellow fluid. Suspended in this liquid crystal solution are about 10 septillion (10 to the 18th power) molecules that make up the fundamental component of a quantum computer, known as a qubit. These are molecules chosen because in an intense magnetic field their electrons can be precisely oriented. Computing takes place when the molecules are “programmed” with a burst of radio waves, causing the rotation of their electrons to change and a quantum calculation to occur.

With a small number of qubits, a quantum computer could perform a vast number of calculations in parallel. One possible use might be to factor large numbers rapidly – perhaps so rapidly as to undermine the encryption systems that are the foundation for electronic commerce on the Internet.

“Fortunately,” Mr. Chuang says, “what quantum computing will take away with one hand it will give back with another.”

And indeed, in the same research laboratory, tucked away in the rolling hills south of Silicon Valley, another group of scientists has developed a working data-scrambling system based on the same quantum phenomena. The system might make it possible to exchange information secretly without fear that the communications could be decoded by even the most powerful computer.

Openness to radically new ideas is common among the scientists here. They also seem to be acutely aware that fundamental advances can neither be engineered nor planned for.

For example, in the early 1990s, an I.B.M. scientist, Stuart Parkin, stumbled upon the crucial material necessary to create a phenomenon known as gigantic magneto resistance, or G.M.R., when someone in his lab incorrectly filled in a table on a computer screen. This discovery of a thin sandwich of materials led to a new ultrasensitive sensor for disk drives, permitting a vast increase in the amount of information that can be stored and retrieved by a computer. Now Mr. Parkin has set his sights on a new data storage application for magnetism based on the quantum forces Mr. Chuang is exploring. He is working on a new kind of memory chip – known as magnetic random access memory – which, if it can be made cheaply, might become the standard computer memory for future personal digital assistants and every other portable electronic device.

Mr. Parkin spends his days searching for the magic combination of superthin sandwiches of molecules that interact in the precisely right way. “It’s a huge space to search,” he said. “It involves serendipity, luck and hopefully some science.”

VISIONS: Technology: Quantum computers and cars smarter than you are –
A new meaning for automatic

By Edmund L. Andrews

Hans-Georg Metzler envisions the day when he has conversations with his car.

It will tell him about traffic jams and predict where new ones probably will develop. It will watch for jaywalkers and warn him if he seems not to have noticed a stop sign ahead. It will listen for unusual engine noises, run diagnostic tests and report any problems. And if he gets caught in slow-moving highway traffic, his car can take over the controls, keeping a safe distance behind the vehicle ahead.

“It will be a dialogue,” said Mr. Metzler, who heads research on “machine understanding” at DaimlerChrysler here. “The car will be taking information from its surroundings and putting it to use. That is the function of a brain.”

Mr. Metzler is at the crossroad of automobiles and computing. It is a clash of different traditions: one, a neo-Newtonian world grounded in heavy metal and seemingly immutable laws of motion, inertia and gravity; the other, a quicksilver world of molecular electronics and algorithms in which the speed limits constantly double.

Cars have been getting smarter for years. On-board navigation systems can map the best route to a destination and tell drivers if they make a wrong turn. Some top-of-the-line Mercedes-Benz models can warn drivers when they get too close to other cars.

But for people like Mr. Metzler, that is just a start. He is convinced that the future of cars is in “drive-by-wire” systems that would fundamentally change the relationship between driver and car.

Consider Daimler’s somewhat bizarre “side-stick” car, a sporty yellow Mercedes coupe that has no steering wheel and no pedals for either braking or accelerating. Instead, all controls are on joysticks to the right and left of the driver.

Press forward on either one or both of the joysticks and the car accelerates. Pull backward and the car brakes. Push to the right or left and the car turns accordingly.

Why do it? Daimler engineers say the joystick car is much simpler to control than traditional vehicles because everything can be done in one fluid move of the hand. Moreover, the on-board electronic systems simulate a feeling of the road conditions in the joystick, which moves more freely if the wheels are spinning on ice and more slowly if the car is starting up a steep incline.

“You have the feeling you have full control right in the grip of your hand,” said Lutz Eckstein, 31, an engineer who wrote some of the algorithms and has tested the car.

In tests, Mr. Metzler said, 17-year-old novice drivers seemed to handle the car well. But people accustomed to conventional vehicles had a much harder time.

Daimler executives do not know whether they will ever try to market such a car. The idea, they say, is mainly to show that advanced electronics make it possible to think about car design in entirely new ways.

Social and political acceptance will also affect the future of ever more intelligent vehicles. Klaus-Dieter Vohringer, DaimlerChrysler’s managing director in charge of research, is convinced that technology makes it possible to have accident-free traffic in the next 10 or 20 years.

But it may take much longer before drivers or regulators accept it. Cars could see and hear the world around them. They would not only warn drivers about trouble but would be allowed to take partial control.

One glimpse of the future is Daimler’s experimental “urban transit assistant.” Video cameras produce three-dimensional images of the road. An on-board computer matches the stream of incoming images with a library of images of pedestrians, traffic signs and potential barriers. It then highlights important images of things to stop for and pay attention to in blue or red.

Many Daimler engineers think it would be dangerous in practice to flash such images in front of drivers. But the system could warn drivers with alarms or spoken words.

The bigger issues are about control and trust in technology. Should cars of the future simply do more to warn drivers or should they sometimes take over the controls?

Daimler is already campaigning for governments to approve what it calls an electronic tow bar, a system that would regulate the speed and distance between trucks in a convoy. The technology would enable trucks to reduce wind resistance and save fuel by bunching together, Daimler engineers say.

Looking further ahead, engineers here are convinced that cars will be able to anticipate collisions. Cars would also continuously monitor road conditions and surrounding traffic. They would beam a constant stream of reports over wireless networks to regional traffic authorities.

Drivers might give up some control. On highways, they could leave it to the cars to keep a safe distance from one another. In a possible collision, cars might even communicate on their avoidance strategy.

All this could detract from the thrill of being in full command of a high-performance car. But it may happen anyway.

“Driving is fun, and customer preferences must always be on the agenda,” Mr. Vohringer said. “But if there is a situation where the human being is not capable of reacting, then the technology should be able to react to dangers.”

VISIONS: Cities –
Will entrepreneurs still be heroes?

By Diana B. Henriques

When Fortune magazine was introduced in February 1930, its premise, according to one historian, was “that the great businessmen were the new supermen of civilization.” Almost 70 years later, the rest of America is coming around to Fortune’s way of thinking.

As the new millennium opens, entrepreneurs – from Warren E. Buffett of Berkshire Hathaway to Jeff Bezos of Amazon.com – are the Lindberghs and DiMaggios of a wired and bullish generation.

Will it last? After all, Fortune’s “new supermen” quickly became the whipping boys of the New Deal; even today, the evil Mr. Burns of “The Simpsons” is better known than Ayn Rand’s heroic capitalists.

“Ultimately, I think the image of businessmen depends on how well the business world does,” said Richard Sylla, a business-history professor at New York University. If television and the movies continue to denigrate executives and entrepreneurs, a more positive view may be emerging in serious literature.

“Today’s business heroes are selling something – technology – that they themselves created, much like the early Henry Ford,” said Emily S. Watts, author of “The Businessman in American Literature.” She sees a certain “romance” with that aspect of business reflected in some novels of Richard Powers and Douglas Coupland.

Still, American fame has always had a short shelf life.

“Already, the market for business entrepreneurs feels a little overvalued,” said the cultural historian Thomas Hine. “We may be near a top.”

VISIONS: Cities –
Going higher tech degree by degree

By Karen W. Arenson

Despite warnings that too many people are overeducated, new high school graduates and adults long out of school are expected to flock to college in ever-growing numbers in the years ahead, even as the notion of higher education expands and changes. There will always be a Harvard (and a Yale and a Princeton). But other traditional colleges are likely to be bypassed as students seek convenience and specially tailored services – and colleges respond. The watchwords will be high technology and cost efficiency, and courses will be delivered in many ways. For better or worse, the future is full of niches to be filled. Here are some institutions that higher education experts expect to be on the cutting edge.

UNIVERSITY OF PHOENIX – If there is a McDonald’s of higher education, it is Phoenix. The name produces shudders among many educators, who say it does not offer the face-to-face interaction that has long characterized elite higher education. A for-profit institution with classrooms in 26 states, and more on the way, Phoenix has turned college education for older students into a mass-market commodity. It has 86,800 degree students, most concentrating in fields like business and technology (including 10,382 who take courses over the Internet). Phoenix employs mostly part-time faculty members to teach a standardized curriculum developed in central headquarters and presented in leased classrooms.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY, Cambridge, Mass. – For the last half-century, the federal government has pumped billions of dollars into research universities to finance their expansion. With that growth ending, universities are looking elsewhere for support. Few have been as successful as M.I.T., which has formed partnerships with corporations ranging from Microsoft to Merrill Lynch for money, research and teaching opportunities. One visible model of that collaboration is its industry-supported Media Laboratory, where professors and students work side by side on high-technology projects. The British government recently tapped M.I.T. to work with Cambridge University and others to bolster their entrepreneurial spirit.

ALVERNO COLLEGE, Milwaukee – As students, professors and politicians seek clearer definitions of what a basic college education should include and how to measure the results, Alverno, a small Roman Catholic women’s college, offers a breathtakingly simple approach to both. Its answer: students should learn eight sets of specific skills – including communications, analysis, problem solving and social interaction – and their progress should be monitored through regular assessment based on predetermined criteria. Their growth in public speaking, for example, might be judged by comparing videos from one year with those of previous years. Deceptively simple, but Alverno has won many awards and drawn a stream of corporate and educational visitors from as far away as Hong Kong and Abu Dhabi.

RENSSELAER POLYTECHNIC INSTITUTE, Troy, N.Y. – Convinced that students were not learning as much as they should, R.P.I. turned its classrooms and its curriculum inside out. On entering a physics or calculus classroom, students plug their laptop computers into ports at their desks. A teacher lectures for 5 or 10 minutes, then asks students to solve problems using what they have just learned. Students cannot tune out. And by watching how students deal with problems on their computers, teachers can tell whether students grasp a concept or need more explanation. Rensselaer professors say their students learn fewer facts but have better command of material.

TRUMAN STATE UNIVERSITY, Kirksville, Mo. – In an era in which prestige is closely linked to the test scores of entering students, some public colleges have transformed themselves from run-of-the-mill, comprehensive institutions into increasingly selective ones. Among those that have come the furthest is Truman State, which evolved from Northeast Missouri State Teachers College into a small liberal arts version of flagship institutions like the University of California at Berkeley and the University of Michigan. Others include the College of New Jersey, formerly Trenton State, and the State University of New York at Geneseo. Truman State offers small classes and an honors program and continually tests its 6,000 students with national examinations to determine where it needs to improve instruction.

VISIONS: Biology: Turning up the heat –
Two forecasts for Earth 2100

By William K. Stevens

What will the weather be like in 2100?

Don’t laugh. While forecasting the next century in any field is a dicey exercise, climate experts have more tools than most prognosticators.

Hence the widespread concern about global warming. Computer models enable climatologists to project the effects of the ever-increasing emissions of industrial gases, like carbon dioxide, that scientists contend are warming the atmosphere.

Of course, the picture that the models paint depends on the scientists’ assumptions. A 1999 study by Dr. Tom Wigley, a climatologist at the National Center for Atmospheric Research in Boulder, Colo., ran four possibilities for the future through climate models to examine how global temperature. Then he combined the results of 15 models to explore the effects on temperature and precipitation in North America.

Presented below are projections for the year 2100 under two of Dr. Wigley’s cases: those assuming the highest and lowest levels of emissions.

A word of caution. Computers do not yet have the capacity to make more than relatively crude simulations of the climate’s behavior, and scientists have not yet firmly pinned down the climate’s sensitivity to warming by greenhouse gases. Dr. Wigley’s analysis, conducted for the Pew Center on Global Climate Change in Arlington, Va., assumed a moderate sensitivity of about 4.5 degrees Fahrenheit for a doubling of atmospheric carbon dioxide.

The Highest-Emissions Future

ASSUMPTIONS

  • High population growth
  • Relatively little emphasis on technological change
  • No specific steps by governments to control greenhouse gas emissions

GLOBAL CONDITIONS IN 2100

  • Average surface temperature increase of about 5 degrees Fahrenheit increase of about 5 degrees Fahrenheit

  • Sea level would rise about 2 feet

The Lowest-Emissions Future

ASSUMPTIONS

  • Low population growth

  • Emphasis on global action to reduce environmental problems, but no policy steps on greenhouse gas emissions specifically

  • Rapid technological change

  • Use of cleaner-energy technologies

GLOBAL CONDITIONS IN 2100

  • Temperature increase of about 3.5 degrees
  • Sea level would rise about 1.5 feet

VISIONS: Cities –
Manifestoes to give city a new edge; Cooper Union for technology

By Allen R. Myerson

For 140 years, the Cooper Union for the Advancement of Science and Art has provided worthy but, in many cases, needy students with training in art, architecture and engineering, all on full scholarships for tuition. Its founder, Peter Cooper, built the nation’s first steam railroad engine but bemoaned his own lack of formal education.

Mr. Espuelas, whose company provides a global online network for speakers of Spanish and Portuguese, wants the school to train students in technology – especially Internet technology – but also provide full scholarships. He expects Silicon Alley companies to be eager supporters.

“It is compelling not only from a business standpoint, but also for the social implications,” Mr. Espuelas says.

VISIONS: Biology –
A genetic future both tantalizing and disturbing: a small leap to designer babies

By Sheryl Gay Stolberg

Unlike the sheep named Dolly, the mouse named Lucy has barely made a ripple in the news. Lucy, born on October 26, 1998, is the creation of scientists at Chromos Molecular Systems, a biotechnology company in Vancouver, British Columbia. She has black-brown fur and is not unlike mice bred in laboratories around the world, the company says, with one exception: Lucy has an artificial chromosome that she has passed to her offspring.

And so the little mouse’s implications may be vast. Her descendants, experts say, represent another critical step in biotechnology’s inexorable march toward the day when parents will be able to design their own babies: eliminating genes for undesirable traits, adding genes for desirable ones – and in the process altering the DNA of generations to come.

This will not be the work of some mad scientist squirreled away in a dungeon lab. It will flow from legitimate research intended to treat disease and cure infertility. Scientists are already pushing the frontiers of cloning; of the manipulation of embryonic stem cells, which grow indefinitely and can develop into any tissue in the body; of gene therapy, which uses genetic material to treat disease; and of in vitro fertilization.

In the next century, experts say, these technologies will merge, enabling doctors to tinker with the genes of eggs and sperm. Thus will “germline genetic engineering,” now only a theory, become reality.

“There will be enhancements to life span, alterations to personality, like intelligence,” says Dr. Gregory Stock, the director of the Program on Medicine, Technology and Society at the University of California at Los Angeles. “In the not-too-distant future, it will be looked at as kind of foolhardy to have a child by normal conception.”

Genetic research has always posed vexing social and ethical issues. But there may be no more complicated question than how the fruits of the genetic revolution should be used and, most important, who should have access to them. In the last century, eugenics was about the exercise of power and ideology. In the next, it may be about money.

Already, wealthy childless couples routinely spend tens of thousands of dollars trying to conceive babies through science. Already, couples who carry genes for Tay-Sachs disease or sickle cell anemia can have their embryos screened to prevent passing on those genes. Dr. Lee M. Silver, a molecular geneticist at Princeton University, says that someday, a doctor will tell parents: “I’ve got your embryos under a microscope. How about if I add a couple of genes to provide cancer resistance?” Or genes for stronger muscles? Or musical talent?

Dr. Ruth Hubbard, a retired professor of biology at Harvard University, is offended by the thought. She predicts that safety concerns will prevent germline engineering from coming to pass; the risk of creating deformed babies is too great, she says. And, she points out, biology is not destiny: “What if you have been engineered to be a dancer and you’re a total clod? How horrible for those children.”

Dr. Silver agrees that genes confer an advantage – nothing more. But if some children are lucky enough to be born with genes that protect them against cancer or make them musical geniuses, he reasons, what is wrong with using science to give other children the genes those youngsters come by naturally? Is it any different from paying for private school? “The answer to what’s wrong with this,” Dr. Silver says, “is that it is people with money who will be able to not only give their child a better environment, but also better genes.”

That is a concern others share. “Have you seen ‘Gattaca’?” asks Dr. W. French Anderson, the geneticist who in 1990 conducted the first gene therapy experiment. In that 1997 science-fiction film, society was divided into “valids,” whose genetic profile was corrected before birth, and “invalids,” who were conceived naturally. “That’s exactly what will happen in our society if there are no safeguards,” Dr. Anderson warns.

Yet Dr. Anderson is among those pushing the technology forward. He has proposed performing gene therapy on a fetus to try to cure a rare immune deficiency, although the National Institutes of Health has rejected the plan. As to engineering eggs and sperm, he says, enhancement should be forbidden. What would he permit? “Anything which brings a child from below-normal function back up to normal function.”

But the question of what is normal is tricky. Adrianne Asch, a bioethicist at Wellesley College, is blind, a condition she says does not prevent her from functioning normally. Even if germline gene therapy could safely prevent blindness in children, Dr. Asch says, she would be apprehensive. She views Dr. Anderson’s standard as problematic.

“If you say you are only going to use it for disabilities, but you are not going to use it for character traits, then is schizophrenia a disability but depression a character trait?” she asks. “When is dwarfism a disability and when is being short a social problem?”

The American Association for the Advancement of Science has spent the last two years wrestling with such thorny questions; it has convened two working groups that are examining ethical and scientific issues surrounding germline gene therapy. A report, intended as a guide for policy makers, should be released early this year.

That effort wins praise from Dr. Paul Billings, a geneticist and member of the board of the Council for Responsible Genetics, a nonprofit group that opposes germline engineering. Dr. Billings says he has safety concerns about the technology and warns that altering genes may prove an exercise in hubris. He wants limits placed on germline engineering before it becomes feasible, but doubts that will happen.

“We are market driven,” he says. “We emphasize the individual freedoms of scientists and others. The cow is out of the barn.”

VISIONS: Cities –
A stud in every tongue, and logo tattoos

By Jennifer Steinhauer

It took America almost the entire 20th century to catch up with the rest of the world in its collective interest in body adornment, but in the end the country’s beauty industry has embraced its extremes.

Only 50 years ago, dark lipstick was considered risque, to say nothing of men in nail polish and women covered in body glitter. Tattoos were for sailors, although by the 1960s, they were adopted by groups who considered themselves on the fringes of society, like motorcycle gangs. Ditto for piercing, beloved by punk rockers.

But class distinctions have all but fallen away in the turn-of-the-century proclivity among Americans to adorn their bodies. Suburban teenagers pierce their tongues, 40-year-old mothers decorate the corners of their eyes with plastic jewels, movie stars take town cars to Queens to get their hands covered with inky henna and mainstream cosmetic giants sell body paint. Indeed it may well be that the explosive Wall Street bull market – and the materialism that it implies – ignited this fin-de-siecle trend, much of which finds its roots in places as diverse as India and Papua New Guinea.

“There has been interest in the tribal world that in one context has been a critique of Western society,” said Jeffrey D. Ehrenreich, the chairman of the department of anthropology at the University of New Orleans. “It is a response to the perception that our culture does not meet very basic and fundamental spiritual needs in spite of the material wealth accomplished through our culture. People don’t feel attached. They don’t feel special.”

Experts on body adornment – there happens to be no shortage of them – seem to concur that the new century will bring about only more forms of the art. In some cases, the manifestations will be more extreme, like the inlay of metal into arms, which is already being done. Or perhaps in many ways, more conformist, as in having one’s corporate logo of choice, now perhaps just on the back of a jacket, tattooed on the leg.

“There probably will be a continuing escalation of body art options available,” said Daniel Wojcik, an associate professor of English and folklore studies at the University of Oregon. He imagines a combination of past styles, like ‘60s hippie chic, with, say, a tattoo inspired by TV or film. “The final result could be an eclectic or ironic mix-up, like Pokemon/biker style, with tribal piercings and bondage wear.”