New! My Talk at Google's 2018 I/O Conference on Building Healthier Relations with Technology

Just Hand-Wringing? Why the Excesses of Technology Need Watching

In 2018, I did an interview with one of the UK’s leading environmentalists, Rob Hopkins, about the  fragmentation of attention in modern life. At the time, Hopkins was exploring an overlooked hurdle to solving global warming – our waning ability to think well and even to muster the creativity needed to imagine and shape a better future. In the next few weeks, I’ll be sharing excerpts from our wide-ranging  interview, which you can read or listen to on his great website, “Imagination Taking Power.”

Hopkins: What is qualitatively different about the kind of impacts we’re seeing in terms of attention now?

Jackson: How do we know what is different, what has truly changed in our lives?  What’s better, what’s worse?  Are our concerns about technology and distraction just the kind of handwringing that we have always seen when things change?  I have two answers to that question.  First, the totality of what we’re dealing with is so much greater.  Teens are on average exposed to nearly six hours of non-print media a day, and a significant minority experience nearly 8 hours of media a day.  So while media and technology were just a slice of life in the past, now they are a constant. They are the reality. We inhabit the virtual world in disproportionate measure to the physical, and that shift has taken just a generation to unfold.

However, it’s important to note secondly that we are getting a better handle scientifically on the impact of these changes, especially cognitively. We have strong correlational evidence linking time spent on smartphones or online to lowered well-being in children and declining empathy among young adults. As well, steep declines in children’s and adult’s capacity to imagine, persist in problem-solving, and reason coincide with recent decades of rapid technological penetration. This is important. We don’t have the full picture but we are beginning to understand the effect of technology on our lives and on our minds.

Certainly we should remain aware of the human tendency to yearn for the familiar and to look nostalgically back at the past.  But the bottom line is that we have to solve the problems of our day, and there are just too many signs that digital living in its current forms raises red flags. For instance, the ability of Americans from kindergarten to adulthood to elaborate on a problem, to put flesh on an idea, for instance, has dropped by 40 percent since the 1980s, and the most steep drop has occurred in the years since technology came to play such a dominant role in our lives.

There are warning bells, and just as with climate change, we can wait until all the t’s are crossed and I’s are dotted on the evidence or we can act to solve the problems of our day, using the best possible assessments available to us at this time.

My Appearance on Italy’s Top Investigative News Show

Presa-Diretta – Maggie Jackson Interview – from Iperconnessi – Oct. 15, 2018
Narrator: “Nine years ago, when we were full of enthusiasm about the arrival of smart phones, Maggie Jackson, working for the Boston Globe, the major newspaper in Boston, wrote a prophetic book, which has just been republished, about distraction and its impact on a society that is constantly connected.”
Jackson: “Distraction is not just about being pushed toward something irrelevant, but also about life exploding into a thousand pieces, and I think this is the sense of our distraction today. We skip from one thing to another, no longer able to understand what is important and what is not. We have created a society that rewards only what is easy and comfortable. But when we have to resolve a difficult issue or answer a complex question, that requires the use of a part of us that is no longer functional in this ‘online world.’ We only see the advantage of having all this information at our disposal. The idea that we have access to immediate information, leaves the impression that information is easy. Everything is downloadable. As one scientist says, “Online you don’t have to face your ignorance.” You never have to say ‘I don’t know.’ You don’t have to be humble any more. But humility is the starting point for learning. To open yourself up to the new is the only way to reach the most optimal answer.”

Jackson: “When you live without paying attention to others, you basically regress to a  more primitive form of thinking, the stereotyping and quick assumptions that make you intolerant and filled with prejudice. The quick takeaway is that you form simplistic categorizations because it’s much easier to hate than to understand. I believe that this culture of distraction leads you quite directly to fascism, to authoritarian cultures, and this, today, is the real danger.”

PresaDiretta Host:  “Okay, so, if you find it too much to attribute even the crisis facing democracy to smartphones, I understand, but in reality the reflections of the writer Maggie Jackson should not be taken literally. What she is telling us is that when everything passes through a smartphone, when political thinking is exhausted and is reduced to the 280 characters of Twitter, when the number of “likes” begin to drive complex political choices, a drift to authoritarianism is closer than its seems today.

The Costs of Instantaneity

Are we using our technologies wisely?

That’s one of the points that I discussed recently in an interview for the intriguing new blog Human-Autonomy Sciences, curated by two leading psychology researchers on human-machine interaction, Clemson University’s Richard Pak and Microsoft Senior Design Research Manager Arathi Sethumadhavan.

Here is an excerpt from our e-conversation:

Pak — What does the future of human relationships with technology: good, bad, or ugly?

Jackson — The essential question is: will our technologies help us flourish? The potential – the wondrous abundance, the speed of delivery, the possibility for augmenting the human or inspiring new art forms – is certainly there. But I would argue that at the moment we aren’t for the most part using these tools wisely, mostly because we aren’t doing enough to understand technology’s costs, benefits, and implications.

I’ve been thinking a lot about one of technology’s main characteristics: instantaneity. When information is instant, answers begin to seem so, too. After a brief dose of online searching, people become significantly less willing to struggle with complex problems; their “need for cognition” drops even as they begin to overestimate their ability to know. (The findings echo the well-documented “automation effect,” in which humans stop trying to get better at their jobs when working closely with machines, such as automated cockpits.) In other experiments, people on average ranked themselves far better at locating information than at thinking through a problem themselves.

Overall, the instantaneity that is so commonplace today may shift our ideas about what human cognition can be. I see signs that people have less faith in their own mental capacities, as well as less desire to do the hard work of deliberation. Their faith increasingly instead lies with technology. These trends will affect a broad range of future activities, such as whether or not people can manage a driverless car gone awry or even think it’s their role to do so; whether or not they any longer recognize the value of “inefficient” cognitive states of mind such as daydreaming, or whether or not they have the tenacity to push beyond the surface understanding of a problem on their own. Socially, similar risks are raised by instant access to relationships – whether to a friend on social media or to a companion robot that’s always beside a child or elder. Suddenly the awkwardness of depth need no longer trouble us as humans!

These are the kinds of questions that we urgently need to be asking across society in order to harness technology’s powers well. We need to ask better questions about the unintended consequences and the costs/benefits of instantaneity, or of gaining knowledge from essentially template-based formats. We need to be vigilant in understanding how humans may be changed when technology becomes their nursemaid, coach, teacher, companion.

Recently, an interview with the singer Taylor Goldsmith of the LA rock band Dawes caught my eye. The theme of the band’s latest album, Passwords, is hacking, surveillance and espionage. “I recognize what modern technology serves,” he told the New York Times. “I’m just saying, ‘let’s have more of a conversation about it.’”

Well, there is a growing global conversation about technology’s effects on humanity, as well there should be. But we need to do far more to truly understand and so better shape our relations with technology. That should mean far more robust schooling of children in information literacy, the market-driven nature of the Net, and in general critical thinking skills. That should mean training developers to become more accountable to users, perhaps by trying to visualize more completely the unintended consequences of their creations. It certainly must mean becoming more measured in our own personal attitudes; we all too often still gravitate to exclusively dystopian or utopian viewpoints on technology.

Will we have good, bad, or ugly future relations to technology? At best, we’ll have all of the above. But at the moment, I believe that we are allowing technology in its present forms to do far more to diminish human capabilities than to augment them. By better understanding technology, we can avert this frightening scenario.


A New Vision of Balance: Tech-Life, Not Work-Life

A new vision of human flourishing is urgently needed, I call it “tech-life balance.”

That’s one of the points that I discussed recently in an interview for the intriguing new blog Human-Autonomy Sciences, curated by two leading psychology researchers on human-machine interaction, Clemson University’s Richard Pak and Microsoft Senior Design Research Manager Arathi Sethumadhavan.

In coming weeks I’ll be sharing parts of my interview with Pak in this space. Please share, comment, and ponder!

RP – How can technology facilitate a healthy work-life balance?

MJ – Over the last 20 years, technology has changed human experience of time and space radically. Distance no longer matters much, nor duration, as devices allow us to fling our bodies and thoughts around the globe near-instantly. While on a business trip, a parent can skype a bedtime story with a child at home. The boss can reach a worker who’s hiking on a remote mountaintop. Technology has broken down cultural and physical boundaries and walls – making home, work, and relationships portable. That’s old news now, and yet we’re still coming to grips with the deep impact of such changes.

For instance, it’s becoming more apparent that the anywhere-anytime culture isn’t simply a matter of carrying our work or home lives around with us and attending to them as we wish. It’s not that simple by far. First, today’s devices are designed to be insistent, intrusive systems of delivery, so any single object of our focus – an email, a text, a news alert – is in competition with others at every minute. We now inhabit spaces of overlapping, often-conflicting commitments and so have trouble choosing the nature and pace of our focus.

The overall result, I believe, is a life of continual negotiation of roles and attentional priorities. Constant checking behavior (polls suggest Americans check their phones on average up to 80 times a day) is a visible symptom of the need to rewrite work-life balance dozens of times a day. The “fear of missing out” that partly drives always-on connectivity also is a symptom of the necessity of continually renegotiating the fabric of life on- and off-line.

Because this trend toward boundary-less living is so tech-driven, I believe that the crucial question today is improving the balance between digital and non-digital worlds. After that, work-life balance will follow.

We need to save time for uninterrupted social presence, the kind that nurtures deeper relationships. We urgently need space in our lives where we are not mechanically poked, prodded and managed, ie when we are in touch with and able to manage our inner lives. (Even a silent phone in “off” mode undercuts both focus and cognitive ability, according to research by Adrian Ward at the University of Texas/Austin.)

One solution would be to think more deliberately about boundaries in all parts of our life, but especially in the digital sphere. Too often lines of division are seen as a confinement, a kind of archaic Industrial Age habit. But boundaries demarcate; think of a job description, a child’s bedtime, or the invention of the weekend, a ritual that boosts well-being even among the jobless. Boundaries are systems of prioritization, safety zones, structures for depth, and crucial tools for providing structure in a digital age. A family that turns off its cell phones at dinner is creating opportunities for the kind of in-depth bonding that rarely is forged online.

Technology can help facilitate creative boundary-making – think of the new Apple and Google product designs that prompt offline time. But our devices cannot do the work of inventing and managing the boundaries that are crucial for human flourishing.

Invited HuffPost TED Weekend Blog: Beyond Gaming, There’s Life


We’d just begun a family vacation this summer, when my teenager woke up barely able to swallow, with a throat raw and sore. I took her to the nearest ER, where the wait was blessedly brief. A triage nurse whisked into the examining room with a laptop on wheels and began questioning my daughter. Name? Weight? Pain on a scale of 10? The nurse was efficient, yet something was missing. During a 10-minute checklist, she never once looked at the case – the bundle of humanity (and mystery) that is my daughter.

Was I expecting too much of this moment? Checklists in medicine can prevent infections. Taking 10,000 steps a day is now a global health movement. Shaking hands for six seconds boosts oxytocin, the “trust” hormone, Jane McGonigal recounts in her TED talk on how simple game-based tricks can better our lives. Anything daunting or monumental – health, medical diagnosis, resilience – demands entry points. The lists and formulas and tips that we adore point our muddled selves in the right direction, making small but powerful changes possible. Now portable and automated, they can help the fragile roots of good habits take hold.

But are these entry points to change too often seen as endpoints today, especially when they come to us so easily, with a click and a touch? Are we increasingly sated by the checklist and tipsheet? Consider that a majority of teachers now see a link between middle and high school students’ use of digital tools and careless, short-cut writing. Most online searches consist of one query, and we tend to open just one document per search. Since the mid-1980s, Americans show a 35-percent drop in their ability to elaborate on ideas, a key measure of creativity. While briefly using my daughter’s laptop, I was taken aback to see slightly off-target word suggestions flashing above my prose – the work of her school software. How often had an algorithm’s choice eclipsed a moment of potential student musing?

Yes, we evolved to survive a threatening world by plucking the low-hanging fruit – and by using tools to extend our grasp. Shortcuts and quick fixes appeal to what psychologists call our “cognitive miserliness.” Yet in a highly sci-tech society, our zeal for efficiency and brevity become akin to Plato’s wild horses of appetite and instinct battling the charioteer of deliberation. Nearly anything cloaked in a template or metric – six seconds, three steps, nine questions – seems unarguably sufficient. Insurers now reward doctors for treating complex conditions such as pneumonia with checklists that stipulate administering antibiotics within six hours of hospital arrival, writes the cardiologist Sandeep Jauhar. “But doctors often cannot diagnose pneumonia that quickly,” he notes. “Checklists lack flexibility.”

And some walking behavior researchers – yes, they exist – are concerned by our sometimes blind faith in the 10,000 steps regimen. “This is just a guideline,” says Catrine Tudor-Locke. Not only do differing populations have varying exercise needs, but the myriad step-counting devices on the market measure “a step” in a plethora of ways, she says. In multiple ways, confidence in a magic formula is unwarranted, reminding us, as Aristotle once wrote, that versatile minds do not try to measure a fluted column with a rigid straight-edge.

McGonigal is right in asserting that we can’t condemn games wholesale as a waste of time. Content matters. A ‘game’ that inspires an elderly recluse to walk farther each day is a good thing, and surely better for us all than one filled with gruesome violence. But shouldn’t we remember most of all that challenges don’t come with clear rules, levels of play and push-button heroics? The eminent British woodcarver David Pye once wrote of automation as a “workmanship of certainty.” Once in production, the widget as product is predictable. But craftsmanship is a “workmanship of risk,” in that the process of making is uncertain, like life itself.

On that crisp blue-sky late-summer day, my daughter and I left the emergency room in great time, toting a correct diagnosis and an incorrect prescription, not knowing that ahead lay a two-week saga of three more doctor’s visits before she truly could begin to mend. As we click through checklists, apps and games that promise so much, let’s remember that games have a place in our lives, but life is not a game.

In McCain’s (Multitasking) Wake: WashPost’s “Great Moments in Boredom”


Yesterday I got a call from an old AP colleague, now with the Washington Post. Quick! Dave Beard wanted commentary on distraction for a photo montage dubbed “Great Moments in Boredom.”

The Post’s resulting Photo Gallery, featured on page one today, offers a nice peek at world leaders and others caught peeking at their watches, yawning and even falling dead asleep in the public eye. The inspiration? Sen. McCain was photographed this week playing poker on his smart phone at a Senate Foreign Relations Committee meeting on the Syrian crisis. Oops.

One incident of squirming statesmanship that didn’t make it into the Post’s gallery occurred in 1943, during a press briefing following the Quebec Conference between FDR, Churchill and then-Canadian premier W.L. Mackenzie King.

An article from the time headlined Churchill’s fidgety behavior – crossing and recrossing his legs, loosening his collar, mopping his forehead and hurling one of his famous cigars away just half-smoked. As the press didn’t seem to emerge from the event with any scoop, perhaps Churchill was simply bored with the proceedings – and in a hurry to move on. (At the summit, the allies had agreed to begin discussing a plan to invade France.)

All that’s to say that distraction, evermore, is subjective. Sometimes it’s pure escapism, sometimes a natural response to our less-then-enchanting surroundings. McCain perhaps should have mustered more focus for a crucial hearing. (And our devices certainly make it easy for us to turn away.) And yet, given how long hearings tend to last, perhaps he just needed a bit of a break…


On the Day of MLK’s “Dream” – A Thought on Aliveness

The perfect scent of an August morning. The still reservoir, a sheet of glimmering glass.

Think of all the people pursuing their hopes, unrecognized, perhaps unpaid, for their inventions, creations, solutions – or just for surviving.

Sunlight strikes the small branch of a wayside bush, shaft meeting shaft. A flame-reflection bursts from a taxi in the distance, then vanishes.

A scowling artist pushes his easel and paints up a hill, mulling where to pause.

Towhead twins jog with their mother, one leaping to touch the leaves above.

It is not the loose-weave of these forgotten moments that carry us forth – or is it?


– on the anniversary of MLK’s ‘I Have a Dream’ Speech

Why We Can’t Let Google Push Technology ‘Out of the Way’

Note:  This post first appeared on Huffington Post‘s front page.


Heading home last week after a walk in Central Park, I saw a bearded man on his hands and knees, peering at a patch of plants poking up between the sidewalk’s cobblestones. Lost key or contact lens? I stopped to ask. No, he was just looking, he told me, running his fingers across a tuft of grass and over a mossy cushion of green. Two plants that normally flower in April and August were blooming simultaneously in May, he said. Are you a botanist, I asked. He stood up beside his bicycle. No, he said, just interested.

You never know what you might see if you look down, or around, or in any new direction. Take a minute and you might notice something unexpected – evidence of a tiny glitch in a seasonal rhythm, or a bonsai-size bit of wilderness breaking though a stony byway.

Yes, we often hear this exhortation to ‘wake up.’ Immersed in virtual worlds, we particularly miss much of the earthly wonder budding around us. In her intriguing new book On Looking, the animal cognition expert Alexandra Horowitz takes 11 city walks with experts such as a geologist or an artist in order to make the “familiar become unfamiliar.” Her book is timely. With our finite attentional capacities, we do miss so much in life.

But as Horowitz would attest, the topic of awareness can be a Pandora’s box. Why should alert awareness matter when we have so much ‘coming at us’ all day long? Isn’t the real problem honing our focus, rather than trying to drink more from the experiential firehose? The question of exercising our awareness is complex, and yet matters more than ever today, I believe. Here is why:

First, we are creatures of habit, prone to see, hear and think the familiar and expected – whether online or on the sidewalks we cross each day. This is in part why babies amuse us. They exude delighted interest in everything, since nearly everything is new to them. We can’t exist child-like, open to everything. But turning off our ingrained ‘eyes of habit’ once in a while makes us more inventive. People who live abroad or are bilingual are more likely to be creative, simply because inhabiting a new country or language inspires what psychologists call ‘cognitive flexibility.’ Knowing two ways to drink tea or multiple words for love expands our horizons of understanding. We need to look up from our humdrum commute or from our same old stomping grounds of the Web.

Even more importantly, truly seeing anew isn’t simply a matter of glancing around. It involves noticing, and then comprehending what’s familiar and what’s new on multiple levels. Two blooming weeds in the pavement may cause a moment’s admiration for their unexpected beauty. But understanding whether or not their cycles of flowering are askew and so further evidence of climate change demands effort, not simple snapshot observation. Exercising our awareness involves probing and testing our assumptions.

Today, the popular idea that our devices should fade into the background – exemplified by Google’s aim to make technology that ‘gets out of the way – is alarming. As I told the Huffington Post’s Bianca Bosker in a recent interview, if technology becomes invisible to us, we lose sight of how it shapes us, for good and for ill. We will stop noticing the ‘google effect’ – the complacency we show while searching online – and sadly keep assuming that Facebook-style template identities can express our whole selves.

We can’t see our devices and their torrents of information anew each moment. Our tools invariably will fade into the background of our lives. (While reading a book, I see its content, not the print technology I hold in hand.) But we must sometimes step back and try to comprehend how new, powerful digital technologies influence us, as well as what they deliver to our minds. Waking up to the world is a two-fold responsibility: seeing, then understanding. If we don’t manage this with our digital and earthly habitats, we will be abdicating a role in the making of our future.


A Blog: Does It Matter Where and When We Are?

I was asked to write this blog as part of their recent series on our mobile society:


Tonight, as my husband stands in our bedroom, fingers whirling across his smartphone and eyes glued to its tiny screen, I have no idea “where” he is. Is he checking the score of his beloved home team, or dealing with a rant from an indefatigable boss overseas? Is he working or home-ing, or both?

This melding of work and home, of course, is an old story. In 1999, I wrote an article about three generations of a Baltimore family and their work-life balance. Shattering my romantic views on what it was like to live a few easy steps from work — literally over the store — the family’s elderly patriarch told me that his parents couldn’t wait to move to the suburbs and put some distance between family and work. Their hardware business had shadowed their evenings and weekends, stealing peace. Decades later, the patriarch’s restless, cell phone-toting, entrepreneurial son blamed the portability of work for his recent divorce. … read more

Digital Dharma: The Art of Preservation in a Copy-Paste World

Replication is at the heart of life. The genetic information encoded in our DNA allows new life to be passed along and to evolve. Children learn through imitation. And the copying of written information allows us to build on the past and make knowledge accessible. Gutenberg turned a wine press into a vehicle for individual enlightenment.

E. Gene Smith, a scholar of Tibetan literature and the subject of an intriguing new documentary, was a highly original hero – of the copy. At his death in 2010, he left behind a single volume of essays, but an enormous lifework: the preservation and reproduction of tens of thousands of rare, seminal Tibetan texts from a canon integral to the history of Buddhism. In an age when information seems quick, easy and even expendable, the film Digital Dharma should make us think carefully about technology’s relationship to replication in our post-analog lives.

As a lanky young man with a flair for languages, Smith was earning a PhD at the University of Washington in the early 1960s when he became a student of Tibetan lama and refugee, Dezhung Rinpoche. But Smith’s studies were stymied by a lack of texts. The country’s astonishing canon was imperiled first by the Chinese occupation and later the Cultural Revolution, when monasteries, libraries and books collections were destroyed in huge numbers.

The destruction wasn’t merely symbolic. Most often, no other copy existed when a Tibetan text went up in flames. Even into the 20th century, Tibet had no printing presses, so texts consisted of hand-lettered manuscripts or books printed from carved wooden blocks. Moreover, the texts themselves are crucial to world history. Tibet is one of four languages in which the Buddhist canon – or dharma – is preserved, and the country’s vibrant literature as a whole reflects its place as a crossroads of Asia. The losses were akin to the destruction of the library at Alexandria.

Enter Smith, a Mormon-turned-Buddhist who began a personal, decades long quest to search and recover and then print copies of thousands of Tibetan books and texts. Working for the Library of Congress from New Delhi beginning in the 1960s, he traveled tirelessly across India, Nepal and Bhutan to find texts that refugees were hiding, then found ways to use U.S. aid to fund the printing of new copies. With velvet guile, he navigated Cold War and Sino-Tibetan politics, always placing the books – not human differences – at the center of his quest. “The idea is to deliver the tradition back to the owners of the traditions,” he told the Buddhist magazine Mandala in 2001.

Smith’s vast publishing efforts, however, didn’t stop at the printed page. The movie climaxes with his late-life efforts to digitize the vast collection he accumulated at the Cambridge, Mass.-based Tibetan Buddhist Resource Center, and his 2008 travels back to Nepal and India to give exiled Tibetan monks laptops and hard drives containing their own monasteries’ sacred, now-digitized collections. The monks were ecstatic. In the film, one beaming abbot wears a pouch around his neck containing a flashdrive that Smith mischievously told him was an “amulet.” Is this the magic of technology? At the first joyous monk-meets-computer moment, the audience at the film’s July 25 New York premiere burst into applause.

The story seemingly ends there. But digitizing isn’t a magic solution. The laptops will need upgrades, the flashdrives must be updated, and digital media is more fragile than we often imagine. We must offer as much curatorial care to a digital canon as we should to the vast and still-important treasures of our print age. As well, the speed and invisibility of the digital neatly hides the truth of entropy. Just as our genes mutate, our traditions evolve, and our stories change, so too a digitized canon will shift little by little in the making. (Just peek at Google Books, with its typos and missing pages, to see that copying is never perfect.) Just as our ability to see the world is a construction – our vision is interpreted by our minds – so too the handing down any bits of culture for the future is a building, a choosing, an incremental shoring up.

Today, we too often believe that technology neatly solves a problem when in reality, technology merely shifts the nature of the challenges before us. I have no doubt that the inestimable Gene Smith deeply understood this. But do we, as consumers, producers and curators of the new canons of our age, understand what we are doing? Perhaps we should copy and paste a little less often, and think about knowledge a little bit more. Gene Smith’s vision will be missed.

The film, directed by Dafna Yachin, will be screened August 1, 8, 15, 22, 29 and September 5 at the Rubin Museum in New York; as well as August 17-23 at the DocuWeeks film festival in New York, and August 10-16 at DocuWeeks Los Angeles.