New! My Talk at Google's 2018 I/O Conference on Building Healthier Relations with Technology

What I’ve Learned: Three Tips for Reclaiming Focus that Might Surprise You

Here’s Part III of my recent interview with one of the UK’s leading environmentalists, Rob Hopkins, about the  fragmentation of attention in modern life. In this excerpt, I talk about the three ways that I personally guard and nurture my capacity for focus. Some of my best practices might surprise you!

Hopkins: I wondered, having done all this research, and having been living with this stuff for several years longer than most people, and being really aware of the impacts of the technologies that you write about, what changes it’s led to in your own life in relationship to those technologies?

Jackson: That’s a very good question.  Well, first of all, I zealously guard opportunities for quiet, full focus, and thinking. And one reason that I do so, is that I know how easy it is to fall into the trap of “getting things done,” ticking off the boxes, jumping from task to task, while avoiding the hard problems, the messy difficult aspirations of our lives. We define productivity in a very narrow way. Hyper-busyness is something that our culture reveres, and yet sages from Aquinas to the Buddha warn that this kind of lifestyle inspires us to sidestep the most difficult problems of life.

But believe me I still struggle with the right balance – how to interact with this new world of social media and hyper-connectivity and avalanches of instant-access information yet protect times for deep human connection and for doing justice to the messy complex problems that face us.

Just last fall one of my daughters, who’s in college a thousand miles from our home, was ill. I took time from work and stayed with her for some weeks, but when I returned home, I began checking in with her more often to make sure that she was getting the right care. Yet now she is strong and healthy and I’m still trying to battle the habit of checking my phone multiple times a day! The urge is so strong! Every time I take a small break when I’m in the library, or working at home, I just have the urge to pull it out of my pocket, as I had to do for many months. I’m battling this, and yet it’s difficult to pull back and begin to recover time for uninterrupted thinking and focus.

Second, to cope with our “blooming buzzing world,” I also try to do different sorts of work in different physical locations. At home, I do research, searching for scientific papers or studies, interviewing people for my books and articles. It’s a busy kind of mindset. To think deeply, read carefully and to write, I go to a Library where I intentionally do not connect to the Internet, or I retreat to our house in the country, where I am alone for days and can inhabit the space of whatever problem I am working on.

Third, the temptation to fracture our attention and stunt our thinking is also a social challenge. Paying attention fully to one another is a precious and fragile process, especially today.

For instance, I often talk with my husband about my book and whether we do so by phone or in person, there are tensions related to how each one of us interprets the act of paying attention. When I’m asking for his feedback on my writing or evolving ideas, he’ll often start puttering around the house, cooking or cleaning up. He insists that he’s still paying attention, and yet I think he’s not fully present at a crucial moment for me. My view is that these moments when we’re really talking about something that matters are rare and precious, so why do anything that might take away from the possibility for full connection? It’s a difficult call.

I once interviewed a UCLA anthropologist who is a MacArthur Fellow and expert on Americans’ hyper-busy family lives. I will never forget one of her comments to me: “Will we look back someday and say, we could have been having a conversation?”  That really stuck with me. So often we could have been having a conversation rather than sitting side by side, hardly present to one another, splitting our focus.

So here are my tips for staying focus: guard our opportunities to pay attention, use the environment; tailor where we work and think to what kinds of work needs to be done; and don’t forget that attention is a fragile social challenge – be empathic and a good listener!

The Hidden Costs of Multitasking

Recently, I did an interview with one of the UK’s leading environmentalists, Rob Hopkins, about the  fragmentation of attention in modern life. Hopkins is part of a growing wave of green thinkers who rightly worry about how technology is affecting our ability to solve big-picture problems such as climate change. Here’s a second excerpt from our wide-ranging  interview, which you can read or listen to on his great website, “Imagination Taking Power.”

Hopkins: I didn’t have email until maybe 13 years ago, Twitter until maybe 7 years ago, Facebook just a few years ago.  If we were to say this has been a 20 year experiment on a massive, massive scale, how would you summarize the interim findings of that experiment?”

Jackson: Well that’s a very big question!  Currently as I release a new updated edition of my Distracted book, I’m thinking a lot about distraction and how that affects people in new ways.

We often define distraction as being pulled to something secondary, but a lesser-known definition involves being pulled in pieces, being fragmented. That certainly describes life on- and off-line today.

And research shows that when people are avid multi-taskers, when their attention is splintered and abbreviated they actually are shown to have less ability to discern what’s trivial and what’s relevant in their environments.

Even more importantly, when we are jumping from task to task or person to person, we may be undermining our ability to be cognitively flexible, which is a core skill of creativity and problem-solving. In other words, studies show that when people are multi-tasking, they can absorb new information, they can learn, but they encode and store knowledge in more shallow ways, actually using different parts of their brain than if they were paying full attention. As a result, the new knowledge is less assimilated with other stored knowledge and so is less available for transfer to novel situations.

For example, if you multitask your way through your math homework, you can solve the kind of math problem that you studied, but you likely can’t tackle a related but different kind of math problem. Or the surgeon who multitasked her way through med school might be able to fix a routine problem that arises in the operating room, but may be flummoxed when a new, rare complication arises.

I’ve had professors tell me that because kids are multi-tasking their way through an introductory college class, they’ll get to the second level psychology or history class, and it’s as if they hadn’t even taken the first course. They have learned the material in shallow ways.

Additional research shows that the presence of a cell phone, even if it’s silent and turned off, unconsciously siphons our attention away from the moment at hand, so we’re less focused. But as well, we become less able to think in flexible ways. The presence of the phone, beckoning to us even unconsciously, lowers fluid intelligence, which is described as the ability to interpret and solve unfamiliar problems.  We simply don’t have the capacity to multitask and think nimbly!

So, we are beginning to discover that our habits of mind and our technologies may be making us less discerning and flexible cognitively – skills that are crucial to imagination and higher-order thinking.  That’s alarming and could be linked to the kind of tribalism and risk aversion that we see so often today.

The second assessment I would make about technology as a social experiment is that the instantaneity of information is being shown to undermine our willingness to think in complex ways. And that’s very damaging to our capacity to imagine.

Studies both at Yale and Harvard show that a brief online search for information, just a bit of googling, makes people less willing later to wrestle with a complex problem. Their “need for cognition,” a measure of one’s willingness to struggle with a problem and see it through, drops dramatically after searching online. As well, a bit of searching leads to a kind of hubris; we begin to think that we know more than we actually do. People begin to over-estimate their ability to answer similar type of questions without the computer.

Why? Scientists believe that when information is so instant, we begin to think that answers are just there for the plucking, that “knowing” is easy. One researcher who’s been involved in this work says, “We never have to face our ignorance online.”

What are the implications of this? In our current culture, “knowing” is becoming something brief, perfunctory, neat, packaged, and easily accessible. Yet complex murky problems demand firstly the willingness not to know, to understand that the time for ease in thinking has ended and the real work of reflective cognition must begin.

And second, difficult problems demand tenacity, a willingness to struggle and connect and reflect on the problem and its possible solutions and move beyond the first answer that springs to mind. This is when we must extricate ourselves from automaticity in thinking and call consciously upon the side of ourselves that can decouple from tried-and-true answers, gather more information, test possibilities, and build new understanding. Much of this cognition demands both flexibility and a willingness to grapple with the unknown.

Just Hand-Wringing? Why the Excesses of Technology Need Watching

In 2018, I did an interview with one of the UK’s leading environmentalists, Rob Hopkins, about the  fragmentation of attention in modern life. At the time, Hopkins was exploring an overlooked hurdle to solving global warming – our waning ability to think well and even to muster the creativity needed to imagine and shape a better future. In the next few weeks, I’ll be sharing excerpts from our wide-ranging  interview, which you can read or listen to on his great website, “Imagination Taking Power.”

Hopkins: What is qualitatively different about the kind of impacts we’re seeing in terms of attention now?

Jackson: How do we know what is different, what has truly changed in our lives?  What’s better, what’s worse?  Are our concerns about technology and distraction just the kind of handwringing that we have always seen when things change?  I have two answers to that question.  First, the totality of what we’re dealing with is so much greater.  Teens are on average exposed to nearly six hours of non-print media a day, and a significant minority experience nearly 8 hours of media a day.  So while media and technology were just a slice of life in the past, now they are a constant. They are the reality. We inhabit the virtual world in disproportionate measure to the physical, and that shift has taken just a generation to unfold.

However, it’s important to note secondly that we are getting a better handle scientifically on the impact of these changes, especially cognitively. We have strong correlational evidence linking time spent on smartphones or online to lowered well-being in children and declining empathy among young adults. As well, steep declines in children’s and adult’s capacity to imagine, persist in problem-solving, and reason coincide with recent decades of rapid technological penetration. This is important. We don’t have the full picture but we are beginning to understand the effect of technology on our lives and on our minds.

Certainly we should remain aware of the human tendency to yearn for the familiar and to look nostalgically back at the past.  But the bottom line is that we have to solve the problems of our day, and there are just too many signs that digital living in its current forms raises red flags. For instance, the ability of Americans from kindergarten to adulthood to elaborate on a problem, to put flesh on an idea, for instance, has dropped by 40 percent since the 1980s, and the most steep drop has occurred in the years since technology came to play such a dominant role in our lives.

There are warning bells, and just as with climate change, we can wait until all the t’s are crossed and I’s are dotted on the evidence or we can act to solve the problems of our day, using the best possible assessments available to us at this time.

My Appearance on Italy’s Top Investigative News Show

Presa-Diretta – Maggie Jackson Interview – from Iperconnessi – Oct. 15, 2018
Narrator: “Nine years ago, when we were full of enthusiasm about the arrival of smart phones, Maggie Jackson, working for the Boston Globe, the major newspaper in Boston, wrote a prophetic book, which has just been republished, about distraction and its impact on a society that is constantly connected.”
Jackson: “Distraction is not just about being pushed toward something irrelevant, but also about life exploding into a thousand pieces, and I think this is the sense of our distraction today. We skip from one thing to another, no longer able to understand what is important and what is not. We have created a society that rewards only what is easy and comfortable. But when we have to resolve a difficult issue or answer a complex question, that requires the use of a part of us that is no longer functional in this ‘online world.’ We only see the advantage of having all this information at our disposal. The idea that we have access to immediate information, leaves the impression that information is easy. Everything is downloadable. As one scientist says, “Online you don’t have to face your ignorance.” You never have to say ‘I don’t know.’ You don’t have to be humble any more. But humility is the starting point for learning. To open yourself up to the new is the only way to reach the most optimal answer.”

Jackson: “When you live without paying attention to others, you basically regress to a  more primitive form of thinking, the stereotyping and quick assumptions that make you intolerant and filled with prejudice. The quick takeaway is that you form simplistic categorizations because it’s much easier to hate than to understand. I believe that this culture of distraction leads you quite directly to fascism, to authoritarian cultures, and this, today, is the real danger.”

PresaDiretta Host:  “Okay, so, if you find it too much to attribute even the crisis facing democracy to smartphones, I understand, but in reality the reflections of the writer Maggie Jackson should not be taken literally. What she is telling us is that when everything passes through a smartphone, when political thinking is exhausted and is reduced to the 280 characters of Twitter, when the number of “likes” begin to drive complex political choices, a drift to authoritarianism is closer than its seems today.

The Costs of Instantaneity

Are we using our technologies wisely?

That’s one of the points that I discussed recently in an interview for the intriguing new blog Human-Autonomy Sciences, curated by two leading psychology researchers on human-machine interaction, Clemson University’s Richard Pak and Microsoft Senior Design Research Manager Arathi Sethumadhavan.

Here is an excerpt from our e-conversation:

Pak — What does the future of human relationships with technology: good, bad, or ugly?

Jackson — The essential question is: will our technologies help us flourish? The potential – the wondrous abundance, the speed of delivery, the possibility for augmenting the human or inspiring new art forms – is certainly there. But I would argue that at the moment we aren’t for the most part using these tools wisely, mostly because we aren’t doing enough to understand technology’s costs, benefits, and implications.

I’ve been thinking a lot about one of technology’s main characteristics: instantaneity. When information is instant, answers begin to seem so, too. After a brief dose of online searching, people become significantly less willing to struggle with complex problems; their “need for cognition” drops even as they begin to overestimate their ability to know. (The findings echo the well-documented “automation effect,” in which humans stop trying to get better at their jobs when working closely with machines, such as automated cockpits.) In other experiments, people on average ranked themselves far better at locating information than at thinking through a problem themselves.

Overall, the instantaneity that is so commonplace today may shift our ideas about what human cognition can be. I see signs that people have less faith in their own mental capacities, as well as less desire to do the hard work of deliberation. Their faith increasingly instead lies with technology. These trends will affect a broad range of future activities, such as whether or not people can manage a driverless car gone awry or even think it’s their role to do so; whether or not they any longer recognize the value of “inefficient” cognitive states of mind such as daydreaming, or whether or not they have the tenacity to push beyond the surface understanding of a problem on their own. Socially, similar risks are raised by instant access to relationships – whether to a friend on social media or to a companion robot that’s always beside a child or elder. Suddenly the awkwardness of depth need no longer trouble us as humans!

These are the kinds of questions that we urgently need to be asking across society in order to harness technology’s powers well. We need to ask better questions about the unintended consequences and the costs/benefits of instantaneity, or of gaining knowledge from essentially template-based formats. We need to be vigilant in understanding how humans may be changed when technology becomes their nursemaid, coach, teacher, companion.

Recently, an interview with the singer Taylor Goldsmith of the LA rock band Dawes caught my eye. The theme of the band’s latest album, Passwords, is hacking, surveillance and espionage. “I recognize what modern technology serves,” he told the New York Times. “I’m just saying, ‘let’s have more of a conversation about it.’”

Well, there is a growing global conversation about technology’s effects on humanity, as well there should be. But we need to do far more to truly understand and so better shape our relations with technology. That should mean far more robust schooling of children in information literacy, the market-driven nature of the Net, and in general critical thinking skills. That should mean training developers to become more accountable to users, perhaps by trying to visualize more completely the unintended consequences of their creations. It certainly must mean becoming more measured in our own personal attitudes; we all too often still gravitate to exclusively dystopian or utopian viewpoints on technology.

Will we have good, bad, or ugly future relations to technology? At best, we’ll have all of the above. But at the moment, I believe that we are allowing technology in its present forms to do far more to diminish human capabilities than to augment them. By better understanding technology, we can avert this frightening scenario.

 

A New Vision of Balance: Tech-Life, Not Work-Life

A new vision of human flourishing is urgently needed, I call it “tech-life balance.”

That’s one of the points that I discussed recently in an interview for the intriguing new blog Human-Autonomy Sciences, curated by two leading psychology researchers on human-machine interaction, Clemson University’s Richard Pak and Microsoft Senior Design Research Manager Arathi Sethumadhavan.

In coming weeks I’ll be sharing parts of my interview with Pak in this space. Please share, comment, and ponder!

RP – How can technology facilitate a healthy work-life balance?

MJ – Over the last 20 years, technology has changed human experience of time and space radically. Distance no longer matters much, nor duration, as devices allow us to fling our bodies and thoughts around the globe near-instantly. While on a business trip, a parent can skype a bedtime story with a child at home. The boss can reach a worker who’s hiking on a remote mountaintop. Technology has broken down cultural and physical boundaries and walls – making home, work, and relationships portable. That’s old news now, and yet we’re still coming to grips with the deep impact of such changes.

For instance, it’s becoming more apparent that the anywhere-anytime culture isn’t simply a matter of carrying our work or home lives around with us and attending to them as we wish. It’s not that simple by far. First, today’s devices are designed to be insistent, intrusive systems of delivery, so any single object of our focus – an email, a text, a news alert – is in competition with others at every minute. We now inhabit spaces of overlapping, often-conflicting commitments and so have trouble choosing the nature and pace of our focus.

The overall result, I believe, is a life of continual negotiation of roles and attentional priorities. Constant checking behavior (polls suggest Americans check their phones on average up to 80 times a day) is a visible symptom of the need to rewrite work-life balance dozens of times a day. The “fear of missing out” that partly drives always-on connectivity also is a symptom of the necessity of continually renegotiating the fabric of life on- and off-line.

Because this trend toward boundary-less living is so tech-driven, I believe that the crucial question today is improving the balance between digital and non-digital worlds. After that, work-life balance will follow.

We need to save time for uninterrupted social presence, the kind that nurtures deeper relationships. We urgently need space in our lives where we are not mechanically poked, prodded and managed, ie when we are in touch with and able to manage our inner lives. (Even a silent phone in “off” mode undercuts both focus and cognitive ability, according to research by Adrian Ward at the University of Texas/Austin.)

One solution would be to think more deliberately about boundaries in all parts of our life, but especially in the digital sphere. Too often lines of division are seen as a confinement, a kind of archaic Industrial Age habit. But boundaries demarcate; think of a job description, a child’s bedtime, or the invention of the weekend, a ritual that boosts well-being even among the jobless. Boundaries are systems of prioritization, safety zones, structures for depth, and crucial tools for providing structure in a digital age. A family that turns off its cell phones at dinner is creating opportunities for the kind of in-depth bonding that rarely is forged online.

Technology can help facilitate creative boundary-making – think of the new Apple and Google product designs that prompt offline time. But our devices cannot do the work of inventing and managing the boundaries that are crucial for human flourishing.

Invited HuffPost TED Weekend Blog: Beyond Gaming, There’s Life

 

We’d just begun a family vacation this summer, when my teenager woke up barely able to swallow, with a throat raw and sore. I took her to the nearest ER, where the wait was blessedly brief. A triage nurse whisked into the examining room with a laptop on wheels and began questioning my daughter. Name? Weight? Pain on a scale of 10? The nurse was efficient, yet something was missing. During a 10-minute checklist, she never once looked at the case – the bundle of humanity (and mystery) that is my daughter.

Was I expecting too much of this moment? Checklists in medicine can prevent infections. Taking 10,000 steps a day is now a global health movement. Shaking hands for six seconds boosts oxytocin, the “trust” hormone, Jane McGonigal recounts in her TED talk on how simple game-based tricks can better our lives. Anything daunting or monumental – health, medical diagnosis, resilience – demands entry points. The lists and formulas and tips that we adore point our muddled selves in the right direction, making small but powerful changes possible. Now portable and automated, they can help the fragile roots of good habits take hold.

But are these entry points to change too often seen as endpoints today, especially when they come to us so easily, with a click and a touch? Are we increasingly sated by the checklist and tipsheet? Consider that a majority of teachers now see a link between middle and high school students’ use of digital tools and careless, short-cut writing. Most online searches consist of one query, and we tend to open just one document per search. Since the mid-1980s, Americans show a 35-percent drop in their ability to elaborate on ideas, a key measure of creativity. While briefly using my daughter’s laptop, I was taken aback to see slightly off-target word suggestions flashing above my prose – the work of her school software. How often had an algorithm’s choice eclipsed a moment of potential student musing?

Yes, we evolved to survive a threatening world by plucking the low-hanging fruit – and by using tools to extend our grasp. Shortcuts and quick fixes appeal to what psychologists call our “cognitive miserliness.” Yet in a highly sci-tech society, our zeal for efficiency and brevity become akin to Plato’s wild horses of appetite and instinct battling the charioteer of deliberation. Nearly anything cloaked in a template or metric – six seconds, three steps, nine questions – seems unarguably sufficient. Insurers now reward doctors for treating complex conditions such as pneumonia with checklists that stipulate administering antibiotics within six hours of hospital arrival, writes the cardiologist Sandeep Jauhar. “But doctors often cannot diagnose pneumonia that quickly,” he notes. “Checklists lack flexibility.”

And some walking behavior researchers – yes, they exist – are concerned by our sometimes blind faith in the 10,000 steps regimen. “This is just a guideline,” says Catrine Tudor-Locke. Not only do differing populations have varying exercise needs, but the myriad step-counting devices on the market measure “a step” in a plethora of ways, she says. In multiple ways, confidence in a magic formula is unwarranted, reminding us, as Aristotle once wrote, that versatile minds do not try to measure a fluted column with a rigid straight-edge.

McGonigal is right in asserting that we can’t condemn games wholesale as a waste of time. Content matters. A ‘game’ that inspires an elderly recluse to walk farther each day is a good thing, and surely better for us all than one filled with gruesome violence. But shouldn’t we remember most of all that challenges don’t come with clear rules, levels of play and push-button heroics? The eminent British woodcarver David Pye once wrote of automation as a “workmanship of certainty.” Once in production, the widget as product is predictable. But craftsmanship is a “workmanship of risk,” in that the process of making is uncertain, like life itself.

On that crisp blue-sky late-summer day, my daughter and I left the emergency room in great time, toting a correct diagnosis and an incorrect prescription, not knowing that ahead lay a two-week saga of three more doctor’s visits before she truly could begin to mend. As we click through checklists, apps and games that promise so much, let’s remember that games have a place in our lives, but life is not a game.

In McCain’s (Multitasking) Wake: WashPost’s “Great Moments in Boredom”

 

Yesterday I got a call from an old AP colleague, now with the Washington Post. Quick! Dave Beard wanted commentary on distraction for a photo montage dubbed “Great Moments in Boredom.”

The Post’s resulting Photo Gallery, featured on page one today, offers a nice peek at world leaders and others caught peeking at their watches, yawning and even falling dead asleep in the public eye. The inspiration? Sen. McCain was photographed this week playing poker on his smart phone at a Senate Foreign Relations Committee meeting on the Syrian crisis. Oops.

One incident of squirming statesmanship that didn’t make it into the Post’s gallery occurred in 1943, during a press briefing following the Quebec Conference between FDR, Churchill and then-Canadian premier W.L. Mackenzie King.

An article from the time headlined Churchill’s fidgety behavior – crossing and recrossing his legs, loosening his collar, mopping his forehead and hurling one of his famous cigars away just half-smoked. As the press didn’t seem to emerge from the event with any scoop, perhaps Churchill was simply bored with the proceedings – and in a hurry to move on. (At the summit, the allies had agreed to begin discussing a plan to invade France.)

All that’s to say that distraction, evermore, is subjective. Sometimes it’s pure escapism, sometimes a natural response to our less-then-enchanting surroundings. McCain perhaps should have mustered more focus for a crucial hearing. (And our devices certainly make it easy for us to turn away.) And yet, given how long hearings tend to last, perhaps he just needed a bit of a break…

 

On the Day of MLK’s “Dream” – A Thought on Aliveness

The perfect scent of an August morning. The still reservoir, a sheet of glimmering glass.

Think of all the people pursuing their hopes, unrecognized, perhaps unpaid, for their inventions, creations, solutions – or just for surviving.

Sunlight strikes the small branch of a wayside bush, shaft meeting shaft. A flame-reflection bursts from a taxi in the distance, then vanishes.

A scowling artist pushes his easel and paints up a hill, mulling where to pause.

Towhead twins jog with their mother, one leaping to touch the leaves above.

It is not the loose-weave of these forgotten moments that carry us forth – or is it?

 

– on the anniversary of MLK’s ‘I Have a Dream’ Speech

Why We Can’t Let Google Push Technology ‘Out of the Way’

Note:  This post first appeared on Huffington Post‘s front page.

 

Heading home last week after a walk in Central Park, I saw a bearded man on his hands and knees, peering at a patch of plants poking up between the sidewalk’s cobblestones. Lost key or contact lens? I stopped to ask. No, he was just looking, he told me, running his fingers across a tuft of grass and over a mossy cushion of green. Two plants that normally flower in April and August were blooming simultaneously in May, he said. Are you a botanist, I asked. He stood up beside his bicycle. No, he said, just interested.

You never know what you might see if you look down, or around, or in any new direction. Take a minute and you might notice something unexpected – evidence of a tiny glitch in a seasonal rhythm, or a bonsai-size bit of wilderness breaking though a stony byway.

Yes, we often hear this exhortation to ‘wake up.’ Immersed in virtual worlds, we particularly miss much of the earthly wonder budding around us. In her intriguing new book On Looking, the animal cognition expert Alexandra Horowitz takes 11 city walks with experts such as a geologist or an artist in order to make the “familiar become unfamiliar.” Her book is timely. With our finite attentional capacities, we do miss so much in life.

But as Horowitz would attest, the topic of awareness can be a Pandora’s box. Why should alert awareness matter when we have so much ‘coming at us’ all day long? Isn’t the real problem honing our focus, rather than trying to drink more from the experiential firehose? The question of exercising our awareness is complex, and yet matters more than ever today, I believe. Here is why:

First, we are creatures of habit, prone to see, hear and think the familiar and expected – whether online or on the sidewalks we cross each day. This is in part why babies amuse us. They exude delighted interest in everything, since nearly everything is new to them. We can’t exist child-like, open to everything. But turning off our ingrained ‘eyes of habit’ once in a while makes us more inventive. People who live abroad or are bilingual are more likely to be creative, simply because inhabiting a new country or language inspires what psychologists call ‘cognitive flexibility.’ Knowing two ways to drink tea or multiple words for love expands our horizons of understanding. We need to look up from our humdrum commute or from our same old stomping grounds of the Web.

Even more importantly, truly seeing anew isn’t simply a matter of glancing around. It involves noticing, and then comprehending what’s familiar and what’s new on multiple levels. Two blooming weeds in the pavement may cause a moment’s admiration for their unexpected beauty. But understanding whether or not their cycles of flowering are askew and so further evidence of climate change demands effort, not simple snapshot observation. Exercising our awareness involves probing and testing our assumptions.

Today, the popular idea that our devices should fade into the background – exemplified by Google’s aim to make technology that ‘gets out of the way – is alarming. As I told the Huffington Post’s Bianca Bosker in a recent interview, if technology becomes invisible to us, we lose sight of how it shapes us, for good and for ill. We will stop noticing the ‘google effect’ – the complacency we show while searching online – and sadly keep assuming that Facebook-style template identities can express our whole selves.

We can’t see our devices and their torrents of information anew each moment. Our tools invariably will fade into the background of our lives. (While reading a book, I see its content, not the print technology I hold in hand.) But we must sometimes step back and try to comprehend how new, powerful digital technologies influence us, as well as what they deliver to our minds. Waking up to the world is a two-fold responsibility: seeing, then understanding. If we don’t manage this with our digital and earthly habitats, we will be abdicating a role in the making of our future.

*