Laying down a social marker

Another thought-provoking piece by Gareth Price about how the pressure to share via social media may be influencing the quality and quantity of our ideas.

DisCoverage

frustrated_writer_by_photonerd88-d3gobx6BrainJuicer’s Tom Ewing wrote a blog post today about how the way we listen to music could change.

He envisioned people will soon have “attention regimes, in the way they follow dietary regimes and exercise regimes, and will have them in public: a proclamation of one’s listening regime will become a kind of social marker”; adding:

Demonstrating you can pay attention in a world of instant clicks will be a mark of presumed character (and bragging rights) in the same way demonstrating you keep fit in a world of chairs and screens is among white-collar workers now.”

Ephemerality is built into the internet.

If you don’t update your website Google will punish you by pushing you down its search rankings.

Fail to tweet for any extended period and people will unfollow you.

Don’t update your status and friends will accuse you of being a ‘Facebook…

View original post 229 more words

The Happiest iPhone on the Block: Why Managing Your Digital Life is Like Good Parenting

When I started blogging a little over a year ago, I was a true social media skeptic. I drew more inspiration from thinkers like Sherry Turkle than Anil Dash. But my experiences with social media have turned this on its head. I’m still a skeptic in the sense that, as a scientist, I believe we need to know a lot more about how social media affect our lives for better and for worse. But I don’t feel the kind of concern I used to feel. Perhaps I’ve been tempted by the siren song of technology, lulled by a false sense of security engendered by the all-consuming digital embrace… but I don’t think so. I actually feel more in control and less overwhelmed by social media and other digital forms of communication than ever before. I feel they are tools, which I can selectively choose among and harness. I believe that a sense of well-being and balance in social media use is possible if we use some simple practices. The best metaphor I can think of for these practices is that they are the types of things that an effective and sensitive parent does. Here are the top five “parenting strategies” I’ve used to manage my social media burden:

naughty child

  1. Establish rules and set limits. Children thrive when there are consistent limits and structure. In the same way, our technology use needs rules and limits. If I don’t set limits on when and how I use social media, I’m more likely to get sucked into the black hole of keeping up with every tweet/text/email/post/newsfeed. I’m more easily distracted by social media, less present with others, and more likely to waste time and be less efficient because of it. Like all good parents, I try to create structure that is firm but fair. Harsh discipline might work in the short term, but the child usually rebels. So, I try not to be unreasonable or unrealistic about the rules (e.g., “I can only check email once a day, and for no more than 10 minutes” doesn’t work). I’ve tried to find a set of guidelines that work with my life and make me happy.
  2. Monitor communication technology use. It’s 10 o’clock. Do you know how much social media you’ve used today? This is really about being mindful about how we’re using our technology. I prioritize my time – I only have so much time and attention in a day, and so I try to spend my mental and social capital wisely. I keep track and schedule times that I will use these tools, and know the times that they need to be put to bed.
  3. Reinforce good behavior. It’s not only the amount of time we use social media or communication technology. It’s about how we use it and what it brings to our lives. I try to select digital communities that brings something positive to my life and that cultivates a positive peer network.
  4. Selectively ignore. In parenting, the idea here is that if a child is showing a troublesome behavior, as long as it’s not destructive, it can be “extinguished” by just ignoring it. If there is no reaction, and no reward, there ceases to be a reason for the child to act that way. And then the child stops being a nuisance. In the similar vein, when I start to feel that my communication technology use is becoming burdensome and bossy, when I feel the pressure to respond to every message or push notification is too much, I start ignoring it. Most of us like the feeling of being connected, and hope that the dings and rings on our devices will bring something good into our lives or that stressful things can be averted and dealt with quickly. So, we start to check obsessively and end up spending dinner time with our family on a device, or walking into traffic with our eyes glued to our iPhone. When I begin to move in this direction, I reverse course and start to consciously and selectively ignore my devices in order to break the cycle.
  5. Adapt technology use to fit my life. One key to being a good parent, I believe, is structuring your life so that it can accommodate children in support of their well-being and happiness. Some (in my opinion) not-so-great parents do the opposite, they expect not to change their lives at all and that children should just fit in. In contrast to my list of strategies thus far, when it comes to mobile technology and social media I try to follow the inspiration of the questionable parent: I fit technology into my life so that I remain able to do what I want and need to do without being sidetracked. If my life is becoming  more stressful and less organized because of social media burden, then I’m probably doing the opposite.

So remember, when that naughty stream of Facebook status updates are just too much to handle, you’re a week behind on your twitter feed, the pesky email inbox just won’t empty out, and those 10 texts – that are going to go unanswered for another few days – won’t stop bugging you, ask yourself: what would mom do?

The Medium is the Message: On Mindfulness and Digital Mirrors

I recently had the pleasure of doing a talk-back with Congressman Tim Ryan on the role of mindfulness – focusing your awareness on the present moment – in education, as part of the Rubin Museum’s Brainwave Festival in NYC. The film, called “Changing Minds at Concord High School,” followed an entire school as they took part in a mindfulness training program. This school is unique in that it is a transfer school, a last stop for many kids with a history of school failure and discipline problems. The twist here is that the students both filmed the experience and conducted a study – of their classmates! – comparing the effects of mindfulness training with that of a placebo. We also included a science curriculum on the neuroscience of mindfulness – how it can change our brains for the better. I was the lead scientist on this project, so the kids were my “research assistants.” The project was spearheaded and directed by the amazing Susan Finley and filmed by the equally inspiring Peter Barton (with the help of the students). Our outstanding scientific advisors were David Vago and Robert Roeser. There is a lot that was amazing about this project, these kids, and this film. I want to focus on just one aspect, which hinges on the phrase “The medium is the message.”

lake yoga

The medium is the message. This phrase was coined by Marshall McLuhan who put forward the idea that the “form of a medium embeds itself in the message.” That is, the medium in which we experience something influences how we perceive the take-home message. Using movies as an example, he argued that the way in which this medium presents time has transformed our view of time from something that is linear and sequential into something that reflects patterns of connection across people and places. I am obviously no film theorist, but I apply this notion to the idea that different media provide us with an array of tools that can help us create a narrative of ourselves and the world that is unique to that medium.

Film and self-identity. In the case of our film “Changing Minds at Concord High School,” I believe that one way that the medium was the message for our students was that film is able to portray individual identities as being truly flexible and changeable. I think that the teens at Concord High, many of whom have experienced tremendous challenges, stress, and obstacles in life, didn’t believe as a group that change for them was really possible. But what our program strove to do, using converging media – film, scientific readings, mind/body experiences of mindfulness – was to convince these young adults that they really could change their brains, change counterproductive habits of thinking, and find the tools to focus more and let negative feelings go. As we move on to Phase 2 of the project by refining and developing our program, we are asking the fundamental question: How can we best use these tools to teach teens to view themselves and the world differently, creating a narrative in which personal change is possible?

Our digital mirrors. I think these issues are especially important to consider now, in this era of social media and reality television in which we crave to see ourselves reflected back to ourselves. We can criticize this, and analyze this, but the fact of it borders on the irrefutable. We know that it’s easier than ever before to document our lives via pictures and videos on our mobile devices, and share them with our digital networks. And we love to do so. Social media, through which we share our images of ourselves and our lives, are an immeasurably huge and complex array of mirrors into which we can gaze at ourselves. There may be costs and benefits to this, but it simply is. The power of this, however, is that we now have a new set of tools to curate our beliefs about who we are – hopefully for the better. And perhaps we believe this evidence of who we are more strongly because it is concrete, it is documented, it receives “likes” and is seen by others and thus is real. I’m liked therefore I am.

This digital infrastructure also provides a profound opportunity for those trying to support growth and positive change in youth. If we help youth document the possibility of change – like we did in “Changing Minds at Concord High School”- they may start to believe it applies to their own lives. This is particularly important for those of us who aren’t used to feeling that the world is full of possibilities. In this way, social networking may be a medium that gives the message that change is possible and that our limitations are as fluid as the flow of information.

Rebel Without a Status Update

I am fascinated by the psychology of Facebook status updates. There are many reasons to make a status update. One reason, of course, is obvious – let others know what you’re up to or share something that’s cool. For example, if I did frequent status updates, I might decide to post “Buying a fantastic ½ pound of Australian feta at Bedford Cheese Shop on Irving Place – should I up it to a pound?!” (and seriously, it is incredible). This may be an interesting snapshot of a day in my life, but these types of status updates are exactly the ones that tend to annoy me for some reason. Even the most benign version of this feels like TMI.

Why? Status updates are for many an instinctive way to reach out.  A recent study even showed that increasing the number of status updates you do every week makes you feel more connected to others and less lonely. Seems like a good thing! Moreover, it’s consistent with what seems to be our new cultural comfort zone – being virtually seen and heard by a loosely connected group of people we know (or sort of know) as our “social network.” This virtual network is the social status quo for many of us, and certainly for many children growing up today.

I believe one consequence of this is that no one wants to be James Dean anymore. Putting it another way, maintaining privacy and being a strong silent type, like Dean, are no longer alluring ideas to us.  And when I thought of this, I realized why I don’t feel fully comfortable with the status update culture – I am a proponent of the James Dean School of Sharing Personal Information in Public: motto, the less the better. I like understatement, privacy, the choice to share with a few and retain privacy with most.

 Image

It’s no coincidence that as a culture, we don’t fetishize James Dean any more. Many of today’s icons (some of them “anti-icons” because we love to feel superior) are people who humiliate themselves, who will tweet that they’re on the toilet and what they’re doing there, who end up in compromised positions, and happen to have pictures and videos of those positions, which then promptly go viral (funny how that happens). James Dean would have disapproved.

James Dean himself would have been very bad at social media…..or perhaps very, very good. Very bad, because he would have had little to say, and would have hated the constant spotlight and social media culture of ubiquitous commentary and chit chat. On the other hand, he might have been very good at it because he would have been the Zen master of the status update, expounding with haiku-like pithiness. An imaginary James Dean status update:

James Dean…….

Old factory town

Full moon, snow shines on asphalt

#Porsche alive with speed

But seriously, while he probably wouldn’t have written haiku, perhaps he somehow would have figured out how to use sharing to create a sense of privacy, because a sense of mystery would have remained intact.

Yes, the status update is a beautiful thing. We have an efficient and fun tool which allows us to reach out to others, curate our self-image, and think out loud to a community. But I wonder if we’re starting to lose the simple pleasures of privacy, of knowing less and wondering more.

Islands in the Stream: A Meditation on How Time Passes on Facebook

Shortly after the terrible tragedy in Newtown, I received email notifications that my (designated) close friends on Facebook had made status updates. Scrolling through my news feed, my friends expressed the range of emotions that we all felt – horror, sadness, distress, anger, and confusion. Later that day, I popped onto Facebook again and was jarred and a little upset to read that friends who seemed to have just expressed horror and heartbreak were now posting about every day, silly, and flippant things.

Now, why should I be jarred or upset? Hours had gone by. After three, or six, or ten hours, why wouldn’t we be in a different emotional state, and why wouldn’t it be ok to post about it? I started to think that it was not my friends’ posts that were at issue here. Rather, it was the nature of how I perceive the passage of time and sequence of events on Facebook. A couple aspects of this came to mind:

Facebook time is asynchronous with real time. Time is easily condensed on Facebook. Events and updates that might be spread out over the course of a day or several days can be read at a glance, and therefore seem to be happening almost simultaneously. So, our perception of time on Facebook is a combination of how frequently our friends post and how frequently we check in. For example, say I check in twice in two days – at 9am on day 1 and at 9pm on day 2. I know a good bit of time has passed (and the amount of time that has passed is clearly indicated next to friends’ updates), but I still read each status update in the context of the previous ones – especially if I pop onto a friend’s Timeline instead of my news feed.

With this type of infrequent checking, friends’ updates about their varying and changing emotions (which might be reasonably spread out over the course of a day or multiple days) appear to be an emotional roller coaster. If someone has several posts in a row about the same thing, even if they are spaced days apart, the person comes across as preoccupied with the topic. Somehow, I form a view of this individual that brings these little snippets together into one big amorphous NOW. If I were checking more frequently, however, perhaps I wouldn’t lump updates together in this way. I’d “feel” the passage of time and – more accurately – see that the ebb and flow of status updates are like islands in the stream of our lives rather than a direct sequence of events.

RiverIslands_2

Related to this first point, it occurred to me that status updates are not meant to be interpreted in the context of preceding status updates. Our brains are pattern recognition machines. So, if Facebook status updates follow one after the other, our brains may perceive a direct sequence of events. But, each status update is a snapshot of a moment, a thought, or a feeling. Intuitively, they are supposed to be stand-alone, not readily interpreted in the context of a previous update, even if they occur close together in actual time. Think how different this is from our face-to-face interactions, in which sequence of events matter. For example, imagine that you’re at work, and your co-worker tells you she is on pins and needles waiting to hear back about a medical test. When you see her a few hours later, she is joking and laughing. You assume she either (a) got some good news from the doctor, or (b) is trying to distract herself from the worry. You don’t think she’s just having a good time, out of context of what you learned about her earlier in the day. But this contextualization is not the way it works on Facebook. Linkages between updates are tenuous, connections malleable. We can lay out our stream of consciousness in a way that requires no consistency among updates. Maybe the temporal and logical requirements of the off-line world are suspended on social networking sites like Facebook. Maybe our brains need to catch up.

This is Your Brain on Technology?

There is a lot of polarized dialogue about the role of communication technologies in our lives – particularly mobile devices and social media: Technology is either ruining us or making our lives better than ever before. For the worried crowd, there is the notion that these technologies are doing something to our brain; something not so good – like making us stupid, numbing us, weakening social skills. It recalls the famous anti-drug campaign: This is your brain on drugs. In the original commercial, the slogan is accompanied by a shot of an egg sizzling on a skillet.

So, this is your brain on technology? Is technology frying our brain? Is this a good metaphor?

One fundamental problem with this metaphor is that these technologies are not doing anything to us; our brain is not “on” technology. Rather, these technologies are tools. When we use tools, we change the world and ourselves. So, in this sense, of course our brain is changed by technology. But our brain is also changed when we read a book or bake a pie. We should not accord something like a mobile device a privileged place beyond other tools.  Rather, we should try to remember that the effects of technology are a two-way street: we choose to use tools in a certain way, which in turn influences us.

We would also do well to remember that the brain is an amazing, seemingly alchemical combination of genetic predispositions, experiences, random events, and personal choices. That is, our brains are an almost incomprehensibly complex nature-nurture stew.  This brain of ours is also incredibly resilient and able to recover from massive physical insults. So, using a tool like a mobile device isn’t going to “fry” our brain. Repeated use of any tool will shape our brain, surely, but fry it? No.

So, “this is your brain on technology” doesn’t work for me.

The metaphor I like better is to compare our brains “on technology” to a muscle. This is a multi-faceted metaphor. On one hand, like a muscle, if you don’t use your brain to think and reason and remember, there is the chance that you’ll become less mentally agile and sharp. That is, if you start using technology at the expense of using these complex and well-honed skills, then those skills will wither and weaken. It’s “use it or lose it.”

On the other hand, we use tools all the time to extend our abilities and strength –whether it’s the equipment in a gym that allows us to repeatedly use muscles in order to strengthen them; or whether it’s a tool that takes our muscle power and amplifies it (think of a lever). Similarly, by helping us do things better, technology may serve to strengthen rather than weaken us.

It is an open question whether one or both of these views are true – and for what people and under what conditions. But I believe that we need to leave behind notions of technology “doing” things to our brains, and instead think about the complex ways in which our brains work with technology – whether that technology is a book or a mobile device.

 

Through a Glass, Darkly; But Then Face to Face: Sensitive Souls and Social Media

There is an idea out there that’s prevalent but which has little or no scientific support:  that people who use more social media are less sensitive, less empathic, and less emotionally attuned. My students Lee Dunn, Amy Medina and I wanted to put that assumption to the test (and reported these findings at the Society for Psychophysiological Research Annual Conference). We found the opposite: that people who prefer to use technology like social media to communicate with others are actually more emotionally sensitive and more empathic. These folks aren’t emotionally stunted or disconnected. If anything, they are more attuned to their emotions and to the emotions of others, and also might be more challenged by these emotions. They are “sensitive souls.”

This makes sense when you start to think about how hard face-to-face interactions can be.When we use social media, we may feel in control and safe compared to face to face. Technology affords a comfortable distance. It’s simply easier to tell someone you’re angry via email or IM, without having to deal with their reactions in person. So, if you’re an emotionally sensitive person, you might be drawn to social media. This is a judgment-free statement. Our findings don’t weigh in on whether this helps or hinders a person’s social and emotional skills. That is the critical next step in our research. Here is what we know so far:

How we put it to the test. While previous studies ask people to report on very basic aspects of their social media use – like how many hours a week they use social media sites – we did something new. We asked people how they prefer to communicate with others (and what they actually did over the past 6 months) when they need to express emotions like anger or excitement, ask for or give social support during emotionally tough times, and exchange information. For each question, answers could vary from 100% using technology (not including the phone) to 100% using face-to-face interactions. Many people showed a strong face-to-face preference, but just as many showed a strong tech preference.

Then, we asked people to tell us about their emotional lives – emotional highs and lows, empathy for others, personality, and satisfaction with the social support they receive from others. Finally, we recorded EEG (aka “brainwaves”) while they viewed emotional pictures. While EEG doesn’t give us the power to directly access people’s consciousness (Oh, Dennis Quaid, you really had us believing that you could EEG your way into our brains in the 1984 movie Dreamscape), EEG can measure the degree to which our brains are sensitive to different types of emotional information – pleasant, disgusting, erotic, dangerous, and cute, cuddly things. We showed participants everything from sex to kittens, and graves to gore.

The power of EEG, portrayed by the movie Dreamscape (1984). Dennis Quaid is probably NOT looking at pictures of kittens.

Findings. Data analyses are incomplete and are not yet published, so I’ll only discuss the broad strokes of our findings. As I stated at the top, those who prefer to communicate via social media and technology versus face-to-face interactions are sensitive souls: they report feeling more negative emotions (like anxiousness and sadness), are less extroverted, and are less satisfied with the social support they receive from others. On the other hand, they also report feeling more empathic towards others (for example, “I get a strong urge to help when I see someone who is upset” or “it upsets me to see someone being treated disrespectfully”).

Complementing this, EEG findings show that those with a social media/tech preference have stronger brain responses to pictures portraying mortality – graves, sick people, dying loved ones. That is, the brains of folks who prefer social media are more sensitive to pictures that are reminders of death and loss.

This is not about social media causing anything! The popular press often describes research about social media in inaccurate ways – saying that social media caused people to be a certain way (e.g., the idea of Facebook depression). This sounds sexy but is just wrong most of the time. Unless you’ve done experiments that show social media directly change something about people, or you’ve tracked how social media predicts changes in people over time, you cannot even begin to discuss causality.

So what can we discuss? What does this all mean? What it means is that our findings are not about causality, they are descriptive. These results help us to describe the social-emotional profile of people who prefer and use tech-mediated versus face-to-face social interactions – their personalities, goals, strengths, and vulnerabilities. Ultimately, this can help us understand the growing role of social media in our everyday routines, and why, for some, these tools can feel like life boats in the stormy seas of our lives. What remains unclear is whether these life boats are going to bring us to shore or whether we will be lost at sea (ok, this metaphor is getting a little much).

Where are we going with this? Importantly, we have no idea what the long-term costs or benefits of social media are for our sensitive souls. That is where I am really going with this research. I believe we need to track how a tech preference influences us from the cradle to the rocking chair: in our digital natives who are using these tools before they are out of diapers; in adults, who almost can’t remember a time when these tools didn’t exist; and in older adults, who may be discovering the immense world that opens up before them when they use technology to communicate with others.

So Long Ago I Can’t Remember: Memory, Technology, and Creativity

I recently read an interesting blog through Scientific American by the writer Maria Konnikova. In it, she writes about how memorization may help us be more creative. This is a counterintuitive idea in some ways because memorizing information or learning something by rote seems the antithesis of creativity. In explanation, she quotes the writer Joshua Foer, the winner of the U.S. memory championship, from his new book: “I think the notion is, more generally, that there is a relationship between having a furnished mind (which is obviously not the same thing as memorizing loads of trivia), and being able to generate new ideas. Creativity is, at some level, about linking disparate facts and ideas and drawing connections between notions that previously didn’t go together. For that to happen, a human mind has to have raw material to work with.”

This makes perfect sense. How can we create something new, put things together that have never before been put together, if we don’t really know things “by heart”? This makes me think of the great classical musicians. Great musicians know the music so well, so deeply that you both play it perfectly in terms of the intention of the composer AND you are able to add that ineffable creative flair. It’s only when you’ve totally mastered and memorized the music that you can put your own stamp on it and it becomes something special. Otherwise, it’s robotic.

These issues are incredibly relevant to how human memory is adapting to new information technologies. Research has recently shown that when we think we can look up information on the internet, we make less effort and are less likely to remember it. This idea is referred to as “transactive memory” – relying on other people or things to store information for us. I think of it as the External Second Brain phenomenon – using the internet and devices as our second brain so that we don’t have to hold all the things we need to know in our own brain. As a result, how much do we actually memorize anymore? I used to know phone numbers by heart – now, because they are all in my phone’s address book, I remember maybe five numbers and that’s it. How about little questions I’m wondering about, like: When was the first Alien movie released (okay, I saw Prometheus last week)? The process of getting the information is – 1. Look it up; 2. Say, “ah, right, interesting”; 3. Then with a 75% probability in my case forget it within a week. Information is like the things we buy at a dollar store – easily and cheaply obtained, and quickly disposed of.

A colleague in academia once told me about an exercise his department made their graduate students go through in which they presented their thesis projects – the information they should know the best, be masters of really – using an old-school flip board with paper and sharpies. Without the help of their PowerPoint slides and notes, they could barely describe their projects. They had not internalized it or memorized it because they didn’t need to. It was already in the slides. If they didn’t know something about their topic, they could just look it up with lightening speed. Only superficial memorization required.

In addition, the process of relating to and transcribing information has changed. Today, if students need to learn something, they can just cut and paste information from the internet or from documents on their computers. They often don’t need to type it in their own words, or even type it at all. They miss a key opportunity to review and understand what they are learning. We know that things are remembered better when they are effortfully entered into memory – through repetition, and using multiple modalities like writing it out and reading it. If you quickly and superficially read something, like we do all the time when we are on the internet or zooming from email to website to app, then you cannot put that information into memory as efficiently. For most of us, it just won’t stick.

On the other hand, shouldn’t the vast amounts of information we have at our fingertips aid us in our creative endeavors? Haven’t our world and the vision we have of what is possible expanded? Couldn’t this make us more creative? Perhaps, by delegating some information to our external second brains, we are simply freed up to focus our minds on what is important, or on what we want to create (credit to my student Lee Dunn for that point).

Also, I think many of us, me included, know that we NEED help negotiating the information glut that is our lives. We CAN’T keep everything we need to know memorized in our brains, so we need second brains like devices and the internet to help us. I don’t think we can or should always relate deeply to and memorize all the information we have to sift through. It is a critical skill to know what to focus on, what to skim, and what to let go of. This is perhaps the key ability of the digital age.

I also appreciate all the possibilities I have now that I would NEVER have had before were it not for the incredible breadth and speed of access to information. As a scientist, this has transformed my professional life for the good and the bad – along with opportunities comes the frequently discussed pressure to always work. But give up my devices? I’d rather give you my left arm (75% joking).

As a child developmentalist and psychologist, I feel that we have to acknowledge that these shifts in how we learn, remember, and create might start affecting us and our children – for good and bad – sooner than we think. This isn’t just the current generation saying “bah, these new fangled devices will ruin us (while shaking wrinkly fist)!!!” I think these changes are evolutionarily new, all-pervasive, and truly different. We as a society have to contend with these changes, our brains have to contend with these changes, and our children are growing up in a time in which memory as we think of it may be a thing of the past.

The Gamification of Learning

A recent Pew Report polled internet experts and users about the “gamification” of our daily lives, particularly in our networked communications. They write:

The word “gamification” has emerged in recent years as a way to describe interactive online design that plays on people’s competitive instincts and often incorporates the use of rewards to drive action – these include virtual rewards such as points, payments, badges, discounts, and “free” gifts; and status indicators such as friend counts, retweets, leader boards, achievement data, progress bars, and the ability to “level up.”

According to the survey, most believe that the effects of this gamification will be mostly positive, aiding education, health, business, and training. But some fear the potential for “insidious, invisible behavioral manipulation.“

Don’t pooh-pooh the behavioral manipulation point. Do you really want to have your on-line behavior shaped like one of Skinner’s rats by some faceless conglomerate? But that’s actually not what got me going. What got me wondering about where this is all going is that it seems undeniable that gamification will shape how we learn, in particular how kids learn.

Elements that make up this gamification – rewards, competition, status, friend counts – are particularly powerful incentives. Neuroscience had repeatedly documented that these incentives rapidly and intensely “highjack” the reward centers of our brain. So it begins to feel as if we’re addicted to getting that next retweet, higher friend counts, higher scores on fruit ninja, etc.,…. Even the sound that our device makes when a message pops up gives us a rush, makes us tingle with anticipation. We eagerly wait for our next “hit” and are motivated to make that happen.

This gamification could have a powerful impact on how we go about learning. Psychological researchers distinguish between a fixed and a growth mindset – that is, peoples’ beliefs – about intelligence and learning. When people have a fixed mindset, intelligence is viewed as a hard-wired, permanent trait. If intelligence is a fixed trait, then we shouldn’t have to work very hard to do well, and rewards should come easily. In contrast, in a growth mindset, intelligence is viewed as something that can grow and develop through hard work. In this way, a growth mindset promotes learning because mastering a new skill or learning something new is enjoyable for its own sake and is part of the process of intellectual growth. Intelligence is not fixed because it is shaped by hard work and effort. For a nice summary of these distinctions, see a recent post on a wonderful blog called Raising Smarter Kids.

This is where gamification comes in. If children are inundated with incentives and rewards for even the simplest activity or learning goal, motivation for learning becomes increasingly focused on the potential for reward, rather than the process and joy of learning. In addition, when you’re doing things mainly for the reward, the motivation for hard work will peter out after a while. You just move on to the next, perhaps easier way of getting rewards rather than digging in and trying to master something. It also becomes more difficult to appreciate the value of setbacks – not getting a reward – as an opportunity to improve. In these subtle ways, gamification may undermine a child’s ability to develop a growth mindset. Instead, we might have a generation of children who are implicitly taught that everything we do should be immediately rewarded, and that getting external things, rather than the joy of learning, is why we do what we do.

Promoting a growth mindset is not only important for helping our children learn, but for helping them face frustrations and obstacles. Dona Matthews and Joanne Foster, in Raising Smarter Kids, highlight several rules to promote a growth mindset:

1. Learn at all times. This means think deeply and pay attention. When we use technology and social media, we can sometimes err on the side of doing things very quickly and superficially. So, this rule is important to emphasize with children today more than ever. We also have to remind our children (and ourselves) that it’s ok to make mistakes, even if we don’t get rewarded for our efforts.

2. Work hard. This is a skill that of course can be promoted by the presence of incentives – kids will work for hours at a game if they can beat their highest score. But what happens after they get the reward? Are they committed to continue learning? Will they continue struggling and practicing? Sustained hard work is an opportunity for personal growth that external motivation, like that from rewards, may not be able to sustain. Here, the enjoyment of learning and gaining mastery may be the most powerful motivator when it comes to helping children become dedicated learners for the long haul.

3. Confront deficiencies and setbacks. This is about persisting in the face of failure. The increasing role of gamification could both help and hinder this. Gamification will help in the sense that with so many rewards and game dynamics, opportunities for failure are around every corner and children will need to learn to persist. At the same time, what guarantees that a child will persist to obtain these rewards? Rewards are not equally motivating for all individuals. Will those not interested in rewards and games just be left feeling bored, and take part in fewer opportunities for learning?

I’m not saying that we should avoid all rewards – that would too extreme and impossible to boot. But we must maintain our awareness of how, with increasing gamification, the simplest act of using technology, logging onto our favorite website, or using social media might be subtly changing our motivation to learn.

Pattern Recognition: How Technology Might Make Us Smarter

There is a lot of talk about how technology might be making us stupid. The examples are legion, and possibilities endless: we can’t spell anymore; we can’t remember anything anymore because we have a big, giant, virtual brain called the internet; we have flea-like attention spans; etc, etc, etc,..

To over-generalize like this is certainly giving technology a bum rap. And of course, many argue the opposite – that using different technologies improves key abilities  like working memory and eye-hand coordination. I think that there is always the risk of losing skills (aka becoming more stupid) if use shortcuts all the time and look at things superficially rather than using our brains to understand something at a deeper level. But there are many opportunities to gain new abilities via technology.

One ability that I think might be enhanced by the use of internet-based platforms, like social media, web browsers, and online shopping, is pattern recognition. From the point of view of psychology, pattern recognition refers to perceiving that a set of separate items make up a greater whole – such as faces, objects, words, melodies, etc. This process often happens automatically and spontaneously, and seems to be an innate ability of most animals. Certainly, the tendency to see patterns is fundamentally human – even patterns that don’t exist, such as the Man in the Moon.

How would using the internet help strengthen our pattern recognition abilities? To use the internet, we have to become skilled at skimming through large quantities of information rapidly, instantly judging whether we’ve found the information, website, or person that we’re looking for. Also, we have to rapidly shift from site to site. To process all that information slowly and serially would keep us busy all day. We have to put it together, see the patterns, and glean the information that we need. Children are frighteningly good at this. They have no difficulty sorting through complex arrays of information and graphics.  It feels like they read the patterns of the computer interfaces like native speakers. It’s not for nothing that we call children growing up today digital natives.

One of my favorite books of the last decade, Pattern Recognition, by the great technovisionary William Gibson, plays with the idea of what pattern recognition means to us today. Set in the present (rather than some future dystopia, which is more usual for him), the protagonist, Cayce (pronounced case not cas-ee) has an extreme psychological sensitivity to corporate logos, and has what amounts to an allergic reaction to successful advertising. So, companies hire her to judge the effectiveness of their proposed corporate logos and advertising strategies. Her ability is to effortlessly identify the je ne sais quoi – that special pattern – that makes a logo powerful and effective. I think that Gibson is thinking about our era as one in which highly skilled pattern recognition defines what we do and who we are becoming.

So, the question arises: Does that mean I want to sit my 3-year-old in front of a device for hours a day to help him build these abilities? No. But perhaps focusing on the skills he can build will help me think through how to structure his use of things like the iPad more effectively – such as what apps to choose for him, how to dovetail what he’s learning on the device with what he’s doing in the world (e.g., building blocks all the time, learning about letters and numbers), and how to help him see the patterns in what he’s doing.

Of course it is way too simplistic to demonize any technology by saying it will make us stupid. It’s all about the costs and benefits of how we use the technology. That’s why the research community needs to step up to the plate and try to understand how all these aspects of our children’s technological lives are changing them (or not) – what technology offers us, and what we in turn bring to the table in that equation.  We know shockingly little. As parents, we can either cut our children off from technology all together, or try to use our best judgment and make our children’s interactions with technology useful and powerful.  As adults, we can do the same – clearly, we need to think carefully about how we want to integrate these devices into our lives.

Now, sit down and look through your twitter feed or Facebook newsfeed, and see all the information you have to sort through. Tons of it! Reams – just in a given day…. And feel how your pattern recognition abilities are growing!