Kindness is a 21st Century Skill

These are rapidly changing times, in part due to the frenetic pace of technological innovation. How we communicate, connect, love, hate, and elect presidents are forever altered. Given this, educators, parents, and corporations are focusing on cultivating 21st century skills – skills like problem solving, synthesizing information, interpreting, collaboration, and kindness. These are skills that prepare us for the increasingly complex life and work environments of the 21st century, and reflect the changing nature of work, communication, and how we use technology to facilitate our lives.

crying-boyI believe that of these, kindness is the most critical 21st century skill, whether your goal is a civil society or successful business. Kindness is at the hub of our pro-social selves and is the glue of civilization. It allows us to understand the world through another’s eyes and act meaningfully in that world.

What is kindness? Kindness means interacting with others in friendly, generous, and thoughtful ways. It means performing acts to benefit others without expectation of reward or benefit for oneself.

For that reason, forcing acts of kindness sabotages the motivation to be kind, and a display of good manners does not automatically mean that a person is kind. Good manners can exist in the absence of generosity and thoughtfulness, and can be motivated by the hope of reward and praise.

Kindness is distinct from other, related aspect of our pro-social selves. For example, sympathy refers to the concern for and understanding of someone else’s distress, feeling pity toward the misfortune of another, especially those perceived as suffering unfairly. In contrast, empathy is the capacity to experience what another person is experiencing, including thoughts, emotions, and sensations, all from the other person’s frame of reference. It leads to an attuned response from the observer. And compassion, perhaps the pinnacle of our pro-social self, is empathic and sympathetic awareness of another’s suffering coupled with the drive to alleviate it. Think Mother Theresa, although compassion does not need to be that elevated, complete, or grand.

So, kindness is at the hub of all these aspects of our pro-social selves.                         Kprosocial-selves-figure-2indness does not emerge out of a vacuum nor is it innate. Kindness instead is the result of core, crucial skills and capacities that lay the foundation for kind behavior and kindness as a moral compass. These capacities of the sine qua non of our pro-social selves: perspective taking, emotion regulation, moral reasoning, and modeling. Each of these skills allows kindness to emerge, and without them is impossible.

Here, I want to focus just on perspective taking. Perspective taking is the ability to put oneself in another person’s shoes, to understand that someone might think and feel differently than you do. Perspective taking allows us to feel sympathy and empathy.

In Psychology, perspective taking is part and parcel of Theory of Mind, which describes how we have a latent “theory” or belief about how the world works. This theory assumes that other people have minds, and that these minds think and feel and believe things that are distinct from what we think, believe and feel. In disorders such as Autism Spectrum Disorder, where social understanding is disrupted, Theory of Mind and perspective taking may not develop fully or in ways that we see in typical development. In very young children, Theory of Mind and perspective taking is evident when a toddler plays a trick on someone, or surprises someone. To be surprised, one must not know something that another person does know. They must have their own mind.

In our current political climate in the U.S. as well as nations all over the world, kindness and civility appear to be crumbling. Xenophobia and “us versus them” thinking is ascendant. One of the most effective ways to combat this, I believe, is to practice perspective taking, make a habit of trying to understand what and why a person might be experiencing the world in the way that they do. Practicing perspective taking will nourish kindness in us all.

 

 

 

 

Blast from The Past: The Day My Three-Year-Old Discovered Multitasking

Today we’re revisiting a post from a few years ago “The Day My Three-Year-Old Discovered Multitasking”:

I recently overheard a conversation between my three-year-old son, Kavi, and my husband. Kavi was about to go to bed and had only a couple minutes left to play. Dada asked him to choose how he wanted to spend his remaining time. Kavi said, “I have a great idea, dada! I can play iPad AND play Legos at the same time!!!”

Hoo boy, I thought. My son is becoming a multitasker at age three. Already dissatisfied with the pleasure of any single activity, he is trying to divide his attention between two things (one of which is a mobile device) thinking it will be more fun and he won’t have to miss out. Is this an expression of the dreaded FOMO, fear of missing out, rearing its head so early?

And thus followed a mental checklist of my potential parenting failures. Two stand out:

  1. I multitask too much in front of him. I am definitely a multitasker, but one who makes strong efforts to put away my devices when I am with my family. I don’t always succeed, so have I become a bad role model?
  2. I don’t encourage him to enjoy the process of doing and learning. As I’ve blogged about before, one way of thinking about styles of learning is making the following distinction: we can focus on and enjoy the process of learning, or we can learn with the goal of obtaining rewards (praise, grades, etc,…). If Kavi is so interested in multitasking, perhaps this is because he doesn’t fully enjoy the process of doing a single activity.

Then I thought on a more hopeful note, maybe I’ve done something very right, teaching him 21st century skills and facilitating his mental acuity:

  1. Multitasking in moderation is useful! Certainly, at this moment in time, people could be at a disadvantage if they are not able to take advantage of multitasking opportunities to gather information, learn, or accomplish goals – in moderation. So, the fact that it occurred to him to multitask two things he likes to do could simply indicate that his cognitive development is moving along nicely.
  2.  Maybe he is learning to augment his creativity via technology. Perhaps his thought was – well, I’m hitting a wall with new things to build with Legos so maybe I can use the iPad to come up with more ideas. But who knows what he was thinking. So I asked him.

The conversation went something like this:

Me: Hey sweetie, do you remember when you told daddy that you wanted to play iPad and Legos and the same time?

Kavi: mumbles something.

Me: What’s that?

Kavi: Yes, I think so.

Me: Why did you want to do iPad and Legos at the same time?

Kavi:  Because it’s the same kind of fun.

Me: The same kind of fun?

Kavi:  Yes. First you do iPad, then you do Legos. iPad, Legos, iPad, Legos….

Me:  But you also play Legos alone, just Legos.

Kavi: But that would be boring!

Me: Really? I see you do that all the time.

Kavi: Yes…..

At this point, I decided to drop it. So, what does this little bit of anecdotal evidence mean? I have no idea. But I think the bottom line is that I know my son and I’m not too worried. He is already quite good at focusing for long periods of time (he can build with Legos for hours if you let him). Perhaps, though, there is something I can do better. I could focus more on promoting his JOMO  – the joy of missing out. It’s the feeling that what you’re doing right now, at this moment, is exactly the perfect thing to do.

 

Is Your Child Using Devices Too Much? Apply the Delight Principle

Many of us parents worry about the potential negative effects of technology – particularly mobile technology – on our children. But we have precious little science out there that can help us figure out the costs and benefits, risks and returns. Heck, we’ve had television sets in our homes for over 80 years and we still don’t know a lot about its effects on kids.

mother child

But putting our kids in front of technology is sometimes hard to resist. Your kid is having a tantrum on the grocery line? Bring up a movie on the iPad. Children whining at the restaurant? Hand them your iPhone and see their little smiling faces and glazed-over eyes light up from the warm glow of the screen.

However, these solutions are often tinged with parental guilt and a nagging feeling that maybe we shouldn’t be doing this quite so much. To figure out how much is too much, I apply what I call the delight principle – and it’s perhaps not what it sounds like. It’s not experiencing the (yes) exquisite delight of  that whining/crying/fussing/annoying behavior stopping as quickly as if you pressed the mute button. Rather, it’s the idea that if we’re putting devices in our children’s hands so much that we’re losing opportunities to delight in them and enjoy their wonderful little selves, then we might want to reevaluate.

In a nutshell, devices can be used in a “disconnecting” way that, over time, can reduce a child’s experience of that  loving twinkle in your eye, that unconditional positive regard that is the cornerstone of a happy childhood.

This notion – show your child that you delight in them – is obvious in many ways.  But I think that in the cacophony of all the “expert” parenting advice out there – from free range parenting to attachment parenting – this simple instinct that every parent has is easy to lose track of. When children are NOT being delightful (often!), devices are not necessarily a parent’s best friend. Here are a few ways that delight can be blocked when devices are used to disconnect during frustrating situations:

1. Remember to twinkle: Children need to see themselves literally reflected in our eyes in the form of that loving twinkle. It’s not that we need to praise them (and indeed there is good research coming out now about the downside of praise) but rather we need to take joy in their accomplishments, mirror their journey of self-discovery, and be our children’s promoters (as distinct from praisers). Putting devices in front of our kids “too much” has the effect of directly, physically blocking that twinkle. We need to trust our guts as parents on how much twinkle we want to block and make a mindful choice.

2. Share your child’s world: Take time to see the world from your child’s perspective. Every parent knows that it’s a magical place. Explore the world together, discuss ideas, point out things that are interesting or puzzling or wonderful. Listen to what they have to say about it, and if they don’t have much to say, just be with their experience of it and share your experience. Using a device to share in your child’s world seems like one of the best possible uses of a device. So, when we bring out a device, we can choose to use it to connect with our children or to tune them out.

3. Help your child find their own inner delightful child: Just in case you were starting to think I am a proponent of “just twinkle and let the hard stuff go” – not the case. By #3 here, I mean I think we shouldn’t be afraid to talk to our child about being civilized and polite – yes, delightful – human beings. I think that children who are explicitly taught and socialized to be polite, compassionate, and empathic will on average be delightful children and will grow up to be delightful adults. And the converse is also true. I think too much device time reduces opportunities to guide our children towards being delightful. Moreover,  we have to believe that a child is delightful for this to even work. With too much device time I think it’s harder to know how delightful our children truly can be.

There are definitely times when I choose to use a device to press that mute button and just take a break. But when this starts to become a family habit (are they on the device every time you go out to dinner, precluding opportunities to actually talk with one another? Are they spending so much time watching tv that you don’t know how their day at school was? ), it might make sense to do a delight check and make sure the technology choices we’re making for our children sit right with us.

 

 

 

Mental Health on the Go

My forthcoming research paper reporting on a mobile app that gamifies an emerging treatment for anxiety and stress  – a paper that hopefully will be officially out in the next month or so – is starting to be discussed in the media, including the Huffington Post. Thank you Wray Herbert for such great coverage of the study.

 

 

The Medium is the Message: On Mindfulness and Digital Mirrors

I recently had the pleasure of doing a talk-back with Congressman Tim Ryan on the role of mindfulness – focusing your awareness on the present moment – in education, as part of the Rubin Museum’s Brainwave Festival in NYC. The film, called “Changing Minds at Concord High School,” followed an entire school as they took part in a mindfulness training program. This school is unique in that it is a transfer school, a last stop for many kids with a history of school failure and discipline problems. The twist here is that the students both filmed the experience and conducted a study – of their classmates! – comparing the effects of mindfulness training with that of a placebo. We also included a science curriculum on the neuroscience of mindfulness – how it can change our brains for the better. I was the lead scientist on this project, so the kids were my “research assistants.” The project was spearheaded and directed by the amazing Susan Finley and filmed by the equally inspiring Peter Barton (with the help of the students). Our outstanding scientific advisors were David Vago and Robert Roeser. There is a lot that was amazing about this project, these kids, and this film. I want to focus on just one aspect, which hinges on the phrase “The medium is the message.”

lake yoga

The medium is the message. This phrase was coined by Marshall McLuhan who put forward the idea that the “form of a medium embeds itself in the message.” That is, the medium in which we experience something influences how we perceive the take-home message. Using movies as an example, he argued that the way in which this medium presents time has transformed our view of time from something that is linear and sequential into something that reflects patterns of connection across people and places. I am obviously no film theorist, but I apply this notion to the idea that different media provide us with an array of tools that can help us create a narrative of ourselves and the world that is unique to that medium.

Film and self-identity. In the case of our film “Changing Minds at Concord High School,” I believe that one way that the medium was the message for our students was that film is able to portray individual identities as being truly flexible and changeable. I think that the teens at Concord High, many of whom have experienced tremendous challenges, stress, and obstacles in life, didn’t believe as a group that change for them was really possible. But what our program strove to do, using converging media – film, scientific readings, mind/body experiences of mindfulness – was to convince these young adults that they really could change their brains, change counterproductive habits of thinking, and find the tools to focus more and let negative feelings go. As we move on to Phase 2 of the project by refining and developing our program, we are asking the fundamental question: How can we best use these tools to teach teens to view themselves and the world differently, creating a narrative in which personal change is possible?

Our digital mirrors. I think these issues are especially important to consider now, in this era of social media and reality television in which we crave to see ourselves reflected back to ourselves. We can criticize this, and analyze this, but the fact of it borders on the irrefutable. We know that it’s easier than ever before to document our lives via pictures and videos on our mobile devices, and share them with our digital networks. And we love to do so. Social media, through which we share our images of ourselves and our lives, are an immeasurably huge and complex array of mirrors into which we can gaze at ourselves. There may be costs and benefits to this, but it simply is. The power of this, however, is that we now have a new set of tools to curate our beliefs about who we are – hopefully for the better. And perhaps we believe this evidence of who we are more strongly because it is concrete, it is documented, it receives “likes” and is seen by others and thus is real. I’m liked therefore I am.

This digital infrastructure also provides a profound opportunity for those trying to support growth and positive change in youth. If we help youth document the possibility of change – like we did in “Changing Minds at Concord High School”- they may start to believe it applies to their own lives. This is particularly important for those of us who aren’t used to feeling that the world is full of possibilities. In this way, social networking may be a medium that gives the message that change is possible and that our limitations are as fluid as the flow of information.

Rebel Without a Status Update

I am fascinated by the psychology of Facebook status updates. There are many reasons to make a status update. One reason, of course, is obvious – let others know what you’re up to or share something that’s cool. For example, if I did frequent status updates, I might decide to post “Buying a fantastic ½ pound of Australian feta at Bedford Cheese Shop on Irving Place – should I up it to a pound?!” (and seriously, it is incredible). This may be an interesting snapshot of a day in my life, but these types of status updates are exactly the ones that tend to annoy me for some reason. Even the most benign version of this feels like TMI.

Why? Status updates are for many an instinctive way to reach out.  A recent study even showed that increasing the number of status updates you do every week makes you feel more connected to others and less lonely. Seems like a good thing! Moreover, it’s consistent with what seems to be our new cultural comfort zone – being virtually seen and heard by a loosely connected group of people we know (or sort of know) as our “social network.” This virtual network is the social status quo for many of us, and certainly for many children growing up today.

I believe one consequence of this is that no one wants to be James Dean anymore. Putting it another way, maintaining privacy and being a strong silent type, like Dean, are no longer alluring ideas to us.  And when I thought of this, I realized why I don’t feel fully comfortable with the status update culture – I am a proponent of the James Dean School of Sharing Personal Information in Public: motto, the less the better. I like understatement, privacy, the choice to share with a few and retain privacy with most.

 Image

It’s no coincidence that as a culture, we don’t fetishize James Dean any more. Many of today’s icons (some of them “anti-icons” because we love to feel superior) are people who humiliate themselves, who will tweet that they’re on the toilet and what they’re doing there, who end up in compromised positions, and happen to have pictures and videos of those positions, which then promptly go viral (funny how that happens). James Dean would have disapproved.

James Dean himself would have been very bad at social media…..or perhaps very, very good. Very bad, because he would have had little to say, and would have hated the constant spotlight and social media culture of ubiquitous commentary and chit chat. On the other hand, he might have been very good at it because he would have been the Zen master of the status update, expounding with haiku-like pithiness. An imaginary James Dean status update:

James Dean…….

Old factory town

Full moon, snow shines on asphalt

#Porsche alive with speed

But seriously, while he probably wouldn’t have written haiku, perhaps he somehow would have figured out how to use sharing to create a sense of privacy, because a sense of mystery would have remained intact.

Yes, the status update is a beautiful thing. We have an efficient and fun tool which allows us to reach out to others, curate our self-image, and think out loud to a community. But I wonder if we’re starting to lose the simple pleasures of privacy, of knowing less and wondering more.

Are Video Games the Learning Tools They’re Cracked Up To Be?

I was just included in an interesting “Up For Discussion” feature on Zócalo Public Square about whether video games in education are all they’re cracked up to be.

http://www.zocalopublicsquare.org/2012/12/03/class-i-commend-you-for-your-work-on-resident-evil/ideas/up-for-discussion/

Check out the whole array of viewpoints. Here was mine:

I believe that while many people overrate the benefits of video games in education, just as many underrate them. Video games are tools like any other. Their pros and cons depend on how, why, when, and for whom the video games are used. The use of video games in education should be tailored, not off-the-rack. However, until we have more direct scientific evidence on this topic, we can only do thought experiments. For my thought experiment, I focus on how video games might influence the broader contexts of learning: relationships and motivation.

Relationships. Do video games influence the teacher-student relationship? A recent study hints at the possibility. This study compared mothers playing with their toddlers with traditional toys versus electronic versions of the same toy. Mothers playing with the electronic toys were less responsive, less likely to be educational, and less encouraging. Might the same apply to teachers and students? Could video games, because they “do the teaching” have a negative impact on a teacher’s ability and motivation to engage with students? Could video games disempower teachers?

Motivation. We use incentives all the time to motivate learning (e.g., grades), but video games may be unique in the degree to which incentives, whether points or rewards, are integral to the learning process. If the motivation for learning becomes too closely tied to these external incentives, the pleasure of learning for learning’s sake may be squelched and children may miss opportunities to appreciate that setbacks—not getting a reward—are opportunities to improve. We must think through the subtle ways in which video games can shape children’s motivation for learning and design video games to encourage the learning style we believe will be most productive.

Whether one believes that video games will lead to shorter attention spans and boredom in the classroom or that they are powerful tools for igniting a child’s passion for learning, video games will soon become a central part of the educational landscape. So, let’s figure out how to do it right.

 

Blocks and books better than electronic games for your toddler?

Thanks for this post, Dona Matthews! 

Blocks and books better than electronic games for your toddler?

I think one important take-home message is that we need to think through how electronic toys could be designed to better foster communication  and creativity.

With Great Power Comes Great Responsibility: Are Social Media Anti-Social?

This past Wednesday, I had the pleasure of being a panel member for a debate at the UN on social media. It launched the debate series “Point/Counter-point” organized by the United Nations Academic Impact team.  You can see the debate here.

We debated on the theme “social media are anti-social.” I was assigned to the team arguing in support of this point. I was unhappy with being asked to take this side – because I don’t agree with it! – but I was willing to do so with the understanding that I would argue that the very question of whether social media are anti-social is a faulty one. That is, like most technology, social media are neither good nor bad in and of themselves because the impact of social media depends on how they are used. Moreover, from a scientific standpoint, we know almost nothing about whether social media are actually making us more “anti-social” – less socially connected and less socially skilled.

After clearly stating this, however, my strategy was to highlight ways in which social media COULD be antisocial – emphasizing that the research to test these possibilities remains to be done. Perhaps that was one reason why we (my team mate BJ Mendelson and I) lost so spectacularly. At the same time, it was clear that the audience (whose votes determined the winning side) had already made up their minds before the debate even began. This was unsurprising because social media, as this era’s technological bugaboo, are absurdly polarizing. It’s either the scapegoat for all that is wrong, or the best hope for a utopian future.  And of course, the truth is always somewhere in between.

Coincidentally, this very debate had just been played out in relation to an inflammatory Newsweek article last week called “Is the Web Driving Us Mad?” A flurry of responses emerged, including an online Time Healthland article calling into serious question the Newsweek article’s review of evidence that the internet “makes” people crazy. Essentially, the Newsweek article is accused of being sensationalistic rather than doing what responsible journalism is supposed to do: (a) impartially seeking out and weighing the evidence that exists with a careful eye to the quality and direct implications of the science being cited, and (b) avoiding quoting scientific findings out of context.

I believe, however, that there is so much polarized debate because the research we need to weigh in on these issues has not yet been conducted. And that was my main point in the debate. We know almost nothing about the cause and effect relationship between social media or the internet and mental health: Are these technologies making us crazy, depressed, anxious, etc,…, or are people who are already troubled in offline life troubled no matter what the context? How do we measure anti-social, or crazy, or any other outcome that reflects the well-being of an individual? The plethora of unanswered questions makes for polarizing journalism.

One interesting possibility that the Newsweek article brought up and which I considered in the debate was the idea that social media may influence us in ways that are more powerful than other types of technology because they tap into something that is fundamentally rewarding to humans (and most mammals!): the need to be socially connected with others.

I made the point in the debate that, “Science is finding that social media are so rewarding, so motivating, that they essentially “highjack” our brain’s reward centers – the same brain areas that underlie drug addiction-  so that you see what all of us can attest to: people have difficulty disengaging from social media. They feel the need to constantly check their device for the next text, tweet, status update, or email. They feel obsessed. The documented existence of Facebook addiction attests to this. How many of us walk down the street, or eat dinner in a restaurant with our devices clutched in our hand or lying on the table right next to us like a security blanket. I know I do more often than I’d like.”

Indeed, we don’t walk down the street reading a book or watching TV. These technologies can be consuming, but the nature of social media – portable, brief, deeply social – creates a completely different set of temptations and rewards. Textbook theories of behavioral learning and reinforcement tell us that the way rewards are integrated into social media is a recipe for keeping us roped in. For example, if your goal is to,  say, make a rat in a cage press a bar as frequently as possible you should do the following: every once in a while, in a completely unpredictable way, give a reward when the bar is pushed. In contrast, if you give rewards every time they push the bar, they’ll become sated and push less. If you reward in a predictable way, they’ll press the bar just enough to get the reward and no more – because they know how many times they need to press the bar before the reward comes.

Now think about how we use our devices. We check our devices frequently (analogous to pressing the bar) because we’re never sure when an important message, really good piece of news or fascinating factoid will pop up (analogous to the unpredictable reward). So, we find ourselves with device in hand “pressing the bar” over and over again, all day long. The whole economy of social media (i.e., the way the creators of these platforms make their money) is hugely dependent on this very fact.

Now I have to stop and give a MAJOR caveat: This idea may be compelling, sounds like it could be right, but, from my reading of the literature, there is very little direct evidence that this is the case. All we know is that neurologically, aspects of social media and internet use are rewarding, calming, and pleasurable. It’s a far cry from “highjacking our brain,” a phrase I used in the debate for the sake of argument and hyperbole. At the same time, a growing number of people think this is a viable hypothesis, and one that we must put to the test.

By the end of the debate, I think we were all in agreement that when forced to pick a side, we could argue it. But really, we all felt the same thing: Whether social media are anti-social simply depends. It depends on who is using it, how they are using it, and why they are using it. And we just don’t have the scientific knowledge yet to understand these who’s, how’s, and whys.

I concluded my opening statement in the debate by saying, “Until we as a society spend the time, energy and resources to scientifically test how we are changed [by social media], we should proceed with caution and with the possibility in mind that social media could make us more anti-social.”

But BJ Mendelson may have summed it up best when he made a good old-fashioned fan boy reference: with great power comes great responsibility. We need to take the responsibility to look at, question, and try to understand the role of social media in our lives and in society.

Cyborgs, Second Brains, and Techno-Lobotomies: Metablog #2

Last week, I had the pleasure of being Freshly Pressed on WordPress.com – that is, I was a featured blog on their home page. As a result, I got more traffic and more interesting comments from people in one day than I have since I began blogging. Thanks, WordPress editors!

I’ve been really excited and inspired by the exchanges I’ve had with others, including the ideas and themes we all started circling around. Most of the dialogue was about a post I made on technology, memory, and creativity. Here, I was interested in the idea that the more we remember the more creative we may be simply because we have a greater amount of “material” to work with. If this is the case, what does it mean that, for many of us, we are using extremely efficient and fast technologies to “outsource” our memory for all sorts of things? – from trivia, schedules, and dates, to important facts and things we want to learn. What does this mean in terms of our potential for creativity and learning, if anything? What are the pros and cons?

I was fascinated by the themes –maybe memes? – that emerged in my dialogue with other bloggers (or blog readers). I want to think through two of them here. I am TOTALLY sure that I wouldn’t have developed and thought through these issues to the same degree – that is, my creativity would have been reduced – without these digital exchanges. Thanks, All.

Picture taken from a blog post by Carolyn Keen on Donna Haraway’s Cyborg Manifesto
Picture taken from a blog post by Carolyn Keen on Donna Haraway’s Cyborg Manifesto

Are We Becoming Cyborgs? The consensus is that – according to most definitions – we already are. A cyborg (short for cybernetic organism) is a being that enhances its own abilities via technology. In fiction, cyborgs are portrayed as a synthesis of organic and synthetic parts. But this organic-synthetic integration is not necessary to meet the criteria for a cyborg. Anyone who uses technology to do something we humans already do, but in an enhanced way is a cyborg. If you can’t function as well once your devices are gone (like if you leave your smart phone at home), then you’re probably a cyborg.

A lot of people are interested in this concept. On an interesting blog called Cyborgology they write: “Today, the reality is that both the digital and the material constantly augment one another to create a social landscape ripe for new ideas. As Cyborgologists, we consider both the promise and the perils of living in constant contact with technology.”

Yet, on the whole, comments on my post last week were not made in this spirit of excitement and promise – rather, there was concern and worry that by augmenting our abilities via technology, we will become dependent because we will “use it or lose it.” That is, if we stop using our brain to do certain things, these abilities will atrophy (along with our brain?). I think that’s the unspoken (and spoken) hypothesis and feeling.

Indeed, when talking about the possibility of being a cyborg, the word scary was used by several people. I myself, almost automatically, have visions of borgs and daleks (look it up non sci-fi geeks ;-)) and devices for data streaming implanted in our brains. Those of us partial to future dystopias might be picturing eXistenZ – a Cronenberg film about a world in which we’re all “jacked in” to virtual worlds via our brain stem. The Wikipedia entry describes it best: “organic virtual reality game consoles known as “game pods” have replaced electronic ones. The pods are attached to “bio-ports”, outlets inserted at players’ spines, through umbilical cords.” Ok, yuck.

There was an article  last month on memory and the notion of the cyborg (thank you for bringing it to my attention, Wendy Edsall-Kerwin). Here, not only is the notion of technological augmentation brought up, but the notion of the crowdsourced self is discussed. This is, I believe, integral to the notion of being a cyborg in this particular moment in history. There is a great quote in the article, from Joseph Tranquillo, discussing the ways in which social networking sites allow others to contribute to the construction of a sense of self: “This is the crowd-sourced self. As the viewer changes, so does the collective construction of ‘you.’”

This suggests that, not only are we augmenting ourselves all the time via technology, but we are defining ourselves in powerful ways through the massively social nature of online life. This must have costs and benefits that we are beginning to grasp only dimly.

I don’t have a problem with being a cyborg. I love it in many ways (as long as I don’t get a bio-port inserted into my spine). But I also think that whether a technological augmentation is analog or digital, we need to PAY ATTENTION and not just ease into our new social landscape like a warm bath, like technophiles in love with the next cool thing. We need to think about what being a cyborg means, for good and for bad. We need to make sure we are using technology as a tool, and not being a tool of the next gadget and ad campaign.

The Second Brain. This got a lot of play in our dialogue on the blog. This is the obvious one we think about when we think about memory and technology – we’re using technological devices as a second brain in which to store memories to which we don’t want to devote our mental resources.

But this is far from a straightforward idea. For example, how do we sift through what is helpful for us to remember and what is helpful for us to outsource to storage devices? Is it just the trivia that should be outsourced? Should important things be outsourced if I don’t need to know them often? Say for example, I’m teaching a class and I find it hard to remember names. To actually remember these names, I have to make a real effort and use mnemonic devices. I’m probably worse at this now than I was 10 years ago because of the increased frequency with which I DON’T remember things in my brain now. So, given the effort it will take, and the fact that names can just be kept in a database, should I even BOTHER to remember my students’ names? Is it impersonal not to do so? Although these are relatively simple questions, they raise, for me, ethical issues about what being a professor means, what relating to students means, and how connected I am to them via my memory for something as simple as a name. Even this prosaic example illustrates how memory is far from morally neutral.

Another question raised was whether these changes could affect our evolution. Thomaslongmire.wordpress.com asked:  “If technology is changing the way that we think and store information, what do you think the potential could be? How could our minds and our memories work after the next evolutionary jump?”

I’ll take an idea from andylogan.wordpress.com as a starting point – he alluded to future training in which we learn how to allocate our memory, prioritize maybe. So, perhaps in the future, we’ll just become extremely efficient and focused “rememberers.” Perhaps we will also start to use our memory mainly for those types of things that can’t be easily encoded in digital format – things like emotionally-evocative memories. Facts are easy to outsource to digital devices, but the full, rich human experience is very difficult to encode in anything other than our marvelous human brains. So if we focus on these types of memories, maybe they will become incredibly sophisticated and important to us – even more so than now. Perhaps we’ll make a practice of remembering those special human moments with extreme detail and mindfulness, and we’ll become very, very good at it. Or, on the other hand, perhaps we would hire “Johnny Mnemonics” to do the remembering for us.

But a fundamental question here is whether there is something unique about this particular technological revolution. How is it different, say, than the advent of human writing over 5,000 years ago? The time-scale we’re talking about is not even a blink in evolutionary time. Have we even seen the evolutionary implications of the shift from an oral to a written memory culture?  I believe there is something unique about the nature of how we interact with technology – it is multi-modal, attention grabbing, and biologically rewarding (yes, it is!) in a way that writing just isn’t. But we have to push ourselves to articulate these differences, and seek to understand them, and not succumb to a doom and gloom forecast. A recent series of posts on the dailydoug  does a beautiful job of this.

Certainly, many can’t see a downside and instead emphasize the fantastic opportunities that a second brain affords us; or at least make the point, as robertsweetman.wordpress.com does, that “The internet/electronic memory ship is already sailing.”

So, where does that leave us? Ty Maxey wonders if all this is just leading to a technolobotomy – provocative term! –  but I wonder if instead we have an amazing opportunity to take these technological advances as a testing ground for us to figure out as a society what we value about those capacities that, for many of us, are what we think make us uniquely human.