Keep Your Friends Close…: Technology and the Politics of Fear

Speaking at the Personal Democracy Forum (PDF) 2016 was one of those paradigm-shifting conference experiences for me. Before PDF, I tended to hear technophilic, almost Pollyannaish narratives about how technology can make our lives- and our civic lives – better. I was clearly behind the times because I now see the narrative shifting and morphing into a much more challenging, questioning viewpoint that might be best described by the saying “keep your friends close, but keep your enemies closer.”

In almost every talk I heard, technology and the digital economy was described as a double-edged sword, as a way to ignite change, but with high potential costs, and full of booby traps. Those who create technology? A mixed bag at best. Anil Dash didn’t mince words when he called the technocrats and Silicon Valley billionaires liars and the new robber barons. Kentaro Toyama compared the digital economy to The Matrix, in which our personal data is the lifeblood of same Silicon Valley billionaire evil robot overlords.

I have to admit that I take grim pleasure in the aptness of these metaphors, and have uttered identical words myself. However, it is also clear that these ideas are polarizing and, like extremism in politics, privilege emotions above logic to drive more fractious and divisive discourse. Luna Malbroux’s hilarious talk about “EquiTable,” a faux app she developed to create dialogue about social justice and equity, is a nice example of how to break away from bitter recriminations and instead to use humor as a powerful weapon for change.

But if technology is a very sharp double-edged sword, how do we wield it without cutting ourselves?  How do we, as Yvette Alberdingk Thijm described in her talk about using technology as civic witnesses, harness technology for good without allowing others to use it against us.

Keep your friends close…

PDF yielded many ideas and solutions. I mention only a few below (including mine). I was particularly interested in those ideas and solutions demanding that technology serve humanistic goals and that the well-being of people be part and parcel of how we design and build technology. To do this, we have to open our eyes and take a cold, hard look at how our romance with technology has caused us to take our hands off the wheel (no pun with driverless cars intended).

My talk (text can be found here) centered on technology and mental health. I argued that the psychological and emotional nature of the tech we build is not peripheral or ancillary – it is fundamental to shaping how we use tech for healing. Right not, technology and digital culture is precisely and relentlessly designed to high jack our attention and our emotional brains for the economic benefit of its creators – this is the basis of the attention economy. To gather, mine, and sell our personal data, technology needs to be addictive, keeping us looking, clicking, buying, eyeballs on the screen, swiping, checking, clutching our devices, hoping to hear the next best thing, to feel connected, soothed, and understood. This is counter to health promotion, and creates imbalance instead of balance, weakness instead of strength. The notion that technology is designed to high jack our brains was beautifully and compelling described in a blog post just a few days after PDF by Tristan Harris.

I ended my talk with a call to action, that we must reclaim the technology culture to serve and amplify humanity and well-being, rather than serve the attention economy. We must further anchor this new culture in key values, including the value that our attention is sacred and valuable,  not just the coin of the realm. We must own and be responsible for how we spend our precious attention.

Sherry Turkle observed how our excitement over the rapid pace of technological advances makes us forget some fundamental, common-sense things we know about life. For example, after research suggesting that self-reported declines in empathy among millennials could be caused by growing use of social media and digital communication, one researcher’s solution was to build an “empathy app.” Why would we ever think that technology could make us more empathic, that the thing that might have caused declines in empathy could also be the solution? Dr. Turkle described how many aspects of digital technology actually allow us to effectively hide from the challenges of feeling and expressing emotions in our relationships, to “sidestep physical presence” and seek “frictionless relationships.” Solution – we need to reclaim common sense and realize that we are the empathy app, as Dr. Turkle quipped.

danah boyd called our attention to the immense ethical disconnect in how the digital infrastructure of our civic lives – code – is constructed.  This is an industry in “perpetual beta” and thus there are few if any standards, audits, or inspections of code. There also is little consideration of the resources taken up to maintain the immense glut of data generated every day, and little awareness of how bias and inaccuracy are built into data analytics.  These questions are of the utmost importance because an increasing number of decisions in our personal and civic lives are being made based on algorithms and digital profiling.  She exhorts us to be careful of how and what we code.

…but keep your enemies closer

As in everything, knowledge is power. I felt that we at PDF, speakers, participants, and audience alike, implicitly but universally agreed to keep our eyes open, to look our crush, technology, in the face and see that she may not be on our side anymore but to hope that it’s not too late. Technology is empowering, BUT…. We all agreed to spend more time on the “buts,” as well as on the when, how, and under what conditions we can reclaim technology for humanity. In his PDF talk, Kentaro Toyama evoked the great Isaac Asimov and the First Law of Robotics from Asimov’s “I, Robot” (A robot may not injure a human being or, through inaction, allow a human being to come to harm). In Asimov’s universe, the powers of technology are at their fundamental core designed and harnessed for the benefit of people. I believe that we must and can insist that our technology conform to this higher standard, and that with this as a guiding light, we can wield the double-edged sword of technology for more good than ill.

The Future of Medicine is in Your Smartphone

Picture by Helen Weinstein

A great essay from the Wall Street Journal on the promise and challenges of the smartphone revolution in healthcare – from mobile physical exams, to merging day-to-day health data from wearables with medical records. A key – and underdeveloped – innovation here will be to integrate health tracking with mobile therapies. This transformation of healthcare – both physical and mental – is going to happen, and it is up to us, as patients and professionals, to make sure that it is done right, with the privacy and well-being of the individual as top priorities.

I’ve been interested in some emerging companies, like Mana Health, that are on the cutting edge of this revolution because they are solving the problem of how to effectively merge clinical data with health data collected in the daily lives of patients, and directly empowering patients to have a clear voice in their healthcare and greater collaboration with doctors.

The Body-Data Craze

I’ve been working on a post about the Quantified Self movement. To set the stage for that, I thought I’d post this thought-provoking article from Newsweek.

Today, I’ve been on the phone four times, for an average of 24 minutes a call. my last phone call was 22 minutes 23 seconds long, according to the digital time device on my landline. It took me exactly 45 minutes and 10 seconds on the train to reach Brooklyn the other night: I counted the seconds off on my smart phone. My average mile when I ran 5K yesterday was 8 minutes and 45 seconds that showed up on the pedometer. (Nothing to boast about, I know.) As I was on deadline for this piece, I walked only 4,000 steps, not the advised 10,000. I know I am exactly 45 percent through my friend’s excellent nonfiction book thanks to Kindle (in the past you could have estimated that you’d read more than half). I am able to hold my plank at the gym for 54 seconds rather than the minute I always thought I could, which I know thanks to my phone’s stopwatch. My optimal sleep time is seven hours and 20 minutes and I wake up twice a night: I discovered that from a wristband that measures sleep duration and intensity. I now know for certain what before I only assumed: I always sleep lightly unless I take an Ambien. Continue Reading