The American Academy of Clinical Neuropsychology (AACN) has its own Facebook page, which is ironic since I wrote a short comment on social media and the brain for the AACN page, coming soon to:
A number of websites have recently posted articles on the impending doom that awaits us due to society’s use of social media and the Internet (see several listed below). When two popular social issues collide –mental health and social media—there’s a journalistic feeding frenzy. And frankly, at the rate new studies on brain functioning come out, there’s a constant stream of comment-worthy neuroscience news to follow.
The recent Huffington Post story listed five ways our use of Internet and social media is “changing the brain:”
- Gamers show behaviors similar to gambling/drug addicts;
- Facebook makes you depressed;
- Teens who spend a lot of time on the internet are at greater risk for self-harm;
- Information processing is negatively impacted by Internet use…
- Or is cognitive processing improved with Internet stimulation?
Reasonable cautions for interpretation of such research are as follows:
Video games are no more fine-tuned to create addiction than other addictive products, like gambling, alcohol and smoking; proneness to addictive behavior is not a new phenomenon. There are no ill effects associated with supervised, limited use of video games in situations where there is not already a great deal of psychological disturbance, and in fact some studies have suggested there are cognitive benefits. This applies to Internet use more broadly as well: excessive use has been associated with problems, while structured, targeted use may improve skills–at least temporarily, within the narrow range of the specific task being practiced.
As for Facebook, people with heavier use of social media, or the Internet in general, may report greater unhappiness and loneliness. But spending more time online may reflect life circumstances and stressors that result in one’s withdrawal, not the other way around. Furthermore, there are plenty of studies that do not support the theory that social media use increases loneliness.
In research, an oft-repeated mantra is “correlation does not imply causation.” Read Nate Silver’s “The Signal and the Noise;” he describes seasonal increases in ice cream sales and forest fires as an example: there is a correlation, but buying ice cream certainly does not trigger fires.
As for the Facebook study detailed in Time magazine, it shows that intense Facebook users show greater activity in an area of the brain that processes social reward. There is a correlation: it does not mean Facebook use changes brains or induces addiction. More simply: what came first, the chicken or the egg? Anyway, the wickedly smart “Neuroskeptic” blog on discovermagazine.com notes that everything changes brain activity (reading, thinking, walking, smelling, seeing, etc.). The question is whether something changes brain activity or structure longer-term, or permanently.
The interesting aspect of the Facebook “addiction” study is the point that intermittent reinforcement strengthens behaviors. An intermittent schedule of reward increases the likelihood of that behavior, and make it harder to extinguish. Think of children who escalate tantrums, knowing that if they flail wildly enough, the parent will eventually be mortified or exhausted enough to relent and allow the desired reward.
So people who get Facebook ‘likes’ may well respond with more Facebook activity to gain more ‘likes’ or social reward, especially if they are “reward-dependent” types to begin with. Intermittent reinforcement is not only about addictive or pathological behavior: it’s also an important aspect of risk-taking and decision-making in our everyday lives. After all, not everyone can be like Dr. Spock (“All I know is logic”).
For the record, I think a very valid question is whether or how ubiquitous use of Internet and smartphone technology will eventually affect our hardwiring as human beings. For instance, people now video events routinely: your child skateboarding, a famous person walking by, heavy rain, you name it. So, not only is there a division of attentional resources (making sure your camera is working right, versus focusing on the actions at hand), but there is potentially an emotional disconnect: are you less invested in being ‘in the moment’ if you know events are being recorded for instant and constant review? And if so, how does this impact the complex system we’ve evolved for encoding/retrieval of new information? No way to answer that for now.
Another example: anyone but me love the movie “Strange Days” with Ralph Fiennes? A neat part of the futurist movie involves the black market sale of recorded events in which the viewer can immerse himself, for a real-time, visceral experience (e.g., running on the beach, or naughtier things). One point was that memories are supposed to fade; human beings can’t evolve if they are trapped in a loop of reliving the past. Well, now that we can keep a video record of the minutiae of our lives using smartphones, is there a knock-on effect? And if so, is it an individual effect, temporary? Or over generations will it become the norm, so that humans allocate less neuronal energy for encoding or retrieval, and more for simultaneous processing?