acm-header
Sign In

Communications of the ACM

Viewpoint

Excessive Use of Technology: Can Tech Providers be the Culprits?


young male and female looking at smartphones

Credit: Syda Productions

The influx of hedonic online services (including video streaming, social media, video games) has created rather fierce competition for people's attention, in what is termed the "attention economy—in which every minute of attention and engagement tech companies can "squeeze" out of users counts. To compete in this environment, tech companies, intentionally or unintentionally, have adapted practices that have capitalized on varying features of human decision making and brain physiology to cultivate automatic, and uninterrupted use.4

There is a body of evidence—growing yet debated—suggesting that when some technologies are used excessively, the use can interfere with normal functioning, such as with sleep, physical activity, and school performance.12 What's more, populations such as children and adolescents may be susceptible to excessive use,2 although age related prevalence issues have not always been made clear. We say the evidence is debated because some studies suggest that excessive use may be related to prior mental illness rather than to the technology itself.6 Consequently, some scholarly groups have criticized the concept of "technology addiction."1 Therefore, we use here the term "excessive use," which reflects use patterns that are excessive in that they infringe on normal functioning of users.5

The role of tech companies (mostly hedonic online service providers and app developers) in excessive use is an issue that merits further discussion and research. This issue is very timely, given the tendency to blame tech providers for many ills in our society (for example, violence and radicalization on social media and/or the role of artificial intelligence (AI) in job displacement and reduced human agency). Focusing on excessive use, as is, it is often assumed that it is the sole responsibility of users; they should have controlled their use. This is akin to a speeding driver, in which case if caught, most people will agree that it is purely his or her fault, and not the car manufacturer's fault for affording speeding. This simplistic one-sided view, however, has been losing ground in recent years. For example, the use of loot boxes in video games has been equated with gambling, which prompted debate about the need to regulate such tools.3 Similarly, a recent U.S. senate bill proposes social media providers should also take some responsibility for excessive use, and remove psychological mechanisms that reduce people's self-control over their use.10

In this Viewpoint, we seek to make first strides toward discussing the responsibility of tech providers for excessive use. Initiating this discussion is important, because it can serve as a basis for more informed use practices and interventions.

Back to Top

What Makes Technology Use Excessive?

Excessive use of technologies is not measured by use frequency, or time, because what is excessive for one person or in one situation may be normal, unharmful, and even beneficial for another person or in another situation. For example, spending five hours/day on social media may benefit an unemployed job seeker, but may become excessive when this person starts working. As such, the excessiveness of technology use is typically captured by a range of persistent negative symptoms involving interference with other life responsibilities. Given there are no agreed upon criteria, prevalence rates of excessive use range from 1% to over 17%.11 The high numbers may result from false positives (that is, identifying individuals as excessive users when they are experiencing mundane symptoms).

Back to Top

Motivation for Excess

If excessive use of technology is characterized by persistently hurting other life domains, why would rational people engage in such excessive behavior? In part this may relate to humans' limited ability to control very tempting behaviors,8 particularly under times of strain. This explanation is based on dual-system theory, according to which some people have a hyperactive reward processing system that creates strong motivations to engage in tempting behaviors, and in some cases also have hypo-active self-control faculties that prevent them from engaging the "brakes." The use of many personal-hedonic technologies routinely activates the reward faculties in the brain, which makes these technologies susceptive to excess consumption.


Hedonic technologies are unique in that they can be consumed nearly anytime and anywhere with relative assumed privacy.


While this is also true for many other routine fun activities such as eating and shopping, hedonic technologies are unique in that they can be consumed nearly anytime and anywhere with relative assumed privacy. This has been afforded by the advent of smartphones and ubiquitous high-speed data access (at least in the U.S.). That is, while rewarding behaviors such as eating may be equally or more rewarding than technology use, they typically cannot be performed as routinely. In addition, many hedonic technologies afford socialization with large groups, beyond the physical reach of users. This can be a highly rewarding facet, and it typically cannot be afforded to the same extent by other rewarding activities. Whether these are meaningful or trivial differences remains to be seen from future research.

Both nature and nurture affect difficulties in moderation of fun activities. Regarding arguments for the nurture component, many scholars argue it is driven by the way modern technologies are designed. Tech companies fight for their survival by trying to accumulate use time and engagement, which often translate into increased in-app purchases or advertising revenues.4 Some worry they specifically use mechanisms that promote repeated, automatic, tempting behavior through a variable reward schedule7 and making behaviors easy and automatic.9 Rewarding behaviors produce behavior-reward associations in people's brains, which leads to behavior seeking and reenactment, especially when rewards are obtained on a variable schedule.4 Tech companies have mastered the delivery of variable rewards. For example the schedule of "likes" on social media posts is variable; and the wins or content of loot boxes on video games is also variable.3

That said, much of this narrative is speculative. Almost certainly, tech companies attempt to develop ways in which participants remain engaged, although the degree to which such mechanisms are harmful remain hotly contested. The proliferation of modern technology has not been linked to a visible epidemic or upswing of "addicted" individuals in the same manner that irresponsible prescribing of opioids led to an opioid epidemic in the U.S. This need not absolve technology companies from a role in protecting their struggling customers or preventing vulnerable customers from becoming excessive users. However, we argue that narratives that are overly hostile to tech companies, imply they are a primary source of overuse problems, or have sinister intentions, are likely less than helpful. In part this may be because technology overuse may sometimes be symptomatic of other issues.6

Back to Top

Are Tech Companies Practices Ethical?

It is not uncommon to hear activists claim that scientists are hired by technology companies to make technology purposefully addictive. Engaging AI to choose and present content (for example, on the social media feed) that will overly engage the users can also be blamed for causing excessive use. However, evidence for such claims is still lacking. Such concerns also appear to confuse addiction (a pathological state) with engagement (a state of continued, enjoyed use, with no significant impairment). However, this need not mean that some mechanisms might not over-shoot engagement into excessive over-use. One useful test for ethics in this context is whether tech companies act like drug dealers, in that they manipulate people to use their products, their products are harmful, and they themselves do not use their products.5 While there is a trend in Silicon Valley for some tech executives to send their kids to tech-free schools,12 it does not seem that tech executives avoid using their own products. The evidence regarding the harmfulness of technology is also not conclusive, and does not apply to all users. Hence, on its face, it seems that tech companies pass at least some aspects of this ethicality test; yet their personnel present some worries about the potentially harmful nature of technology, at least for young children.

There also appears to be little consensus regarding the ethical ramifications of scientists' involvement with technology companies and/or the use of AI for increasing engagement. Certainly, were scientists to knowingly engage in actions they believed might be harmful to consumers; common ethics principles are violated. However, there does not appear to be current evidence to support such claims. On its face, it does not seem to differ much from engaging food scientists for developing tastier foods. One can ask in this case, if the scientists adding sugar to food while ignoring the implications (such as obesity, tooth decay) were ethical. This is of course not an easily resolved issue, but it should be discussed for ensuring we avoid moral panic, while ensuring users who need our help and protection receive it.

Back to Top

Recommendations

One thing that is clear is there is a need for further research to clarify concepts related to excessive use of technology. First, distinguishing whether excessive use behaviors constitute a unique diagnosis or are better conceptualized as risk markers, symptoms or red flags of established mental health disorders would be welcome. Second, current conceptualizations of excessive use tend to rely on symptom profiles adapted from substance abuse. However, critiques of this method suggest it may be too easy to meet "addiction" criteria as applied to technology use (for example, most people will feel some discomfort/withdrawal when prevented from using their smartphones, but this "withdrawal" in non-comparable with the physical withdrawal people who quit substances feel). Research on symptom sensitivity and specificity is therefore needed. Third, it would be important to consider whether excessive use is distinct from overuse of non-tech behaviors such as shopping. If not, it may be of greater utility to consider an overarching behavioral overuse disorder category that could be applied to any behavior, rather than many microdiagnoses focused on specific behaviors.


Almost certainly, tech companies attempt to develop ways in which participants remain engaged, although the degree to which such mechanisms are harmful remain hotly contested.


Without this greater research clarity, it is unclear what ethical advice to give to scientists working with technology companies. We note that knowingly developing technology (for example, algorithms, AI) that would reasonably be expected to lead to excessive use among vulnerable individuals would certainly be unethical. However, we feel that blanket prohibitions against scientists working with technology companies, including related to non-pathological engagement, are not yet warranted. What is needed, as a first step, is much greater transparency and scrutiny of funding arrangements and potential conflicts of interest by computer and social scientists working with tech providers. Take for example the Cambridge Analytica scandal, which was a non-scrutinized collaboration between academics and industry. Hopefully with further research, we will have greater clarity on these ethical issues, and better insights on best academia-industry collaboration practices. In the meantime, technology companies can help with this by making their considerable anonymized user data available openly to scholars without restrictions regarding the favorability of scholarly findings for those technology companies. They should also meet our concerns with open ears and minds. Academics, for now, can simply employ an ethical mind-set when getting involved in projects that may support excessive use.

Back to Top

References

1. American Psychological Association Society for Media Psychology and Technology and Psychological Society of Ireland Special Interest Group in Media, t.A.a.C. An Official Division 46 Statement on the WHO Proposal to Include Gaming Related Disorders in ICD-11, The Society for Media Psychology and Technology, Division 46 of the American Psychological Association, 2018.

2. Cerniglia, L. et al. Internet addiction in adolescence: Neurobiological, psychosocial and clinical issues. Neuroscience & Biobehavioral Reviews 76 (2017), 174–184.

3. Drummond, A. and Sauer, J.D. Video game loot boxes are psychologically akin to gambling. Nature Human Behaviour 2, 8 (2018), 530.

4. Eyal, N. and Hoover, R. Hooked: How to Build Habit Forming Products. Portfolio Hardcover, New York, NY, 2014.

5. He, Q., Turel, O. and Bechara, A. Association of excessive social media use with abnormal white matter integrity of the corpus callosum. Psychiatry Research: Neuroimaging 278 (2018), 42–47.

6. Jeong, E.J., Ferguson, C.J., and Lee, S.J. Pathological gaming in young adolescents: A longitudinal study focused on academic stress and self-control in South Korea. Journal of Youth and Adolescence (2019).

7. Karlsen, F. Entrapment and near miss: A comparative analysis of psycho-structural elements in gambling games and massively multiplayer online role-playing games. International Journal of Mental Health and Addiction 9, 2 (2011), 193–207.

8. Osatuyi, B. and Turel, O. Tug of war between social self-regulation and habit: Explaining the experience of momentary social media addiction symptoms. Computers in Human Behavior 85 (2018), 95–105.

9. Social Media Addiction Reduction Technology Act LYN19429, 2019, 1–14.

10. Tarafdar, M. et al. The dark side of information technology. MIT Sloan Management Review 56, 2 (2015), 600–623.

11. Turel, O. Potential 'dark sides' of leisure technology use in youth. Commun. ACM 62, 3 (Mar. 2019), 24–27.

12. Weller, C. Silicon Valley parents are raising their kids tech-free—and it should be a red flag. Business Insider, 2018.

Back to Top

Authors

Ofir Turel ([email protected]) is a Professor of Information Systems at California State University Fullerton, CA, USA.

Christopher Ferguson ([email protected]) is a Professor of Psychology at Stetson University in Deland, FL.


Copyright held by authors.
Request permission to (re)publish from the owner/author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2021 ACM, Inc.


 

No entries found