Review

I investigate whether the attention span of individual humans has been falling over the last two decades (prompted by curiosity about whether the introduction of the internet may be harmful to cognitive performance). I find little direct work on the topic, despite its wide appeal. Reviewing related research indicates that individual attention spans might indeed have been declining.

Have Attention Spans Been Declining?

In what might be just the age-old regular ephebiphobia, claims have been raised that individual attention spans have been declining—not just among adolescents, but among the general population. If so, this would be quite worrying: Much of the economy in industrialized societies is comprised of knowledge work, and knowledge work depends on attention to the task at hand: switching between tasks too often might prevent progress on complicated and difficult problems.

I became interested in the topic after seeing several claims that e.g. Generation Z allegedly has lower attention spans and that clickbait has disastrous impacts on civilizational sanity, observing myself and how I struggled to get any work done when connected to the internet, and hearing reports from others online and in person having the same problem. I was finally convinced to actually investigate™ the topic after making a comment on LessWrong asking the question and receiving a surprisingly large amount of upvotes.

The exact question being asked is:

"Have the attention spans of individuals on neutral tasks (that is, tasks that are not specifically intended to be stimulating) declined from 2000 to the present?"

(One might also formulate it as "Is there an equivalent of the “Reversed Flynn Effect” for attention span?") I am not particularly wedded to the specific timeframe, though the worries mentioned above assert that this has become most stark during the last decade or so, attributing the change to widespread social media/smartphone/internet usage. Data from before 2000 or just the aughts would be less interesting. The near-global COVID-19 lockdows could provide an especially enlightening natural experiment: Did social media usage increase (my guess: yes), and if so, did attention spans decrease at the same time (or with a lag) (my guess: also yes), but I don't think anyone has the data on that and wants to share it.

Ideally want to have experiments from ~2000 up to 2019: close enough to the present to see whether there is a downward trend (a bit more than a decade after the introduction of the iPhone in 2007), but before the COVID-19 pandemic which might be a huge confounder, or just have accelerated existing trends (which we can probably check in another 2 years).

I am mostly interested in the attention span of individual humans and not groups: Lorenz-Spreen et al. 2019 investigate the development of a construct they call "collective attention" (and indeed find a decline), but that seems less economically relevant than individual attention span. I am also far less interested in self-perception of attention span, give me data from a proper power- or speed-test, cowards!

So the question I am asking is not any of the following:

  • "Does more social media/internet usage cause decreased attention spans?"
  • "Does more social media/internet usage correlate with decreased attention spans?"
  • "Does more social media/internet usage correlate with people reporting having shorter attention spans?"
  • "Did collective attention spans decrease?"
  • "Are people on average spending less time on webpages than they used to?"

How Is Attention Span Defined?

Attention is generally divided into three distinct categories: sustained attention, which is the consistent focus on a specific task or piece of information over time (Wikipedia states that the span for sustained attention has a leprechaun figure of 10 minutes floating around, elaborated on in Wilson & Korn 2007); selective attention, which is the ability to resist distractions while focusing on important information while performing on a task (the thing trained during mindfulness meditation); and alternating or divided attention, also known as the ability to multitask.

When asking the question "have attention spans been declining", we'd ideally want the same test measuring all those three aspects of attention (and not just asking people about their perception via surveys), performed anually on large random samples of humans over decades, ideally with additional information such as age, sex, intelligence (or alternatively educational attainment), occupation etc. I'm personally most interested in the development of sustained attention, and less so in the development of selective attention. But I have not been able to find such research, and in fact there is apparently no agreed upon test for measuring attention span in the first place:

She studies attention in drivers and witnesses to crime and says the idea of an "average attention span" is pretty meaningless. "It's very much task-dependent. How much attention we apply to a task will vary depending on what the task demand is."

— Simon Maybin quoting Dr. Gemma Briggs, “Busting the attention span myth”, 2017

So, similar to comas, attention span doesn't exist…sure, [super-proton things come in varieties](https://unremediatedgender.space/2019/Dec/on-the-argumentative-form-super-proton-things-tend-to-come-in-varieties/index.html "On the Argumentative Form "Super-Proton Things Tend to Come In Varieties"), but which varieties?? And how??? Goddamn, psychologists, do your job and don't just worship complexity.

Perhaps I should soften my tone, as this perspective appears elsewhere:

[…] Gould suggests the metaphor of a dense bush whose branches are periodically pruned by nature. This allows for parallel evolutionary sequences, some of which are adaptive and others not — at any moment in time only the tips of adaptive branches are in evidence, the pruned ones cannot be seen. Thus rather than being direct descendants of primitive hominids, for example, humankind would have evolved along a separate but parallel line from other primates.
Might the ontogeny of selective attention recapitulate this theme? That is, rather than selective attention comprising a single construct with a fixed ontogenic plan, might it be better conceptualized as a multidimensional construct with separate, parallel developmental trajectories for different components. To carry the analogy still further, might the specific developmental progression for a particular component of selective attention be determined by the adaptive fit of that component with the individual's ‘environmental press’? Although such a conjecture rekindles the tenet of ontogeny recapitulates phylogeny long since abandoned in physiological development (e.g., Dixon and Lerner, 1985), we suggest that it may nonetheless provide an overarching framework within which to cast life-span research and theory on the development of selective attention.

— Plude et al., “The development of selective attention: A life-span overview” p. 31, 1994

How Do We Measure Attention Span?

One of my hopes was that there is a canonical and well-established (and therefore, ah, tested) test for attention span (or just attention) à la the IQ test for g: If so, I would be able to laboriously go through the literature on attention, extract the individual measurements (and maybe even acquire some datasets) and perform a meta-analysis.

Continuous Performance Tests

For measuring sustained and selective attention, I found the family of continuous performance tests, including the Visual and Auditory CPT (IVA-2), the Test of Variables of Attention (T.O.V.A.), Conners' CPT-III, the gradCPT and the QbTest, some of which are described here. These tests usually contain two parts: a part with low stimulation and rare changes of stimuli, which tests for lack of attention, and a part with high stimulation and numerous changes of stimuli, which tests for impulsivity/self control.

Those tests usually report four different scores:

  1. Correct detection: This indicates the number of times the client responded to the target stimulus. Higher rates of correct detections indicate better attentional capacity.
  2. Reaction times: This measures the amount of time between the presentation of the stimulus and the client's response.
  3. Omission errors: This indicates the number of times the target was presented, but the client did not respond/click the mouse. High omission rates indicate that the subject is either not paying attention (distractibility) to stimuli or has a sluggish response.
  4. Commission errors: This score indicates the number of times the client responded but no target was presented. A fast reaction time and high commission error rate points to difficulties with impulsivity. A slow reaction time with high commission and omission errors, indicates inattention in general.

I'm currently unsure about two crucial points:

  • How much does any CPT measure the concept we naively call attention span? The papers I've read don't refer to attention span per se, but a general capability of sustained and selective attention.
  • Are there any time-series analyses or longitudinal studies using a CPT, or alternatively meta-analyses using data collected from existing studies? I have not been able to find any.

Other Heterogenous Metrics

I also attempted to find a survey or review paper on attention span, but was unsuccessful in my quest, so I fell back to collecting metrics for attention span from different papers:

  • Gausby 2015
    • Three online tests (probably devised by the authors (?), since no source is given) (n≈2000 Canadians). Very little information about the exact nature of the tests.
      • Sustained attention span: "Counting the number of times responds correctly identified an X occurring after an A."
      • Selective attention span: "Counting the number of times respondents correctly identified a change in the orientation of the rectangles"
      • Alternating attention span: "Calculating the difference in the time lapsed to perform a series of consecutive number or letter classification, compared to a mixture of number and letter classifications."
    • Neurological research: The same games/tests as above with the participants being measured with an EEG ("Results were reported as ACE (Attention, Connectivity, Encoding) scores, as well as the number of attention bursts") (n=112 Canadians)
  • Carstens et al. 2018 (n=209 American respondents to a survey)
    • Questionnaire developed by the authors based on Conners 2004 (reliability: α=0.786)
  • Wilson & Korn 2007 report several different measures of attention span during lectures: the amount of notes taken over time, observation of the students by an author of one study or two independent observers in another study, retention of material after the lecture, self-report in 5-minute intervals during the lecture, and heart rate. They also note that "Researchers use behaviors such as fidgeting, doodling, yawning, and looking around as indicators of inattentiveness (e.g., Frost, 1965; Johnstone & Percival, 1976)."
  • Plude et al. 1994 review how selective attention develops during a human life. For measuring attention, they mainly focus on studies using reaction time as a metric—the speed at which an action occurs as a result of a changing stimulus: eye movement patterns of infants, simple tests such as pressing a button on a changing (often visual) stimulus, the influence of irrelevant visual stimuli at the periphery on a task performed at the centre of the visual field, judging similarity of stimuli at various distances in the visual field, responding to a target stimulus surrounded by interfering distractor stimuli, and determining whether a visual target item is present or absent. They also mention skin conductance (measuring arousal).
    • They also mention studies investigating the time required for attentional switching in acoustic contexts: "Pearson and Lane (1991a) studied the time course of the attention-shifting process between lists and also found large age-related improvements between 8 and 11 years. Whereas 8-year-olds required more than 3.5 s to completely switch from monitoring one list to another, 11-year-olds and adults appeared to complete the switch in less than 2.5 seconds."
  • Muhammad 2020
    • Time spent on websites on average.
      • This is not an adequate metric, I believe: It would also decline if people would become better at prioritising on which websites are more worthy of their attention.
  • Lorenz-Spreen et al. 2019
    • Time that specific pieces of information (hashtags/n-grams/Reddit submissions &c) were popular

As it stands, I think there's a decent chance that one or several tests from the CPT family can be used as tests for attention span without much of a problem.

I don't think a separate dedicated test for attention span exists: The set of listed measures I found (apart from the CPT) appears to be too heterogenous, idiosyncratic, mostly not quantitative enough and measuring slightly different things to be robustly useful for a meta-analysis.

What Are the Existing Investigations?

A lack of long-term studies means we can't tell whether attention spans have actually declined.

—Bobby Duffy & Marion Thain, “Do we have your attention” p. 5, 2022

  • Gausby 2015
    • Questions answered:
      • Sustained attention:
        • Do younger people perform worse on the sustained attention span test?, Yes (31% high sustained attention for group aged 18-34, 34% for group aged 35-54, and 35% group aged 55+) (the methodology is wholly unclear here, though: how do we determine the group that has "high sustained attention span"? Did they perform any statisitical tests? If yes, which?).
        • Do people who report more technology usage (web browsing/multi-screen usage while online/social media usage/tech adoption) perform worse on the sustained attention span test?, Yes. Light:medium:heavy usage for web browsing has 39%:33%:27% users with high sustained attention span, 36%:33%:27% for light:medium:heavy multi-screen usage, 36%:29%:23% for light:medium:heavy social media usage and 35%:31%:25% for light:medium:heavy tech adoption (though these numbers are basically not elaborated on).
      • Selective attention:
        • Do younger people perform worse on the selective attention span test? No (34% high selective attention for group aged 18-34, 30% for group aged 35-54, and 35% group aged 55+).
        • Do people with high selective attention use fewer devices at the same time? Yes (details p. 31).
      • Alternating attention:
        • Do younger people perform worse on the alternating attention span test? No (36% high selective attention for group aged 18-34, 28% for group aged 35-54, and 36% group aged 55+).
        • Do people who report more technology usage (tech adoption/web browsing/multi-screen usage while online) perform worse on the alternating attention span test? No, they seem to perform better: Light:medium:heavy tech adoption corresponds to 31%:39%:40% having high alternating attention spans, light:medium:heavy web browsing to 29%:34%:37% and multi-screening while online to 27%:32%:37%.
        • Do people who use social media more have higher Attention/Connection/Encoding scores on EEG measurements?, Not quite: "Moderate users of social media are better at multi-tasking than lower users. But, when crossing into the top quartile of social media usage, scores plummet."
    • This is a marketing statement wearing the skinsuit of a previously great paper, it would be awesome if they released their exact methodology (tests performed, data collected, exact calculations & code written). I can smell that they actually put effort into the research: Creating an actual test instead of just asking respondents about their attention spans, doing EEG measurements of over 100 people, for 3 different types of attention…come on! Just put out there what you did!
  • Carstens et al. 2018 (n=209 American respondents to a survey)
    • Questions answered:
      • Is self-reported attention span related to the number of social media accounts?, No, not statistically significant (F(2, 206)=0.1223, p>0.05) (via a one-way ANOVA)
      • Is self-reported attention span related to whether a respondent mainly uses a mobile phone or a computer?, No, not statistically significant (P(2,713)=0.923, p>0.05) (via a one-way ANOVA)
    • I do not trust this paper: Calling (what I think is) Generation Z "Generation D" (without source for the term), being clearly written in Word, and confusing grammar (I think the authors are all Americans, so no excuse here):

Users that are older such as late adolescents and emerging adults average approximately 30-minutes daily for just Facebook that does not calculate the time spent on all social media networks

—Carstens et al., “Social Media Impact on Attention Span” p. 2, 2018

Bakardjieva and Gaden (2012) examined the field of social interaction in general to the everyday chatter of unstructured and spontaneous interactions among individuals to highly structured and regulated interaction consisting of the military or the stock exchange.

—Carstens et al., “Social Media Impact on Attention Span” p. 3, 2018

  • Muhammad 2020
    • Question answered: How much time do people spend on a website, on average?, "if you look at the trend for mobile browsing between the years 2017 and 2019 you would see that there is a drop of about 11 seconds in the average time spent on a website." and "The data suggests that the average amount of time spent on websites before navigating away for all devices has gone down by 49 seconds which is a pretty huge reduction all things considered."
    • The data is from the right timeframe (up to but not including 2020), but the linked SimilarWeb report is behind a paywall, so I can't confirm the numbers. Furthermore, the time spent on websites is a weak proxy: Perhaps people simply have become better at prioritising information sources.
  • Lorenz-Spreen et al. 2019
    • Questions answered:
      • How long does any particular hashtag stay in the group of the top 50 most used hashtags? Specifically, how has that number developed from 2013 to 2016?, "in 2013 a hashtag stayed within the top 50 for 17.5 hours on average, a number which gradually decreases to 11.9 hours in 2016", and "The average maximum popularity on one day stays relatively constant, while the average gradients in positive and negative direction become steeper over the years."
      • Do things become more popular faster over time? That is, when e.g. a movie is gaining popularity, did it take longer to become popular in 1985 than it did in 2018?, Broadly yes (the trends holds for popularity of hashtags in tweets (2013-2016)/n-grams in books (1900-2004)/number of theaters that movies were screened in (1985-2018)/topics for search queries on Google (2010-2017)/Reddit comments on posts (2010-2015)/citations of publications (1990-2015)/daily traffic for Wikipedia articles (2012-2017)). Again the length of the time at the peak mostly didn't change (except in the case of Wikipedia articles, where the time at the peak shrunk)
    • While it investigates a question different from the one I have, this paper seems good and trustworthy to me, while supporting a suspicion I've had (observing that the lifecycle of e.g. memes has apparently sped up significantly). I'd be interested in seeing whether the same process holds for internet communities I'm part of (for example on votes LessWrong and the EA Forum or forecasts on Metaculus).

Chart indicating how the speed at which hashtags become popular changed over the years. Four plots (yellow, green, blue and purple) which form a peak in the middle and fall off at the sides. The yellow line is highest around the peak, the green one is lower, blue even lower and purple the lowest.

Mark 2023 is a recent book about attention spans, which I was excited to read and find the important studies I'd missed. Unfortunately, it is quite thin on talking about the development of attention span over time. It states that

My own research, as well as those of others, has shown that over the last fifteen years, our attention spans have declined in duration when we use our devices. Our attention spans while on our computers and smartphones have become short—crazily short—as we now spend about forty-seven seconds on any screen on average.

—Gloria Mark, “Attention Span” p. 13/14, 2023

which is not quite strong enough a measurement for me.

In 2004, in our earliest study, we found that people averaged about one hundred fifty seconds (two and a half minutes) on a computer screen before switching their attention to another screen; in 2012, the average went down to seventy-five seconds before switching. In later years, from 2016 to 2021, the average amount of time on any screen before switching was found to be relatively consistent between forty-four and fifty seconds. Others replicated our results, also with computer logging. seconds. Others replicated our results, also with computer logging. André Meyer and colleagues at Microsoft Research found the average attention span of twenty software developers over eleven workdays to be fifty seconds.⁹ For her dissertation, my student Fatema Akbar found the average attention span of fifty office workers in various jobs over a period of three to four weeks to be a mere forty-four seconds.¹⁰ In other words, in the last several years, every day and all day in the workplace, people switch their attention on computer screens about every forty-seven seconds on average. In fact, in 2016 we found the median (i.e., midpoint) for length of attention duration to be forty seconds.¹¹ This means that half the observations of attention length on any screen were shorter than forty seconds.

—Gloria Mark, “Attention Span” p. 74/75, 2023

She doesn't mention the hypothesis that this could be the symptom of a higher ability to prioritize tasks, although she is adamant that multi-tasking is bad.

Furthermore, this behavior displays only a decrease in the propensity of attention, but not necessarily one of capacity: Perhaps people could concentrate more, if they wanted to/were incentivized to, but they don't, because there is no strong intent to or reward for doing so. Admittedly, this is less of an argument in the workplace where these studies were conducted, but perhaps people just care not as much about their jobs (or so I've heard).

when email was cut off, people’s attention spans were significantly longer while working on their computers—in other words, they switched their attention less frequently.

—Gloria Mark, “Attention Span” p. 97, 2023

She gives some useful statistics about time spent on screens:

Nielsen reports that Americans spend on average five hours and thirty minutes daily of screen time on their computers, tablets and phones. […] But what is really astonishing is that when we add in the time watching other media like TV and films to this, then we see that our attention is fixated on some form of screen, in some type of mediated environment, nearly ten hours a day.

—Gloria Mark, “Attention Span” p. 180, 2023

She connects attention span to shot-length in movies:

The type of motion within shots has been changing. According to film scholar James Cutting and his colleagues at Cornell, shots containing the onset of motion (like a standing person who then runs) have increased because filmmakers believe that it will better attract viewers’ attention. […] The average film shot length in 1930 was twelve seconds, but then began to shorten, reaching an average of less than four seconds after the year 2010, as measured by James Cutting and colleagues.¹² Interestingly, the shot length for film sequels also decreased. For example, the shot length of the first Iron Man film averaged about 3.7 seconds; for Iron Man 2, 3.0 seconds; and for Iron Man 3, about 2.4 seconds.¹³

—Gloria Mark, “Attention Span” p. 180/181, 2023

Like in TV and film, shot lengths in television commercials also shortened over time. The average shot length of commercials in 1978 was 3.8 seconds, dropping down to an average of 2.3 seconds in 1991. […] It’s not just the shot lengths, though, that are short—the overall length of advertisements on TV has also decreased. The majority of ads started out as sixty seconds in length in the 1950s,²⁶ but that length comprised only 5 percent of ads shown in 2017. In the 1980s, advertisers started experimenting with showing fifteen-second ads instead of thirty-second ads. They discovered that fifteen seconds was even more persuasive than thirty seconds, especially when the ads used elements expressing cuteness and humor.²⁷ In 2014, 61 percent of ads were thirty seconds in length, but three years later, that percentage decreased to 49 percent.²⁸

—Gloria Mark, “Attention Span” p. 189, 2023

Do People Believe Attention Spans Have Declined?

Half of the public feel their attention span is shorter than it used to be, compared with around a quarter (23%) who believe they are just attentive [sic] as they've always been.

Again, the feeling of is not just reported by the young — it's also the dominant feeling among the middle aged too, with 56% of 35- to 54-year-olds thinking their attention spans have worsened.

—Bobby Duffy & Marion Thain, “Do we have your attention” p. 6, 2022

Even more widespread is the belief that young people's attention spans in particular are worse than they were in the past—two-thirds of people think this is the case (66%).

Perhaps unsurprisingly, this belief is most common among the oldest age group surveyed, of those aged 55 or over — however, young people themselves also feel this way, with a majority of 18- 34-year-olds holding this view.

—Bobby Duffy & Marion Thain, “Do we have your attention” p. 7, 2022

Note that selective attention mostly improves with age, so the older age-groups might be comparing themselves now to the younger age groups now (as opposed to remembering back at their own attention spans).

The absence of long-term research means it remains unknown whether technology has caused a deterioration in the country's ability to concentrate — but comparisons with survey data from previous decades indicate that, on some measures the public feel more pressured than they did in the past.

—Bobby Duffy & Marion Thain, “Do we have your attention” p. 18, 2022

In response to the questions (n=2093 UK adults aged 18+ in 2021):

  • "To what extent do you agree or disagree with the following statement? The pace of life is too much for me these days" (1983: 30% agree, 2021: 41% agree)
  • "To what extent do you agree or disagree with the following statement? I wish I could slow down the pace of my life" (1997: 47% agree, 1999: 51% agree, 2008: 45% agree, 2021: 54% agree)

What About Rates of ADHD?

Data from the CDC shows a clear increase in the percentage of children with a parent-reported ADHD diagnosis:

There has been a similar increase in the diagnosis of ADHD among adults, "from 0.43 to 0.96 percent" between 2007 and 2016.

However, this does not necessarily mean that the rate of ADHD has increased, if e.g. awareness of ADHD has increased and therefore leads to more diagnoses.

What Could A Study Look Like?

Compared to other feats that psychology is accomplishing, finding out whether individual attention spans are declining appears to be of medium difficulty, so I'll try to outline how this could be accomplished in three different ways:

  1. Develop a good instrument for measuring attention span (optionally just use a continuous performance test). Once one has a suitable instrument for measuring attention span, one can every year (or every second year) for a couple of years pick a random sample from the population (not of the same set of people, though, since attention span increases with age), e.g. via the internet if the test can be done online. One could then apply a linear trend estimation or a fancier statistical technique I don't know to find out whether attention spans have declined between the measurements.
    1. This could be done relatively cheaply: Let's say we collect 50 datapoints a year, from Mechanical Turk workers at $10/hr. A conservative estimate is that the test takes ~30 minutes to complete, so for three years the cost of the data would be $50 \cdot 3 \cdot 10 \frac{$}{\text{h}} \cdot 0.5\text{h}=$750$. There are open-source implementations of the test available (Conners' CPT 3 costs $1.5k), so the additional cost is for the researcher setting up the test and recruiting the participants, which could take ~30 hours, and another ~30 hours for analysing the data. So the total cost of the experiment would be, at an hourly wage of $15 for the researcher (come on, we can let a grad student do it), $$750+60 \text{hr} \cdot 15 \frac{$}{\text{hr}}=$1650$. Fudging upwards by taking the planning fallacy into account gives $2k for the experiment.
  2. Find someone who has been collecting data on attention span, ask them for it nicely, and analyse that data.
  3. Use the control groups from studies testing the effect of interventions on attention as data and then perform a meta-analysis. A lot of studies use some variant of the CPT, I started collecting such studies here.

Conclusion

Given the amount of interest the question about shrinking attention spans has received, I was surprised to not find a knockdown study of the type I was looking for, and instead many different investigations that were either not quite answering the question I was asking or too shoddy (or murky) to be trusted. I seems likely to me that individual attention spans have declined (I'd give it ~70%), but I wouldn't be surprised if the decline was relatively small, noisy & dependent on specific tests.

So—why hasn't anyone investigated this question to satisfaction yet? After all, it doesn't seem to me to be extremely difficult to do (compared to other things science has accomplished), there is pretty clearly a lot of media attention on the question (so much so that a likely incorrect number proliferates far & wide), it appears economically and strategically relevant to me (especially sustained attention is probably an important factor in knowledge work, I'd guess?) and it slots more or less into cognitive psychology.

I'm not sure why this hasn't happened yet (and consider it evidence for a partial violation of Cowen's 2nd law—although, to be fair, the law doesn't specify there needs to be a good literature on everything…). The reasons I can think of is that one would need to first develop a good test for determining attention span, which is some work in itself (or use the CPT); be relatively patient (since the test would need to be re-run at least twice with a >1 year pause, for which the best grant structure might not exist); there are many partial investigations into the topic, making it appear like it's solved; and perhaps there just aren't enough cognitive psychologists around to investigate all the interesting questions that come up.

So I want to end with a call to action: If you have the capacity to study this problem, there is room for improvement in the existing literature! Attention spans could be important, it's probably not hard to measure them, and many people claim that they're declining, but are way too confident about it given the state of the evidence. False numbers are widely circulated, meaning that correct numbers might be cited even more widely. And it's probably not even (that) hard!

Consider your incentives :-).

Appendix A: Claims That Attention Spans Have Been Declining

Most of these are either unsourced or cite Gausby 2015 fallaciously (which Bradbury 2016 conjectures to be the number of seconds spent on websites on average).

Today, individuals are constantly on an information overload from both the quantity of information available and the speed of which information gets into the hands of individuals through advertising and multimedia. Attention deficits tend to be increasing as it is challenging to attract individuals and hold their attention long enough for people to read or watch messages such as work memos, advertisements, etc.

—Carstens et al., “Social Media Impact on Attention Span” p. 2, 2018

Big data plays an important role in the development of microlearning. In the age of big data, human’s attention span is decreasing. As per Hebert (1971), “what information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it” (p. 41). An example of short attention span in the age of big data can be found in the music industry, as per (Gauvin, 2017), the average time that passed before the audience would hear the vocals on any radio song was 23 s, today the average intro is just 5 s long. Wertz (2017) also suggested that 40% of users are likely to abandon a website if it does not load within three seconds or less. Furthermore, a survey (Gausby,

  1. conducted by Microsoft indicated that the average attention span of a human dropped from 12 to eight seconds, which means shorter than a goldfish. Given the average human attention span is decreasing, microlearning becomes more and more important because it emphasises short learning duration.

—Leong et al., “A review of the trend of microlearning” p. 2, 2020

Unfortunately, all too many of us are having “squirrel” days, according to Dr. Gloria Mark, a professor of informatics at the University of California, Irvine, who studies how digital media impacts our lives. In her new book, “Attention Span: A Groundbreaking Way to Restore Balance, Happiness and Productivity,” Mark explained how decades of research has tracked the decline of the ability to focus.

“In 2004, we measured the average attention on a screen to be 2½ minutes,” Mark said. “Some years later, we found attention spans to be about 75 seconds. Now we find people can only pay attention to one screen for an average of 47 seconds.”

Not only do people concentrate for less than a minute on any one screen, Mark said, but when attention is diverted from an active work project, it also takes about 25 minutes to refocus on that task.

—Sandee LaMotte, “Your attention span is shrinking, studies say. Here's how to stay focused”, 2023

Tech-savvy users often say that the way the modern internet works has made it so that people’s attention spans are getting shorter every single day but the truth behind this story is rather tough to ascertain. However, recent data from SimilarWeb indicates that people definitely are suffering from shorter attention spans, and what’s more is that these attention spans are shortening at a pretty rapid pace when you take into account the numerous factors that are coming into play, all of which serve some kind of purpose in this trend.

If you look at the data for how long users spend on websites before navigating away, for the most part the trend has been that these times are remaining more or less stable on web based browsing, but if you look at the trend for mobile browsing between the years 2017 and 2019 you would see that there is a drop of about 11 seconds in the average time spent on a website. When you take into account the fact that mobile browsing is starting to become a lot more popular and in many ways has become the preferred form of browsing for people on the internet, the change is a lot more drastic.

—Zia Muhammad, “Research Indicates That Attention Spans Are Shortening”, 2020

However, as much as technology can be used as an effective learning tool inside and outside the classroom, there’s no denying that one of the biggest challenges faced by educators today is the distraction posed by social media. Students are distracted by their phones during class, and even without that distraction, the time they spend on social media outside the classroom has an impact on their attention spans.

—EU Business School, “The Truth about Decreasing Attention Spans in University Students”, 2022

(No link given.)

In 2015, a study commissioned by Microsoft and discussed in Time magazine found that the average attention span was in fact only 8 s. If indeed this is the case, then even participating in a 15-min lecture would be positively heroic. To place this in perspective, it was reported in the same Time article, that goldfish, of the piscine rather than snack variety, have an attention span of 9 s, one whole second greater than humans! It is perhaps rather premature to opt for an 8-s lecture format, as there are many caveats to the Time article, not the least of which is that no one knows how to actually measure a goldfish’s attention span. What has been measured is goldfish memory, which, according to researchers in the School of Psychology at the University of Plymouth, is actually quite good (7). Similarly the 8-s attention span for humans actually reflects the average time a person will spend on a web page before looking somewhere else.

—Neil A. Bradbury, “Attention span during lectures: 8 seconds, 10 minutes, or more?”, 2016

Appendix B: Studies for a Meta-Analysis

I'll list the closest thing those studies have to a control group, list sorted by year.

Studies using the CPT

Furthermore, “Is the Continuous Performance Task a Valuable Research Tool for use with Children with Attention-Deficit-Hyperactivity Disorder” (Linda S. Siegel/Penny V. Corkum, 1993) p. 8-9 contains references to several studies from before 1993 using the CPT on children with ADHD.

Appendix C: How I Believed One Might Measure Attention Span Before I Found Out About The CPT

Before I found out about the Continuous Performance Test, I speculated about how to measure attention span:

(Note that I'm not a psychometrician, but I like speculating about things, so the ideas below might contain subtle and glaring mistakes. Noting them down here anyway because I might want to implement them at some point.)

It seems relatively easy to measure attention span with a power- or speed-test, via one of three methods:

  1. Present a stimulus, change the stimulus and let the test subject report the change in stimulus; this results in two numbers: the time between the stimulus being initially being presented and the time it was changed (let's call this value t_change), and the time between the change of the stimulus and the reporting of the change (calling this value t_report). Performing this test with different value of t_change should result in different values of t_report. There is a t_change for which t_report falls over a threshold value, that t_change can be called the attention span.
    1. This method has some disadvantages:
      1. It needs a change in stimulus that requires selective attention to notice, but changing e.g. visual stimuli involves motion, which direct attention. (Idea: have a colored stimulus continuously changing color, and a reference color, once the stimulus has the reference color, the subject is supposed to report; avoiding sudden changes in visual stimuli.)
      2. The method would require many samples to find the t_change for which t_report falls over the threshold value.
      3. Performing the test multiple times in a row might induce mental fatigue, decreasing attention span
  2. Let the test subject engage in a mentally draining exercise like the Stroop test with some performance measure. I would the performance to decline over time, and one could define a threshold value at which the subject "is no longer paying attention".
  3. Let the subject observe a neutral stimulus while measuring some indicator of attention (such as arousal via skin conductivity or the default mode network being inactive), when the measured value falls under/over a threshold the subject has "lost attention".
    1. This method has the major disadvantage that it requires special equipment to perform.

Such an instrument would of course need to have different forms of reliability and validity, and I think it would probably work best as a power test or a speed test.

I'm not sure how such a test would relate to standard IQ tests: would it simply measure a subpart of g, completely independent or just partially related to it?

New Comment
21 comments, sorted by Click to highlight new comments since:

I think that movies are getting longer.

I didn't check this systematically, but it feels like old movies are about 80 minutes long, and the recent ones are approaching 120 minutes.

This could just be a function of film directors getting better at making long films with pacing that keeps lower attention spans engaged.

Yeah, there's all these damn confounders :-|

This again seems like something that different people could interpret differently.

I usually watch movies on my computer, and when I think they are too slow, I increase the speed. (If I no longer understand the speech at the high speed, I add subtitles.) Recently, I usually watch a movie on 2x speed, and slow down if there is some action scene or a nontrivial dialogue. Speed up if too boring.

Seems to me that most (but not all) movies follow the same pacing, where during the first half of the movie almost nothing happened, we just get the protagonist introduced. Then things start happening, then they get more dramatic, then there is the climax, then a short cool down and the movie ends.

Now, is this "pacing that keeps lower attention spans engaged"? For me, the first half of such movie is the one that watch at 2x or 3x speed, and if I didn't have an option to do that or to skip that part, I probably would not have watched many of the movies (so I would never learn that the second half was good).

I suspect the desired psychological outcome is more like: "people remember the end of the movie most, so let's push everything interesting as close to the end as possible". And a possible bet that if they are in a cinema (where the producent probably gets most money), people won't leave during the first half.

With old movies, the pacing seems more like the slow and fast sections alternate throughout the film.

I looks like movies have been getting longer:

Although there seems to have been a slight dip/stagnation in the 2000s and 2010s, and the text concludes with

In conclusion, our intuition was wrong. There is no trend in the movies runtime. The differences are too small to be noticed. We can say that for the last 60 years movies on average have the same length. No matter what criteria we take into account, the result is the same.

[-]jmh40

Interesting. The graph seems to fit Viliam's intuition rather well. It is a noisy dataset (not to mention that "movie" is probably a poorly defined item so might have both apples and oranges here) so I'm not quite sure one can easily make the claim of increasing or constant very easily. 

[-]jmh20

You might find some of the discussion here of interest on the subject. This listing of John Wayne movies, with run time listed might also be of interest.

I have been under the impression that they had been getting shorter - say from the 70s to the 90s/00s but then started lengthening again a bit somewhat after that until now. Nothing very scientific about my recollection of that observation though.

I always assumed this was a shift due to VCRs, DVDs, and then DVRs and streaming. Today media can assume everyone can watch anything as often as they want. Older media had to assume no one would ever watch it more than once, or be able to look up anything they missed. Old movies and TV shows are simple in the stories they tell, simpler in dialog, and shorter in length, so you can mostly catch everything the first time through.

Movies in general require very little attention compared to just listening or even reading. Since they are audiovisual, they leave very little to the imagination. So I expect other influences have a larger effect.

Thank you for this! Strong-upvoted.

I only skimmed, so maybe you discuss this and I missed it, my apologies if so -- what about a metric that logs screen time over the course of a day, and tracks frequency of major task shift? E.g. we could classify websites as "work-related" or "entertainment" or "other" and then track how many shifts from work-related to entertainment happen throughout the day. If we had been tracking this for 20 years then maybe we'd have some good data on pretty much exactly the problem I'm most concerned about when I say attention spans seem to be declining...

Mark 2023 looks at this, I've summarized the relevant stuff at the end of this section, and the time spent per task has indeed been declining. I don't know about research looking at task classification, but that would be interesting to do.

My current take is that this provides medium evidence—but I think this could also be evidence of higher ability at priorization.

Higher ability at prioritization? What do you mean?

Better at getting specific information out of a present piece of information (e.g. becoming skilled at skimming), better at putting tasks aside when they need some time to be processed in the background.

I would buy that hypothesis if the data was less time between switches within work-related tasks, but if the pattern is more frequent visits to entertainment sites during work hours, that sure doesn't sound like a good thing to me. Yeah maybe it's wisely letting stuff process in the background while I browse reddit or less wrong, but very likely it just is what it appears to be- degraded attention span and/or I creased internet addiction.

Too long; didn’t read

I found this comment both unhelpful and anti-scholarship, and have therefore strong - downvoted.

I suspect it was supposed to be a joke about attention spans

Sorry @RomanHauksson! Reversed the downvote. I didn't realize this was a joke 🤦

Np! I actually did read it and thought it was high-quality and useful. Thanks for investigating this question :)

Regarding your study idea. Sounds good! Would be interesting to see, and as you rightly point out wouldn't be too complicated/expensive to run. It's generally a challenge to run multi-year studies of this sort due to the short-term nature of many grants/positions. But certainly not impossible. 

An issue that you might have is being able to be sure that any variation that you see is due to changes in the general population vs changes in the sample population. This is an especially valid issue with MTurk because the workers are doing boring exercises for money for extended periods of time, and many of them will be second screening media etc.  A better sample might be university students, who researchers fortunately do have easy access to. 

Strong upvoted. I've recently been interested in whether you could consider transformative slow takeoff to have started ~5-10 years ago via the deep learning revolution. I don't know whether it would have much of an effect on the number of highly-skilled AI capabilities researchers, but the effect would be so broad that it would nonetheless be significant for global forecasting in tons of other ways. Unless, of course, meditation becomes popular extremely popular or social media platforms change their algorithm policies, immediately reversing the trend. The impression that I get from current social media news feed/scrolling systems is that they have many degrees of freedom to measure changes over time, and perhaps even adjust user's attention spans in aggregate.

With systems like TikTok-based platforms, the extreme gratification offered would require substantial evidence to prove that attention spans haven't been shortening, given neuroplasticity. It's theoretically plausible, but not where I'd bet, even before reading this.