Empirical Research

The Effects of Auditory Numerosity and Magnitude on Visual Numerosity Representation: An ERP Study

Jinbo Zhanga, Zehua Wua, Jiashuang Wua, Yi Moua, Zhenzhu Yue*a

Abstract

Numerical representation is not restricted to sensory modalities. It remains unclear how numerosity processing in different modalities interacts within the brain. Moreover, the effect of continuous magnitudes presented in one modality on the representation of numerosity in another modality has not been well studied. By using event-related potential (ERP) and source localization analyses, the present study examined whether there was an interaction between auditory numerosity and continuous magnitude on visual numerosity representation. A visual dot array (visual standard stimulus) was preceded by sound in which numerosity (Multiple-tone vs. One-tone conditions) and magnitude (Loud-tone vs. Soft-tone conditions) information were manipulated. Then, another visual dot array (visual comparison stimulus) was presented, and participants were required to compare the numerosities of the visual dots. Behavioural results revealed that participants showed smaller just-noticeable differences (JNDs) when visual stimuli were preceded by multiple tones than those when visual stimuli were preceded by one tone. The subsequent ERP analysis of visual standard stimuli revealed that the peak amplitude of N1 was more negative under the Loud-tone condition than that under the Soft-tone condition, which could be related to better preparatory attention. Moreover, a significant interaction between auditory numerosity and magnitude was found within the P2p time window for the standard stimuli. Further source localization analysis identified the effect of N1 and P2p to be in the right middle frontal gyrus (MFG) and left inferior parietal lobule (IPL). The present study suggests that numerosity information presented in one sensory modality could spontaneously affect the numerical representation in another modality.

Keywords: cross-modal, numerosity, magnitude, auditory, visual

Journal of Numerical Cognition, 2020, Vol. 6(2), https://doi.org/10.5964/jnc.v6i2.234

Received: 2019-07-02. Accepted: 2020-05-22. Published (VoR): 2020-09-09.

*Corresponding author at: Department of Psychology, Sun Yat-sen University, Guangzhou, China, 510006. E-mail: yuezhzh@mail.sysu.edu.cn

This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Numerosity processing is a basic ability of humans; i.e., people can extract numerosity information quickly and understand the relationship among numerosities. Previous studies have shown that numerosity cognition is not restricted by sensory modalities (Barth, Kanwisher, & Spelke, 2003; Barth, La Mont, Lipton, & Spelke, 2005; Cantlon & Brannon, 2006; Feigenson, Dehaene, & Spelke, 2004). For example, in the study of Barth et al. (2003), participants were required to compare the numerosity of stimulus pairs that consisted of two sequences of flashes (visual), two sequences of tones (auditory), or a flash sequence and a tone sequence (cross-modal). The results did not reveal a significant difference in accuracy between the cross-modal and intramodal conditions, indicating that the numerosity representation was independent of the stimulus modality. Supramodal numerical representation and processing have been seen in human adults, infants, and some nonhuman animals (Barth et al., 2003, 2005; Cantlon & Brannon, 2006; Feigenson et al., 2004; Whalen, Gallistel, & Gelman, 1999). However, other studies have challenged the proposal that the numerical representation is supramodal. For example, some have tested whether the numerical comparison performance for visual, auditory, and cross-modal stimuli differed in these sensory modalities. In a study, two stimulus sequences were presented sequentially, and the stimuli in the sequences were either from the same (the visual or the auditory condition) or different modalities (the cross-modal condition) (Tokita, Ashitani, & Ishiguchi, 2013). Participants compared the numerosities of the stimuli, and their performance was measured with Weber fractions (an index indicating the acuity of numerical representation) and the points of subjective equality (PSEs). Their performance was better (lower Weber fraction) for auditory stimuli than for visual stimuli, and the performance of the cross-modal condition was between those of the visual and auditory conditions. These findings indicated that the representation and processing of numerosity were not independent of modalities.

Previous studies have shown that numerosity information could be processed automatically. For example, in a study by Naparstek and Henik (2010), a comparative judgement task by visual stimuli was applied. Several digits (targets) and letters were presented in a circular display, and participants were asked to report whether the value of the digits was smaller or larger than 5. The congruence of the numerosity and numerical value of digits was manipulated. The numerosity and the numerical value were the same (e.g., 333) in the congruent trials, while the numerosity differed from the numerical value in the incongruent trials (e.g., 3333). The results reported faster responses in the congruent trials than in the incongruent trials when the numerosity of the target was task-irrelevant. Their results suggested that numerosity information could be automatically processed when the participants were required to respond to the numerical value of the target. In addition, recent studies showed that the processing of numerosity information not only is independent of the semantic numerical information of the stimuli but also has a natural advantage over other stimuli properties, such as the area and density of the stimuli, in perceptual judgement (Anobile et al., 2019; Cicchini, Anobile, & Burr, 2016, 2019).

However, it remains unclear whether numerosity cognition from the auditory modality could affect numerical processing in the visual modality. Some researchers have investigated cross-modal numerical cognition. For example, Alards-Tomalin et al. (2015) presented Arabic digits to participants in a sound intensity categorization task. The values of the Arabic digit were either small (1, 2, 3) or large (7, 8, 9). The results showed that loudness categorization was biased by the Arabic digits presented visually. The sound was more likely to be categorized as loud when it co-occurred with large digits than with small digits. In addition, a recent study showed that sound with different intensities (loudness: low vs. high) can influence the values of numbers that were spontaneously generated by participants (Heinemann, Pfister, & Janczyk, 2013). The participants were asked to speak out a digit (1–9) as fast as possible when sound with different intensities (e.g., volume: quiet vs. loud; duration: short vs. long) was presented. The results showed that participants tended to report larger numbers when the high-intensity sound was presented than when a low-intensity sound was presented. These findings suggested that the numerical representation in one sensory modality could be affected by the stimuli presented in another modality. However, these studies only investigated cross-modal numerical processing based on symbolic numbers (e.g., numbers were presented with Arabic digits), and the research on cross-modal numerical processing for non-symbolic numbers is still rare. Here, by adopting a numerosity comparison task, we aimed to investigate whether visual non-symbolic numerosity perception could be affected by auditory numerosity.

Previous studies have pointed out that the relationship between numerosity and magnitude information is close (Gallistel & Gelman, 2000). Previous studies have not clarified the potential interaction of the magnitude information and numerosity information of stimuli across sensory modalities. For instance, the loudness of auditory tones or the area of visual dot arrays could be regarded as the magnitude information of stimuli (Gallistel & Gelman, 2000; Leibovich, Katzin, Harel, & Henik, 2017; Rugani, Castiello, Priftis, Spoto, & Sartori, 2017). Few studies have investigated the effect of non-numerical magnitudes, such as the loudness of sound, on the numerical representation across modalities. Therefore, the second aim of the present study was to investigate how continuous magnitudes such as loudness presented in the auditory modality affect the numerosity representation in the visual modality.

In summary, the present study aimed to explore whether the numerosity and magnitude information of an auditory stimulus can influence the representation of visual numerosity. By adopting brain event-related potentials (ERPs), we investigated the brain-electric correlates of cross-modal numerosity representation. A numerosity comparison paradigm was used in the present study. An auditory prime tone sequence was presented, followed by a visual dot array (i.e., the standard stimulus). Then, another visual dot array was presented (i.e., the comparison stimulus). Participants were asked to compare the numerosities of the two visual dot arrays. Critically, the numerosity and magnitude information of the auditory prime stimulus was manipulated by changing the number (Numerosity) and the loudness of the tones (Magnitude), respectively. Tones were presented either at 60 dB SPL (Soft tone) or 80 dB SPL (Loud tone). The tone sequence with a single tone was defined as the One-tone condition, and the sequence with five tones was defined as the Multiple-tone condition. Thus, four kinds of prime sound sequences were involved in the present study, i.e., One-soft, Multiple-soft, One-loud, and Multiple-loud sequences. We used the sound sequence without any tones (No-tone) as the control condition. We hypothesized that the numerosity or the magnitude of auditory stimuli would influence the numerosity processing of the visual standard stimuli.

Previous studies have examined brain responses when participants process numerosities. A positive ERP presented in the posterior parietal region and with the peak at approximately 150-250 ms, P2p, was found to be related to the numerical processing (Dehaene, 1996; Dehaene & Brannon, 2011; Fornaciai, Brannon, Woldorff, & Park, 2017; Hyde & Spelke, 2009; Libertus, Woldorff, & Brannon, 2007). The P2p amplitude could reflect the perception of numerosity (Park, DeWind, Woldorff, & Brannon, 2015). A large number could elicit a larger P2p amplitude than that for a small number. In the numerical comparison, the P2p amplitude could be modulated by the numerical distance between two numbers, with a large amplitude being associated with a closer distance (Dehaene, 1996; Dehaene et al., 1998; Hyde & Spelke, 2009; Libertus et al., 2007). In the present study, if the auditory numerosity influences the visual numerosity representation, a change in the amplitude of P2p is expected. In particular, the P2p amplitude of visual standard stimuli is larger in the Multiple-tone condition than in the One-tone condition. In addition, given that the numerosity and magnitude are closely related and could interact with each other (Cordes, Gelman, Gallistel, & Whalen, 2001; Heinemann et al., 2013; Leibovich et al., 2017; Rugani et al., 2017), we hypothesized that P2p may also be modulated by the magnitude of tones (Loud or Soft).

In addition to P2p, a negative ERP presented in the posterior parietal region with a peak at approximately 170 ms, N1, was also observed in numerical processing (Dehaene, 1996; Hyde & Spelke, 2009; Libertus et al., 2007). N1 may reflect better preparatory attention (van den Berg, Appelbaum, Clark, Lorist, & Woldorff, 2016). In the present study, it was hypothesized that the visual standard stimuli that were primed by loud tones could elicit a larger N1 amplitude than that by soft tones. Finally, previous neuroimaging studies have investigated the neural basis of numerosity cognition, such as neuropsychological studies (Dehaene, Piazza, Pinel, & Cohen, 2003), positron emission tomography (PET) studies (Fias, Lammertyn, Reynvoet, Dupont, & Orban, 2003), and functional magnetic resonance imaging (fMRI) studies (Dehaene et al., 2003; Pinel, Piazza, Le Bihan, & Dehaene, 2004). Consistent evidence from these studies suggests that the bilateral parietal regions, especially the intraparietal sulcus (IPS), are essential for representing and processing numerical information (Ansari, 2008; Brannon, 2006; Nieder, 2005; Sokolowski, Fias, Bosah Ononye, & Ansari, 2017). In addition, the intraparietal region might also be important for multisensory processing, especially in exchanging auditory and visual information (Regenbogen et al., 2018). Using source localization analysis, we further investigated whether the parietal regions were associated with cross-modal numerosity and magnitude cognition.

Materials and Methods [TOP]

Participants [TOP]

Twenty-three college students from Sun Yat-sen University participated in the experiment as paid volunteers receiving 60 Chinese Yuan. Data from three participants were excluded because their behavioural numerical comparison performance in the condition with the largest numerical distance (comparing 20 versus 15 or 27) was at chance level (50%). Data from one additional participant were excluded due to high artefact rates (valid trials < 70%) in the ERP data. Finally, nineteen participants (female/male = 12/7; age range = 18-30 years; mean age = 22; SD = 2.91) were included in the final analyses. They were all right-handed, with normal hearing and normal or corrected-to-normal visual acuity. None of them had a history of mental or neurological diseases. All participants signed informed consent before the experiment, following the guidelines in the Declaration of Helsinki (World Medical Association, 2013). This study was approved by the Ethics Committee of the Department of Psychology at Sun Yat-sen University.

Apparatus and Stimuli [TOP]

Visual stimuli (used as the standard stimulus and comparison stimulus) were arrays of white dots on a black background (non-symbolic number stimuli) presented on a 23-inch LCD monitor (HP ProDisplay 231; resolution: 1920 × 1080; refresh rate: 60 Hz) using E-Prime 2.0 (Psychology Software Tools, 2012). These dot arrays were generated with the method proposed by Gebuis and Reynvoet (2011), which could minimize the influence of some non-numerical properties of the dot arrays that confounded with numerosities, such as the size of individual dots and the total surface area of the dot arrays. In the present study, the distribution of the dot arrays was limited to a circular region (radius = 5° of the visual angle) at the centre of the screen. The numerosity of the standard stimulus was fixed to 20 dots. For the comparison stimulus, the numerosity was controlled in an equally distributed Napierian-based log space as in previous studies (Dehaene, Izard, Spelke, & Pica, 2008; Nieder, 2018). The numerosity of the comparison stimulus was set to 15, 16, 18, 22, 24, and 27 dots (with a similar range of numerosity as in previous studies, e.g., Fornaciai & Park, 2018; Liu et al., 2013), which were varied within a range of ± 0.3 log units relative to the standard numerosity (3.0 in Napierian-based log space). For example, the distance between 15 and 20 is the same as that between 27 and 20 on the log scale.

Auditory stimuli (used as the prime stimulus; pure sine-wave tone, 440 Hz) were presented through an in-ear-monitor headphone (TORRAS H1). The magnitude information (loudness) and numerosity information (chunks, du or du-du-du-du-du) of the stimuli were manipulated. The magnitude information of the auditory stimuli is the loudness of the tones. Tones presented at 60 dB SPL were defined as Soft tones, and 80 dB SPL tones were defined as Loud tones. Here, the numerosity of the auditory stimuli was generated by inserting gaps (20 ms) into a continuous pure tone with a duration of 300 ms. Two such auditory stimuli were generated: one with no gaps, called One-tone (du; numerosity = 1), and the other with four gaps distributed equally within the continuous tone, called Multiple-tone (du-du-du-du-du; numerosity = 5). A 5 ms rise and fall time were applied to each gap-separated segment of the multiple tones.

Procedure and Experimental Design [TOP]

Participants sat 60 cm away from the monitor in a dimly lit room and wore an in-ear monitor headphone. They were instructed to maintain a central fixation on the monitor throughout the experiment. At the beginning of each trial, a white fixation cross was presented to the participant for a random duration (from 900 to 1100 ms in steps of 25 ms). Then, an auditory prime was presented for 300 ms. There were four types of auditory primes: One-soft, Multiple-soft, One-loud, and Multiple-loud. We also set up a control condition (No-tone) in which the auditory prime was absent. After the prime (or no prime in the control condition), a blank screen (with a white fixation cross) was presented for a random duration (from 100 to 300 ms in steps of 25 ms). Then, a standard stimulus was presented for 300 ms. Afterwards, a blank screen was present for a random duration (from 900 to 1100 ms in steps of 25 ms). Then, a comparison stimulus was presented for 300 ms (see Figure 1). Participants were instructed to compare the numerosity of the visual standard and the comparison stimulus with a joystick (BTP-2163X). If the number of visual dots in the comparison array was larger than that in the standard array, participants were required to press the left button; otherwise, they were asked to press the right button. This mapping between the two response keys was counterbalanced across participants (i.e., the right button for the larger numerosity, the left button for the smaller numerosity). There were 600 trials in total, and each prime condition (No-tone, One-soft, Multiple-soft, One-loud, and Multiple-loud) consisted of 120 trials. The five types of trials were randomly presented. Participants practised eight practice trials before starting the formal experiment, and a correct rate of 75% was required to pass the practice. In the present study, all participants passed the practice within five minutes.

Figure 1

Experimental procedures.

Note. The sequence of stimuli in the experiment. The auditory prime stimulus was manipulated in terms of loudness (magnitude) and number (numerosity). There was also a control condition in which no tone was presented. Dot arrays were presented sequentially to construct the comparison task. In the present study, the numerosity of the standard stimulus was fixed to 20. The numerosity of the comparison stimuli was sampled from the sets of 15, 16, 18, 22, 24, and 27 with equal numbers of trials. Participants were instructed to judge the numerosity of the standard and the comparison stimulus with the left or right button of a joystick. The mapping between the two response keys was counterbalanced across participants. [900:25:1100] means that the stimulus was presented with a random duration from 900 to 1100 ms in steps of 25 ms.

Behavioural Data Analysis [TOP]

Participants compared the standard and the comparison dot arrays and chose the arrays with more dots. Trials with reaction times shorter than 50 ms were excluded. Then, the resulting psychometric data were fitted with a Gaussian cumulative density function (CDF) for each participant using psignifit version 4.0 (Schütt, Harmeling, Macke, & Wichmann, 2016), a MATLAB toolbox for Bayesian inference for psychometric functions. The points of subjective equality (PSEs) were estimated as the probability of making more responses at 50%. Just-noticeable differences (JNDs) were defined as half of the probability difference of making the more responses at 25% and 75%. The PSE and JND reflect different aspects of the representation of input stimuli features. They are two features of the psychometric function. The psychometric function is an inferential model applied in detection and discrimination tasks. It models the relationship between a given feature of a physical stimulus and the forced-choice responses of a participant. The mean of the psychometric function—that is, the PSE—when more and less responses are balanced—represents the accuracy of participants’ perception of the stimulus. The standard deviation of the psychometric function—that is, the JND—represents the precision of participants’ perception of the stimulus.

For the statistical analysis of the behavioural data, we first examined whether the PSEs were different from the standard stimulus (20 dots) with a one-sample t-test in each prime condition separately. After that, we further compared the PSEs and JNDs between the auditory prime conditions (One-soft, Multiple-soft, One-loud, and Multiple-loud) and the No-tone condition separately via paired samples t-tests. Then, a 2 (auditory numerosity: One-tone vs. Multiple-tone) by 2 (auditory magnitude: Soft-tone vs. Loud-tone) repeated measures analysis of variance (ANOVA) was implemented for the auditory prime conditions. In the pairwise analysis, p-values were Bonferroni adjusted. In addition, Greenhouse-Geisser correction was applied in all statistical analyses when the sphericity was violated. Statistical differences were considered significant at p < .05. All statistical analyses were performed with JASP (JASP Team, 2018).

Electrophysiological Recording and Preprocessing [TOP]

Scalp voltages were recorded (sampling rate 500 Hz, online bandpass filter 0.05-100 Hz) from 62 Ag-AgCl scalp electrodes mounted into an electric cap (Easy Cap; FMS, Herrsching-Breitbrunn, Germany) according to the standard international 10-20 system with a NeuroScan SynAmps2 Amplifier (Scan 4.5, Neurosoft Labs, Inc. Virginia, USA). External electrodes were used for the vertical and horizontal electrooculograms. All the scalp electrodes were referenced to an electrode attached to the right earlobe online. A linked earlobe was used as a reference calculated offline. The impedances of the electrodes were maintained below 5 kΩ during the recording.

The EEGLAB toolbox (Delorme & Makeig, 2004) was used to preprocess the EEG data. The data were high-pass filtered offline above 0.5 Hz with a one-pass, noncausal, zero-phase windowed sinc FIR filter. A Kaiser window was used in the present study. The maximum passband ripple of this window was set to 0.0015, and the transition bandwidth was set to 2 Hz. Ocular artefacts were corrected throughout the continuously collected data with a procedure based on independent component analysis (Jung et al., 2000). Specifically, the Infomax algorithm was used for the independent component analysis, and the SASICA plugin of EEGLAB (see https://github.com/dnacombo/SASICA) was further used to choose EOG-related components automatically. The results of the EOG correction were visually confirmed via the comparison of waves before and after correction.

ERP Analysis of Visual Standard Stimuli [TOP]

For the ERP analysis, data were further low-pass filtered below 40 Hz with the same filter parameters as those used in the preprocessing stage. To determine the effect of the auditory prime stimuli on the visual standard stimuli, ERPs were locked to the visual standard stimulus in the epochs from 200 ms before stimulus onset to 800 ms following stimulus onset. The period from -200 ms to stimulus onset served as the prestimulus baseline in the ERP analysis. Trials with extreme voltages of epochs exceeding ± 70 μV during the baseline and poststimulus periods were excluded. ERPs were constructed by separately averaging the standard stimuli-locked ERPs at the five prime conditions (No-tone, One-soft, Multiple-soft, One-loud, and Multiple-loud). For the visual standard stimuli, the peak amplitudes of N1 (130 to 190 ms) and P2p (220 to 275 ms) were analysed at the P3 and P4 electrodes as in a previous study (Libertus et al., 2007). For the statistical analysis of ERPs elicited by the standard stimuli, repeated measures ANOVA was implemented to determine the interaction of auditory numerosity and auditory magnitude.

For visual standard ERPs, correlation analysis and source localization analysis were conducted. The difference in the JNDs and the peak amplitudes of the ERPs between the Multiple-tone and One-tone conditions was calculated. Then, Pearson’s correlation coefficient was computed to examine the relationship between the ERPs and behavioural performance. In addition, to localize the cortical regions that were sensitive to the interaction effect between the auditory numerosity and magnitude observed at the scalp level, we adopted the standard low-resolution brain electromagnetic tomography analysis (sLORETA; Pascual-Marqui, 2002). Although solutions provided by EEG-based source-location algorithms should be interpreted with caution due to their potential risks for error, sLORETA solutions have shown significant correspondence with the results provided by haemodynamic procedures as suggested by a previous study (Hinojosa et al., 2015). In the present study, the sLORETA software package (Pascual-Marqui, 2002) was implemented to achieve source localization based on the topographic map of scalp voltages. Specifically, the three-dimensional current density was estimated for each condition (No-tone, One-soft, Multiple-soft, One-loud, and Multiple-loud) and each participant. Subsequently, the voxel-based, whole-brain sLORETA images (6239 voxels) were compared between conditions of interest using the nonparametric mapping (SnPM) tool. As explained by Nichols and Holmes (2002), this nonparametric methodology inherently avoids multiple comparison-derived problems and does not require any assumption of normality. Voxels that showed significant differences between conditions (t-statistic on log-transformed data, one-tailed corrected p < .05) were localized to specific anatomical regions and Brodmann areas (BAs).

ERP Analysis of Visual Comparison Stimuli [TOP]

To investigate whether the peak amplitude of P2p is influenced by the ratio of the comparison stimuli, we analysed the peak amplitude of P2p (from 220 to 275 ms) at three ratios with a duration of 700 ms. We used the 200 ms before stimulus onset as the baseline. A 3 (ratio: 20 vs. 15/27; 20 vs. 16/24; 20 vs. 18/22) × 2 (hemisphere: electrode P3 vs. P4) repeated measures ANOVA was implemented on the peak amplitude of P2p. To explore the difference between the auditory prime conditions and the No-tone condition, one-way ANOVA (auditory prime conditions: No-tone, One-soft, Multiple-soft, One-loud, and Multiple-loud) was used. The Greenhouse-Geisser correction was applied to all statistical analyses when sphericity was violated. For significant main effects or interactions of any factors, pairwise comparisons or simple effects analyses were conducted. Bonferroni correction was used for multiple comparisons. In the present study, all differences were considered significant at p < .05. All statistical analyses were performed with JASP (JASP Team, 2018).

Results [TOP]

Psychophysical Results [TOP]

The fitted psychometric curve can be found in Figure 2a. The x-axis shows the numerosity of the comparison stimulus that has been transformed into the natural logarithm scale. The y-axis shows the percentage of trials in which the number of dots in the comparison stimuli was judged to be more than the number of dots in the standard stimulus (i.e., more responses).

PSEs [TOP]

The PSE results are illustrated in Figure 2b. The one-sample t-tests did not reveal any significant drift of PSE away from the standard stimulus (ps > .05) among the auditory prime conditions presented. However, for the No-tone condition, the PSE showed a drift away from the standard stimulus with marginal significance (PSE lower than the standard stimulus, p = .051). Further planned paired t-tests between the auditory prime conditions (One-soft, Multiple-soft, One-loud, and Multiple-loud) and the control condition (No-tone) did not show any significant difference (ps > .05). A 2 × 2 repeated measures ANOVA (auditory numerosity: One, Multiple; auditory magnitude: Soft, Loud) did not reveal any significant main or interaction effects (ps > .05).

Figure 2

Psychophysical results.

Note. a) Average psychometric curves estimated from the pooled data of all participants in the experiment. A Gaussian cumulative density function was applied. b) Average PSEs and JNDs under different conditions of the prime stimulus. The PF curve indicates the psychophysical fitted curve. Error bars represent 1 standard error.

*p < .05. **p < .01. N.S. = nonsignificant.

JNDs [TOP]

The JND results are illustrated in Figure 2b. Paired comparisons between the auditory prime conditions and the No-tone condition were conducted with paired samples t-tests. The planned comparison between the No-tone condition and other conditions (including One-soft, Multiple-soft, One-loud, and Multiple-loud) revealed that participants showed smaller JND when the standard stimuli were primed by tones (M = 0.160, SD = 0.037) than that when the standard stimuli were not primed by tones (M = 0.174, SD = 0.035), t(18) = 2.63, p = .017. More specifically, when primed with Multiple-soft (M = 0.155, SD = 0.038) and Multiple-loud (M = 0.154, SD = 0.041) tones, participants showed significantly smaller JND relative to the No-tone condition (M = 0.174, SD = 0.035), ps < .05. No other significant results were discovered. A 2 × 2 repeated measures ANOVA (auditory magnitude by auditory numerosity) revealed a significant main effect of auditory numerosity, F(1,18) = 5.97, p = .025, η p 2 = .25, indicating that the JND under the Multiple-tone condition (M = 0.154, SD = 0.039) was smaller than that of the One-tone condition (M = 0.165, SD = 0.039). Other main effects or interaction effects did not reach significance (ps > .05). The JND results indicated that auditory numerosity (One vs. Multiple), rather than auditory magnitude (Soft vs. Loud), affected participants’ sensitivity to visual numerosity. In summary, the behavioural results suggested that auditory numerosity influenced participants’ representation of the subsequent visual numerosity.

ERP Results [TOP]

ERPs Elicited by Visual Standard Stimuli [TOP]

N1 waveform for visual standard stimuli (Electrodes: P3, P4; 130-190 ms) [TOP]

A two-way ANOVA of auditory prime condition (No-tone, One-soft, Multiple-soft, One-loud, and Multiple-loud) and hemisphere (left vs. right) was applied. The results are shown in Figure 3 and Figure 4a.

Figure 3

The grand average waveform of the standard stimulus-locked ERPs.

Note. Top panel: ERP waveforms for electrodes P3 and P4. Bottom panel: Brain electrical activity mapping within the time windows of the N1 (130-190 ms) and P2p (220-275 ms) waves.

The main effect of auditory prime condition, F(4,72) = 4.25, p = .004, η p 2 = .19, and the main effect of hemisphere, F(1,18) = 12.97, p = .002, η p 2 = .42, were significant. Further analysis revealed that the peak amplitude elicited by the standard stimulus under the condition of Multiple-soft (M = -3.31, SD = 3.22) was significantly less negative than that under the condition of No-tone (M = -4.39, SD = 3.03), t(18) = 3.81, p = .005. The peak amplitude at the right hemisphere (P4; M = -4.80, SD = 3.09) was more negative than that at the left hemisphere (P3; M = -3.21, SD = 3.13), t(18) = 3.60, p = .002. The interaction effect was nonsignificant, F(4,72) = 0.23, p = .92, η p 2 = .013.

Figure 4

Results of the standard stimulus-locked ERPs and behaviour-brain correlation.

Note. a) Peak amplitude of the standard stimulus-locked N1 under different experimental conditions and correlation analysis between the behavioural JND and the peak amplitude of N1. Error bars represent 1 standard error. b) Peak amplitude of the standard stimulus-locked P2p under different experimental conditions and correlation analysis between the behavioural JND and the peak amplitude of P2p. Error bars indicate 1 standard error.

p < .10. N.S. = nonsignificant. All results are displayed for electrodes P3 and P4.

A 2 × 2 × 2 three-way repeated measures ANOVA (auditory magnitude by auditory numerosity by hemisphere) was conducted. The results are shown in the upper panel of Figure 4a. The main effect of auditory magnitude was significant, F(1,18) = 6.07, p = .024, η p 2 = .25, indicating that the visual standard stimulus under the Loud-tone condition (M = -4.14, SD = 2.90) elicited a larger negative peak amplitude than that under the Soft-tone condition (M = -3.68, SD = 3.12). The main effect of auditory numerosity was significant, F(1,18) = 5.18, p = .035, η p 2 = .22, indicating that the visual standard stimulus under the One-tone condition (M = -4.13, SD = 2.93) elicited a more negative peak amplitude than that under the Multiple-tone condition (M = -3.68, SD = 3.10). The main effect of the hemisphere was significant, F(1,18) = 13.77, p = .002, η p 2 = .43, indicating that the visual standard stimulus elicited a more negative peak amplitude in the right hemisphere (M = -4.72, SD = 3.15) than that in the left hemisphere (M = -3.10, SD = 3.12). The interaction effects were nonsignificant, Fs < 2.30, ps > .14.

P2p waveform for visual standard stimuli (Electrodes: P3, P4; 220-275 ms) [TOP]

First, to compare the difference between conditions with and without tones, a two-way ANOVA of auditory prime condition (No-tone, One-soft, Multiple-soft, One-loud, and Multiple-loud) and hemisphere (left vs. right) was applied. The results are shown in Figure 3 and Figure 4b. The main effect of auditory prime condition was significant, F(4,72) = 5.69, p < .001, η p 2 = .24, indicating that the visual standard stimulus elicited a more positive peak amplitude for the Multiple-soft condition (M = 10.41, SD = 4.37) than that for the No-tone condition (M = 9.49, SD = 4.62), t(18) = 3.15, p = .022. Other main and interaction effects were nonsignificant, Fs < 1.2, ps > .32.

A 2 × 2 × 2 three-way repeated measures ANOVA (auditory magnitude by auditory numerosity by hemisphere) revealed that the main effect of auditory magnitude was significant, F(1,18) = 19.74, p < .001, η p 2 = .52, indicating that the visual standard stimulus elicited a smaller P2p wave under the Loud-tone condition (M = 9.28, SD = 4.56) than that under the Soft-tone condition (M = 10.16, SD = 4.49). The main effects of auditory numerosity and hemisphere were nonsignificant, Fs < 1, ps > .41. Moreover, the interaction between auditory magnitude and numerosity was significant, F(1,18) = 4.64, p = .045, η p 2 = .21. The auditory numerosity effect was marginally significant in the Soft-tone condition, F(1,18) = 4.17, p = .056, but not in the Loud-tone condition, F(1,18) < 1, p = .399. Under the Soft-tone condition, the standard stimulus under the Multiple-tone condition (M = 10.41, SD = 4.37) elicited a more positive peak amplitude than that under the One-tone condition (M = 9.90, SD = 4.67). Further analyses of this interaction showed that the auditory magnitude effect was marginally significant in the One-tone condition, F(1,18) = 4.08, p = .059, and in the Multiple-tone condition, F(1,18) = 19.23, p < .001. However, the effect size was larger for the Multiple-tone condition than that for the One-tone condition. For the One-tone condition, the standard stimulus under the Soft-tone condition (M = 9.90, SD = 4.67) elicited a more positive peak amplitude than that under the Loud-tone condition (M = 9.41, SD = 4.66), Δ = 0.49. Under the Multiple-tone condition, the standard stimulus after a soft tone (M = 10.41, SD = 4.37) elicited a more positive peak amplitude than that after a loud tone (M = 9.15, SD = 4.55), Δ = 1.26. The results are shown in the upper panel of Figure 4b.

Correlation Analyses for Visual Standard Stimuli [TOP]

For the significant results that we found in the behavioural analysis and the ERP analysis, further correlation analyses were conducted. Pearson’s correlation coefficients were computed to assess the relationship between differences in behavioural performance (JNDs) and differences in ERP peak amplitudes between Multiple-tone and One-tone conditions. The difference between Multiple-tone and One-tone conditions could reflect the effect of auditory numerosity. Considering that the hemisphere effect was reported in the analysis of the N1 peak amplitude, we conducted the analysis at different electrodes (P3 and P4) separately. Scatterplots summarize the results obtained (Figure 4a and Figure 4b, bottom panels). There were significant correlations between behavioural performance and N1/P2p peak amplitudes, ps < .05. The difference in N1 peak amplitudes was negatively correlated with the differences in JNDs at electrode P3, r = -.49, p = .040. The difference in P2p peak amplitudes was negatively correlated with the differences in JNDs at electrode P3, r = -.78, p < .001. One participant was an outlier for the difference in N1 peak amplitude (-1.37 μV) and another participant was an outlier for that of P2p (4.67 μV); therefore, they were not included in the correlation analyses. We identified influential or outlying data points based on Cook's distance and centred leverage plots.

Source Localization Results for Visual Standard Stimuli [TOP]

The last analytic step consisted of localization of the cortical regions that were responsible for the interactions observed in P2p. To achieve this localization, we analysed P2 waveforms within the range of the P2p time windows (220-275 ms) for each participant and experimental condition (One-soft, Multiple-soft, One-loud, and Multiple-loud) with sLORETA. Then, the voxel-based whole-brain sLORETA images (6239 voxels) were compared with the contrast of (Multiple-One)soft – (Multiple-One)loud using the SnPM approach. The difference between Multiple-tone and One-tone conditions reflected the effect of auditory numerosity. The activity within the N1 time windows was also investigated for detecting potential responses with the same contrast. As illustrated in Table 1 and Figure 5, these voxels showed that under Soft-tone condition, stronger activity was localized to the left inferior parietal lobule (IPL; peak MNI coordinates: X = -35, Y = -40, Z = 40; BA 40) in the P2p window than that under Loud-tone condition. For the activity of N1, we found that under Soft-tone condition, lower activity was localized to the right middle frontal gyrus (MFG; peak MNI coordinates: X = 30, Y = 5/20, Z = 50/55; BA 6/8) compared to the activity for Loud-tone condition.

Table 1

Brain Regions in Which the sLORETA Source Current Density Represented the Interaction Effect of Auditory Numerosity and Magnitude With the Contrast of (Multiple-One)soft – (Multiple-One)loud

Brain Region Time Window Direction MNI
Coordinates (mm)
t
x y z
Right MFG/BA6 N1 Soft < Loud 30 5 55 -5.86
Right MFG/BA6 N1 Soft < Loud 30 5 50 -5.03
Right MFG/BA8 N1 Soft < Loud 30 20 50 -4.90
Left IPL/BA40 P2p Soft > Loud -35 -40 40 4.74
Left IPL/BA40 P2p Soft > Loud -40 -45 45 4.72

Note. All results in the list are significant (p < .05, one-tailed, SnPM).

Figure 5

Results of sLORETA source localization.

Note. Top panel: Slice view of the source localization results within the time windows of the N1 (130-190 ms) and P2p (220-275 ms) waves for the standard stimulus-locked ERPs. Bottom panel: 3D visualization of the results of the source localization. Blue indicates that the difference between the Multiple-tone and One-tone conditions is larger for the Loud-tone condition than that for the Soft-tone condition. Red indicates that the difference between the Multiple-tone and One-tone conditions is larger for the Soft-tone condition than that for the Loud-tone condition. Colour bars represent the t value of the SnPM test.

P2p Waveform Elicited by Visual Comparison Stimuli (Electrodes: P3, P4; 220-275 ms) [TOP]

A 3 (ratio: 20 vs. 15/27; 20 vs. 16/24; 20 vs. 18/22) × 2 (hemisphere: left or right) repeated measures ANOVA revealed that the main effect of hemisphere was significant, F(1,18) = 4.59, p = .046, η p 2 = .20. The left hemisphere electrode (P3; M = 5.70, SD = 3.65) showed a higher peak amplitude than that for the right hemisphere electrode (P4; M = 4.96, SD = 3.20), t(18) = 2.14, p = .046. The main effect of ratio was nonsignificant, F(2,36) = 1.87, p = .169, η p 2 = .09. However, the interaction between ratio and hemisphere was significant, F(2,36) = 4.28, p = .021, η p 2 = .19. The effect of ratio was only significant at electrode P4, F(2,36) = 4.14, p = .024. We further analysed the linear trend of the ratio at electrode P4. Subsequent polynomial contrast analysis of the ratio suggested that there was a linear relationship between the ratio and the peak amplitude of P2p, p = .011; see Figure 6a (the zoomed-in part) and Figure 6c.

Figure 6

Results of comparison stimuli-elicited ERPs.

Note. a) ERP waveforms for electrodes P3 and P4. b) The corresponding topographic map under the condition 20 vs. 15/27 within the time window of P2p (220-275 ms). c): Statistical results for the peak amplitude of P2p. Error bars indicate 1 standard error.

Discussion [TOP]

The present study aimed to investigate whether numerosity and magnitude information presented in one sensory modality (audition) could influence non-symbolic numerosity representation in another modality (vision). Moreover, by adopting the ERP technique, the neural correlates of the cross-modal influence on numerosity representation were also investigated. Behavioural results demonstrated that auditory numerosity information (One-tone vs. Multiple-tone) could affect the JND of responses to visual dot arrays, but magnitude information did not (Soft-tone vs. Loud-tone). However, visual ERPs revealed that both auditory numerosity and magnitude information could influence the processing of non-symbolic numerosity information in the visual modality. The visual standard stimulus under the Loud-tone condition elicited a more negative N1 peak amplitude than that under the Soft-tone condition. The visual standard stimulus under the One-tone condition elicited a more negative N1 peak amplitude than that under the Multiple-tone condition. Moreover, there was an interaction between auditory numerosity and auditory magnitude information for visual numerosity perception within the time window of the P2p. Finally, by using source localization analysis, we localized the brain activities involved in this interaction to the lateralized activity of the right frontal region (MFG) and the left inferior parietal lobule (IPL) within the time window of the N1 and P2p, respectively.

Our behavioural results showed that JNDs under the Multiple-tone condition were smaller than those under the One-tone condition, indicating that participants were more reliable to the non-symbolic numerosity information in the visual modality when the auditory prime was multiple tones relative to one tone. This finding is in line with the priming distance effect, which suggests that when a number is preceded by a prime number, participants could process numbers more efficiently when the prime–target numerical distance is smaller (Bahrami et al., 2010; Dehaene et al., 1998). In other words, participants may have a more reliable representation of the numerosity information of standard stimuli under the Multiple-tone condition than that under the One-tone condition. In addition, our behavioural results did not show a significant difference in PSE among all priming conditions, indicating that participants were able to perceive the numerosity information of the standard stimuli accurately regardless of priming conditions.

In the present study, the interaction between auditory numerosity and magnitude was observed for the cross-modal standard stimuli on the P2p amplitude. This result is in line with previous unimodal studies, which suggest that the representation of numerosity is sensitive to continuous magnitudes (Leibovich & Henik, 2013; Leibovich et al., 2017). Under the Soft-tone condition, visual standard stimuli presented after multiple tones elicited larger P2p amplitudes than that when the stimuli were presented after one tone. In contrast, this numerosity effect was not shown under the Loud-tone condition. Previous studies showed that the P2p amplitude was highly related to the processing of numerosity (Dehaene, 1996; Hyde & Spelke, 2009; Libertus et al., 2007). However, the detailed function of P2p in numerosity cognition is still controversial. Some researchers have suggested that the P2p amplitude reflects the perception of numerosity (Park et al., 2015). That is, a large number elicited a higher P2p amplitude than that for a small number. However, other researchers suggested that the P2p amplitude might reflect the comparative distance effect (Hyde & Spelke, 2012; Libertus et al., 2007). When comparing the numerosity of two numbers, the closer two numbers are, the higher the amplitude of P2p is. In the present study, for the ERPs elicited by the visual standard stimulus, a larger P2p amplitude was observed for the Multiple-tone condition than that for the One-tone condition. These results could be attributed to the larger numerosity being perceived in the Multiple-tone condition than that in the One-tone condition. This processing was spontaneous, as no numerosity comparison between the auditory prime and visual standard stimuli was required. The absence of such an effect under the Loud-tone condition suggested that the perception of the numerosity of visual stimuli might be jointly affected by the numerosity and magnitude of the auditory stimuli. Previous studies have proposed that the amplitude of P2p may reflect the integration effect of numerosity and magnitude based on evidence from the visual modality (Gebuis & Reynvoet, 2012, 2013). This integration was also observed in the present cross-modal context.

N1 has been reported for both symbolic and non-symbolic stimuli in previous studies on numerosity cognition (Gebuis & Reynvoet, 2013; Hyde & Spelke, 2012; Soltész & Szűcs, 2014). The N1 wave is believed to represent a sensory gain control mechanism because focusing on a visual area would facilitate further perceptual processing of stimuli presented in that area (Luck, Woodman, & Vogel, 2000). We observed that larger N1 amplitude was elicited by a visual standard stimulus under the One-tone condition than that under the Multiple-tone condition, indicating that participants allocated more attention resources to the dot array under the One-tone condition. By contrast, the processing of a large number could cost more attention resources than those for a small number (Pomè, Anobile, Cicchini, Scabia, & Burr, 2019). Moreover, a previous study suggested that the loudness of auditory stimuli could modulate participants' preparatory attention. Loud auditory stimuli could increase participants' alerting levels, speed up processing, and decrease the perception threshold for subsequent visual stimuli (Petersen, Petersen, Bundesen, Vangkilde, & Habekost, 2017). Therefore, enhanced negative peak amplitude of N1 was found under the Loud-tone condition in the present study, which could be attributed to the better preparatory attention elicited by the Loud-tone (van den Berg et al., 2016). This explanation was further supported by the source localization results of the N1 wave elicited by the visual standard stimulus. The right MFG is related to the allocation of attention resources (Small et al., 2003). Our results showed that more activity was observed in the right MFG for the Loud-tone condition than that for the Soft-tone condition.

The correlation results revealed a covariant relationship between the difference of the JNDs and the differences in the peak amplitudes of visual standard stimuli ERPs between the Multiple-tone and One-tone conditions, which showed the consistency and reliability of our results. In the behavioural results, we found that participants showed smaller JNDs for the standard stimuli under the Multiple-tone condition than those under the One-tone condition. For N1, the peak amplitude was independently related to the numerosity and magnitude of auditory prime stimuli. Considering that the numerosity effect on JNDs and N1 was not influenced by the magnitude of the auditory prime, we combined the difference of JNDs and the peak amplitudes of N1 across the Soft and Loud conditions. A negative correlation between the difference in the JNDs and the difference in the peak amplitude of N1 was observed; that is, less negative peak amplitude of N1 was found under the Multiple-tone condition than that under the One-tone condition when participants showed smaller JND under the Multiple-tone condition than that under the One-tone condition. For P2p, an interaction between auditory numerosity and magnitude was found. The difference in the peak amplitude of P2p between the Multiple-tone and One-tone conditions was only significant under the Soft-tone condition. Therefore, we only applied the correlation analysis under the Soft-tone condition for P2p. Under the Soft-tone condition, more positive peak amplitude of P2p was found under the Multiple-tone condition than that under the One-tone condition. A negative correlation between the difference in JND and the difference in peak amplitude of P2p was observed; that is, the more positive peak amplitude of P2p was found under the Multiple-tone condition than that under the One-tone condition when participants showed smaller JND under the Multiple-tone condition than that under the One-tone condition. The correlation analysis of N1 and P2p consistently showed that the individuals with obvious condition differences revealed by the group-level ERP analyses (less negative N1 or more positive P2p) also demonstrated obvious condition differences revealed by group-level behavioural analyses (smaller JND).

In the analysis of the P2p amplitude for the visual comparison stimulus, we observed that the P2p amplitude varied as a function of ratio (see Figure 6), i.e., the smaller the ratio, the higher the amplitude of P2p. The ratio represents the distance of the two numbers: a ratio close to one means that the numbers are close in distance. Therefore, the increase in P2p amplitude following the decrease in the ratio may be due to the change in the distance between two numbers, i.e., the distance effect (Moyer & Landauer, 1967). When comparing two numerosities, the closer the numbers, the higher the amplitude of P2p.

A recent meta-analysis used the activation likelihood estimation (ALE) method to analyse nearly one hundred neuroimaging studies that focused on numerical and non-numerical magnitude processing (Sokolowski et al., 2017). Their results indicated that the processing of symbolic, non-symbolic, and non-numerical magnitudes was related to overlapping activation in the frontal and parietal lobes. This coexistence of overlapping and segmentation of brain activation among symbolic, non-symbolic, and non-numerical magnitudes suggests that number cognition may use both a generalized brain magnitude system and some specialized brain regions for representing numerical magnitudes. Despite the relatively poor spatial resolution of source localization techniques, our source localization results might suggest that different brain areas, such as the left inferior parietal lobule and right middle frontal gyrus, were involved in different processing stages in visual numerosity processing when the perception of visual numerosity was cross-modally influenced by the preceding auditory numerosity.

In summary, the present study demonstrated that visual non-symbolic numerosity perception could be affected by auditory numerosity information. Most importantly, our results revealed the existence of the interaction between auditory numerosity and magnitude at the P2p waves for cross-modal standard stimuli (visual P2p), which were further localized to the left inferior parietal lobule (IPL). Potential neural mechanisms underlying such a cross-modal influence on numerosity processing need to be investigated in the future.

Funding [TOP]

The work was funded by a grant from Natural Science Foundation of China (31470978) to Dr. Zhenzhu Yue.

Competing Interests [TOP]

The authors have declared that no competing interests exist.

Acknowledgments [TOP]

The authors have no support to report.

Ethics Approval [TOP]

This study was approved by the Ethics Committee of the Department of Psychology at Sun Yat-sen University.

Data Availability [TOP]

The current study is part of a larger research project. This project has not yet been completed. Currently, it is inconvenient to publish the data at this time point. However, please contact us if you have any questions.

References [TOP]

  • Alards-Tomalin, D., Walker, A. C., Shaw, J. D., & Leboe-McGowan, L. C. (2015). Is 9 louder than 1? Audiovisual cross-modal interactions between number magnitude and judged sound loudness. Acta Psychologica, 160, 95-103. https://doi.org/10.1016/j.actpsy.2015.07.004

  • Anobile, G., Guerrini, G., Burr, D. C., Monti, M., Del Lucchese, B., & Cicchini, G. M. (2019). Spontaneous perception of numerosity in pre-school children. Proceedings of the Royal Society B: Biological Sciences, 286, Article 20191245. https://doi.org/10.1098/rspb.2019.1245

  • Ansari, D. (2008). Effects of development and enculturation on number representation in the brain. Nature Reviews Neuroscience, 9, 278-291. https://doi.org/10.1038/nrn2334

  • Bahrami, B., Vetter, P., Spolaore, E., Pagano, S., Butterworth, B., & Rees, G. (2010). Unconscious numerical priming despite interocular suppression. Psychological Science, 21, 224-233. https://doi.org/10.1177/0956797609360664

  • Barth, H., Kanwisher, N., & Spelke, E. (2003). The construction of large number representations in adults. Cognition, 86, 201-221. https://doi.org/10.1016/S0010-0277(02)00178-6

  • Barth, H., La Mont, K., Lipton, J., & Spelke, E. S. (2005). Abstract number and arithmetic in preschool children. Proceedings of the National Academy of Sciences of the United States of America, 102, 14116-14121. https://doi.org/10.1073/pnas.0505512102

  • Brannon, E. M. (2006). The representation of numerical magnitude. Current Opinion in Neurobiology, 16, 222-229. https://doi.org/10.1016/j.conb.2006.03.002

  • Cantlon, J. F., & Brannon, E. M. (2006). Shared system for ordering small and large numbers in monkeys and humans. Psychological Science, 17, 401-406. https://doi.org/10.1111/j.1467-9280.2006.01719.x

  • Cicchini, G. M., Anobile, G., & Burr, D. C. (2016). Spontaneous perception of numerosity in humans. Nature Communications, 7, Article 12536. https://doi.org/10.1038/ncomms12536

  • Cicchini, G. M., Anobile, G., & Burr, D. C. (2019). Spontaneous representation of numerosity in typical and dyscalculic development. Cortex, 114, 151-163. https://doi.org/10.1016/j.cortex.2018.11.019

  • Cordes, S., Gelman, R., Gallistel, C. R., & Whalen, J. (2001). Variability signatures distinguish verbal from nonverbal counting for both large and small numbers. Psychonomic Bulletin & Review, 8, 698-707. https://doi.org/10.3758/BF03196206

  • Dehaene, S. (1996). The organization of brain activations in number comparison: Event-related potentials and the additive-factors method. Journal of Cognitive Neuroscience, 8, 47-68. https://doi.org/10.1162/jocn.1996.8.1.47

  • Dehaene, S., & Brannon, E. (2011). Space, time and number in the brain: Searching for the foundations of mathematical thought. London, United Kingdom: Academic Press.

  • Dehaene, S., Izard, V., Spelke, E., & Pica, P. (2008). Log or linear? Distinct intuitions of the number scale in Western and Amazonian indigene cultures. Science, 320, 1217-1220. https://doi.org/10.1126/science.1156540

  • Dehaene, S., Naccache, L., Le Clec’H, G., Koechlin, E., Mueller, M., Dehaene-Lambertz, G., . . . Le Bihan, D., (1998). Imaging unconscious semantic priming. Nature, 395, 597-600. https://doi.org/10.1038/26967

  • Dehaene, S., Piazza, M., Pinel, P., & Cohen, L. (2003). Three parietal circuits for number processing. Cognitive Neuropsychology, 20, 487-506. https://doi.org/10.1080/02643290244000239

  • Delorme, A., & Makeig, S. (2004). EEGLAB: An open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. Journal of Neuroscience Methods, 134, 9-21. https://doi.org/10.1016/j.jneumeth.2003.10.009

  • Feigenson, L., Dehaene, S., & Spelke, E. (2004). Core systems of number. Trends in Cognitive Sciences, 8, 307-314. https://doi.org/10.1016/j.tics.2004.05.002

  • Fias, W., Lammertyn, J., Reynvoet, B., Dupont, P., & Orban, G. A. (2003). Parietal representation of symbolic and nonsymbolic magnitude. Journal of Cognitive Neuroscience, 15, 47-56. https://doi.org/10.1162/089892903321107819

  • Fornaciai, M., Brannon, E. M., Woldorff, M. G., & Park, J. (2017). Numerosity processing in early visual cortex. NeuroImage, 157, 429-438. https://doi.org/10.1016/j.neuroimage.2017.05.069

  • Fornaciai, M., & Park, J. (2018). Early numerosity encoding in visual cortex is not sufficient for the representation of numerical magnitude. Journal of Cognitive Neuroscience, 30, 1788-1802. https://doi.org/10.1162/jocn_a_01320

  • Gallistel, C. R., & Gelman, R. (2000). Non-verbal numerical cognition: From reals to integers. Trends in Cognitive Sciences, 4, 59-65. https://doi.org/10.1016/S1364-6613(99)01424-2

  • Gebuis, T., & Reynvoet, B. (2011). Generating nonsymbolic number stimuli. Behavior Research Methods, 43, 981-986. https://doi.org/10.3758/s13428-011-0097-5

  • Gebuis, T., & Reynvoet, B. (2012). The interplay between nonsymbolic number and its continuous visual properties. Journal of Experimental Psychology: General, 141, 642-648. https://doi.org/10.1037/a0026218

  • Gebuis, T., & Reynvoet, B. (2013). The neural mechanisms underlying passive and active processing of numerosity. NeuroImage, 70, 301-307. https://doi.org/10.1016/j.neuroimage.2012.12.048

  • Heinemann, A., Pfister, R., & Janczyk, M. (2013). Manipulating number generation: Loud+ long= large? Consciousness and Cognition, 22, 1332-1339. https://doi.org/10.1016/j.concog.2013.08.014

  • Hinojosa, J. A., Mercado, F., Albert, J., Barjola, P., Peláez, I., Villalba-García, C., & Carretié, L. (2015). Neural correlates of an early attentional capture by positive distractor words. Frontiers in Psychology, 6, Article 24. https://doi.org/10.3389/fpsyg.2015.00024

  • Hyde, D. C., & Spelke, E. S. (2009). All numbers are not equal: An electrophysiological investigation of small and large number representations. Journal of Cognitive Neuroscience, 21, 1039-1053. https://doi.org/10.1162/jocn.2009.21090

  • Hyde, D. C., & Spelke, E. S. (2012). Spatiotemporal dynamics of processing nonsymbolic number: An event-related potential source localization study. Human Brain Mapping, 33, 2189-2203. https://doi.org/10.1002/hbm.21352

  • JASP Team. (2018). JASP (Version 0.9.2) [Computer software].

  • Jung, T.-P., Makeig, S., Westerfield, M., Townsend, J., Courchesne, E., & Sejnowski, T. J. (2000). Removal of eye activity artifacts from visual event-related potentials in normal and clinical subjects. Clinical Neurophysiology, 111, 1745-1758. https://doi.org/10.1016/S1388-2457(00)00386-2

  • Leibovich, T., & Henik, A. (2013). Magnitude processing in non-symbolic stimuli. Frontiers in Psychology, 4, Article 375. https://doi.org/10.3389/fpsyg.2013.00375

  • Leibovich, T., Katzin, N., Harel, M., & Henik, A. (2017). From “sense of number” to “sense of magnitude”: The role of continuous magnitudes in numerical cognition. Behavioral and Brain Sciences, 40, Article e164. https://doi.org/10.1017/S0140525X16000960

  • Libertus, M. E., Woldorff, M. G., & Brannon, E. M. (2007). Electrophysiological evidence for notation independence in numerical processing. Behavioral and Brain Functions, 3, Article 1. https://doi.org/10.1186/1744-9081-3-1

  • Liu, W., Zhang, Z.-J., & Zhao, Y.-J. (2013). Numerosity adaptation effect on the basis of perceived numerosity: Numerosity adaptation effect on the basis of perceived numerosity. Acta Psychologica Sinica, 44, 1297-1308. https://doi.org/10.3724/SP.J.1041.2012.01297

  • Luck, S. J., Woodman, G. F., & Vogel, E. K. (2000). Event-related potential studies of attention. Trends in Cognitive Sciences, 4, 432-440. https://doi.org/10.1016/S1364-6613(00)01545-X

  • Moyer, R. S., & Landauer, T. K. (1967). Time required for judgements of numerical inequality. Nature, 215, 1519-1520. https://doi.org/10.1038/2151519a0

  • Naparstek, S., & Henik, A. (2010). Count me in! On the automaticity of numerosity processing. Journal of Experimental Psychology: Learning, Memory, and Cognition, 36, 1053-1059. https://doi.org/10.1037/a0019766

  • Nichols, T. E., & Holmes, A. P. (2002). Nonparametric permutation tests for functional neuroimaging: A primer with examples. Human Brain Mapping, 15, 1-25. https://doi.org/10.1002/hbm.1058

  • Nieder, A. (2005). Counting on neurons: The neurobiology of numerical competence. Nature Reviews Neuroscience, 6, 177-190. https://doi.org/10.1038/nrn1626

  • Nieder, A. (2018). Evolution of cognitive and neural solutions enabling numerosity judgements: Lessons from primates and corvids. Philosophical Transactions of the Royal Society of London: Series B. Biological Sciences, 373, Article 20160514. https://doi.org/10.1098/rstb.2016.0514

  • Park, J., DeWind, N. K., Woldorff, M. G., & Brannon, E. M. (2015). Rapid and direct encoding of numerosity in the visual stream. Cerebral Cortex, 26, 748-763. https://doi.org/10.1093/cercor/bhv017

  • Pascual-Marqui, R. D. (2002). Standardized low-resolution brain electromagnetic tomography (sLORETA): Technical details. Methods and Findings in Experimental and Clinical Pharmacology, 24(Suppl D), 5-12.

  • Petersen, A., Petersen, A. H., Bundesen, C., Vangkilde, S., & Habekost, T. (2017). The effect of phasic auditory alerting on visual perception. Cognition, 165, 73-81. https://doi.org/10.1016/j.cognition.2017.04.004

  • Pinel, P., Piazza, M., Le Bihan, D., & Dehaene, S. (2004). Distributed and overlapping cerebral representations of number, size, and luminance during comparative judgments. Neuron, 41, 983-993. https://doi.org/10.1016/S0896-6273(04)00107-2

  • Pomè, A., Anobile, G., Cicchini, G. M., Scabia, A., & Burr, D. C. (2019). Higher attentional costs for numerosity estimation at high densities. Attention, Perception & Psychophysics, 81, 2604-2611. https://doi.org/10.3758/s13414-019-01831-3

  • Psychology Software Tools, Inc. (2012). E-prime v2.0 [Computer software]. Pittsburg, PA, USA: Psychology Software Tools Inc.

  • Regenbogen, C., Seubert, J., Johansson, E., Finkelmeyer, A., Andersson, P., & Lundström, J. N. (2018). The intraparietal sulcus governs multisensory integration of audiovisual information based on task difficulty. Human Brain Mapping, 39, 1313-1326. https://doi.org/10.1002/hbm.23918

  • Rugani, R., Castiello, U., Priftis, K., Spoto, A., & Sartori, L. (2017). What is a number? The interplay between number and continuous magnitudes. Behavioral and Brain Sciences, 40, Article e187. https://doi.org/10.1017/S0140525X16002259

  • Schütt, H. H., Harmeling, S., Macke, J. H., & Wichmann, F. A. (2016). Painfree and accurate Bayesian estimation of psychometric functions for (potentially) overdispersed data. Vision Research, 122, 105-123. https://doi.org/10.1016/j.visres.2016.02.002

  • Small, D. M., Gitelman, D. R., Gregory, M. D., Nobre, A. C., Parrish, T. B., & Mesulam, M.-M. (2003). The posterior cingulate and medial prefrontal cortex mediate the anticipatory allocation of spatial attention. NeuroImage, 18, 633-641. https://doi.org/10.1016/S1053-8119(02)00012-5

  • Sokolowski, H. M., Fias, W., Bosah Ononye, C., & Ansari, D. (2017). Are numbers grounded in a general magnitude processing system? A functional neuroimaging meta-analysis. Neuropsychologia, 105, 50-69. https://doi.org/10.1016/j.neuropsychologia.2017.01.019

  • Soltész, F., & Szűcs, D. (2014). Neural adaptation to non-symbolic number and visual shape: An electrophysiological study. Biological Psychology, 103, 203-211. https://doi.org/10.1016/j.biopsycho.2014.09.006

  • Tokita, M., Ashitani, Y., & Ishiguchi, A. (2013). Is approximate numerical judgment truly modality-independent? Visual, auditory, and cross-modal comparisons. Attention, Perception & Psychophysics, 75, 1852-1861. https://doi.org/10.3758/s13414-013-0526-x

  • van den Berg, B., Appelbaum, L. G., Clark, K., Lorist, M. M., & Woldorff, M. G. (2016). Visual search performance is predicted by both prestimulus and poststimulus electrical brain activity. Scientific Reports, 6, Article 37718. https://doi.org/10.1038/srep37718

  • Whalen, J., Gallistel, C. R., & Gelman, R. (1999). Nonverbal counting in humans: The psychophysics of number representation. Psychological Science, 10, 130-137. https://doi.org/10.1111/1467-9280.00120

  • World Medical Association. (2013). World Medical Association Declaration of Helsinki: Ethical principles for medical research involving human subjects. Journal of the American Medical Association, 310, 2191-2194. https://doi.org/10.1001/jama.2013.281053



Copyright (c) 2020 Zhang et al.