Module 0: Foundations of Sensory Biology

Before we traverse the spectral atlases of vision, hearing, touch, electroreception, magnetoreception, chemoreception and thermoreception, we first lay down the universal machinery that every sense must obey: information theory, signal detection theory, receptor biophysics, neural coding, energetic constraints, and evolutionary origins. These scaffolds are what let us compare the eye of a mantis shrimp to the lateral line of a trout or the pit organ of a rattlesnake on a single common axis: bits per joule per second.

1. Information Theory of Sensing

Claude Shannon’s A Mathematical Theory of Communication (1948) gave us a universal currency for sensing: the bit. Any sensory neuron, regardless of whether it transduces photons, pressure waves, chemical gradients or magnetic fields, is at heart a noisy analog channel carrying information from the external world to the nervous system.

Shannon Channel Capacity

For an additive white Gaussian noise (AWGN) channel with bandwidth \(B\) Hz and signal-to-noise ratio \(S/N\) (linear power ratio), the maximum information rate achievable with arbitrarily small error is:

\[ C \;=\; B \, \log_{2}\!\left(1 + \frac{S}{N}\right) \quad \text{[bits/s]} \]

Shannon (1948) capacity theorem for a band-limited AWGN channel

For sensory neurons, \(B\) is the bandwidth of the transduction cascade (often set by the slowest biochemical step) and \(S/N\) is the ratio of stimulus-driven variance to intrinsic noise variance. A blowfly photoreceptor in bright light achieves \(B \approx 100\) Hz and \(S/N \approx 100\), giving a capacity \(C \approx 700\) bits/s (de Ruyter van Steveninck & Laughlin, 1996). A typical vertebrate rod operating near threshold delivers only\(\sim 3\) bits/s.

Mutual Information & Spike Trains

For non-Gaussian stimuli and spiking outputs we use the mutual information\(I(S;R)\) between stimulus \(S\) and response \(R\):

\[ I(S;R) \;=\; H(R) - H(R \mid S) \;=\; \sum_{s,r} p(s,r) \log_2 \frac{p(s,r)}{p(s)p(r)} \]

Brenner et al. (2000) and Strong et al. (1998) introduced the direct methodthat estimates \(I\) from binned spike-word distributions, sidestepping any Gaussian assumption and giving a model-free lower bound on what the animal actually extracts.

Energy Cost Per Bit

Niven & Laughlin (2008) compiled energy budgets across taxa and found a striking scaling: invertebrate photoreceptors cost as little as \(10^{-14}\) J/bit, whereas vertebrate cortical neurons approach \(10^{-12}\) J/bit. The human brain as a whole burns \(\sim 20\) W for an estimated\(10^{10}\) bits/s throughput (Attwell & Laughlin, 2001). Thus the fundamental economic question for any sense is not “how much signal?” but bits per joule.

The Sensory Channel as a Noisy Communication Link

Worldstimulus S(t)Receptortransducer g(S)Encoderspikes r(t)DecoderS-hat(t)noise n1(t)noise n2(t)C = B log2(1 + S/N) [bits/s]Capacity bounded by transduction bandwidth and intrinsic photon/channel noiseNiven & Laughlin (2008): metabolic cost ~ 10^-14 to 10^-12 J/bit

2. Signal Detection Theory

A monkey pressing a lever because it saw a dim flash, a mosquito swerving to a CO&sub2; plume, a bat classifying an echo as “moth” or “raindrop”—all of these are binary detection problems embedded in noise. Green & Swets (Signal Detection Theory and Psychophysics, 1966) gave us the mathematical apparatus.

The d′ Discrimination Index

Suppose the decision variable \(x\) is Gaussian under two hypotheses “signal+noise” \(\mathcal{N}(\mu_S, \sigma^2)\) and “noise alone” \(\mathcal{N}(\mu_N, \sigma^2)\). The sensitivity index is:

\[ d' \;=\; \frac{\mu_S - \mu_N}{\sigma} \;=\; \Phi^{-1}(\text{Hit rate}) - \Phi^{-1}(\text{False-alarm rate}) \]

The receiver operating characteristic (ROC) plots hit rate against false-alarm rate as the decision criterion slides. The area under the ROC equals the probability of correctly ordering a random signal trial above a random noise trial, and for equal-variance Gaussians it relates to \(d'\) through\(\text{AUC} = \Phi(d'/\sqrt{2})\).

Absolute Threshold & the Hecht-Shlaer-Pirenne Experiment

Hecht, Shlaer & Pirenne (1942) showed that a dark-adapted human observer can report a flash delivering as few as 5–14 photons to the cornea, implying that a single photon absorbed by a single rhodopsin is enough to trigger a rod spike. The Poisson statistics of photon arrivals (variance = mean) set the irreducible noise floor, and the threshold corresponds to a criterion\(d' \approx 1\).

Weber-Fechner Law

Ernst Weber (1834) observed that the just-noticeable difference\(\Delta I\) is proportional to the background intensity:

\[ \frac{\Delta I}{I} \;=\; k_W \;=\; \text{const (modality-specific)} \]

Weber fractions: brightness \(\sim 1.6\%\), loudness \(\sim 10\%\), lifted weight \(\sim 2\%\), salty taste \(\sim 8\%\)

Integrating \(dS = k\,dI/I\) gives the Fechner logarithmic law\(S = k\ln(I/I_0)\). Logarithmic compression is optimal for stimuli whose natural distribution is log-uniform over several decades, which is exactly the situation for sound pressure, light intensity and chemical concentrations.

Stevens’ Power Law

S. S. Stevens (1957, 1961) showed that magnitude estimation—where subjects assign numerical labels to sensation—follows a power law rather than a logarithmic one:

\[ S \;=\; k \, I^{\, n} \]

Exponents: brightness (5° target) \(n \approx 0.33\), loudness (3 kHz) \(n \approx 0.67\), weight \(n \approx 1.45\), electric shock \(n \approx 3.5\)

Exponents less than 1 produce compression (vision, hearing, taste) while exponents greater than 1 produce expansion (electric shock, warmth on the skin). On a log-log plot Stevens’ law is a straight line, whereas Fechner’s logarithm is a curve—a direct empirical discriminator.

ROC Curve & Gaussian Decision Distributions

Decision variable distributionsNoise aloneSignal + noisecriterion cd' = (mu_S - mu_N)/sigmaReceiver Operating CharacteristicFalse-alarm rateHit rated' = 0d' = 1d' = 2d' = 3AUC increases with sensitivity

3. Receptor Architecture

All biological sensors convert an external stimulus into a change in a membrane ion current. The molecular hardware falls into three great families.

GPCR-Based Transduction

G-protein-coupled receptors (GPCRs) form seven-transmembrane α-helical bundles. Photon absorption isomerises retinal in rhodopsin, activating transducin, which activates a phosphodiesterase that hydrolyses cGMP, which closes a CNG channel. A single photon triggers an amplification cascade of \(\sim 10^{5}\) messengers, yielding a reliable single-photon rod response (Baylor, Lamb & Yau, 1979). Vertebrate odorant receptors (~400 functional in humans, ~1100 in mice; Buck & Axel, 1991) and taste receptors T1R and T2R use the same GPCR framework with different G-protein partners (gustducin for bitter).

Ionotropic Transduction

Here the receptor is the channel. The ENaC/degenerin salty-taste receptor is directly gated by Na⁺, and insect olfactory receptors (ORs/ORco) form heterotetrameric ligand-gated cation channels (Sato et al., 2008) — a completely independent evolutionary invention from the vertebrate GPCR olfactory system. The speed advantage (no second messenger cascade) underlies why insect olfaction has kilohertz bandwidth.

Mechanotransduction

Force-gated channels open within microseconds of membrane deformation. Three families dominate:

  • Piezo1/2 — trimeric propeller-shaped channels responsible for touch, proprioception and blood-flow sensing (Coste et al., 2010; Nobel Prize 2021).
  • TMC1/TMC2 — the auditory mechanoelectrical transduction (MET) channel in hair-cell stereocilia (Pan et al., 2013; Kawashima et al., 2011); gated by tip-link tension.
  • TRP channels — polymodal thermo/chemo/mechano sensors: TRPV1 (hot, capsaicin), TRPM8 (cold, menthol), TRPA1 (noxious chemicals, pit-organ IR in pit vipers).
ReceptorMechanismStimulusLatency
RhodopsinGPCR cascadephoton (380-700 nm)~20 ms
OR/OR51E2GPCR + cAMPodorants~100 ms
ORco (insect)ionotropicodorants~5 ms
T1R/T2RGPCR + PLC-IP3sweet, bitter, umami~200 ms
ENaCionotropicNa⁺ (salt)~1 ms
Piezo2mechanotouch, proprioception<1 ms
TMC1/2mechano (tip-link)hair bundle deflection~10 μs
TRPV1thermo/chemo>42°C, capsaicin~100 ms
TRPA1 (snake pit)thermoIR radiant heat~10 ms

Dynamic Range & Adaptation

Receptors span an enormous input range by two tricks. First, logarithmicamplification—more molecules or more conductance per unit stimulus—maps multiplicative inputs to additive outputs. Second, adaptation: the instantaneous sensitivity is continuously adjusted to subtract the current mean, so the receptor always operates near its steepest gain. In photoreceptors this is calcium feedback on cGMP synthesis (Pugh & Lamb, 1990). In hair cells it is the myosin slipping-clamp that resets the tension on the tip link. Laughlin (1981) showed that the histogram of natural light intensities matches the gain curve of the blowfly’s Large Monopolar Cells—a direct demonstration of efficient coding.

4. Sensory Neuron Coding

Given that a receptor fires spikes, how does the animal “read” that activity? Two canonical schemes dominate, and most real senses combine them.

Labeled-Line Coding

One receptor = one quality. Mammalian bitter-taste T2Rs signal “poison” regardless of the specific ligand, because all T2R-expressing cells project to the same downstream circuit (Mueller et al., 2005). Similarly, nociceptive C-fibres expressing TRPV1 encode “pain” as a modality, not a specific noxious chemical. Labeled lines simplify decoding but burn metabolic energy on many redundant pathways.

Population / Combinatorial Coding

The most economical way to encode a high-dimensional stimulus is to have a small number of broadly-tuned receptors whose relative activity forms a vector. Human color vision uses just three cone types (S, M, L), and the perceived colour is the three-component ratio. Olfaction uses combinatorial coding on an industrial scale: each of the \(\sim 400\) human ORs responds to many odorants, each odorant activates many ORs, and the identity of an odorant is the pattern across all 400 (Malnic et al., 1999).

\[ \vec{r} \;=\; \big(r_1, r_2, \dots, r_N\big) \quad\Longrightarrow\quad \text{stimulus identity} = \arg\max_s \; p(s \mid \vec{r}) \]

Rate vs. Temporal Codes

In the simplest picture the spike count in a time window is the message. But many sensory systems carry information in the precise timing of spikes: in the auditory nerve, interspike intervals lock to stimulus phase up to\(\sim 5\) kHz (volley principle). Owls use microsecond-scale interaural timing to localise prey (Konishi, 1993). Electric fish discriminate waveform distortions at sub-millisecond precision (Carr & Maler, 1986). Theunissen & Miller (1995) introduced a formal test: if shuffling spike times (preserving the rate) significantly decreases \(I(S;R)\), the code is temporal.

Sparse & Efficient Codes

Olshausen & Field (1996) showed that natural-image statistics force a dictionary of edge-like “Gabor” features when neural activity is constrained to be sparse. Sparse codes save energy (few spikes per stimulus) and are easier for downstream circuits to read out. In the mushroom-body Kenyon cells of insects fewer than 5&percnt; of neurons fire to any single odour—a hallmark of sparse coding (Perez-Orive et al., 2002).

5. Energy Cost of Sensing

The brain is a voracious consumer. Attwell & Laughlin (2001) computed a bottom-up budget for grey matter and found that \(\sim 50\%\) of cortical ATP is spent restoring ionic gradients after spikes and synaptic currents—with postsynaptic glutamate receptors alone accounting for a third.

ATP Per Photon

Howard (1987) and Laughlin & de Ruyter van Steveninck (1996) measured phototransduction cost in the fly. Each absorbed photon triggers a transient depolarising current that must be recovered by the Na⁺/K⁺-ATPase; roughly \(\sim 10^6\) ATP molecules are hydrolysed per photon at bright light. For a photoreceptor carrying \(\sim 400\) bits/s this works out to \(\sim 3 \times 10^{-14}\) J per bit—the most efficient known sensory channel.

Why Vertebrates Pay More

Vertebrate photoreceptors are inverted (light must pass through neurons before hitting the outer segment), myelinated axons cost more per millimeter than unmyelinated invertebrate fibres, and active dendritic integration adds cost. The penalty: cortical spikes carry \(\sim 10^{-12}\) J/bit, two orders of magnitude worse than fly photoreceptors. But the gain in representational power and plasticity is what makes mammalian cognition possible.

Balasubramanian’s Framework

Balasubramanian et al. (2001) framed sensing as a rate-distortion-energy trade-off: minimise metabolic cost\(E\) subject to a bit rate \(R\) and tolerable distortion \(D\):

\[ \min_{p(r\mid s)} \; E[p] \quad \text{s.t.} \quad I(S;R) \ge R, \quad \mathbb{E}[d(S,\hat{S})] \le D \]

This generalises Shannon’s rate-distortion theory by adding a biophysical cost, and has quantitatively predicted optimal firing thresholds, transfer functions and adaptation rules in fly LMCs, salamander ganglion cells and mouse olfactory receptors.

6. Predictive Coding & Free-Energy Principle

Rao & Ballard (1999) proposed that cortical microcircuits implement a hierarchy of predictive codes: each level sends downward a prediction of the level below, and receives upward the prediction error. The result is dramatic bandwidth compression: only the unexplained residual needs to travel.

\[ \text{error}_{\ell} \;=\; r_{\ell} - W_{\ell,\ell+1} \, r_{\ell+1} \]

Residual signal \(r_{\ell} - \hat{r}_{\ell}\) ascends; priors \(\hat{r}_{\ell}\) descend

Friston (2010) subsumed predictive coding under the free-energy principle: any biological system that persists must minimise an upper bound on the surprise of its sensory inputs. Action and perception jointly solve this bound:

\[ F \;=\; \mathbb{E}_{q}[\log q(z) - \log p(s,z)] \ge -\log p(s) \]

Variational free energy \(F\) bounds \(-\log p(s)\), the surprise of sensory data

Evidence for predictive coding has accumulated at every level: mismatch-negativity in EEG, oddball-selective neurons in auditory cortex (Ulanovsky et al., 2003), surround suppression in V1, and even single-neuron responses to Bayesian violations in primate IT cortex (Meyer & Olson, 2011). For comparative sensing, predictive coding explains why the visual world does not overwhelm the optic nerve: most of it is already predicted by the prior.

7. Evolutionary Origins of Sensing

Darwin (Origin of Species, 1859) famously called the vertebrate eye an “organ of extreme perfection” and confessed that its evolution by gradual variation “seems, I freely confess, absurd in the highest possible degree.” Nilsson & Pelger (1994) numerically demonstrated that a flat photosensitive patch can evolve into a camera eye in fewer than 400,000 generations given realistic selection coefficients—an eyeblink on geological timescales.

The Simplest Sensors: C. elegans ASH

The nematode Caenorhabditis elegans has only 302 neurons. Its ASH amphid neuron alone integrates nociceptive chemicals, osmolarity and touch to drive avoidance (Hilliard et al., 2005). This single polymodal neuron illustrates the ancestral state: one cell = one “sense” of “bad thing out there.” Diversification of sensory modalities then recapitulates the pattern of gene duplication followed by functional subspecialisation (Ohno, 1970).

Deep Homology of Opsins

All metazoan opsins descend from a single ancestral GPCR that gained retinal-binding around 700 Mya (Feuda et al., 2012). The modern bilaterian diversity of ~9 opsin subfamilies (ciliary, rhabdomeric, Go, neuropsin, peropsin, …) partitions wavelengths, photoreceptor cell types and even non-visual functions (circadian melanopsin). Hearing, by contrast, has evolved independently at least seven times in animals (fish lateral line, insect tympana, cephalopod statocysts, mammalian cochlea, …), each time redeploying mechanosensory hair cells and TMC-family channels.

Convergent Nervous-System Origins

Comb jellies (Ctenophora) likely evolved a nervous system independently from cnidarians + bilaterians (Moroz et al., 2014), meaning neurons themselves were invented at least twice. Sensing, therefore, is not a single evolutionary invention but a repeated attractor: any organism that can move and metabolise will be selected for channels that couple environmental perturbations to membrane currents.

Deep-Time Origins of Animal Senses

800 Mya600 Mya400 Mya200 MyatodayOpsin vision (camera/compound)Mechanosensory hair cellsGPCR chemosensorsTRP thermosensorsElectroreceptors (ampullary)ancestral opsinhair-cell MET (TMC)TRPV/TRPM duplicationsshark ampullae, mormyrid knollenorgansSensory modalities as deep-time attractors

8. A Cross-Modal Currency: Bits, Watts, Seconds

The power of the Shannon framework is that it puts photons, pressure waves, electric fields and chemical gradients on the same axis. Once we quantify a sense by (bit rate, metabolic cost, latency), we can compare modalities normally considered incommensurable.

Sensory systemBit rateLatencyCost/bit
Fly photoreceptor (bright)~700 b/s~20 ms~10⁻¹⁴ J
Human rod (dim)~3 b/s~200 ms~10⁻¹³ J
Cochlear IHC (1 kHz)~1000 b/s~0.5 ms~10⁻¹² J
Mormyrid electrosensor~500 b/s~5 ms~10⁻¹³ J
Olfactory GPCR neuron~20 b/s~300 ms~10⁻¹² J
Pit-viper TRPA1 fibre~50 b/s~100 ms~10⁻¹² J
Human cortical neuron~3 b/s~10 ms~10⁻¹² J

Three Universal Trade-Offs

  • Speed vs. accuracy: faster decisions use shorter integration windows, accepting higher noise. Vickers et al. (2004) modelled this as optimal drift-diffusion.
  • Resolution vs. sensitivity: summing over many receptors boosts SNR but blurs detail. Rod pools at scotopic threshold trade spatial acuity for photon catch.
  • Bandwidth vs. energy: wider bandwidth channels cost more ATP, because the ion pumps must compensate for higher frequencies of channel opening.

The Senomic Hypothesis

Kondoh (2019) and others have argued that total sensory throughput per species is a conserved quantity—animals budget bits among modalities subject to metabolic and developmental constraints. A dragonfish trades colour for red bioluminescence; an elephant trades high-frequency hearing for seismic infrasound; a pit viper trades part of its visual opsin repertoire for a thermal camera. The senomic genome reveals these trade-offs as loss-of-function mutations in the opsin, olfactory and TRP gene families, correlated with ecology.

9. Ideal-Observer Analysis

An ideal observer is a theoretical agent that uses Bayes’ theorem to compute the posterior probability of a stimulus given the noisy receptor outputs, then decides optimally. Geisler (2011) formalised the ideal-observer framework:

\[ \hat{s} \;=\; \arg\max_s \; p(\mathbf{r} \mid s) \, p(s) \]

Bayes-optimal estimate given prior \(p(s)\) and receptor likelihood \(p(\mathbf{r}\mid s)\)

Comparing real behavioural performance to the ideal-observer bound tells us how efficiently an animal uses its available information. Honeybees achieve\(>90\%\) of ideal performance in colour discrimination; human contrast sensitivity approaches the ideal within a factor of 2–3 at low frequencies (Geisler, 1989). Where performance falls short, the gap points to downstream noise, late-stage decision rules, or missing priors.

Simulation 1: Shannon Channel Capacity Across Species

We compute \(C = B\log_2(1+S/N)\) for representative sensory channels: blowfly photoreceptor, fly H1 motion neuron, cat auditory nerve, mormyrid electrosensor, human rod and cochlear IHC. The information rate is plotted against SNR and then summarised per channel, annotated with estimated nanowatt power draw (Niven & Laughlin 2008 cost per bit).

Python
script.py81 lines

Click Run to execute the Python code

Code will be executed with Python 3 on the server

Simulation 2: Weber-Fechner vs. Stevens Law

Synthetic psychophysical data for brightness (\(n \approx 0.33\)), loudness (\(n \approx 0.67\)) and weight (\(n \approx 1.45\)) are fitted to Stevens’ power law and compared against the Weber-Fechner logarithm. A third panel shows just-noticeable differences predicted by Weber’s rule \(\Delta I/I = k_W\) across four modalities.

Python
script.py89 lines

Click Run to execute the Python code

Code will be executed with Python 3 on the server

Key References

• Shannon, C. E. (1948). A mathematical theory of communication. Bell System Technical Journal, 27, 379–423.

• Green, D. M. & Swets, J. A. (1966). Signal Detection Theory and Psychophysics. Wiley.

• Hecht, S., Shlaer, S. & Pirenne, M. H. (1942). Energy, quanta, and vision. J. Gen. Physiol., 25, 819–840.

• Stevens, S. S. (1957). On the psychophysical law. Psychological Review, 64, 153–181.

• Laughlin, S. B. (1981). A simple coding procedure enhances a neuron’s information capacity. Zeitschrift für Naturforschung C, 36, 910–912.

• de Ruyter van Steveninck, R. R. & Laughlin, S. B. (1996). The rate of information transfer at graded-potential synapses. Nature, 379, 642–645.

• Attwell, D. & Laughlin, S. B. (2001). An energy budget for signalling in the grey matter of the brain. J. Cereb. Blood Flow & Metab., 21, 1133–1145.

• Niven, J. E. & Laughlin, S. B. (2008). Energy limitation as a selective pressure on the evolution of sensory systems. J. Exp. Biol., 211, 1792–1804.

• Rao, R. P. & Ballard, D. H. (1999). Predictive coding in the visual cortex. Nature Neurosci., 2, 79–87.

• Friston, K. (2010). The free-energy principle: a unified brain theory? Nature Rev. Neurosci., 11, 127–138.

• Olshausen, B. A. & Field, D. J. (1996). Emergence of simple-cell receptive field properties by learning a sparse code for natural images. Nature, 381, 607–609.

• Malnic, B., Hirono, J., Sato, T. & Buck, L. B. (1999). Combinatorial receptor codes for odors. Cell, 96, 713–723.

• Nilsson, D.-E. & Pelger, S. (1994). A pessimistic estimate of the time required for an eye to evolve. Proc. R. Soc. B, 256, 53–58.

• Feuda, R. et al. (2012). Metazoan opsin evolution reveals a simple route to animal vision. PNAS, 109, 18868–18872.

• Howard, J. (1987). Assessment of the optical performance of a three-dimensional neural image of an insect photoreceptor. J. Exp. Biol., 133, 469–483.

• Baylor, D. A., Lamb, T. D. & Yau, K.-W. (1979). Responses of retinal rods to single photons. J. Physiol., 288, 613–634.

• Coste, B. et al. (2010). Piezo1 and Piezo2 are essential components of distinct mechanically activated cation channels. Science, 330, 55–60.

• Pan, B. et al. (2013). TMC1 and TMC2 are components of the mechanotransduction channel in hair cells of the mammalian inner ear. Neuron, 79, 504–515.

• Moroz, L. L. et al. (2014). The ctenophore genome and the evolutionary origins of neural systems. Nature, 510, 109–114.

• Balasubramanian, V., Kimber, D. & Berry II, M. J. (2001). Metabolically efficient information processing. Neural Computation, 13, 799–815.

• Brenner, N., Strong, S. P., Koberle, R., Bialek, W. & de Ruyter van Steveninck, R. (2000). Synergy in a neural code. Neural Computation, 12, 1531–1552.

• Buck, L. & Axel, R. (1991). A novel multigene family may encode odorant receptors. Cell, 65, 175–187.

• Sato, K. et al. (2008). Insect olfactory receptors are heteromeric ligand-gated ion channels. Nature, 452, 1002–1006.

• Mueller, K. L. et al. (2005). The receptors and coding logic for bitter taste. Nature, 434, 225–229.

• Kawashima, Y. et al. (2011). Mechanotransduction in mouse inner ear hair cells requires transmembrane channel-like genes. J. Clin. Invest., 121, 4796–4809.

• Perez-Orive, J. et al. (2002). Oscillations and sparsening of odor representations in the mushroom body. Science, 297, 359–365.

• Strong, S. P. et al. (1998). Entropy and information in neural spike trains. Phys. Rev. Lett., 80, 197–200.

• Theunissen, F. & Miller, J. P. (1995). Temporal encoding in nervous systems: a rigorous definition. J. Comput. Neurosci., 2, 149–162.

• Ulanovsky, N., Las, L. & Nelken, I. (2003). Processing of low-probability sounds by cortical neurons. Nature Neurosci., 6, 391–398.

• Hilliard, M. A. et al. (2005). In vivo imaging of C. elegans ASH neurons: cellular response and adaptation to chemical repellents. EMBO J., 24, 63–72.

• Geisler, W. S. (2011). Contributions of ideal observer theory to vision research. Vision Research, 51, 771–781.

• Pugh, E. N. & Lamb, T. D. (1990). Cyclic GMP and calcium: the internal messengers of excitation and adaptation in vertebrate photoreceptors. Vision Research, 30, 1923–1948.

• Rieke, F., Warland, D., de Ruyter van Steveninck, R. & Bialek, W. (1997). Spikes: Exploring the Neural Code. MIT Press.