Exciting new gadget - a hyperspectral camera/imager that you can build for ~£350. It can measure radiance or reflectance, ~320-880nm, 9nm FWHM, customisable spatial resolution, enormous dynamic range, and a simple GUI for desktop or android.
We're hosting a visual ecology symposium - 2nd of July - Free entry
Sign up here: https://tinyurl.com/3b4bd647
High-sensitivity low-cost spectrometer that runs from a smartphone. Build your own from 3D printed parts and off-the-shelf components. My paper is out in JEB today.
Summary: The OSpRad spectroradiometer uses off-the-shelf components and 3D-printed parts, and can be controlled via smartphone. It operates from approximately 320 to 880 nm, and at low light down to approximately 0.001 cd m−2.
Can society turn down damaging levels of #lightpollution? - A Financial Times Rethink Video - with @andreas_jechow @jolyon and many more
Author summary An object’s brightness and colour are not just due to its own surface properties, but also depend on the colours and patterns of its surrounds. We set out to develop a computational model that could predict colour appearance based on the principle of efficient coding. This takes into account the fact that neural bandwidth is limited (e.g. the fastest rate a neurone can fire might only be ten times its lowest rate), and that none of this valuable bandwidth should be wasted when coding information across different spatial scales in a typical natural scene. We next combined these principles with contrast sensitivity functions (because contrast detection thresholds vary with spatial scale), and used either psychophysical or neurophysiological data to estimate the bandwidth for humans/primates. When we tested the model against a bank of visual phenomena (illusions) we found that the model was able to predict the direction of almost all phenomena. Our model is surprisingly simple and generalisable, with no free parameters, and would be explained by low-level feed-forward neural architecture. This suggests that many complex visual phenomena–that have often attributed to high-level processes–could arise as artefacts of limited bandwidth and efficient coding, offering valuable avenues for future research.
New paper from Juho Jolkkonen, Kevin Gaston, and Jolyon Troscianko shows clear evidence of an impact of night sky brightness on the flight response of a bird: https://www.nature.com/articles/s42003-023-04486-x
The difference between b and d is that the data in plot d comes from a creek valley where the main source of light was sky brightness as opposed to light sources from the side. #LightPollution
Eurasian curlew are less willing to take off in low-light, an effect largely governed by light pollution. Artificial light at night appears to cause birds to trade off risky low-light flight against predation risk and foraging opportunities.
New paper: Curlew are less willing to take off in low-light, an effect largely governed by light pollution. Artificial light at night appears to cause birds to trade off risky low-light flight against predation risk and foraging opportunities.
This AI-generated image sets the scene for our study.
Eurasian curlew are less willing to take off in low-light, an effect largely governed by light pollution. Artificial light at night appears to cause birds to trade off risky low-light flight against predation risk and foraging opportunities.