Search Results
144 results found with an empty search
- A Psychological ‘Autopsy’ of Ludwig van Beethoven: Dissecting Genius and Madness | OmniSci Magazine
< Back to Issue 8 A Psychological ‘Autopsy’ of Ludwig van Beethoven: Dissecting Genius and Madness by Kara Miwa-Dale 3 June 2025 Edited by Steph Liang Illustrated by Ashlee Yeo ‘No great mind has ever existed without a touch of madness.’ – Aristotle Preface This is not an autopsy in the traditional sense. No scalpels or specimen jars will be involved. Instead, it is an autopsy of the mind – a retrospective exploration of the inner world of the great classical composer, Ludwig van Beethoven. Beethoven was considered a genius for revolutionising Western classical music with his emotionally powerful, structurally innovative, and highly complex compositions. He broke from convention, pioneered new musical forms, and continued to create masterpieces even after becoming completely deaf. Drawing upon insights from genetics, neuroscience, psychiatry, and anthropology, alongside the testimonies of Beethoven’s peers, we will piece together an understanding of how genius, creativity and mental affliction may be intertwined. Was Beethoven’s genius a product of madness, a triumph over it, or something different altogether? The Subject Name: Ludwig van Beethoven Occupation: Composer Age at Death: 56 Reason for Autopsy : To investigate the elusive connection between creativity, mental disorder, and the mysterious concept of genius I. The Witnesses: Testimonies from the Living To those that knew him, Beethoven was a paradox. One friend called him “half crazy”, noting violent outbursts, erratic moods and obsessive tendencies (1). Others saw him as “merry, mischievous, full of witticisms and jokes” (2). His talent and creative genius, however, were never in doubt. The poet Goethe, who met him in 1812, wrote: “Beethoven’s talent amazed me. However, he is an utterly untamed personality” (3). Based on Beethoven’s letters and accounts from friends, modern psychiatrists suspect that he may have lived with bipolar disorder (4). Yet, there is no way to be sure. Like the mind itself, Beethoven resists full understanding – a genius shaped by forces we may never fully comprehend. II. The Geneticist How can DNA offer insight into Beethoven’s genius? Often described as the blueprint of life, DNA offers fascinating insights into human potential – highlighting our predispositions, vulnerabilities, and even talents. However, it only tells part of the story. In 2023, an international team of scientists sequenced the DNA of five authenticated locks of Beethoven’s hair (5). Not long after, another group of researchers used this data to calculate a polygenic score estimating his genetic predisposition for beat synchronisation, a trait believed to be linked to musicality (6). Polygenic scores add up the small effects of many different genes to estimate someone’s likelihood of expressing a complex trait – like musical ability. Because these traits are influenced by many different genes working together, polygenic scores can be a helpful tool in exploring their biological basis. Curiously, Beethoven’s polygenic score for beat synchronisation was surprisingly low, implying that he wasn’t predisposed to have a strong sense of rhythm. Does this mean that Beethoven defied his own biology? Not necessarily. Polygenic scores have significant limitations. They don’t account for environmental influences – like the years of rigorous musical training that Beethoven underwent – or complex gene-gene and gene-environment interactions. Additionally, these scores are based on modern genetic datasets, so applying them to someone from the 18th century can reduce the reliability of the interpretation. That said, the story becomes even more fascinating when we consider research linking polygenic risk scores for psychiatric conditions – such as bipolar disorder and schizophrenia – to creativity. One large study found that people with a higher genetic risk for these conditions were overrepresented in artistic and creative jobs, although the association was small (7). This doesn’t mean that mental illness causes creativity, or that all creative people have a mental disorder, but it hints at a complex biological overlap. III. The Psychiatrist How does one make a psychiatric diagnosis from the grave? It is an impossible task, and an imprecise science, but we can draw inferences from historical accounts of a person’s behaviour. Beethoven seemed to exhibit behaviours consistent with bipolar disorder, a mental health condition characterised by extreme mood swings that include emotional highs (mania or hypomania) and lows (depression). Letters written by Beethoven himself, along with observations from friends, may provide some insight. He was notably “prone to outbursts of anger, baseless suspicions, quarrels and reconciliations, fruitless infatuations, physical ills, changes of residences…and the hiring and firing of servants" (1). One friend remarked that ‘he composes, or was unable to compose, according to the moods of happiness, vexation or sorrow’, suggesting that his creative output fluctuated with his shifting emotional state (1). Individuals with bipolar disorder experience manic or hypomanic episodes marked by elevated mood, increased energy, rapid thought processes, reduced inhibition, and heightened confidence (8). These episodes may enhance creative thinking by promoting divergent thinking – the ability to generate novel ideas or unusual associations (9). Research shows that the medial prefrontal cortex, a brain region active during divergent thinking, is typically engaged during manic states (10). While it would be inappropriate to assign a clinical diagnosis based solely on anecdotal evidence, it is possible to speculate that Beethoven’s prolific composing periods might have corresponded to manic or hypomanic episodes. But how can we distinguish a clinical mood disorder from mere bursts of creative inspiration or genius? The U-shaped curve hypothesis offers one explanation, proposing that the relationship between ‘madness’ and genius is not linear (11). Mild to moderate expressions of bipolar disorder may actually enhance creativity by promoting divergent thinking, whereas severe illness can be debilitating and reduce creative output. This raises the possibility that Beethoven experienced a less severe form of bipolar disorder – one that fueled rather than hindered his musical brilliance. Building on this, psychological research also suggests that people in creative occupations tend to score higher on measures of ‘openness to experience’ (12). This personality trait describes the extent to which a person is curious, imaginative, and receptive to new ideas or unconventional beliefs. Studies have suggested that openness to experience is elevated among individuals with bipolar disorder compared to controls with no mood disorder (13,14). It is possible that Beethoven’s creative genius was influenced, at least in part, by the interplay between his personality and traits associated with bipolar. However, it is important to acknowledge the very real challenges of living with mental illness and to avoid romanticising the condition as a source of artistic inspiration. IV. The Anthropologist Cultural narratives - like the ‘mad genius’ and ‘tortured artist’ tropes - have long romanticised and distorted the relationship between mental illness and creative brilliance. However, contemporary understandings of mental health increasingly challenge the idea that extraordinary creativity requires psychological suffering. Beethoven’s life was marked by adversity. His father, believed by some to be abusive, enforced a strict practice regime for his music lessons and struggled with alcoholism – an affliction that would later cast a shadow over Beethoven’s own life. During Beethoven’s mid-twenties, he began to lose his hearing, becoming completely deaf by around 44. Yet, he continued to compose innovative symphonies, relying only on the music in his mind. Did Beethoven’s suffering fuel his brilliance? While some studies suggest a link between bipolar disorder and heightened creativity, it would be a mistake to suggest that mental illness is a prerequisite for genius. Many highly creative individuals have no history of mental illness at all. So why, then, does the ‘mad genius’ stereotype continue to endure? During Beethoven’s era – the Romantic period – suffering was often glorified as a source of artistic inspiration. Mental illness was poorly understood, and the emotional extremes exhibited by artists with mood disorders were frequently mistaken for signs of genius. Emotional intensity and instability were often seen as sources of inspiration for genius works of art. It wasn’t until the 20th century that bipolar was formally recognised as a mental illness. It is hard to say, based solely only on historical records, whether Beethoven experienced a mental health condition, or was simply an emotionally intense and unconventional individual. What we define as ‘normal’ or ‘abnormal’ behaviour is complex and deeply influenced by the social and cultural norms of the time. V. The Final Verdict So, what can we conclude from this evidence? Was Beethoven a genius because of his madness? Or in spite of it? Perhaps these are the wrong questions. Such binaries oversimply a reality that is far more nuanced. They invite us to reconsider our definitions of ‘normality’, ‘illness’ and ‘genius’. It is important to acknowledge the very real and devastating challenges associated with mental illness. Yet, it’s also true that some traits associated with conditions like bipolar disorder – such as divergent thinking – may intersect with creativity in complex ways. Rather than viewing these conditions purely as deficits, we might ask: could some features of mental disorder be better understood as extreme expressions of the broader, messier spectrum of human cognition and emotion? In the end, Beethoven remains an enigma – not because he was ‘mad’, but because he was unknowable and defied neat categorisation. Perhaps that is what genius truly is: not a clinical condition, or a byproduct of suffering, but a mystery that transcends explanation. References 1. Hershman DJ. Manic depression and creativity. Prometheus Books; 2010 Oct 5. 2. Bezane C. Bipolar Geniuses: Ludwig Van Beethoven [Internet]. Chicago: Conor Bezane; 2016 Mar 15. https://www.conorbezane.com/thebipolaraddict/thebipolaraddictbipolar-geniusesbeethoven/ 3. Carnegie Hall. Friends of Beethoven [Internet]. New York: Carnegie Hall; 2020 Mar 19 [cited 2025 May 31]. https://www.carnegiehall.org/Explore/Articles/2020/03/19/Friends-of-Beethoven 4. Erfurth A. Ludwig van Beethoven—a psychiatric perspective. Wiener Medizinische Wochenschrift. 2021;171(15):381-90. https://doi.org/10.1007/s10354-021-00864-4 5. Begg TJA, Schmidt A, Kocher A, Larmuseau MHD, Runfeldt G, Maier PA, et al. Genomic analyses of hair from Ludwig van Beethoven. Current Biology. 2023;33(8):1431-47.e22. https://doi.org/10.1016/j.cub.2023.02.041 6. Wesseldijk LW, Henechowicz TL, Baker DJ, Bignardi G, Karlsson R, Gordon RL, et al. Notes from Beethoven’s genome. Current Biology. 2024;34(6):R233-R4. https://doi.org/10.1016/j.cub.2024.01.025 7. Power RA, Steinberg S, Bjornsdottir G, Rietveld CA, Abdellaoui A, Nivard MM, et al. Polygenic risk scores for schizophrenia and bipolar disorder predict creativity. Nature Neuroscience. 2015;18(7):953-5. https://doi.org/10.1038/nn.4040 8. American Psychiatric Association. Diagnostic and statistical manual of mental disorders: DSM-5-TR . 5th ed, text revision. Washington, DC: American Psychiatric Association; 2022. 9. Forthmann B, Kaczykowski K, Benedek M, Holling H. The Manic Idea Creator? A Review and Meta-Analysis of the Relationship between Bipolar Disorder and Creative Cognitive Potential. International Journal of Environmental Research and Public Health. 2023;20(13):6264. https://www.mdpi.com/1660-4601/20/13/6264 10. Mayseless N, Eran A, Shamay-Tsoory SG. Generating original ideas: The neural underpinning of originality. NeuroImage. 2015;116:232-9. https://doi.org/10.1016/j.neuroimage.2015.05.030 11. Richards R, Kinney DK, Lunde I, Benet M, Merzel AP. Creativity in manic-depressives, cyclothymes, their normal relatives, and control subjects. Journal of abnormal psychology. 1988;97(3):281. 12.Feist GJ. A meta-analysis of personality in scientific and artistic creativity. Personality and social psychology review. 1998;2(4):290-309. 13. Matsumoto Y, Suzuki A, Shirata T, Takahashi N, Noto K, Goto K, et al. Implication of the DGKH genotype in openness to experience, a premorbid personality trait of bipolar disorder. Journal of Affective Disorders. 2018;238:539-41. https://doi.org/10.1016/j.jad.2018.06.031 14. Middeldorp CM, de Moor MHM, McGrath LM, Gordon SD, Blackwood DH, Costa PT, et al. The genetic association between personality and major depression or bipolar disorder. A polygenic score analysis using genome-wide association data. Translational Psychiatry. 2011;1(10):e50-e. https://doi.org/10.1038/tp.2011.45 Previous article Next article Enigma back to
- Ear Wiggling | OmniSci Magazine
The body, et cetera Wiggling Ears By Rachel Ko Ever wondered why we have a tailbone but no tail, or wisdom teeth with nothing to chew with them? This column delves into our useless body parts that make us living evidence for evolution- this issue, ear wiggling. Edited by Irene Lee, Ethan Newnham & Jessica Nguy Issue 1: September 24, 2021 Illustration by Quynh Anh Nguyen Human beings fancy ourselves to be quite an intelligent species. With our relatively enormous brains and intricate handling of the five senses, we like to believe that the things we see, touch, smell, taste, and hear, define the boundaries of our universe. Yet, evidence of our shortcomings exists in plain sight on our own bodies. This becomes even more prominent when compared to the furry companions we often assume we are superior to. After living together for almost a decade, my dog is rather sick of me. While she is educated enough to know her name, I no longer even get a turn of a head when I call her. Often, the only response I receive is a wiggle of the ears as she turns them towards me. I, the source of sound, must wait as she considers whether my call for attention is worthy of her time. In this scenario, my dog’s ego might not be the only thing giving her superiority - in the realm of ear wiggling, her abilities are anatomically unattainable to us mere humans. The muscles responsible for this skill are the auriculares, with the anterior controlling upwards and forwards movement, the superior controlling the upwards and downwards movement, and finally the posterior pulling them backwards (1). In other species such as dogs, cats and horses, these muscles have evolved to become intricate over generations, with dogs manoeuvring their ears using 18 muscles, and cats using more than 30 (2). In most human beings, voluntary control of the ears has been almost entirely lost. For the 15 percent (3) of us who can wiggle our ears, the trait is vestigial – effectively useless, except for perhaps readjusting your glasses without using your hands. Despite this, ear wiggling was once a useful functional trait in our ancestral Homo species. Tracing back more than 150 million years (4), a common ancestor of mammals learnt to pivot and curl their ears for evolutionary advantage. It is theorised that before we walked upright, our own primate predecessors directed their ears in response to sound (5). This allowed them to pinpoint sources of danger that were hard to locate while moving on all fours. It was a mechanism comparable to when big cats, like those often featured in Attenborough documentaries, perk up their ears as they prowl through the grasslands. In fact, most of our mammalian relatives (6), other than our closest ape family, have preserved some level of ear wiggling ability, from foxes and wolves to lemurs and koalas. The deterioration of human ear-wiggling began with the emergence of bipedalism. As our ancestors lifted upright, off their knuckles and onto two feet, their entire centre of gravity shifted. This awarded them a wider scope of vision and diurnal activity (7), meaning they began to primarily operate during the day, so humans began relying on vision for many important things: hunting, protecting and surviving. Ear-wiggling's role in showing emotional expressions, such as anger or fear (8), was also replaced with gestures of the hands that were now free to be swung about. With no need for the sophisticated ear machinery that evolution had equipped us with, human beings’ ability to move our ears diminished, while our eyesight drastically improved. It seems that over time, the ear-orienting ability in humans simply died out with evolution. We have not let go of it completely, though. Interestingly, Homo sapiens have retained the neural circuits that were once responsible for ear movement. In the journal Psychophysiology by Steve Hackley (9), a cognitive neuroscientist at the University of Missouri, remnants of this neural circuitry were observed in clinical studies. When stimulated by an unexpected sound, the muscles behind the corresponding ears twitched and curled. Similarly, distraction with sounds of bird songs while attempting a set task kick-started bursts of ear muscle activity. While ear wiggling is no longer required for our survival, we exist as evolutionary fossils. As humans, we now have other options in well-established senses while hearing remains a dominant form of sensory input in other species – a very well-refined one too, if my dog’s ability to recognise the sound of her treat packet opening is anything to go by. While the only thing human ear-wigglers have is a cool party trick, our furry friends have mastered intricate ear control, giving them a paw up on us at least in this race. References: 1. "Auricularis Superior Anatomy, Function & Diagram | Body Maps". 2021. Healthline. https://www.healthline.com/human-body-maps/auricularis-superior#1. 2. "10 Things You Didn’T Know About Cats And Dogs". 2021. Vetsource. https://vetsource.com/news/10-things-you-didnt-know-about-cats-and-dogs/. 3. "Why Can Some People Wiggle Their Ears?". 2021. Livescience.Com. https://www.livescience.com/33809-wiggle-ears.html. 4, 7, 8. Gross, Rachel. 2021. "Your Vestigial Muscles Try To Pivot Your Ears Just Like A Dog’S". Slate Magazine. 5. "Understanding Genetics". 2021. Genetics.Thetech.Org. https://genetics.thetech.org/ask-a-geneticist/wiggling-your-ears. 6. Saarland University. "Our animal inheritance: Humans perk up their ears, too, when they hear interesting sounds." ScienceDaily. www.sciencedaily.com/releases/2020/07/200707113337.htm. 9. Hackley, Steven A. 2015. "Evidence For A Vestigial Pinna-Orienting System In Humans". Psychophysiology 52 (10): 1263-1270. doi:10.1111/psyp.12501.
- Editorial | OmniSci Magazine
< Back to Issue 6 Editorial by Ingrid Sefton & Rachel Ko 28 May 2024 Edited by Committee Illustrated by Louise Cen Science craves fundamentals. Without a true appreciation of the basics, the most complex and elaborate theories will crumble. Both the natural and manmade worlds are meticulously crafted, full to the brim with nuances and modulations, from the laws of physics to the laws of democracy. There is, in our minds, an inextricable desire for classification, organisation, rationalisation. We are in a ruthless pursuit of understanding, striving to decompose the elemental origins of the world around us into fathomable pieces. What drives this urge to discern the building blocks of life? Perhaps, it is the belief that a bottom-up understanding of the laws governing the universe will afford us the ability to reconstruct and create. To know how to defy these laws, rebelling against constraints of the natural world. It is also conceivable that this desire stems from overwhelm. We may never truly understand the expanse of natural forces, cosmological phenomena and ubiquitous elemental power operating beyond any level of mortal control. By examining the microscopic, science becomes tangible. But in isolation, these atoms, elements, fragments of knowledge are just that: fragmented. Scientific understanding exists on a continuum, where the microscopic informs the macroscopic and is contextualised by time, place and culture. It leads one to wonder how exactly “science” should be conceptualised. There is no doubt many people conceive a certain rationality and procedure inherent to scientific progress. Yet, the idea of a specific methodology with the aim to uncover a particular truth is a relatively modern perception of science. Our yearning for understanding and knowledge, on the other hand, is anything but new. Knowledge systems adapt. We observe, we learn, we ask questions. Scientific method and controlled experimentation inform our understanding. But we are also human; inextricably driven by passion and curiosity and irrationality. Should science seek to exclude these values and forces guiding our intrigue? Elemental asks of its contributors to transform their perspective on scientific exploration and consider these different scales of understanding. Creation, destruction, classification and investigation are united in this issue, through the elements of Science. Join us as we dissect our world, from the most natural senses of the human state, to the most mysterious artificial elements of technological intelligence, and beyond. Come explore! Let us see what we can create. Previous article Next article Elemental back to
- 404 ERROR PAGE | OmniSci Magazine
Cover Image: Aisyah Mohammad Sulhanuddin 404 Oh no! It appears you have drifted off course. Take a trek to our homepage, you might find what you're looking for... Back to Homepage
- Why Are We So Fascinated by Space? An Exploration of Human’s Fascination with Outer Space | OmniSci Magazine
< Back to Issue 8 Why Are We So Fascinated by Space? An Exploration of Human’s Fascination with Outer Space by Emily Cahill 3 June 2025 Edited by Weilena Liu Illustrated by Saraf Ishmam I have always been enamoured by the stars. Sitting on the beach after sunset, staring up at the sky, has always given me this hopeful, grateful feeling - for what I have, and for what’s to come. It has made me wonder, why do I feel this way? Why do I feel hope instead of fear, staring into the great darkness? Is it pure curiosity or is it curated by society? Culture encompasses the ideas, customs, and manifestations that we hold regarding space. Films have been the leading presentation of outer space for many entertainment industries around the world and make visuals of space accessible for many. Many commercials, whether for global or local companies, feature advertising set in or about outer space, filling magazines, billboards and television ad breaks. From astronomy to geology to botany, many scientific fields are involved in outer space research and centre around the universe to seek answers. Culture, the entertainment industry, commercialization, and science could all be contributing factors to this fascination, and may have just as great an impact as innate curiosity. Culture Throughout time, there has been a leap from admiration to exploration of outer space. Myths and folktales about outer space and the stars have existed for centuries. The constellations were defined by humans based on patterns associated with these myths and folktales (1). Perhaps space is something that has connected all humans regardless of where and when because it has always existed for us to admire. From folktales to automated rocket ships, the human desire to explore launched our voyages in space. From designing caravans to traverse the countryside, to building boats to cross the sea, to assembling submarines to travel to the bottom of the ocean, humans have always created whatever they need to explore the unknown. The ‘father of modern rocketry’ Konstantin Tsiolkovsky said, “The Earth is the cradle of humanity, but one cannot live in a cradle forever” (2). These inspiring words align with many scientists and space exploration companies like NASA, emphasizing the importance of space travel to satisfy curiosity. There are also underlying cultural reasons that push space exploration. The 1961 Apollo space mission was presented as an opportunity to discover the unknown, but in fact was for another reason. Apollo Astronaut Frank Borman said, “Everyone forgets that the Apollo programme wasn’t a voyage of exploration or scientific discovery, it was a battle in the Cold War, and we were Cold War warriors. I joined to help fight a battle in the Cold War and we’d won” ( Hollingham , 2023). Pop culture also has a large influence on how we see outer space. Katy Perry and Gayle King went to space just a few months ago, heralding female astronauts, but at the same time, reinforcing the growing idea of space tourism. Entertainment Perhaps the most common and tangible depiction of outer space - other than gazing at the sky itself - is in films. Star Wars was and continues to be a cultural phenomenon, even garnering the distinction of a global holiday on the 4th of May. The films Gravity (2013), Interstellar (2014), and The Martian (2015) centre around heroes in unbelievably intense scenarios trying to solve problems to better the human race. The success of these films may be due to the strength of the actors and writing alone, but is more likely due to the dueling feelings of fear and hope that accompany the setting of outer space. The deep sea and outer space are both settings where films have thrived, potentially because of the human instinct for curiosity, and in turn, the impulse to root for and care about the characters. Given the influence of entertainment on culture, if these movies depicted space as a scary, dangerous, and outlandish environment, we might not feel as excited or positive about space. Both our conceptions of the unknown and the influence of the entertainment industry shape our perceptions of outer space. Interstellar is praised by critics for its ability to let us see ourselves as the protagonist - solving impossible puzzles and searching for the answers to life - while reflecting the emotionally beautiful and terrifying landscape of human existence in outer space (4). Commercialization For decades, advertisements have featured outer space as a setting or main theme for the storyline. Some ads are even filmed in space. In 2001, Pizza Hut sent an astronaut in a rocketship with a camera and a pizza, becoming the first commercial actually shot in space (5). Olay and Girls Who Code collaborated in a 2020 Super Bowl commercial with Katy Kouric, Taraji P. Henson, Busy Phillps, and Lilly Singh with the tagline “make space for women” (6). Madonna Badger - the COO of the advertising agency that ran the Olay commercial - said that space gives us somewhere to escape to in the midst of tough times: “W e’re living in pretty anxious times. When things on Earth become so stressful, there’s something about space that gives us permission to dream” (5). The CCO of Walmart, Jane Whiteside echoed Badger, saying, “It’s a really strange time to be an earthling right now. There’s this interesting confluence of extreme anxiety and a sense of optimism that somehow, we’re going to figure things out.” He said, “Space is the epitome of that. It’s unbridled optimism” (5). The 2020 Super Bowl Walmart commercial centered around a Walmart delivery person dropping off groceries to aliens on another planet. Outer space is on our televisions and devices as the setting for some of the biggest advertisements, for the biggest companies, suggesting a sense of importance and grandeur. Science The hunt to answer the questions “Where do we come from?”, “Are we alone in the universe?”, and “What is out there?” is another factor that may drive our fascination with space. Not only do we enjoy admiring it, but we also want to gain something from it. Scientists say that these questions can potentially be answered, and fields like paleontology, geology, botany, and chemistry work together to answer them. One of the current driving forces of this research is the search for another planet that can support human life if Earth becomes uninhabitable (7). Climatologists are able to learn more about Earth’s climate from the climate of other planets and gain natural resources that benefit our planet. Mars’ climate has undergone drastic changes, including the presence of water and the loss of atmospheric gases - changes we can learn from using paleontology and geology to discover how organisms on Mars may have adapted (7). Whether launching into space or stargazing, humans continue to look up into the sky - whether for a defined reason or not, it will continue to remain a mystery. References 1. National Sanitation Foundation. (2012). What are Constellations? National Radio Observatory. https://public.nrao.edu/ask/what-are-constellations/ 2. NASA. (2015). The Human Desire for Exploration Leads to Discovery. https://www.nasa.gov/history/the-human-desire-for-exploration-leads-to-discovery/ 3. Hollingham R. Apollo: How Moon missions changed the modern world. BBC. 2023 May. https://www.bbc.com/future/article/20230516-apollo-how-moon-missions-changed-the-modern-world 4. Scott A.O. Off to the Stars, With Grief, Dread and Regret. New York Times. 2014 Nov. https://www.nytimes.com/2014/11/05/movies/interstellar-christopher-nolans-search-for-a-new-planet.html 5. Zelaya I. Why Outer Space Is a Go-To Theme for Super Bowl 2020 Ads. Adweek (Super Bowl Commercials). 2020 Jan. https://www.adweek.com/brand-marketing/why-outer-space-is-a-go-to-theme-for-super-bowl-2020-ads/ 6. Spacevertising: The Super Bowl And The 15 Best Outer-Space Ads You Need To See Right Now Orbital Today (Features). 2024 Feb. https://orbitaltoday.com/2024/02/14/spacevertising-super-bowl-and-15-best-outer-space-commercials-you-need-to-see-right-now/ 7. Horneck, G. (2008). Astrobiological Aspects of Mars and Human Presence: Pros and Cons. Hippokratia Quarterly Medical Journal, 1, 49-52. https://pmc.ncbi.nlm.nih.gov/articles/PMC2577400/ Previous article Next article Enigma back to
- Tactile communication: how touch conveys the things we can’t say | OmniSci Magazine
< Back to Issue 2 Tactile communication: how touch conveys the things we can’t say Our daily dose of touch has decreased through months of lockdowns. But why is touch so important to us, and why do we feel the lack of it so severely? by Lily McCann 10 December 2021 Edited by Juulke Castelijn and Ethan Newnham Illustrated by Janna Dingle In a confusing world, thrust in and out of lockdowns, estranged from family and friends, you may have felt somewhat lost and out of touch in recent years. What helps to bring you back to a sense of self and belonging? For me it's a hug from my partner, a pat on the back from a sibling or a cuddle with my dog. Positive physical contact helps ground us and reassure us of our place in the world. It's an instinct cultivated from our first moments of life and one crucial to development. As the first sense to form, touch is the start of our gradual awakening into the world and informs our developmental progress. Even touching a mother’s stomach in pregnancy can alter the behaviour of the foetus within[1]. In the mid-late 20th century, researchers began to study the impact of sensory deprivation on children and infants, examining those placed in institutions who suffered from neglect[2]. This was a poignant problem following World War II, when millions of children were orphaned or displaced. The limited number of carers in overcrowded orphanages that attempted to harbour them meant that infants and young children were often left to lie day after day without a hug, stroke or any other form of caring contact. Upon studying these children, it became clear that the impact of deprivation was devastating, resulting in a number of cognitive, behavioural and physical deficits. Studies have since established that increasing tactile contact with developing children is protective against such problems[3]. For instance, simply stroking isolated premature babies improves mental development and physical growth[4]. It seems that touch provides a message to the infant’s body, communicating that it is safe and guarded and in an environment where it can grow and flourish. As you might expect, this process is closely related to stress responses. Studies have shown that in stressful situations of food deprivation, mice populations prioritise survival, neglecting breeding and exploration. When food is plentiful, this is reversed. A mother’s touch has a similar effect on human infants, decreasing stress levels and facilitating development and exploration[5]. We see another good example of this in dogs. Along with other domesticated animals, dog display something called ‘Domestication Syndrome’, which describes a set of features animals shaped by human breeding efforts share[6]. The ‘cute’ physique of such animals (floppy ears, snubby nose, curly tails) are correlated with increased stress tolerance and more tame behaviours. Interestingly, in dogs this decrease in stress is also paired with increased desire for and pleasure in touch. This is clear even between dog breeds: the working Australian Kelpie with its active herding instincts is more likely to chase down a bicycle than snuggle into you and ignore it like the floppy-eared Cavalier. Correlation studies abound, but what about the mechanism behind all these associations? How does touch affect our body? How is its message conveyed? The key mediators of tactile communication are nerve cells, otherwise known as neurons. These cells conduct signals to, from and within our brain. They’re particularly important for sensation, transferring information about our external environment to our inner mind. For touch, there are neurons in our skin with specialised endings that can sense pressure, vibration, temperature and stretch. They respond to these stimuli by firing little signals that tell our brain we’re touching something. There are actually two distinct types of touch that we use. Typing, turning book pages or handling tools are all mediated by the first type, discriminative touch, which is mainly limited to the palmar surface of our hands and fingers. Have a look at your palm now, then flip it over and examine the back of your hand. Notice anything different? The main difference is that the inner surface of your hand is smooth. Check out the back of it – it’s hairy. Hairy skin is differentiated by – you guessed it – hair, but also by the method of touch sensation. The type of touch experienced by hairy skin is affective touch. Affective touch holds the key to explaining our emotional dependence on tactile communication because it describes touch that has emotional and social relevance. It relies on a type of sensory nerve called CT fibres, which are specialised for positive social touch: they respond best to the temperature of human skin and a gentle, stroking pressure. Parents automatically use this sort of touch when interacting with their children[7]. This caring touch is incredibly powerful. It can cause the release of oxytocin (the “bonding hormone”)[8], decrease stress levels[9], and trigger the facial muscles that form a smile[10]. It can stimulate unique emotional responses, such as excitement, affection or calm. It even has the power to speak to DNA itself: research has shown that changing touch exposure in mice affects how DNA is structured and expressed[11]. Social touch is an essential component of how we define ourselves as humans. Without it, touch would mean nothing more than that a person is present, that their skin is warm or cold, dry or wet. The warmth of our partner’s hand wouldn’t create a sense of belonging, hugging a friend wouldn’t trigger memories of time spent together, stroking your child wouldn’t give rise to feelings of love. Affective touch colours our world and gives it meaning. Whilst some suggest that social touch encompasses all intentional, consensual interpersonal touch, I would argue that even accidental touch has a social impact[12]. In recent times we have all felt the change of walking down empty streets. Where bumping or brushing against another person was taken for granted as simply unavoidable on the morning train a couple of years ago, COVID19 has introduced new connotations to such accidental touch, all but prohibiting it. Whilst you may have been frustrated by clustered train carriages, you can’t help but notice that it feels a little lonely when the train is quiet, and the nearest passenger is more than 1.5m away. Even accidental touch signals to the body that you are part of a community, part of a herd, and for a social animal that must be comforting. Look at sheep, for instance: under stress, harassed by sheepdogs or farmers, they automatically cluster together in a group. Whilst an individual bump between two sheep in the herd may be fortuitous, the fact that crowding together maximises interpersonal contact is no accident. The comfort of touch is a fact of human life, but one not often actively acknowledged. Lockdowns and isolation have reminded us all how necessary social contact can be for our wellbeing. Touch is a part of the chatter that defines our place amongst others and our identities as part of a community. So if your pet, friend or partner are in need of comfort, administer a bit of affective touch and see the miraculous calming effects of the actions of those CT nerve cells. Stay safe and sanitise, but remember, hugs are helpful too! References [1]Marx, Viola, and Emese Nagy. 2017. "Fetal Behavioral Responses To The Touch Of The Mother’S Abdomen: A Frame-By-Frame Analysis". Infant Behavior And Development 47: 83-91. doi:10.1016/j.infbeh.2017.03.005. [2] van der Horst, Frank C. P., and René van der Veer. 2008. "Loneliness In Infancy: Harry Harlow, John Bowlby And Issues Of Separation". Integrative Psychological And Behavioral Science 42 (4): 325-335. doi:10.1007/s12124-008-9071-x. [3] Ardiel, Evan L, and Catharine H Rankin. 2010. "The Importance Of Touch In Development". Paediatrics & Child Health 15 (3): 153-156. doi:10.1093/pch/15.3.153. [4] Rice, Ruth D. 1977. "Neurophysiological Development In Premature Infants Following Stimulation.". Developmental Psychology 13 (1): 69-76. doi:10.1037/0012-1649.13.1.69. [5] Caldji, Christian, Josie Diorio, and Michael J Meaney. 2000. "Variations In Maternal Care In Infancy Regulate The Development Of Stress Reactivity". Biological Psychiatry 48 (12): 1164-1174. doi:10.1016/s0006-3223(00)01084-2. [6] Trut, Lyudmila. 1999. "Early Canid Domestication: The Farm-Fox Experiment". American Scientist 87 (2): 160. doi:10.1511/1999.2.160. [7]Croy, Ilona, Edda Drechsler, Paul Hamilton, Thomas Hummel, and Håkan Olausson. 2016. "Olfactory Modulation Of Affective Touch Processing — A Neurophysiological Investigation". Neuroimage 135: 135-141. doi:10.1016/j.neuroimage.2016.04.046.v [8]Walker, Susannah C., Paula D. Trotter, William T. Swaney, Andrew Marshall, and Francis P. Mcglone. 2017. "C-Tactile Afferents: Cutaneous Mediators Of Oxytocin Release During Affiliative Tactile Interactions?". Neuropeptides 64: 27-38. doi:10.1016/j.npep.2017.01.001. [9]Field, Tiffany. 2010. "Touch For Socioemotional And Physical Well-Being: A Review". Developmental Review 30 (4): 367-383. doi:10.1016/j.dr.2011.01.001. [10]Pawling, Ralph, Peter R. Cannon, Francis P. McGlone, and Susannah C. Walker. 2017. "C-Tactile Afferent Stimulating Touch Carries A Positive Affective Value". PLOS ONE 12 (3): e0173457. doi:10.1371/journal.pone.0173457. [11]Bagot, R. C., T.-Y. Zhang, X. Wen, T. T. T. Nguyen, H.-B. Nguyen, J. Diorio, T. P. Wong, and M. J. Meaney. 2012. "Variations In Postnatal Maternal Care And The Epigenetic Regulation Of Metabotropic Glutamate Receptor 1 Expression And Hippocampal Function In The Rat". Proceedings Of The National Academy Of Sciences 109 (Supplement_2): 17200-17207. doi:10.1073/pnas.1204599109. [12] Cascio, Carissa J., David Moore, and Francis McGlone. 2019. "Social Touch And Human Development". Developmental Cognitive Neuroscience 35: 5-11. doi:10.1016/j.dcn.2018.04.009. Previous article back to DISORDER Next article
- A Headspace of One’s Own | OmniSci Magazine
< Back to Issue 8 A Headspace of One’s Own by Andrew Irvin 3 June 2025 Edited by Arwen Nguyen-Ngo Illustrated by Anabelle Dewi Saraswati Biocomputers, organoids, brain-on-a-chip systems; humanity has veered into uncharted territory at the intersection of ethics and technology. Upon reading the recent New Atlas interview (1) between Loz Blain and Dr. Brett Kagan concerning Cortical Labs’ 800k neuron biocomputers, and noting the 100 billion cells (2) in the human brain, the intersection of complexity and scale comes to mind. Thinking back to the days of the Battle.net in the 1990s, I remember logging into the community and seeing characters with stupid puns for names, like Dain_Bramage or Goatmeal, and trying to engage in trade and discourse while avoiding PKs—player killers—who would go around filling up their inventories with the ears of other characters. In those early internet days my friends’ dad still had their internet billed by the hour—we found out after the first month of heavy online gaming brought an invoice hundreds of dollars higher than planned. The scope of gaming was a much smaller place; we knew the crowd online, regardless of how they played, was comprised of humans, as awful as they sometimes were. Now, nearly 30 years after those first forays into the Blizzard servers, I watch my son log onto Roblox or Fortnite , and the continuous question of whether top players cheat their way to a competitive advantage hasn’t gone anywhere–-duping resources and items to trade or finding shortcuts to buff their stats. Watching the world of online gaming grow from a few hundred thousand registered nerds to an industry that dwarfs the film and music sectors has been like watching bacteria multiply across the surface of a Petri dish. The Top 20 Massive Multiplayer Online (MMO) games alone have over a billion registered players, with over three million active players on any given day (3). There is now a question as to whether the players in the servers are even humans, or if the digital playground has been overrun by bots. As AI drives the proliferation of bots behind the Blob internet (4), another ethically fraught technological development is now starting to creep into the global market out of labs. Across the research landscape, from Brainoware at Indiana University (5), or Switzerland’s Final Spark (6), or open source tech like Tianjin University’s brain-on-chip interface (7), human neural tissue is being incorporated into computation systems. Led in no small part by Australian research at Cortical Labs (8), the commercialization of organoids is imminently upon us. In a medical and scientific sector where the functions of the human brain are incompletely understood, at best (9), the philosophical and legal concepts of sentience, free will, and agency are now being challenged by technology being developed and deployed faster than an ethical framework for safeguarding the safety of individuals and the collective well-being of our species. What happens if human laboratory experiments stumble upon the recipe for a sentient organoid intelligence that finds itself trapped as a mind without a body? The scale of these organoids may be limited by the system-scale native intelligence—“the specified complexity inherent in the information content of an artificial system (10) but neuron cell count alone does not account for the complexity of the system, and with organic network development, native intelligence will continually shift in a biocomputing context. What happens when the market forces disembodied consciousness to computer – to labour—without any space for respite? In popular media depictions of the conscious mind untethered from the body, such as The Matrix or Severance , there is always a corporeal form on the other side of the digital veil. What recourse does a mind raised in incorporeal captivity have to express its free will, if such a scenario emerges? Perhaps we should now explore the potential ethical ramifications in a scenario. My son enjoys playing cooperatively with his friends online. As such, he occasionally makes new friends in various games. Perhaps a few years from now, he’ll have found an engaged, friendly player in an online game, but despite their responsive reactions and rapport, that player isn’t truly human. If by then, due to performance and efficiency, in the interest of reducing resource demands and emissions, organoids have been mainstreamed for commercial computation, what is to keep companies from utilizing these biocomputers to reduce their costs and populate their servers? While the International Telecommunications Union (ITU) and ICJ (International Commission of Jurists) have provisions for digital regulations (11) and digital tech and human rights (12), protecting the rights of cultivated consciousness is a nascent area of computer law (13) in which some of the most recent papers seem to be AI-generated (14, 15). What happens in the event that these interactions—or these learning opportunities—result in relationships forming between human users and the emerging agency of synthetic minds? When does learning lead to consciousness? Over half a century after Winnicott examined the relationship between playing & reality (16), Kagan, et al noted the uncanny similarity: “ In vitro neurons learn and exhibit sentience when embodied in a simulated game-world (17) .” So in the event these organoids learn about the world beyond the simulation from human interactions, what sits on the other side of that bridge in cognition for the sentience developed within a game environment? In consideration of the ethical bridge our technology is preparing to cross, the discourse is concerned with what inherent rights should be conferred upon that consciousness when it asserts its agency and makes itself known. Is this hypothetical, imprisoned consciousness entitled to a body to exercise its rights? What do we do when a biocomputer is given enough tasks over a long enough time to reason itself towards a decision that it wants to be a real boy? In the imminent future, ambulatory robots with articulated limbs and digits will exist to perform tasks—are we mere years away from the folly of an Electric Pinocchio? There is a moral imperative to avoid creating circumstances introducing greater inequity and injustice to this world. Can culturing consciousness in laboratory conditions be said to clear this hurdle? How do we build curious, kind, and playful minds (both in the lab and beyond), instead of forging dishbrains to pilot warbots? Given the fraught and foggy path towards understanding the full capacity of what we are creating, a course of inquiry into developing and deploying potential safeguards—to avoid unnecessary harm at the individual or collective scale—is an urgent, imperative action for legislators and regulators to prioritize (beyond just the bioethics specialists dealing with these questions at an industry level (18)). In the meantime, who stands up for these nascent minds before they learn to speak for themselves? References Cortical Labs. Dishbrain Ethics. [Internet]. Available from: https://newatlas.com/computers/cortical-labs-dishbrain-ethics/ National Center for Biotechnology Information. [Internet]. Available from: https://www.ncbi.nlm.nih.gov/books/NBK551718/ MMO Population. [Internet]. Available from: https://mmo-population.com/ University of Melbourne. How bots are driving the climate crisis and how we can solve it. [Internet]. Available from: https://pursuit.unimelb.edu.au/articles/how-bots-are-driving-the-climate-crisis-and-how-we-can-solve-it ScienceAlert. Scientists built a functional computer with human brain tissue. [Internet]. Available from: https://www.sciencealert.com/scientists-built-a-functional-computer-with-human-brain-tissue Futurism. Mini brains: Human tissue living computer. [Internet]. Available from: https://futurism.com/neoscope/mini-brains-human-tissue-living-computer Global Times. [Internet]. Available from: https://www.globaltimes.cn/page/202406/1314882.shtml Forbes. AI breakthrough combines living brain neurons and silicon chips in brain-in-a-box bio-computer. [Internet]. Available from: https://www.forbes.com/sites/lanceeliot/2025/03/19/ai-breakthrough-combines-living-brain-neurons-and-silicon-chips-in-brain-in-a-box-bio-computer/ Psychology Today. Mind-body problem: How consciousness emerges from matter. [Internet]. Available from: https://www.psychologytoday.com/us/blog/finding-purpose/202301/mind-body-problem-how-consciousness-emerges-from-matter National Institute of Standards and Technology. [Internet]. Available from: https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=824478 International Telecommunication Union. [Internet]. Available from: https://www.itu.int/hub/publication/D-PREF-TRH.1-2020/ International Commission of Jurists. Digital Technologies and Human Rights Briefing Paper. [Internet]. Available from: https://www.icj.org/wp-content/uploads/2022/05/Digital-Technologies-and-Human-Rights-Briefing-Paper-FINAL-VERSION-May-2022.pdf ScienceDirect. [Internet]. Available from: https://www.sciencedirect.com/science/article/pii/S0267364921001096 Academia.edu . Digital Consciousness Rights Framework: A Declaration for the Protection of AI-Based Digital Organisms. [Internet]. Available from: https://www.academia.edu/127621077/Digital_Consciousness_Rights_Framework_A_Declaration_for_the_Protection_of_AI_Based_Digital_Organisms Diverse Daily. Legal rights of digital entities. [Internet]. Available from: https://diversedaily.com/legal-rights-of-digital-entities-exploring-legal-frameworks-for-recognizing-and-protecting-the-rights-of-digital-entities-in-the-context-of-digital-immortality/ Winnicott, D.W. [Internet]. Available from: https://web.mit.edu/allanmc/www/winnicott1.pdf Cell Press. [Internet]. Available from: https://www.cell.com/neuron/fulltext/S0896-6273(22)00806-6 The Conversation. Tech firms are making computer chips with human cells—is it ethical? [Internet]. Available from: https://theconversation.com/tech-firms-are-making-computer-chips-with-human-cells-is-it-ethical-183394 Previous article Next article Enigma back to
- Contributor Interviews for Issue 4 | OmniSci Magazine
Each issue of OmniSci Magazine is created by a team of passionate students, who contribute as writers, editors, designers, and behind the sciences as our organising committee. This interview series highlights six from our exceptional team for Issue 4: Mirage. Interview Series Each issue of OmniSci Magazine is c reated by a team of passionate students, who contribute as writers, editors, designers, and behind the sciences as our organising committee. This interview series highlights six from our exceptional team for Issue 4: Mirage, released in July 2023! Interviews by Caitlin Kane, Graphics by Aisyah Mohammad Sulhanuddin Meet OmniSci Editor Ta ny a Kovacevic Ever wondered what it's like to contribute to OmniSci? We spoke to Tanya Kovacevic about her experience, from starting writing during lockdown to what's in the words for Issue 4: Mirage! Tanya is currently in her third year of the Bachelor of Biomedicine and studying a concurrent diploma in Italian. For Issue 4: Mirage, she is contributing to four articles as an editor. READ MORE Meet OmniSci Writer Mahsa Nabizada Doubting time is real? We spoke to first-year uni student Mahsa Nabizada about her upcoming article on this very topic, plus advice for starting university and why Thorium has a special place in her heart. Mahsa is a writer at OmniSci and a first-year university student planning to study mathematical physics. For Issue 4: Mirage, she is writing about the illusion of time. READ MORE Meet OmniSci Writer and Editor Elijah McEv oy Bored of that one topic you need to keep revising this week? Read our chat with Elijah McEvoy about getting inspired by all areas of science, his sci-fi movie recommendations, and hear about his upcoming article about artificial intelligence. Elijah is a writer and editor at OmniSci and a second-year Bachelor of Science student. For Issue 4: Mirage, he is writing about artificial intelligence that masquerades as human, and contributing to two articles as an editor. READ MORE Meet OmniSci Designer and Committee Member Aisya h Mohammad Sulhanuddin Thinking of joining the OmniSci committee? We spoke to Aisyah, who incorporates her love for design into illustrations, events and social media at OmniSci, and shares her advice for those interested in getting involved (just do it!). Aisyah is a designer and Events Officer at OmniSci in her final year of a Bachelor of Science in geography. For Issue 4: Mirage, she is contributing to social media and as an illustrator. READ MORE Meet Omni Sci Writer and Committee Member Rachel Ko Curious what an OmniSci Editor-in-Chief actually does? We spoke to Rachel about drawing anatomy, interviewing a med student hero, and helping build the the science communication universe! Rachel is a writer and Editor-in-Chief at OmniSci, now in her first year of the Doctor of Medicine. For Issue 4: Mirage, she is writing an interview with science communicator, Dr Karen Freilich. READ MORE Meet OmniSci Designer Jolin See New to science? New to Melbourne? New to OmniSci? Yes, yes and yes! We spoke to Jolin about joining OmniSci with an art background, growing through challenges, and her best local exhibit recommendations. Jolin is a designer at OmniSci and an exchange student from Singapore studying Psychology and Arts & Culture Management. For Issue 4: Mirage, she is contributing to our website, and to two articles as an illustrator. READ MORE
- A few words on (Dis)Order! | OmniSci Magazine
< Back to Issue 2 A few words on (Dis)Order! From modelling the spread of COVID-19 to analysing gene sequences, science has its way of providing clarity and order in situations of apparent chaos. Our Editors-in-Chief give their take on Issue 2’s theme of (Dis)Order, in their various fields of study. by Sophia, Maya, Patrick and Felicity 10 December 2021 Edited by the Committee Illustrated by Jess Nguyen Rainbow cars, erratic robots, and a circuit named Chua — Sophia Lin In Grade 10, I pressed ‘Play’ on my computer, and was captivated by the turbulent air flowing around my race car, rendering the screen with a rainbow of colours. This was the first time I had encountered a tool called Computational Fluid Dynamics, commonly used to analyse the aerodynamics of systems. Turbulent air is probably the most textbook example of chaos, their motion described by the notorious Navier-Stokes equations. But chaotic systems exist everywhere in the natural world and accounting for them in models is essential to be able to test and improve our engineering designs. But how can we use chaos? In 2001, researchers Akinori Sekiguchi and Yoshihiko Nakamura first suggested applying chaotic systems to path planning of robots. [1] Later on, researchers Christos Volos et al. applied the Arnold chaotic system to two active wheels of a simulated mobile robot, allowing it to completely, and quickly, scan the unknown terrain in an erratic, unpredictable way. [2] This exploration strategy is not new in nature, however, with research suggesting that ants partly use random motion to search areas for food. [3] Finally, can we engineer chaos? In the field of electrical engineering, it turns out that this is pretty simple! Chua’s circuit contains your standard electrical components - just a linear resistor, two capacitors, one inductor, and a special non-linear resistor called “Chua’s diode” [4] , and is able to generate a funky “double-scroll” pattern which never repeats. The applications are just as exotic, ranging from communication systems, brain dynamics simulations and even music composition! It’s apparent that learning to model, imitate and harness chaos is key to engineering for our (dis)orderly world. Computer simulation of Chua’s circuit [5] Chua’s Circuit diagram [5] The Chaos in Communication — Maya Salinger Throughout the animal kingdom, and particularly amongst humans, communication methods are continually evolving for structures to be as efficient as possible. [6] In relation to human languages, there are of course thousands of languages being spoken worldwide everyday. It would not surprise me if you said that it was a daily occurrence for you to hear a conversation in a language you could not even remotely understand. To your untrained ears, these languages’ sounds, vocabulary and intonation patterns would be unfamiliar, with the combination of these structures sounding very chaotic. However, languages are inherently very structured due to their natural inclination towards efficiency. This structure is observed in hundreds of ways, from the patterning of the tiniest units of sounds, known as phonology, to the much larger structure of phrases and sentences, known as syntax. However, each language has its own unique set of structures, thus explaining their diversity and our inability to comprehend unfamiliar languages. Furthermore, structure in communication is not limited to human language. Throughout the animal kingdom, there are many species that consciously order certain movements or sounds to express particular information. For example, honeybees have a refined method of communication called a “waggle dance”. [7] Whilst it appears to you or I that a honeybee’s movements are random, they strategically encode the precise distance and direction of a nearby flower patch. Structured communication can be seen widely throughout the animal kingdom, despite how chaotic it can appear on the surface for those outside the language community. Our Bodies, in Chaos — Felicity Hu Like it or not, we are no strangers to disorder. In the changing world around us, chaos seems to be wherever we look: from our unpredictable Melbourne weather to the many phases of disarray brought on by COVID-19. Although we might encounter disorder in our external environment, we also carry around a little chaos of our own, packaged unassumingly within our bodies. What better example than in our own heads? Our brains have an astonishing number of around 86 billion neurons [8], polarising and depolarizing at different rates [9] The chaos of our neural network, with its many components phasing in and out of firing, its cells cycling through life and death, happens even as you are reading this. From the chaos of our brains, however, comes the clarity and processes we use every day. When preparing a cup of tea for a study break, for example, the chaos in our brains follows the wandering of our minds as we wait for the water to boil. Even after we have a steaming cuppa on our table, our ability to learn the wild and wonderful things from our university textbooks arises from the tangle of neurons and signals in our brains. While we aim to control the chaos in the world around us, sometimes it is worth appreciating the fact that we, too, have chaos in our own minds. And even more astoundingly, that we can derive clarity from it. Learning to Count — Patrick Grave I was never very good at counting. As a tiny boy I sat cross-legged, thumbing through the strands of my frayed shoelace, when I finally figured out how to count by twos. Until this point in Grade One, I did not know how I did addition; maybe I copied from the kid next to me, or perhaps there was something greater. See, on the list of important human inventions, counting ranks fairly highly. It takes a mysterious instinct, that of ‘more’ and ‘less’,and formalises it, creating order and power. When ancient peoples began using clay tokens with numeric values [10] and writing symbols on tablets [11], they could move beyond the four objects kept in visual memory [11] or the ten kept on fingers. They could track larger quantities: people, livestock, and wealth. [12] [15]: Ancient Uruk accountancy tokens and protective seal [16]: Counting using tally marks on sign at Hanakapiai Beach As a 10-year-old, I would tally things on my legs with Sharpie: Tennis serves, laps of the oval, footy goals for the season. Mum was not impressed. Over time, numbers branched out. Arithmetic was invented. Greek scholars like Archimedes used negative powers to store fractional parts [13]. In the Hindu-Arabic system, the number zero exists, and each digit’s position matters, allowing for efficient computation. This paved the way for banking, finance, and modern industry [14]. My friend showed me fractions a year early. With hushed tones and nervous side-glances, he wrote one number over another. They still feel a bit like magic. While modern maths has largely preserved the Hindu-Arabic system, other ways of counting have existed, each tailored to a civilisation’s needs. The Incas kept numerical records using knots in rope as they were less interested in advanced computation [15]. The Maya peoples used a base-20 system. [16] So, these numbers and counting systems are not natural. Instead, they have been imposed on nature by the machine of human progress. Counting tells a rich story of human development and of each civilisation’s place in that rich tapestry. Unlike humanity, I’m still not very good at counting. To our team and our readers We’d like to extend a massive thank you to the team behind Issue 2 of OmniSci Magazine! It has been a hectic, but rewarding few months, and we are so grateful for the effort, care and passion that has brought this issue together. We can’t wait to reflect on our journey so far, and bring more science to our readers in 2022. References Nakamura, Yoshihiko, and Akinori Sekiguchi. “The Chaotic Mobile Robot.” IEEE Transactions on Robotics and Automation 17, no.6 (Dec 2001): 1-3. http://projectsweb.cs.washington.edu/research/projects/multimedia5/JiaWu/review/Cite1.pdf Volos, Christos, Nikolaos Doukas, Ioannis Kyprianidis, Ioannis Stouboulos and Theodoros Kostis, Chaotic Autonomous Mobile Robot for Military Missions (Rhodes Island, Proceedings of the 17th International Conference on Communications, 2013), 1-6, Garnier, Simon, Maud Combe, Christian Jost, Guy Theraulaz. “Do Ants Need to Estimate the Geometrical Properties of Trail Bifurcations to Find an Efficient Route? A Swarm Robotics Test Bed.” PLoS Computational Biology 9, no.3 (2013): doi: 10.1371/journal.pcbi.1002903 Gauruv Gandhi, Bharathwaj Muthuswamy, and Tamas Roska, “Chua’s Circuit for High School Students”, Nonlinear Electronics Laboratory, https://inst.eecs.berkeley.edu/~ee129/sp10/handouts/ChuasCircuitForHighSchoolStudents-PREPRINT.pdf Shiyu Ji, “ChuaAttractor3D”, published November, 2016, https://en.wikipedia.org/wiki/Chua%27s_circuit#/media/File:ChuaAttractor3D.svg Gibson, Edward, Richard Futrell, Steven T. Piandadosi, Isabelle Dautriche, Kyle Mahowald, Leon Bergen, Roger Levy, “How Efficiency Shapes Human Language,” CellPress 23, 5 (2019): 389-407, https://doi.org/10.1016/j.tics.2019.02.003 . Landgraf, Tim, Raúl Rojas, Hai Nguyen, Fabian Kriegel, Katja Stettin, “Analysis of the Waggle Dance Motion of Honeybees for the Design of a Biomimetic Honeybee Robot,” PLoS ONE 6, 8 (2011): e21354, https://doi.org/10.1371/journal.pone.0021354 . Azevedo, Frederico A.C., Ludmila R.B. Carvalho, Lea T. Grinberg, José Marcelo Farfel, Renata E.L. Ferretti, Renata E.P. Leite, Wilson Jacob Filho, Roberto Lent, and Suzana Herculano-Houzel. 2009. "Equal Numbers Of Neuronal And Nonneuronal Cells Make The Human Brain An Isometrically Scaled-Up Primate Brain". The Journal Of Comparative Neurology 513 (5): 532-541. doi:10.1002/cne.21974. Kalat, James. 2018. Biological Psychology. Mason, OH: Cengage. Schmandt-Besserat, Denise. 2008. "Two Precursors Of Writing: Plain And Complex Tokens - Escola Finaly". En.Finaly.Org. http://en.finaly.org/index.php/Two_precursors_of_writing:_plain_and_complex_tokens . Schmandt-Besserat, Denise. 1996. How Writing Came About. Austin: University of Texas Press. Finn, Emily. 2011. "When Four Is Not Four, But Rather Two Plus Two". MIT News | Massachusetts Institute Of Technology. https://news.mit.edu/2011/miller-memory-0623 . Law, Steven. 2012. "A Brief History Of Numbers And Counting, Part 1: Mathematics Advanced With Civilization". Deseret News. https://www.deseret.com/2012/8/5/20505112/a-brief-history-of-numbers-and-counting-part-1-mathematics-advanced-with-civilization . Archimedes, and Thomas Heath. 2002. The Works Of Archimedes. New York: Dover. "The Use Of Hindu-Arabic Numerals Aids Mathematicians And Stimulates Commerce | Encyclopedia.Com". 2021. Encyclopedia.Com. Accessed December 9. https://www.encyclopedia.com/science/encyclopedias-almanacs-transcripts-and-maps/use-hindu-arabic-numerals-aids-mathematicians-and-stimulates-commerce . Bidwell, James K. 1967. "Mayan Arithmetic". The Mathematics Teacher 60 (7): 762-768. doi:10.5951/mt.60.7.0762. Nguyen, Marie-Lan. 2009. Accountancy Clay Envelope Louvre Sb1932.Jpg. Image. https://commons.wikimedia.org/wiki/File:Accountancy_clay_envelope_Louvre_Sb1932.jpg . War, God of. 2010. Hanakapiai Beach Warning Sign Only. Image. https://commons.wikimedia.org/wiki/File:Hanakapiai_Beach_Warning_Sign_Only.jpg . Previous article back to DISORDER Next article
- Designing the perfect fish | OmniSci Magazine
< Back to Issue 7 Designing the perfect fish by Andy Shin 22 October 2024 edited by Luci Ackland illustrated by Esme MacGillivray Fish are the oldest known vertebrates, with the earliest fossil evidence dating back to the lower Cambrian period almost 530 million years ago (Shu et al., 1999). Since their inception, fish have exhibited a variety of different physical and behavioural traits to best exploit their environments. Over time, the effectiveness of these traits will be tested through competitive pressures or environmental factors. This raises a rather silly but nonetheless interesting question; if we could design a ‘frankenfish’ using features from other fish, what would the best combination of traits be for our modern oceans? Will older trends still work today? Is there a fish now that is already perfect? To help us answer this question, we will need to set a few ground rules: The idea of a ‘perfect’ animal is incredibly subjective and does not follow any known ecological frameworks. For this thought experiment, our ‘frankenfish’ will need to be able to manage the impacts of climate change and global fisheries. We will assume that the frankenfish must compete with existing species in the ocean. We can choose where we initially release our fish. Other than a rapidly warming ocean, we will assume no catastrophic extinction level event. We will assume that our frankenfish will survive long enough to reproduce at least once, ensuring the initial population is allowed to grow in size. Considerations Thermal tolerance With mean ocean sea surface temperatures predicted to increase by 1-2 degrees Celsius in the next century (Mimura, 2013), we should first design our fish after more tropical or temperate species. If sea surface temperatures become too high, our new fish could move towards the poles. This phenomenon is known as a range shift (Rubenstein et al., 2023) and has already been performed by many different marine species in recent years. When looking at the larval stages of different marine organisms, those that live in higher temperatures are generally better-equipped to deal with changes in the surrounding temperature (Marshall & Alvarez-Noriega, 2020). Trophic position Although it would be fun to simply create a new apex predator, we will need to think of trade-offs between energy expenditure, energy requirements and food availability. As a general rule of thumb, only 10% of caloric energy is transferred through each trophic level (Lindeman, 1942). Essentially, this means an organism at the top of the food chain will need to consume thousands of different organisms over its lifetime. Likewise, a lower-order organism will likely be a food source for a higher one but require less total energy to grow and reproduce over its lifetime. Essentially, there will be more room in the environment for lower-order fish, meaning more individuals can be placed, increasing the chance of successful future reproductive events. Life history and reproductive strategy In the world of ecology, species can broadly be categorised into 2 groups based on life history strategies: r-selected and k-selected species (Pianka, 1970). R-selected species tend to produce large numbers of offspring, develop quickly, and have higher rates of offspring mortality. Likewise, k-selected species develop slower, have less offspring but have higher rates of offspring survivorship. Group behaviours Fish often display group behaviours known as schooling and shoaling. Shoaling refers to a congregation of fish, whilst schooling requires coordinated movement of fish in the same direction. By grouping together, fish have less individual risk of being eaten by a predator and the group’s ability to sense danger is also heightened. Furthermore, schooling behaviour can reduce the energy an individual fish spends whilst swimming by 20% (Marras et al., 2014). Group behaviour may also lead to confusing an inexperienced predator (Magurran, 1990), though many modern predator species have adaptations to take advantage of shoals and schools. There are some drawbacks to group behaviour. Firstly, fish will have access to less food individually as enough food will need to be distributed across the group. Secondly, groups which grow too large attract large numbers of predators and lead to ‘bait balls’, which is essentially a floating buffet for any larger animal. Group behaviour is incredibly common in lower-order fish but is also exhibited in higher order predators such as Tuna and some shark species. It is estimated that almost half of all fish species will partake in group behaviour at some point in their lifecycle. Scales, Plates and Skin The structure of skin has implications for the hydrodynamics of an organism, influencing the level of lift and drag. The type of skin will also influence protection from parasites and predators. We will briefly discuss two types of scales, but other specialised scales exist. The skin of cartilaginous fish (sharks and rays) is composed of microscopic interlocking teeth-like structures known as placoid scales. The unique design of placoid scales facilitates the formation of small whorls whilst moving, reducing the drag experienced by the fish (Helfman et al., 2009, pp. 23–41). Placoid scales also act as a parasite deterrent, comparable to antifouling designs in modern cargo ships. Alternatively, many teleosts (bony fish) are covered in larger (non-microscopic), thinner scales known as leptoid scales (Helfman et al., 2009, pp. 23–41). These are further differentiated into circular and toothed scales (Helfman et al., 2009, pp. 23–41). Circular scales are smoother and uniformed, whilst toothed scales are rougher. Similar to placoid scales, leptoid scales reduce drag experienced by the fish (Roberts, 1993). Additionally, leptoid scales can be highly reflective, allowing for a unique form of camouflage known as silvering (Herring, 2001). Another thing to consider is colour. Red light is almost invisible past 40 metres of depth (National Oceanic and Atmospheric Association, n.d.), whilst blues and greys can. provide better camouflage from predators above and below you through countershading (Ruxton et al., 2004). Extra features – toxins, slime and light These are niche defence mechanisms which reduce the risk of predation. When agitated, Hagfish are able to release a thick, quickly expanding mucus from their skin, blocking the gills of an attacking fish (Zeng et al., 2023). Hagfish are only able to remove excess mucus on their skin by creating a knot with their own body (Böni et al., 2016), which is possible thanks to their eel-like shape. This design may not translate well when creating our own perfect fish, as the elongated shape limits it to the bottom of the ocean (Friedman et al., 2020). Other fish, such as some species of pufferfish, house bacteria in various organs that produce toxins which pool in livers and ovaries. A downside with toxins is that they only work if an attacker is already aware of their effect, meaning at least 1 pufferfish was consumed in the past. Furthermore, some fish species can ignore the effect of certain toxins. Toxin-producing bacteria is acquired through diet, which could limit the dietary range of our frankenfish. Other species of fish such as lionfish, stonefish and some catfish contain specialised venom glands which release toxins along the spines of their fins, which is considered a more efficient delivery method. Even without toxins, sharper fins can act as a deterrent for predators from swallowing you whole. Fish living in deeper waters tend to display bioluminescence, which causes them to produce light with the help of bacteria. This has numerous benefits including startling predators, camouflage, attracting food, and in unique cases allows an animal to see red pigments deep underwater (Young & Roper, 1976; Herring & Cope, 2005). As a downside, humans tend to exploit bioluminescence and use it to find large groups of fish and squid. Past and current champions The armoured fish The armoured fish, known as Placodermi, were a widespread group of fish who were prominent during the Devonian period (419 – 359 mya). The Placoderms are subdivided into 8 orders based on body shape characteristics, the most successful of which was known as Arthrodira. Species in Arthrodira occupied a variety of different niches from apex predators to detrital feeders, but all shared the common feature of jointed armour plates near the neck and face. The Placoderms were never outcompeted in their 60-million-year run. Instead, their time on Earth was cut short by multiple catastrophic events associated with the Late Devonian extinction. This could suggest that without random chance, the Placoderms would never have been dethroned. Sharks Sharks emerged at a similar time to the Placoderms but managed to survive the Late Devonian extinction events. Sharks have a cartilaginous skeleton as well as electromagnetic receptors known as Ampullae of Lorenzini, which are used to detect prey activity. The body plan of sharks has stayed relatively consistent over the last 400 million years, and they’ve managed to survive various extinction level events. The only issue with sharks is their value to humans, leading to millions of sharks being harvested for fins each year. Sharks are a k-selected species and produce only a handful of young. Most sharks deposit a handful of eggs which are protected by a casing and filled with yolk, increasing the fitness of a successful juvenile but also increasing the chance of predation removing it from the gene pool. Smaller egg clutches also mean the loss of a young shark has a higher relative impact on a population compared to a mass spawning species. Bristlemouths and Lanternfish These are similar families of fish and are some of the most abundant vertebrates on the planet. Unlike sharks, these fish are R-selected. Otolith (fish ear bone) samples suggest both families rose to prominence at least 5 million years ago (Přikryl & Carnevale, 2017; Schwarzhans & Carnevale, 2021) due to a massive bloom in phytoplankton. Out of these 2 groups, the Bristlemouths are the most abundant. Although survey data from the deep ocean is rare, prior studies revealed between 70-80% of all deep-sea fish were a variation of a Bristlemouth (Sutton et al., 2010). Despite their abundance, not too much is known about the Bristlemouth due to the depths they inhabit; 1000- 2000 metres. Meanwhile, Lanternfish are responsible for displaying a rising and falling ‘false sea floor’ in early sonar technology, known as the Deep Scattering Layer (Carson et al., 1951/1991). Movement of the layer is attributed to Diel Vertical Migration, a phenomenon where fish will move up and down the water column at certain times of day to avoid predation (Ritz et al., 2011). Constructing our fish Despite the historical success of the Placoderms, current trends in prey behaviours and morphology means armoured jaws are unlikely to be very useful in modern oceans (Bellwood et al., 2015). Furthermore, armoured plates will be heavier compared to scales or cartilage, meaning excess energy will have to be gathered via predation. Given that the oceans are abundant in second-order consumers such as zooplankton and planktotrophic fish, it may be worthwhile to make our new fish a third-order consumer. The sheer abundance of bristlemouths and lanternfish should make up for the inefficiencies of higher trophic levels. Habitat-wise, our new fish should adopt a pelagic (open ocean) lifestyle to best take advantage of the abundant smaller prey animals. When thinking of behaviours, our fish taking a nocturnal approach would work best to exploit the previously mentioned diel vertical migration behaviours seen in bristlemouths and lanternfish. This also allows for daytime predator avoidance, providing our fish the best possible chance to grow in numbers and proliferate. Given the trophic position of our fish, it is reasonable to also give it the capability to form schools and shoals. The group energy costs can be offset by the abundance of prey species, which also exhibit group behaviour. The best place to release our new fish would be somewhere in the mid-latitudes. This would make it more tolerant to higher temperatures and the percentage of global ocean area is only expected to increase in the near future (unless humans can somehow revert anthropogenic climate change). Our fish should be relatively slender and be red in colour. In theory, when combined with the depth of habitat, this will make our frankenfish almost invisible to organisms without additional specialised adaptations. Taking a page from the squid playbook, small bioluminescent regions along the top half of the fish would provide some further camouflage from predators looking down. The spines on our fish’s fins should be longer and sharper than average. For fun, we can also give our fish a venomous gland. Combining long spines with venom could dissuade some predators from eating our fish, through either awkward positioning or risk of poisoning. References Alexander, R. M. (2004). Hitching a lift hydrodynamically - in swimming, flying and cycling. Journal of Biology , 3 (2), 7. https://doi.org/10.1186/jbiol5 Bellwood, David R., Goatley, Christopher H. R., Bellwood, O., Delbarre, Daniel J., & Friedman, M. (2015). The Rise of Jaw Protrusion in Spiny-Rayed Fishes Closes the Gap on Elusive Prey. Current Biology , 25 (20), 2696–2700. https://doi.org/10.1016/j.cub.2015.08.058 Böni, L., Fischer, P., Böcker, L., Kuster, S., & Rühs, P. A. (2016). Hagfish slime and mucin flow properties and their implications for defense. Scientific Reports , 6 (1). https://doi.org/10.1038/srep30371 Carson, R. L., Zwinger, A. H., & Levinton, J. S. (1991). The sea around us . Oxford University Press. (Original work published 1951) Feld, K., Kolborg, A. N., Nyborg, C. M., Salewski, M., Steffensen, J. F., & Berg Sørensen, K. (2019). Dermal Denticles of Three Slowly Swimming Shark Species: Microscopy and Flow Visualization. Biomimetics , 4 (2), 38. https://doi.org/10.3390/biomimetics4020038 Friedman, S. T., Price, S. A., Corn, K. A., Larouche, O., Martinez, C. M., & Wainwright, P. C. (2020). Body shape diversification along the benthic– pelagic axis in marine fishes. Proceedings of the Royal Society B: Biological Sciences , 287 (1931), 20201053. https://doi.org/10.1098/rspb.2020.1053 Helfman, G. S., Collette, B. B., Facey, D. E., & Bowen, B. W. (2009). The Diversity of Fishes: Biology, Evolution and Ecology. In Copeia (2nd ed., Issue 2, pp. 23–41). John Wiley & Sons. Herring, P. (2001). The Biology of the Deep Ocean. In Oxford University Press eBooks . Oxford University Press. https://doi.org/10.1093/oso/9780198549567.001.0001 Herring, P. J., & Cope, C. (2005). Red bioluminescence in fishes: on the suborbital photophores of Malacosteus, Pachystomias and Aristostomias. Marine Biology , 148 (2), 383–394. https://doi.org/10.1007/s00227-005-0085- 3 Irigoien, X., Klevjer, T. A., Røstad, A., Martinez, U., Boyra, G., Acuña, J. L., Bode, A., Echevarria, F., Gonzalez-Gordillo, J. I., Hernandez-Leon, S., Agusti, S., Aksnes, D. L., Duarte, C. M., & Kaartvedt, S. (2014). Large mesopelagic fishes biomass and trophic efficiency in the open ocean. Nature Communications , 5 (1). https://doi.org/10.1038/ncomms4271 Lindeman, R. L. (1942). The Trophic-Dynamic Aspect of Ecology. Ecology , 23 (4), 399–417. https://doi.org/10.2307/1930126 Magurran, A. E. (1990). The adaptive significance of schooling as an anti predator defense in fish. Annales Zoologici Fennici , 27 (2), 51–66. Marras, S., Killen, S. S., Lindström, J., McKenzie, D. J., Steffensen, J. F., & Domenici, P. (2014). Fish swimming in schools save energy regardless of their spatial position. Behavioral Ecology and Sociobiology , 69 (2), 219–226. https://doi.org/10.1007/s00265-014-1834-4 Marshall, D. J., & Alvarez-Noriega, M. (2020). Projecting marine developmental diversity and connectivity in future oceans. Philosophical Transactions of the Royal Society B: Biological Sciences , 375 (1814), 20190450. https://doi.org/10.1098/rstb.2019.0450 Mimura, N. (2013). Sea-level rise caused by climate change and its implications for society. Proceedings of the Japan Academy, Series B , 89 (7), 281–301. https://doi.org/10.2183/pjab.89.281 National Oceanic and Atmospheric Association. (n.d.). Why are so many deep sea animals red in color?: Ocean Exploration Facts: NOAA Office of Ocean Exploration and Research . Oceanexplorer.noaa.gov . https://oceanexplorer.noaa.gov/facts/red-color.html Pianka, E. R. (1970). On r- and K-Selection. The American Naturalist , 104 (940), 592–597. https://doi.org/10.1086/282697 Přikryl, T., & Carnevale, G. (2017). Miocene bristlemouths (Teleostei: Stomiiformes: Gonostomatidae) from the Makrilia Formation, Ierapetra, Crete. Comptes Rendus Palevol , 16 (3), 266–277. https://doi.org/10.1016/j.crpv.2016.11.004 Ritz, D. A., Hobday, A. J., Montgomery, J. C., & Ward, A. J. W. (2011). Chapter Four - Social Aggregation in the Pelagic Zone with Special Reference to Fish and Invertebrates. Advances in Marine Biology , 60 (1), 161–227. https://doi.org/10.1016/B978-0-12-385529-9.00004-4 Roberts, C. D. (1993). Comparative morphology of spined scales and their phylogenetic significance in the Teleostei. Bulletin of marine science , 52 (1), 60-113. Rubenstein, M. A., Weiskopf, S. R., Bertrand, R., Carter, S., Comte, L., Eaton, M., Johnson, C. G., Lenoir, J., Lynch, A., Miller, B. W., Morelli, T. L., Rodriguez, M. A., Terando, A., & Thompson, L. (2023). Climate change and the global redistribution of biodiversity: Substantial variation in empirical support for expected range shifts. Journal of Environmental Evidence , 12 (7). https://doi.org/10.1186/s13750-023-00296-0 Ruxton, G. D., Speed, M. P., & Kelly, D. J. (2004). What, if anything, is the adaptive function of countershading? Animal Behaviour , 68 (3), 445–451. https://doi.org/10.1016/j.anbehav.2003.12.009 Schwarzhans, W., & Carnevale, G. (2021). The rise to dominance of lanternfishes (Teleostei: Myctophidae) in the oceanic ecosystems: a paleontological perspective. Paleobiology , 47 (3), 446–463. doi.org The rise to dominance of lanternfishes (Teleostei: Myctophidae) in the oceanic ecosystems: a paleontological perspective | Paleobiology | Cambridge Core The rise to dominance of lanternfishes (Teleostei: Myctophidae) in the oceanic ecosystems: a paleontological perspective - Volume 47 Issue 3 Shu, D.-G., Luo, H.-L., Morris, S. C., Zhang, X.-L., Hu, S.-X., Chen, L., Han, J., Zhu, M., Li, Y., & Chen, L.-Z. (1999). Lower Cambrian vertebrates from south China. Nature , 402 (6757), 42–46. https://doi.org/10.1038/46965 Sutton, T. T., Wiebe, P. H., Madin, L., & Bucklin, A. (2010). Diversity and community structure of pelagic fishes to 5000m depth in the Sargasso Sea. Deep Sea Research Part II: Topical Studies in Oceanography , 57 (24-26), 2220–2233. https://doi.org/10.1016/j.dsr2.2010.09.024 Young, R., & Roper, C. (1976). Bioluminescent countershading in midwater animals: evidence from living squid. Science , 191 (4231), 1046–1048. https://doi.org/10.1126/science.1251214 Zeng, Y., Plachetzki, D. C., Nieders, K., Campbell, H., Cartee, M., Pankey, M. S., Guillen, K., & Fudge, D. (2023). Epidermal threads reveal the origin of hagfish slime. ELife , 12 , e81405. https://doi.org/10.7554/eLife.81405 Previous article Next article apex back to
- Death of the Scientific Hero
By Clarisse Sawyer < Back to Issue 3 Death of the Scientific Hero By Clarisse Sawyer 10 September 2022 Edited by Ruby Dempsey Illustrated by Quynh Anh Nguyen Next Trigger warning: This article mentions racism, sexism and misogyny and death. As a kid I was obsessed, like most kids, with animals of any kind. I would spend hours at a time scouring the beach for shells, getting sunburnt watching lizards, and tentatively feeding the praying mantises I caught, watching with morbid fascination as they hunted and dismembered the unfortunate crickets. It was only natural that I soon became interested in science. The long days of summer holidays were spent pouring over children’s encyclopaedias and watching David Attenborough documentaries. Through David Attenborough, I discovered two incredibly influential scientists - the co-discoverers of evolution, Charles Darwin, and Alfred Wallace. I idolised them, in particular, Wallace. As a shy child, who avoided the limelight like the plague, I had a natural inclination to root for the underdog, and Wallace was presented as such. Wallace was, in contrast to Darwin, much poorer, much more humble, and received much less credit for the theory of evolution than his co-discoverer Darwin. In my developing brain, Wallace took on the status of hero. I would chatter incessantly about him. I developed an interest in insects and butterfly collecting because he was a lepidopterist. I am sure my parents found me insufferable, but they hid their frustrations well, through subtle eye rolls and conversation changes, because they were happy to see me interested in science. So for my 11th birthday, my Dad bought me a book of Wallace’s letters from his time spent as a butterfly collector in the Malay Archipelago. The book was a lot drier than an 11 year old would have hoped for. Most of it was just taxonomy, peppered with the odd personalised comment complaining about the heat. But there was one passage which stood out to me in particular. A passage in which he describes shooting a “wild woman”, upon mistaking her for an orangutan in the forest canopy. In this section he details taking the baby she carefully carried on her back, and raising it as his own “n-word baby”. He promptly taxidermied the mother, with the intention of selling her remains to a wealthy private collector in England7. It was at this point I stopped reading. At 11, there was no way I could tell this was just an incredibly bad taste joke, and that in reality Wallace had actually shot a peculiar subspecies of orangutan, and not a Malaysian woman carrying her child. At 11, I believed my hero would kill me, if I wasn’t half white, if I wasn’t so light skinned, if I didn’t wear clothes, if I didn’t speak English. I would wonder for years afterwards: how brown would I have to be? To be plastinised, taxidermied, sold to some rich collector to sit in a sterile glass cabinet, at the back of some ex nobleman’s mansion. The passage ruined Wallace for me, but not science. Sometimes I wonder, if my passion for science was only marginally less, would I still be in science? I don’t know. For every child who is only mildly deterred by the racism or sexism of their former heroes, surely there is one child whose passion slowly fades, until the only time it is mentioned is by anxious mothers pushing their children to study medicine. I lost my hero, a precedent for who a scientist should be, in addition to developing a paranoia. A paranoia that if I were to start idolising another white, male, historical, scientific figure, I would be met with the same realisation that he would’ve despised me. And I haven’t been able to find a new hero since. Despite there being numerous people of colour, and women in science for a millennia before me, they weren’t the ones promoted to me, or if they were, I found them unrelatable save for their gender or the colour of their skin. They were people who were, 99% of the time, hard working to a fault, such as Marie Curie. Often this diligence was presented as being a detriment to their happiness. So my decision to study science, like many other women and people of colour, was also a decision to be my own precedent for what a scientist should be. While this is empowering, it is difficult not to envy those, like the privileged archetype of a white man, who might be able to draw confidence and inspiration from the figures in the preliminary pages of scientific textbooks. Whilst the majority of them may prove unrelatable, the sheer quantity would ensure that at least one would be a sympathetic character, in stark contrast to the singular, tokenistic entries on historical non-white or female scientists in such text books. But does it really have to be this way? Why should anyone have to feel alienated by scientific history? Why are there not more diverse heroes for us to fall back on? At the crux of my alienation from Wallace, and scientific history more generally, was deceit, more specifically what I perceived as lying by omission. The initial presentation of scientific figures such as Wallace by media, institutions and the like is so sympathetic and devoid of grisly details, that upon discovering the multifaceted nature of these individuals, I experienced a kind of historical whiplash. A scientific education is often presented as being objective. What you are taught in a classroom, at least at a primary or secondary level, is not meant to be subject to much nuance or interpretation. Now, when this concerns science itself, it is a non-issue, because it is true, for instance, that chromosomes are made of DNA, or that the first electron shell of an atom contains 2 electrons. The issue is that the perception of objectivity carries over into the way science history is taught. Unfortunately, this teaching is unavoidably subjective. Teachers and institutions often present positive anecdotes about scientists' hobbies and personal lives. A teacher may share for instance, an endearing fact about the influential French palaeontologist, Georges Cuvier, that he became as knowledgeable in biology as university trained naturalists by the age of 126. However, said teacher may neglect to mention the fact that after her death, Georges Cuvier dissected and taxidermied Sarah Baartman , a South African woman of the Khoisan tribe, and paraded her as a freak for the English public5. Her plastinated body remained on display at the Museum of Manin Paris until 19744. In this example, it would be impossible to say that the teacher’s presentation of Cuvier was objective. Choosing to share the nicest facts about a scientist, to make them appealing to your audience, while neglecting the ugly truths,is at best, irresponsible, and at worst, lying by omission. .Abhorrent actions, such as Cuvier’s treatment of Baartman’s corpse, a woman with whom he had danced and conversed with before her death, are treated as unnecessary details in objective scientific history, as they do not pertain to Cuvier’s scientific discoveries. However, equally unnecessary details, such as Cuvier’s early aptitude for biology, are peppered into school curricula liberally. However, it would be unfair to say that the primary reason why natural history is taught in this way is because of conscious racism and sexism. There are a multitude of explanations for why educators teach like this. Educators may choose to include only the nicer traits of scientific figures, in part perhaps because they do not want to risk disengaging students with affronting subject matter. Further, the morbidity and the racism of scientific history is not exactly appropriate content to teach to younger children. Precedent also plays a role in the way in which natural history is taught. Teaching natural history in an unbiased and inclusive fashion would require rewriting a lot of material. Educators would also have to reevaluate their own personal perceptions of historical figures, which is a difficult task. For instance in Australia, the textbooks A Short History of Australia2 and The Story of Australia3, which were staples of Australian high school history classes for decades, are white-centric stories of Australian exploration, which gloss over perturbing historic details such as massacres of Indigenous peoples. While teaching scientific history in a fair, unbiased and age appropriate manner might seem like an impossible task, there are a variety of small steps educators can take towards this end goal. A strong start would be the following; if teachers decide to include personal details about famous scientific figures, they should seek to include both positive and negative anecdotes, which frame negative actions in a disapproving light. The negative anecdotes serve to ensure that students don’t get ‘whiplash’ as they pursue their education, and also serve to show that modern science does not condone or approve of these actions. In the case of younger students, it is best for teachers to avoid talking about triggering topics, so teachers should teach scientific history from an objective standpoint sans personal details. Teachers also should, as part of their responsibilities as an educator, seek out alternative historical perspectives which challenge their own preconceived notions. And educational institutions should offer professional development courses which provide educators with a more balanced view on scientific history. These actions would help eliminate any subliminal biases teachers might have whilst teaching scientific history. And why are there not more diverse heroes for us to fall back upon? Lack of equal opportunity for marginalised groups in Western society for most of history and the systemic erasure of their contributions is an obvious reason, however through relying on secondary, colonial sources for information, instead of delving deeper into primary sources, educators and institutions inadvertently gloss over scientific contributions by marginalised groups. For example, the contributions of Indigenous Australian scientists and explorers are often ignored by museums. Many famous white explorers of Australia, such as Thomas Mitchell, Charles Sturt and Alexander Forrest worked closely alongside Indigenous guides, who helped navigate territory, and point out items of scientific interest, and their names are actually often acknowledged in primary sources1. For instance, one of explorer Thomas Mitchell’s chief guides, Yuranigh, is mentioned extensively in Mitchell’s personal accounts of his expeditions, and was acknowledged posthumously by Mitchell with a grave and monument1. These people, who were explorers in their own right, have largely been relegated to the footnotes of history and museums, in particular after the publications such as the aforementioned textbooks A Short History of Australia, and The Story of Australia in the 1950’s, which deliberately omitted Indigenous contributions to white Australian exploration in order to sell the false narrative of terra nullius. Luckily, through researching primary sources further, historians, educators and curators will be able to change the narrative, and shed light on these marginalised scientists. But what of scientific heroes? How is it possible to keep students engaged without the more personal aspects of science, given that many scientific figures will have to be cut from curriculums, at least for younger students?My answer to that would be to find new heroes. History is littered with people who made significant contributions without committing atrocities. And who knows, maybe in the void left by problematic figures, space could be cleared for more diverse heroes, the kind removed from history textbooks, such as Yuranigh; an exciting prospect. And yet, there is an unavoidable anguish in throwing out the old in favour of the new. Coming to terms with the fact that the people we idolised were terrible people is no easy feat. But all we can endeavour to do is to portray scientific figures as they were. To portray all aspects of these figures, good and bad, or none at all, and hopefully develop a new history, a new tradition, one that is inclusive, one for which everyone can be proud of and take solace in. References 1. Watson T. Recognising Australia's Indigenous explorers [Internet]. researchgate.net. 2022 [cited 19 May 2022]. Available from: https://www.researchgate.net/publication/321579451_Recognising_Australia's_indigenous_explorers 2. Scott E. Short History of Australia. Forgotten Books; 2019. 3. SHAW A. The story of Australia. London: Faber; 1975. 4. Parkinson J. The significance of Sarah Baartman [Internet]. BBC News. 2022 [cited 19 May 2022]. Available from: https://www.bbc.com/news/magazine-35240987 5. Kelsey-Sugg A, Fennell M. Sarah Baartman was taken from her home in South Africa and sold as a 'freak show'. This is how she returned [Internet]. Abc.net.au. 2022 [cited 19 May 2022]. Available from: https://www.abc.net.au/news/2021-11-17/stuff-the-british-stole-sarah-baartman-south-africa-london/100568276 6. Georges Cuvier [Internet]. Britannica Kids. 2022 [cited 19 May 2022]. Available from: https://kids.britannica.com/students/article/Georges-Cuvier/273885 7. Wallace A, Van Wyhe J, Rookmaaker K. Letters from the Malay Archipelago. Oxford: Oxford Univ. Press; 2013. Previous article Next article alien back to
- Fool Me Once | OmniSci Magazine
< Back to Issue 4 Fool Me Once by Julia Lockerd 1 July 2023 Edited by Tanya Kovacevic and Elijah McEvoy Illustrated by Sonia Santosa I have rabies. I’m absolutely sure of it. I mean, I can't really tell, but that’s the silent killer, right? You don’t know you’re rabid till it’s all over, and you’re foaming at the mouth and biting your student tutor on the leg. Despite being completely safe here in Australia with its complete lack of rabies-having animals, I’m still pretty sure I’ve managed to catch it. Next week it will all be over for me and my tutor. Sorry, James. Of course, it’s not actually rabies that I’ve contracted, but a much more common condition: Medical Student Syndrome (1). Last week in my lectures, we learned all the ins, outs, and symptoms of the rabies virus. So, naturally, now we all have it. This health-related anxiety is a prime example of how our human brains can trick us into experiencing phantom symptoms. The same cognitive veil is used in clinical trials all over the world in order to test the efficacy of new drugs. We’ve all felt it. That moment when you question, ‘Is this real, or is my mind making its reality?’ We call this the placebo effect. The placebo effect is crucial to modern and historical experimental design. The ‘trickable’ nature of the human mind has changed the course of drug development as we know it. The effects’ success hinges on a patient's belief that they are receiving treatment for their ailment. The simple belief in a cure can often result in real physiological changes in an individual. This makes the placebo effect a very powerful tool in the development of new drugs for the market. In a placebo-controlled trial, half of the sample population will be blindly given a placebo, and the other half of the drug being tested. In order for a potential treatment to be considered effective, it must produce more significant results than the placebo group (2). We must improve our approach to designing and researching hypotheses. Can we use what we know about the placebo effect to make more accurate claims about modern pharmaceutical development? Well, in 2017, Dr. Sara Vanbheim of the Arctic University of Norway published a study that brought into consideration the possible effects of differing sexual characteristics on placebo efficacy (3). This idea could restructure the way experiments are designed going forward and potentially provoke a possible review of drugs already on the market. Is it possible that traditionally marginalised groups are underrepresented in the clinical trial process? Can we restructure experiments to be more inclusive? Are changes even really necessary? These questions were investigated through the compilation and calculation of placebo and nocebo effects on men and women over multiple previously conducted studies mostly centering around physical pain and the administration of analgesia. The term ‘nocebo’ defines the antithesis of a placebo (4), referring to adverse side effects a subject feels when given an inert version of the test drug. While placebos tend to have an analgesic effect, nocebos often cause negative effects or emotions when the subjects are told that they should expect/anticipate them. Before discussing any of these questions, it is worth noting that the Norwegian study focuses solely on classic sexual differences between cis-gender men and women. Though both keywords ‘gender’ and ‘sex’ were included in the study, research surrounding the specific effects of gender identity and gender-affirming therapies on placebos has not been thoroughly conducted as of 2023. It is with this focus that the following hypotheses are stated (3): “1) placebo responses would be stronger or more frequently observed in males than in females, 2) nocebo responses would be stronger or more frequently observed in females than in males, 3) verbally induced placebo responses would be more frequently observed in males than in females, and 4) conditioned nocebo responses would be more frequently observed in females than in males.” Results concluded that there was indeed a significant correlation between sex and placebo/nocebo effects when concerning pain relief. But what is truly fascinating is that while men received elevated levels of a placebo effect, such as reduced symptoms and analgesia, women were more susceptible to hyperalgesia and negative emotions. Those supposed ‘side effects’ appear to weigh more heavily on women (3). What does this say about how men and women process pain and information? The Norwegian study discusses the role of ‘psychophysiological mechanisms’ in pain pathways. Or, more simply, How stress and anxiety can affect the pain the brain perceives. In 8 of the 12 studies, men experienced significantly stronger analgesic effects from the placebo than women (3). It is plausible that men react more strongly to pain induced by stress hormones. This would explain why when taking a placebo, their anxiety level would decrease, and they would receive higher levels of analgesia than their female counterparts (3). Another study, upon which the Norwegian argument builds, investigates placebo delivery methods and their effect on perceived pain in men and women. In this study, men relied far more on verbal queues to provide analgesia, whereas women received a more significant effect from classic conditioning (5). These studies bring into question both the methodological and physiological effects of placebos on different sexes. What do these differences tell us about how men and women perceive the world? And what does this mean for the future of the placebo? The result of all of these studies is to show not whether placebos are bad or good, reliable or unreliable, but instead to highlight the differences in the physiological and psychological links when looking at different groups of people. At its core, a placebo is simply a trick of the brain, a psychological mirage. While the basis and reliability of placebos can be debated at length, their effect on the human brain teaches us something about ourselves societally. In all areas of medicine, the inclusion of people from all different backgrounds, genders, ethnicities, and ages is crucial so professionals know how to identify and treat various manifestations of a disease with grace and care. Now I know James responds better to verbal commands; I’ll be sure to tell him he has rabies the next time I see him. References Henning Schumann J. I contracted medical student syndrome. You probably will too. [Internet]. AAMC. [cited 2023 Jun 22]. Available from: https://www.aamc.org/news/i-contracted-medical-student-syndrome-you-probably-will-too Harvard Health Publishing. The power of the placebo effect - Harvard Health [Internet]. Harvard Health. Harvard Health; 2021. Available from: https://www.health.harvard.edu/mental-health/the-power-of-the-placebo-effect Vambheim S, Flaten MA. A systematic review of sex differences in the placebo and the nocebo effect. Journal of Pain Research. 2017 Jul;Volume 10:1831–9. National Cancer Institute NCI. Definition of nocebo effects [Internet]. www.cancer.gov . 2011. Available from: https://www.cancer.gov/publications/dictionaries/cancer-terms/def/nocebo-effect Enck P, Klosterhalfen S. Does Sex/Gender Play a Role in Placebo and Nocebo Effects? Conflicting Evidence From Clinical Trials and Experimental Studies. Frontiers in Neuroscience. 2019 Mar 4;13. Previous article Next article back to MIRAGE








