top of page

Search Results

146 results found with an empty search

  • Friend or Foe?: The Mechanisms Behind Facial Recognition | OmniSci Magazine

    < Back to Issue 8 Friend or Foe?: The Mechanisms Behind Facial Recognition by Mishen De Silva 3 June 2025 Edited by Luci Ackland Illustrated by Aisyah Mohammad Sulhanuddin Among the many mysteries which encompass the world around us, lies a complex interaction right under our nose, or perhaps… right above it. In the labyrinth of human consciousness, we rely on the seemingly arbitrary judgements made from the combination of two eyes, a nose, and a mouth, to discern who might be a friend or foe. Facial recognition gives a snapshot into the intricate dance between our perception and cognition, which allows us to cultivate a more detailed understanding of those around us, and their thoughts, feelings and emotions. In those fleeting moments when you recognise your parents in a sea of unfamiliar faces, spot your friends ensconced among the rows of the lecture theatre, or simply bump into an old friend in a crowd of unacquainted strangers, your brain is able to identify faces in a fraction of a second, a remarkable feat of the human cognitive capacity. But what enables us to distinguish one face from another? How do the faces of those we know stand out from the countless other noses, eyes and mouths we see? To understand what makes these interactions so meaningful, we need to take a closer look at the mechanisms behind facial recognition and decoding within the brain. The Brain’s Blueprint To be human is to seek meaning, even when none may exist. The mind has transformed what is two eyes above a nose, and a nose above a mouth, into its own pattern for classifying the identities and expressions we see around us. Many studies have suggested facial processing to be holistic, where the featural patterns of the eyes, nose and mouth are perceived together and upright (1,2). This mechanism of holistic facial processing explains the interesting phenomena behind pareidolia, where the brain adapts the characteristics of human faces onto everyday objects. It’s the reason why when glancing at a bowling ball it may appear surprised (3), or why some have sworn to see a face on Mars (4)! Figure 1. Bowling balls with surprised facial expressions! (3) In pursuit of meaning for the patterns around us, the brain has developed specialised regions for processing the features of a face to help us recognise individual identities. Facial processing operates through a hierarchical mechanism where distinct aspects of the face are interpreted by different regions of the brain. The unchanging elements of the face such as gender, age, ethnicity and features related to someone’s identity are analysed by the Inferior Occipital Gyrus and Fusiform Face Area (FFA), while the changing aspects such as eye gaze, lip movements and facial expressions are analysed by the Superior Temporal Sulcus and Orbitofrontal Cortex (5,6). Of these face-selective regions, the FFA is particularly important for facial recognition as it helps us recognise who a person is (5). Through the activation of our FFA simple patterns shift from meaningless shapes into familiar visages representing our friends, family, or even our own reflection. Studies have uncovered the importance of the FFA for facial recognition by examining what may happen when this brain region malfunctions (7,8). A unique example of this is prosopagnosia, which results from damage to the FFA in the right hemisphere of the brain (9). Prosopagnosia is a relatively rare condition affecting about 1 in 50 people, impairing their ability to recognise faces (9). Imagine if every face you observed looked the same or unfamiliar… even your own reflection! It is through the brain and its specialised regions for facial recognition where we can appreciate the essence of human connection as a result of our neural hardware. These mechanisms responsible for transforming patterns into faces are the reason we can recognise our neighbour from a stranger, friend from a classmate, or our parents from a teacher. Often overlooked amidst the fleeting and impermanent nature of our social interactions, this complex system guides us along the fragile line of human relationships, between familiarity and estrangement, a friend or foe. It highlights how deeply-rooted our connection and sense of identity is to the faces we see. The Brain’s Threat Detection With each neuron, synapse and pathway, our brains are machines wired for connection, not just in how we think, but also in how we perceive and interact with our surroundings. From the brief exchange of smiles with a stranger, to the furtive glare from someone across the room, one of the hallmarks of our emotional understanding is the ability to decode the thoughts and intentions of others, even from the most subtle of expressions. In the vast and intricate web of neural connectivity, it can be difficult to isolate a singular brain region or connection to explain complex cognitive functions. Brain imaging studies have found a strong bidirectional link between the FFA and amygdala, making this a likely candidate for explaining our remarkable decoding ability (10,11). As the FFA picks up on who a person is or what facial expression is being made, it is the amygdala which then evaluates the emotional salience, or importance, of this face. The amygdala then signals back to the FFA to either increase or decrease the facial processing activity accordingly (10,12). Consider how the visibility of teeth in a barred expression can signal anger, the whiteness of someone’s eyes can hint fear or surprise, and the shape of a person’s eyebrows can indicate the intensity of their emotion, all which guide the brain to prioritise and interpret socially and emotionally relevant cues – almost like a survival filter! (13,14,15). From an evolutionary perspective, the FFA-amygdala feedback loop serves as an important tool for rapidly and accurately interpreting the intentions of others, a pinnacle function in the architecture of our physical and social survival (16). The ability to recognise whether someone poses a friend or foe has been a survival mechanism and evolutionary advantage for millennia. The role of our facial processing network, from the amygdala and FFA, to other brain regions discussed, provides a microcosm into our nature as social beings, and our evolutionary selective changes, which have enhanced our ability to sense, respond to, and connect with those around us (17). In this way, maybe the most profound mysteries lie not in distant galaxies or ancient ruins, but are hidden in plain sight, within the faces we walk past every day. Our brain’s ability to read them is not merely a mechanism for decoding emotion, but a mirror into the nature of what it means to be human, where connection, trust, and survival have long been written in the expressions of those around us. References 1. Farah M, Wilson K, Drain M, Tanaka J. What is “special” about face perception?. Psychological Review [Internet]. 1998 Aug [cited 2025 May 14]; 105(3):482–98. Available from: https://pmc.ncbi.nlm.nih.gov/articles/PMC5561817/ 2. Richler J, Gauthier I. A meta-analysis and review of holistic face processing. Psychological Bulletin [Internet]. 2014 Sep [cited 2025 May 14]; 140(5): 1281–302. Available from: https://pubmed.ncbi.nlm.nih.gov/24956123/ 3. What do you think these bowling balls saw to leave them so surprised & shocked?. Reddit [Internet]. 2022 [cited 2025 May 31]. Available from: https://www.reddit.com/r/Pareidolia/comments/zc12jo/what_do_you_think_these_bowling_balls_saw_to/#lightbox 4. Gilbert L. Why the brain is programmed to see faces in everyday objects. UNSW Sites [Internet]. 2020 Aug [cited 2025 May 14]. Available from: https://www.unsw.edu.au/newsroom/news/2020/08/why-brain-programmed-see-faces-everyday-objects 5. Kanwisher N, Yovel G. The fusiform face area: a cortical region specialized for the perception of faces. Philosophical Transactions of the Royal Society: Biological Sciences [Internet]. 2006 Dec 29 [cited 2025 May 14]; 361(1476):2109–28. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1857737/ 6. Zhen Z, Fang H, Liu J. The Hierarchical Brain Network for Face Recognition. Ptito M, editor. PLoS ONE [Internet]. 2013 Mar [cited 2025 May 14]; 8(3):e59886. Available from: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0059886 7. Hadjikhani N, de Gelder B. Neural basis of prosopagnosia: An fMRI study. Human Brain Mapping [Internet]. 2002 [cited 2025 May 14]; 16(3):176–82. Available from: https://onlinelibrary.wiley.com/doi/abs/10.1002/hbm.10043 8. Sorger B, Goebel R, Schiltz C, Rossion B. Understanding the functional neuroanatomy of acquired prosopagnosia. NeuroImage [Internet]. 2007 Apr [cited 2025 May 14] ;35(2):836–52. Available from: https://www.sciencedirect.com/science/article/pii/S1053811906009906 9. Prosopagnosia | Psychology Today Australia [Internet]. www.psychologytoday.com . [cited 2025 May 14]. Available from: https://www.psychologytoday.com/au/basics/prosopagnosia 10. Herrington J, Taylor J, Grupe D, Curby K, Schultz R. Bidirectional communication between amygdala and fusiform gyrus during facial recognition. NeuroImage [Internet]. 2011 Jun [cited 2025 May 14]; 56(4):2348–55. Available from: https://pubmed.ncbi.nlm.nih.gov/21497657/ 11. Said C, Dotsch R, Todorov A. The amygdala and FFA track both social and non-social face dimensions. Neuropsychologia [Internet]. 2010 Oct [cited 2025 May 14]; 48(12): 3596–605. Available from: https://pubmed.ncbi.nlm.nih.gov/20727365/ 12. Šimić G, Tkalčić M, Vukić V, Mulc D, Španić E, Šagud M, et al. Understanding Emotions: Origins and Roles of the Amygdala. Biomolecules [Internet]. 2021 May [cited 2025 May 14]; 11(6):823. Available from: https://pmc.ncbi.nlm.nih.gov/articles/PMC8228195/ 13. Jacobs R, Renken R, Aleman A, Cornelissen F. The amygdala, top-down effects, and selective attention to features. Neuroscience & Biobehavioral Reviews [Internet]. 2012 Oct [cited 2025 May 14]; 36(9):2069–84. Available from: https://pubmed.ncbi.nlm.nih.gov/22728112/ 14. Horstmann G, Lipp O, Becker S. Of toothy grins and angry snarls – Open mouth displays contribute to efficiency gains in search for emotional faces. Journal of Vision [Internet]. 2012 May [cited 2025 May 14]; 12(5):7–7. Available from: https://jov.arvojournals.org/article.aspx?articleid=2192034#:~:text=We%20suspected%20that%20visible%20teeth,(see%20also%20Figure%205).&text=Mean%20target%20present%20slopes%20(in,while%20angry%20faces%20do%20not.&text=Mean%20target%20present%20slopes%20(in,while%20angry%20faces%20do%20not . 15. Hasegawa H, Unuma H. Facial Features in Perceived Intensity of Schematic Facial Expressions. Perceptual and Motor Skills [Internet]. 2010 Feb [cited 2025 May 14]; 110(1):129–49. Available from: https://pubmed.ncbi.nlm.nih.gov/20391879/ 16. Schmidt K, Cohn J. Human facial expressions as adaptations: Evolutionary questions in facial expression research. American Journal of Physical Anthropology [Internet]. 2001 [cited 2025 May 14]; 116(S33):3–24. Available from: https://pubmed.ncbi.nlm.nih.gov/11786989/ 17. Carter E, Pelphrey K. Friend or foe? Brain systems involved in the perception of dynamic signals of menacing and friendly social approaches. Social Neuroscience [Internet]. 2008 Jun [cited 2025 May 14]; 3(2):151–63. Available from: https://pubmed.ncbi.nlm.nih.gov/18633856/ Previous article Next article Enigma back to

  • The Mirage of Camouflage | OmniSci Magazine

    < Back to Issue 4 The Mirage of Camouflage by Krisha Ajay Darji 1 July 2023 Edited by Megane Boucherat and Tanya Kovacevic Illustrated by Aisyah Mohammad Sulhanuddin Imagine driving on a highway and the road is shimmered by the scorching midday sun. Whilst you drive further on a day like this, you might envision a wet patch gleaming on the road. Does it make you wonder how a mirage passes by playing with your vision? While there is physics involved in this phenomenon, evolution through natural selection has rendered some of its own biological members the ability to play with visual perceptions in subtle but enchanting ways! What comes to your mind when you hear the word camouflage? Some might visualize a chameleon blending in almost any background possible. Others might envision a soldier wearing camouflage pants and shirts to match the earthy tones for their defence. Colourful frogs, butterflies, snakes and so on might cross your mind as you think deeper about this phenomenon. Nature is filled with some of the most fascinating examples of camouflage. Camouflage as a Prehistoric Phenomenon The coloration patterns found on the Sinosauropteryx, a tiny, feathered, carnivorous dinosaur that lived in what is now China during the Early Cretaceous period was studied by a group of scientists. They discovered evidence of coloration patterns corresponding to modern animal camouflage by tracing the distribution of the dark pigmented feathers over the body. This included stripes running around its eyes and across the tail, and countershading with a dark back and pale bottom. By contrasting and comparing the mask and striped tail with the colours of contemporary animals, we can learn more about the evolution of camouflage as a means of natural selection [1]. The presence of stripes on only tails rather than the whole body of certain animals is not well understood, but they are suspected to function as a type of disruptive camouflage. Disruptive camouflage means visually separating the outline of a portion of the body from the others and to make it less noticeable. It could also serve as a type of deception by attracting predators' attention to the tail and away from the more vital parts - the body and head. Birds are found to be the most evident illustration of this as they descend from the theropod dinosaur [1]. Early tyrannosauroids, the ancestors of the ferocious T-rex, coexisted with Sinosauropteryx and may have even hunted the little dinosaur. Sinosauropteryx hunted tiny lizards, as was demonstrated by direct evidence in the shape of a whole animal preserved in the stomach of one of the specimens found. Hence, it is clear that camouflage patterns were developing at that time; since vision was critically important to these dinosaurs while they were hunting and being hunted. This example demonstrates camouflage as a prehistoric phenomenon and its evolution in the animal kingdom. Camouflage in Modern Day Animals Animals use camouflage primarily for defence. Blending in with their background prevents them from being seen easily by predators. The use of warning coloration, mimicry, countershading, background matching and disruptive coloration are mechanisms through which animals employ camouflage. Sneaky Snakes! The harmless scarlet king snake has stripes that resemble those of the deadly coral snake, but it is not poisonous. The only significant distinction between the two is the arrangement of the colours in their patterns. While the pattern for coral snakes is red-yellow-black, for scarlet king snakes it is red-black-yellow [2]. The difference is simple for anyone to remember thanks to a rhyme! Red on yellow kills a fellow, Red on black won’t hurt Jack! This is a classic example of mimicry: a form of camouflage in which one organism imitates the appearance of another to avoid predators. The Walking Leaf! The leaf insect or the waking leaf belongs to the family Phylliidae and is quite like its name. The walking leaf's body has patterns on its outer edges that look like the bite marks that caterpillars leave behind in leaves. To resemble a leaf swinging more accurately in the breeze, the insect even sways while walking! This is an example of a type of camouflage known as background matching- one of the most prevalent forms of camouflage. It is a mechanism through which a particular organism hides itself by resembling its surroundings in terms of its hues, shapes, or movement [2]. Mottled Moth! It is challenging for predators to determine the form and direction of the tiger moth as it is mottled with intricate patterns of black, white, and orange on its wings. This is an example of disruptive camouflage: when an animal has a patterned coloration, such as spots or stripes, it can be difficult to detect the animal's contour [2]. Lurking Leopards! Black rosettes on a light tan backdrop serve as the hallmarks of the leopard’s well known coat patterns. Their coats also include a subtle countershading to help them amalgamate with their environment and evade detection by prey. A leopard's body has a significantly lighter underside than the rest of its coat, which consists mostly of its belly and the bottom of its legs. This produces a shading effect that helps conceal the leopard's body form and contour, making it more challenging to see in low light or when seen from below. This is a typical example of countershading, which is a type of camouflage wherein the animal’s body is darker in colour, but its underside is lighter. It works by manipulating the interactions between light and shadows; thus, making the animal difficult to detect [2]. But what allows these animals to change their colours? Animals can camouflage themselves through two primary mechanisms: Pigments - biochromes Physical structures - prisms While some species have natural and microscopic pigments known as biochromes, others possess physical structures like prisms for camouflage. Biochromes can reflect some wavelengths of light while absorbing others. Species with biochromes can actually seem to alter their colour. Prisms can reflect and scatter light to give rise to a colour that is different from the animal’s skin [2]. Camouflage is not quite restricted to the sense of vision. There are several other ways evolution has taught the living world to adapt and protect themselves in the wild. There is a whole exciting world of behavioural and olfactory camouflage employed by diverse species in the animal kingdom. Ultimately, the compelling association of camouflage with the phenomenon of mirage conveys to us how nature always evolves and expands to secure the continued existence of its inhabitants. From the glistening heat of mirages on arid vistas to the delicate patterns on the wings of a butterfly, this fascinating juxtaposition of mirage and camouflage delivers a peek into the incredible mechanisms that animals deploy to traverse their natural habitats and survive amidst the obstacles they encounter. References Smithwick F. We discovered this dinosaur had stripes – and that tells us a lot about how it lived [Internet]. 2017 [cited 2023 May 12]. Available from: https://theconversation.com/we-discovered-this-dinosaur-had-stripes-and-that-tells-us-a-lot-about-how-it-lived-86170 National Geographic. Camouflage [Internet]. [cited 2023 May 12]. Available from: https://education.nationalgeographic.org/resource/camouflage/ Previous article Next article back to MIRAGE

  • Eyeballs, a Knife, and No Fear of God | OmniSci Magazine

    < Back to Issue 9 Eyeballs, a Knife, and No Fear of God by Jess Walton 28 October 2025 Illustrated by Anabelle Dewi Saraswati Edited by Chavindi Sinhara Mudalige Humans have wanted to understand our bodies the entire time we’ve had them, which is to say, the entire time. Late Classical Athens, around 300 BC, at a peak of intellectual prosperity: Herophilos cuts into a corpse. From this, he’s going to make the novel argument that the brain contains knowledge, and in doing so, he’s going to criticize Aristotle’s writing, which describes the brain as something akin to an air conditioner. Aristotle thought the brain was a cooling chamber, essentially, to prevent the heart from overheating, and that cognition happened in the heart. Much, much earlier, around 1000 BC in India, Sushruta, in his foundational surgical text, overestimated the bone count in humans by over 100. Many ancient societies had impressively detailed understanding of anatomy, considering they had no microscopes, no cameras, no X-rays; usually nothing more than their knives and eyeballs. It’s important to note as well that this article is a brief overview of a complex subject, with a major focus on Classical, meaning Ancient Greek and Roman, examples, and is in no way a complete story of early anatomical developments across the globe. Asia, Africa, the Americas and the Arab world each had their own rich and complex traditions, beyond the few examples cherry-picked here. Most societies had a few impressive hits and a few impressive misses; in a way, their approach to science isn’t all that different from ours today. What can we learn from them, and what can we learn about ourselves? In Ancient Athens, Aristotle believed the heart to be both the intellectual and emotional center of humans; the “seat of the soul” (1). Some remnants of this remain in our modern association between heart and emotion, though we know now it isn’t backed by science. His reasoning behind this was the convergence of blood vessels at the heart and its importance; from this, he also, perhaps reasonably, thought it to be the source of blood (2). Despite being deservedly considered a major anatomist, Aristotle likely made his observations from examining and dissecting the bodies of animals, particularly lower mammals, like dogs or livestock, instead of real humans (3). He unknowingly used homologous structures, long before evolution or even Charles Darwin himself was conceptualized, to essentially assume the anatomy of humans from other animals. Given this, his conclusions on the brain become a little more understandable. The brain is a strange-looking organ, critically important to life, though not obviously connected to the pulse or rich with blood; how were they to understand the structure of nerves and white matter? That it assists the heart in some way becomes a logical conclusion. So why not serve a cooling function? Blood is hot, so the heart must get hot. Overheating is usually bad; see fire. And the brain’s size makes it ideal for such a thing. The thing about anatomy and science, Aristotle’s assertion being one primordial example of many around the ancient world, is that it changes. Herophilos and Erasistratus were two more Greek anatomists who succeeded and often contested Aristotle. Unlike him, they dissected humans, having no qualms about a man’s dead—or, according to some sources, still alive—body (4). However, they offered several accurate, or at least more accurate, insights inside human bodies. Herophilus argued that the brain wasn’t a cooling chamber but contained knowledge (5). While he was at it, he argued that the heart has four chambers, unlike Aristotle, who claimed it only has three (5). Many of Herophilos and Erasistratus’ insights required Aristotle’s, or some other prior Mediterranean scholar’s, claims to give them something to criticise. Praxagoras was one such anatomist, from about 400 BC, about 100 years earlier. He correctly associated the pulse with natural movement within the body, but also asserted that arteries carry air (6). There is, possibly because of this claim, debate as to whether he had any practical anatomical experience or observed any dissections. If so, it’s quite impressive to miss the blood in arteries. He did, however, note that veins carry blood (2). Thus, he was later included in Herophilos’ critique. Before we criticise how long it took for them to realise seemingly obvious facts, we must remember that bloodletting as an acceptable treatment persisted into the 19 th Century. Modern and recent understandings are far from flawless. A couple of hundred years later, Galen, a Roman from the late 2 nd Century AD, would voice similar critiques (2). Galen would later become famous for his theory of the four humors: blood, yellow bile, black bile, and phlegm, each with associated personalities and elements (7). While these are all real liquids found somewhere in the human body, they do not really work as the four-way counterbalance he describes. Galen made some incredible leaps forward in Roman anatomy, including developing more elaborate tools for dissection and surgery processes, which would be instrumental in allowing future developments in the field. However, he also learned more anatomy from treating severe gladiator injuries—which is awesome—or like Aristotle, from dissections and studies on lower mammals (7). This led to some interesting conclusions; his description and diagrams of a human uterus match that of a dog’s uterus exactly, for example (7). He did well with the tools he had, but guesswork has its limits. Three hundred years before Aristotle, and over seven centuries before Galen, the ancient Indian physician Sushruta, a continent away, was revolutionizing, and if there was nothing to revolutionise, inventing surgeries and surgical techniques. He also valued an understanding of human anatomy, which likely contributed to his surgical skill, and dedicated a portion of his seminal Sanskrit work, Sushruta Samhita , to anatomy, calling it the Sharira Sthana . In his work, he describes in detail the head, which he correctly identified as the major center of essentially all function, particularly the cranial nerves (8). He also includes the first detailed guide to human dissection, alongside the anatomy of the embryo at various developmental stages; this is described as arising from seven skins, each with their own associated ailments, and while the skins are anomalous, many of the ailments correlate impressively with known diseases (8). There’s also, incredibly, a detailed description of cataract surgery procedure, where exceptionally specific incision locations in the cornea are interspersed with instructions to sedate the patient with wine mixed with cannabis, which makes sense in a world far predating modern anesthesia, then to spray the eye with breast milk (9). This part seems outlandish and harder to explain, but anyone who has studied immunology can tell you that breast milk contains antibodies and antibacterial proteins. Sushruta likely made some link between breast milk and reduced post-op infections, even if there were not yet microscopes to see bacteria with. Even if they couldn’t see why on the molecular scale, ancient anatomists were able to understand what worked and what didn’t and justify it to the best of their knowledge. When Sushruta describes the bones of the human body, he does so in great detail, and also counts more than 300 of them. Humans typically have 206 bones, give or take a rib: Sushruta mildly overestimated. This is thought to be from him, largely basing his skeletal insights off child cadavers, before many bones have fused together (9). Hindu religious law calls for the cremation of any body over two years old, in its natural and thus undissected state; though there are accounts of Sushruta performing dissections, presumably on adults, the bodies he likely had the most exposure to were infants. Sushruta was working within the confines of the society and world that he lived in, as was Herophilos. Medical insights which seem obvious to us today, like that the brain is for thinking and the heart is for beating blood, and that blood goes through the arteries and is most definitely a liquid, rely upon prior knowledge reached with tools that hadn’t even been invented yet. These firsts—surgeons, anatomists, scientists—would probably have to be physically pried away from microscopes and X-rays, if ever introduced to them. They often didn’t even have a human body to dissect, yet drew human anatomical conclusions regardless. And it’s easy to marvel at their mistakes, but it’s even easier to marvel at how much they got right; Herophilos correctly uncovered nerves and linked them to sensation and response, which is impressive in itself. Could you find a nerve in some meat, with just your naked eye? He also linked the heart and the pulse. The Huangdi Neijing , for example, is a Chinese medical text said, though disputed, to be from 2600 BC, which describes the relationships between organs in military terms: the heart as a king, the liver as a commandant, and the gallbladder as an attorney-general responsible for coordination (10). However, both like and before Herophilos, it also correctly identifies the cyclic nature of blood flow and links it to the heart (10). The Edwin Smith Papyrus, dating from 1700 BC in Ancient Egypt, is the oldest known surgical text, describing 48 different injuries with treatments; all shockingly accurate (11). Sushruta may have miscounted the bones, but he described their shapes accurately and suggested legitimate therapies for particular bone breakages and dislocations. Nowadays, little has changed: in just the 1950s, lobotomies became the standard cure for a headache; even long after we developed microscopes, we were recommending treatments, like scrambling our brains, that only 70 years later seem ridiculously stupid. We’re far from done charting our own bodies, either. In 2018, an entirely new type of tissue all throughout the body was found: the interstitium, which is critical in cell and organ communication across the body (12). It’s been there the whole time, but no one had noticed before. Humans are humans; it is only natural to want to understand ourselves, and as a part of that, our bodies. We now study our ancestors as they studied themselves; the same mix of awe, confusion and confidence. Their methods and conclusions may be fallible, but their curiosity was not, and as long as we remain, never will be, dead. These examples were only a fraction of those whose work has been preserved, who themselves were only a fraction of the ancient people across the globe who investigated human anatomy. A millennium from now, our descendants will laugh at our misconceptions, when they have mapped every neuron in the human brain with instruments we could not conceive of. But without us, they wouldn’t know what they know, and without our original anatomists, we wouldn’t know what we know. Our modern granular understanding of our own structure is built on the bodies we looked in before ours. So, we should perhaps extend some empathy to our predecessors. They had only eyeballs, a knife, and our own curiosity. Different tools, same bodies. References Aird WC. Discovery of the cardiovascular system: from Galen to William Harvey. J Thromb Haemost. 2011;9(Suppl 1):118–29. Johnston IH, Papavramidou N. Galen on the Pulses: Medico-historical Analysis, Textual Tradition, Translation [Internet]. De Gruyter; 2023 [cited 2025 Oct 10]. Available from: https://www.degruyterbrill.com/document/doi/10.1515/9783110612677/html Crivellato E, Ribatti D. A portrait of Aristotle as an anatomist. Clin Anat. 2007;20(5):447–85. Papa V, Varotto E, Vaccarezza M, Ballestriero R, Tafuri D, Galassi FM. The teaching of anatomy throughout the centuries: from Herophilus to plastination and beyond. Med Hist. 2019;3(2):69–77. Bay NSY, Bay BH. Greek anatomist Herophilus: the father of anatomy. Anat Cell Biol. 2010;43(4):280–3. Wright J. Review of: Praxagoras of Cos on Arteries, Pulse and Pneuma. Studies in Ancient Medicine, 48 . Bryn Mawr Class Rev [Internet]. [cited 2025 Oct 10]. Available from: https://bmcr.brynmawr.edu/2017/2017.07.34/ Ajita R. Galen and his contribution to anatomy: a review. J Evid Based Med Healthc. 2015;4(26):4509–16. Bhattacharya S. Sushruta—the very first anatomist of the world. Indian J Surg. 2022;84(5):901–4. Loukas M, Lanteri A, Ferrauiola J, Tubbs RS, Maharaja G, Shoja MM, et al. Anatomy in ancient India: a focus on the Sushruta Samhita . J Anat. 2010;217(6):646–50. O’Boyle C. TVN Persaud, Early history of human anatomy: from antiquity to the beginning of the modern era. Med Hist. 1987;31(4):478–9. van Middendorp JJ, Sanchez GM, Burridge AL. The Edwin Smith papyrus: a clinical reappraisal of the oldest known document on spinal injuries. Eur Spine J. 2010 Nov;19(11):1815–23. Benias PC, Wells RG, Sackey-Aboagye B, Klavan H, Reidy J, Buonocore D, et al. Structure and distribution of an unrecognized interstitium in human tissues. Sci Rep. 2018;8(1):4947. Previous article Next article Entwined back to

  • ABOUT US | OmniSci Magazine

    About Us OmniSci Magazine is a science magazine at the University of Melbourne, run entirely by students, for students. Our team consists of talented feature writers, columnists, editors, graphics designers, social media and web development officers, all passionate about communicating science! Past Contributor Interviews Editors-in-Chief Ingrid Sefton President Aisyah M. Sulhanuddin President Current Committee Lauren Zhang Secretary Andrew Shin General Committee Ethan Bisogni Treasurer Luci Ackland General Committee Kara Miwa-Dale Events and Socials Hendrick Lin General Committee Elijah McEvoy Events and Socials Past Editors-in-Chief Rachel Ko 2022-2024 Sophia Lin 2021-2022 Patrick Grave 2021-2023 Maya Salinger 2021-2022 Caitlin Kane 2022-2023 Felicity Hu 2021-2022 Yvette Marris 2022-2023

  • Ancient Asian Alchemy: Big Booms | OmniSci Magazine

    < Back to Issue 9 Ancient Asian Alchemy: Big Booms by Isaac Tian 28 October 2025 Illustrated by Aisyah Mohammad Sulhanuddin Edited by Luci Ackland One question has plagued the human condition since the beginning of time: how can we escape death? Well, we certainly know who didn’t find the answer – the alchemists of ancient China. It’s 210 BC, and you are an alchemist standing before Emperor Qin Shi Huang in his court. You hand him an elixir supposed to grant him immortality and eternal reign. Only the serum contains what we now call “mercury” and if anything, you granted him mortality, as he drops dead before you (1). Where does one begin in this journey to immortality? How do we combine chemicals to find the perfect serum? Keep in mind, we have not even come close to establishing the periodic table at this point (no, that will occur about 1000 years later) (2). Saltpetre – or potassium nitrate – had been used extensively to treat common illnesses and to maintain good health. There’s our starting point (3). The search for this magic elixir persists for the next eleven centuries. We never give up… do we? The ingenuity of the alchemists spoke to them: it told them to mix in a few other ingredients to the saltpetre. With the trio of saltpetre, sulfur and charcoal, gunpowder was henceforth born into this world (4). The alchemists must have been in for a surprise when their “potion of immortality” sparked and exploded before them. So how does gunpowder explode? Why don’t other flammable items like match tips and dry wood explode when we set them alight? It comes down to a few key things. First is our perception of explosions. Chemicals don’t simply “explode” – it’s not an inherent quality of reactions – however, they can combust. Combustion is the release of energy from a fuel. Wood and matches combust, but they do so in a way that is relatively slower than gunpowder. Gunpowder combusts rapidly – so there is a large amount of energy release within a short period of time. Secondly, it’s about the availability of oxygen. Items that combust slowly typically have to wait for the oxygen to trickle in from the surrounding air, since oxygen is a critical component of combustion. This does not apply to gunpowder. The oxygen for its combustion is right there in the nitrate compound (of potassium nitrate – or saltpetre). So unlike burning wood or matches, the combustion does not need to wait for oxygen to arrive from the surrounding environment – it’s already in there with the rest of the powder (5)! To go further on that point: the closer the atoms are, the faster the combustion reaction can progress, because chemical compounds don’t need to wait long for the heat to get to them. Since gunpowder is… well… a powder, it’s rather compact and all the molecules of potassium nitrate, sulfur, and carbon sit tightly next to one another. It is this physical arrangement that permits the fast transfer of heat between molecules, ensuring that a lot of energy can be released at once. Ultimately, when all these physical and chemical phenomena occur in perfect unison, the high temperatures rapidly increase the kinetic energy of surrounding air molecules, causing them to shoot outwards at great speeds to form a “barrier” of sorts. When this barrier, also known as a shockwave, hits your eardrums, the gunpowder delivers what it does best: BOOM! Now, let’s combust some gunpowder, build up some gaseous pressure, and launch ourselves into the modern day. It’s been about twelve centuries – what have we been doing with all the gunpowder? As it turns out, we humans are very inventive, but also violent (Wow – who knew?). We quickly realised that the physical properties of the resulting gases can be harnessed to quickly move very heavy objects (6). Said heavy objects could then be guided in the direction of, say, a human being or a structure. Weaponry derived from gunpowder has existed for a very long time, albeit rather inefficient at first. The introduction of gunpowder to warfare came in the early 10th century, when soldiers applied gunpowder to arrows that would ignite and create fire arrows. Of course, whilst it might have been effective in creating a hole in humans, it was significantly less so when it came to creating holes in walls and structures. Only after 300 years did we then invent cannons and guns. However, those guns were slow – really, really slow – to the point that bows and arrows were actually preferred during warfare of that era. It would be another 600 years before we realised that there were more effective ways of reloading a gun; brandishing a new trend of military technology that would set the stage for the First and Second World Wars (7). By that point, the most terrifying of weapons had begun to stray away from the use of gunpowder. Missiles and rockets began employing other chemicals as propellants, owing to the advantage it had over gunpowder (7). It would also be remiss of this article to omit the exploitation of atomic power – pervading the world with such destruction that gunpowder appeared like a child’s toy (8). The tragic irony of a supposed innovation in immortality leading to mortality by war and conflict will forever embed itself into our history. Even with the right intentions, the invention by the great minds of alchemy has sparked a chain reaction for widespread destruction and warfare. It only makes you wonder – what are we making now that will lead us further astray in the future? References 1. Glancey J. The army that conquered the world. BBC. Accessed August 24, 2025. https://www.bbc.com/culture/article/20170411-the-army-that-conquered-the-world 2. Guharay DM. A brief history of the periodic table. ASBMBTODAY. Accessed August 28, 2025. https://www.asbmb.org/asbmb-today/science/020721/a-brief-history-of-the-periodic-table 3. Butler A, Moffett J. Saltpetre in Early and Medieval Chinese Medicine. Asian Medicine . 2009;5(1):173-185. doi: 10.1163/157342109X568982 4. Paradowski, R.J. Invention of Gunpowder and Guns. EBSCO Research Starters. 2022. Accessed August 24, 2025. https://www.ebsco.com/research-starters/history/invention-gunpowder-and-guns 5. Stanford University. Detonation and Combustion. Stanford University. Accessed September 4, 2025. https://cs.stanford.edu/people/eroberts/courses/ww2/projects/firebombing/detonation-and-combustion.htm 6. Britannica. Ammunition | Bullets, Shells & Cartridges. Britannica. 2025. Accessed September 25, 2025. https://www.britannica.com/technology/ammunition 7. Beyer G. How Did Gunpowder Change Warfare? TheCollector. 2025. Accessed October 4, 2025. https://www.thecollector.com/how-did-gunpowder-change-warfare/ 8. ICAN. History of Nuclear Weapons. ICAN. Accessed October 4, 2025. https://www.icanw.org/nuclear_weapons_history Previous article Next article Entwined back to

  • The Evolution of Science Communication | OmniSci Magazine

    < Back to Issue 2 The Evolution of Science Communication In the current age of social media, users hold far more autonomy over the posts and information which they share online. However, this was not always the case, with the media once being far more regulated, and restricted for only certain individuals. With users now having far more power over content posted online, how does this impact the information which others receive about the COVID-19 pandemic? by Monica Blasioli 10 December 2021 Edited by Khoa-Anh Tran & Yen Sim Illustrated by Rachel Ko Trigger warning: This article mentions illness, and death or dying. Since the beginning of the pandemic in March 2020, science communication has started to evolve in ways never before seen across the globe. There appears to be an endless amount of infographics, Facebook posts, and YouTube and TikTok videos… including some with dancing doctors. Information not only about the COVID-19 virus, but countless diseases and scientific concepts, is available in more casual, accessible language at only the touch of a button. Any questions which you might have about science or your body can be answered through a quick Google search. In this sense, science communication is now far more rapid, as well as more accessible than in research papers (which always seem like they are written in a foreign language at times). However, the downside of having vast amounts of information available is that it can create challenges in determining the validity of what is being presented. In previous years, science communication was typically limited to the more typical forms of media, such as in a newspaper or a magazine, or even through a television interview. These were typically completed by professionals in the field, such as a research scientist or a medical doctor. When looking at the 1920 Influenza outbreak, many citizens at that time would have received their information from printed newspapers and posters on bulletin boards, as seen below. Image 1, [1] Somewhat similar to today's age, there were signs displaying the importance of mask-wearing, and newspapers explaining the closures of schools and shops, the distribution of vaccines, and reports of death rates. These messages were, and still are, created and approved by larger institutions, governments and medical professionals, particularly doctors. As seen on the (left / right / below / above), doctors are urging people to not become complacent, despite a recent drop in influenza cases. This is rather similar to current newspaper or television news reports - only in reference to COVID-19, instead of influenza. Image 2, [2] There were, of course, still groups which were uncertain about the scientific evidence being provided by journalists, doctors and government officials at this time. In November of 1918, it was declared that “the epidemic of [influenza] disease is practically over,” with mask laws being relaxed. However, only a few days later, the previous mask laws were reintroduced with a spike in Influenza cases. As unpacked in Dr Dolan’s research [3], the “Anti-Mask League” formed and protested in response to this back track, claiming that masks were unsanitary, unnecessary, and stifling their freedom. As this was during the early 20th century, the league advertised their protests in local newspapers, with reports that hundreds of San Francisco residents were fined for not abiding by mask rules, often due to their alliance with the Anti-Mask League. The San Francisco Anti-Mask League is one of the most renowned and infamous groups of its time, with smaller-scale groups also questioning the science being communicated. This type of conflicting information surrounding mask issues, and the opinion that they restrict personal freedoms, have incited similar responses throughout history. However, resistance by anti-mask groups has not existed on such an influential and global scale, as it has during the current COVID-19 pandemic. With the rise of the age of “new media,” including platforms such as Instagram and Facebook, individuals now have far more autonomy over their role in the media, meaning that they yield a lot more power over the information others are receiving. Almost anybody can interpret scientific material online and upload it in a video of them dancing to some music on TikTok, spreading information to potentially hundreds of thousands of viewers across the globe. In many ways this new found autonomy and power can be quite beneficial. Australian Doctor Imogen Hines uses her platform on TikTok, alongside her medical education and current scientific research, to break down medical treatments and mistruths, particularly surrounding the COVID-19 pandemic. These videos use simple language and straight-forward analogies, “humanising” the often intimidating figures in the medical field, and allowing the general public to be well-informed about scientific concepts. For example, Dr Imogen breaks down the research surrounding long term side effects of vaccines using a milkshake analogy! https://www.tiktok.com/@imi_imogen1/video/7027448207823211777?is_copy_url=1&is_from_webapp=v1&lang=en On the other hand, this phenomenon can have pretty serious ramifications, with many individuals feeling rightfully confused about what the truth really is, when there appears to be so many versions of it posted across the internet. Following a rather controversial study on Ivermectin as a treatment for COVID-19, the internet was soon buzzing with excitement about the prospect of a drug that many believed could replace the need for a vaccine. Despite numerous gaps in the original study, and countless further studies refuting Invermectin’s ability to treat COVID-19, many social media users are continuing to spread this myth online. Both governments and hospitals alike have been accused of hiding a seemingly “good” cure from their citizens. In Texas, a group of doctors won a legal case which allowed Texas Huguley Hospital to refuse administering Ivermectin to a COVID-19 infected Deputy Sheriff. This sparked outrage on Facebook, with users and the Sheriff’s wife demanding greater freedoms over their medical treatments, instead of just relying on the judgement of doctors and hospital staff. In this instance, the misinformation surrounding Ivermectin is not only influencing individuals to seek out futile treatments, but it is also spreading mistrust with the science and medical communities, who work incredibly hard to protect the world, particularly over the past two years. Despite Ivermectin being used in a clinical setting to treat parasitic (not viral) infections in humans for a number of years now, it can be extremely dangerous for individuals to have complete power over their medical treatments. The dosage and timing of treatment is crucial in ensuring success. Just like with everyday medications such as paracetamol, taking Ivermectin in high doses is risky. A COVID-19 infected woman from Sydney who read about Ivermectin on social media took a very high dosage of the drug after purchasing it from an online seller, which resulted in severe diarrhea and vomiting. In order to combat some of this misinformation, a number of social media platforms are “fact checking” posts or providing warnings on posts with keywords, such as ‘COVID-19’ or ‘vaccination.’ On Instagram, each post with these keywords will contain a banner at the bottom inviting users to visit their “COVID-19 Information Centre,” which provides a list of information supported by WHO and UNICEF about how vaccines are of high-standard, well-researched, and generally resulting in mild side effects. In addition, on Facebook, posts identified to be spreading mistruths will provide users with websites explaining the truth, before they can access the original posts. However, these warnings and fact-checks can only go so far. Posts blindly supporting the use of Ivermectin, falsely reporting side effects of vaccines, and arguing that masks cannot block virus particles still circulate the internet. Often those most vulnerable in the community are at risk of being led astray with misinformation. In principle, evidence-based, concise, easy-to-understand science communication is essential to break down the barrier between research and the general public, ensuring that citizens are well-informed and more comfortable about the world around them. In the situation of a public health crisis such as the COVID-19 pandemic, this communication is crucial in ensuring that all citizens can remain well-informed, safe and healthy. Misinformation and dodgy studies can not only lead people astray, but also cost them their health and wellbeing. References: 1. Kathleen McGarvey, “Historian John Barry compares COVID-19 to the 1918 flu pandemic,” University of Rochester, October 6, 2020. https://www.rochester.edu/newscenter/historian-john-barry-compares-covid-19-to-1918-flu-pandemic-454732/ 2. Kathleen McGarvey, “Historian John Barry compares COVID-19 to the 1918 flu pandemic,” University of Rochester, October 6, 2020. https://www.rochester.edu/newscenter/historian-john-barry-compares-covid-19-to-1918-flu-pandemic-454732/ 3. Brian Dolan, Unmasking History: Who Was Behind the Anti-Mask League Protests During the 1918 Influenza Epidemic in San Francisco? Perspectives in Medical Humanities (San Francisco: UC Medical Humanities Consortium, 2020) Previous article back to DISORDER Next article

  • Hope, Humanity and the Starry Night Sky

    By Andrew Lim < Back to Issue 3 Hope, Humanity and the Starry Night Sky By Andrew Lim 10 September 2022 Edited by Manfred Cain and Yvette Marris Illustrated by Ravon Chew Next Image 1: The Arecibo Observatory looms large over the forests of Puerto Rico The eerie signal reverberates out over the Caribbean skies, amplified by the telescope below. It oscillates between two odd resonating tones for little more than a couple of minutes, then shuts off. Eminent scholars, government administrators and elected representatives watch in wonderment, their eyes glued open. The forest birds and critters chirp and sing. It is November 16, 1974 – from a little spot in Arecibo, Puerto Rico, Earth is about to pop its head out the door to say ‘hello’. Those sing-song tunes, beamed out into space on modulated radio waves, are a binary message designed for some alien civilisation– a snapshot of humanity in 1679 bits. It sounds like the beginning of a bad sci-fi flick: the kind that ends with little green men coming down in UFOs for a cheap-CGI first contact. But it isn’t, and it doesn’t. Instead, the legacy of those telescope-amplified sounds – that ‘Arecibo Message’ – has a place in history as a symbol of human cooperation, here on Earth rather than in the stars. The message’s unifying vision imbued the famous ‘pale blue dot’ monologue of its co-creator Carl Sagan; and led to the launch of a multi-year international programme designing its successor message 45 years on, presenting extra-terrestrial communication as a mirror of our earth-bound relations. A unified message symbolizing a unified humanity. The previous feature in this series (Discovery, Blue Skies…and Partisan Bickering?) ended with a declaration of nuance: that science in politics matters solely because it transcends partisan bounds with clear analysis. Yet, looking at stories like Arecibo’s, so imbued with human optimism, maybe this cold, logical formulation isn’t enough. Perhaps for all its focus on appropriations bills, initiative funding and flawed infrastructure, that perspective lends insufficient weight to science’s ability to inspire, to cut through the fog of day-to-day policy battles with a beacon of what could yet be. But is this talk of hope just ideological posturing – a triumphant humanism gone mad? Or could there be some merit to its romantic vision of humanity speaking with one voice to the stars? Might it possibly be that science really is the key to bridging our divisions? COOPERATION AMIDST CHAOS Well, why not begin in the times of Arecibo? After all, the interstellar message came at a key moment in the Cold War. Just a few months before, US President Richard Nixon had made his way to Moscow to meet with General Secretary Leonid Brezhnev, leader of the USSR. The signing of a new arms treaty, a decade-long economic agreement and a friendly state dinner at the Kremlin all seemed to indicate a world inching away from the edge of nuclear apocalypse. Such pacifist optimism is found readily in the message’s surrounding documents, with its research proposal speaking glowingly of future messages designed and informed by “international scientific consultations…[similar to] the first Soviet-American conference on communication with extraterrestrial [sic] intelligence.” Indeed, it seems the spirit of the age. Soon after the Arecibo message’s transmission, the Apollo-Soyuz Test Project would see an American Apollo spacecraft docking with a Soviet Soyuz module. Mission commanders Thomas Stafford and Alexei Leonov conducted experiments, exchanged gifts, and even engaged in the world’s first international space handshake – a symbol of shared peace and prosperity for both superpowers. Image 2: Thomas Stafford and Alexei Leonov shake hands on the Apollo-Soyuz mission Apollo-Soyuz marked an effective end to the US-USSR ‘Space Race’ (discussed in Part I of this series), and would lead to successor programmes, including a series of missions where American space shuttles would send astronauts to the Russian space station Mir, and eventually the building of the 21st-century International Space Station (ISS). Science seemed capable of forging cooperation amidst the greatest of disagreements, transcending our human borders and divides. Frank Drake, the designer of the Arecibo Message, was filled with optimism, hoping that his message might herald the beginning of a new age, marked by united scientific discovery and unparalleled human growth. He triumphantly declared to the Cornell Chronicle on the day of its transmission that “the sense that something in the universe is much more clever than we are has preceded almost every important advance in applied technology. SCIENTIFIC SPHERES OF INTEREST Yet this rose-tinted vision of science as the great mediator perhaps has a few more cracks in it than its advocates like to admit. Even at the height of Nixon’s Cold War détente, science was not pure intellectual collaboration. Henry Kissinger, Nixon’s National Security Advisor and later Secretary of State, pioneered ‘triangular diplomacy’, the art of playing adversaries off against one another with alternating threats and incentives. In later years, he would declare that “it was always better for [the US] to be closer to either Moscow or Peking than either was to the other”. And as he opened channels of communication with China, it was science that would pave the way for a stronger relationship. In the Shanghai Communique negotiated on Nixon’s 1972 trip to China, both sides “discussed specific areas in such fields as science [and] technology…in which people-to-people contacts and exchanges would be mutually beneficial [and] undert[ook] to facilitate the further development of [them].” Scientific collaboration (often manipulated by spy agencies from the CIA to the KGB) was the carrot beside the military stick – a central part of building alliances in a world of realpolitik. To Kissinger and his colleagues, the world was to be divided into Image 3: US President Richard Nixon shakes hands with CCP Chairman Mao Zedong in China in 1972 spheres of influence, even in times of peace – and science was best used as a way of strengthening and shoring up your own prosperity. It is a realist view of science diplomacy that continues to this day, with US Secretary of State Hillary Clinton noting in Image 4: Chinese Foreign Minister Wang Yi meets with his Cambodian counterpart Prak Sokhonn in September 2021, pledging additional aid and vaccine doses. 2014 that “educational exchanges, cultural tours and scientific collaboration…may garner few headlines, but… [can] influence the next generation of U.S. and [foreign] leaders in a way no other initiative can match”. To both Clinton and Kissinger, science is an instrument of foreign policy, whether deployed overtly in winning over current governments or more subtly in shaping the views of future ones. For them, amidst competing interests and simmering tensions, we ignore science’s soft power at our own peril. Just look at China’s distribution over Sinovac COVID-19 vaccines in the pandemic. In October 2020, January 2021 and September 2021, Chinese Foreign Minister Wang Yi went on tours of Southeast Asia, promising vaccine aid while pushing closer connections between China and the rest of Asia. Last year, it was estimated that China had promised a total of over 255 million vaccine doses – a key step in building stronger economic and military ties in an increasingly tense region. Indeed, in mid-2021, just as concerns about Chinese vaccine efficacy grew, US President Joe Biden announced “half [a] billion doses with no strings attached…[no] pressure for favours, or potential concessions” from the sidelines of a G7 Summit. Secretary of Defence Lloyd Austin travelled across Southeast Asia. In the the Philippines he renewed a military deal just as a new shipment of vaccines was announced – a clear indicator of the linkage between medical and military diplomacy, something reinforced when Vice President Kamala Harris landed in Singapore later that year to declare the US “an arsenal of safe and effective vaccines for our entire world.” Australia is key to vaccine diplomacy too. On his visit here earlier this year, US Secretary of State Antony Blinken made a point of visiting the University of Melbourne’s Biomedical Precinct to talk about COVID-19, declaring on Australian television that our nation was central to “looking Image 5: United States Secretary of State Lloyd J Austin III meets with Philippines President Rodrigo Duterte in July 2021 for negotiations on renewing the Visiting Forces Agreement at the problems that afflict our people as well as the opportunities…dealing with COVID…[in] new coalitions [and] new partnerships.” These views are backed up locally too. Sitting down for an exclusive interview with OmniSci Magazine last year, Dr Amanda Caples, Lead Scientist of Victoria, was keen to characterise her work in terms of these developments, reminding us that Victoria had been key to “improving the understanding of the immunology and epidemiology of the virus, developing vaccines and treatments and leading research into the social impact of the pandemic”, and emphasising Australia’s national interest, declaring that “global policymakers understand that a high performing science and research system benefits the broader economy…science and research contribute to jobs and prosperity for all rather than just the few.” Science, it seems, whether in vaccines, trade or exchanges, just like fifty years ago, is again to be a key tool for grand strategy and national interests. Image 6: Dr Amanda Caples, Lead Scientist of Victoria ARGUMENTS AND ARMS But perhaps even this might be too optimistic an outlook – for that simmering balance of power occasionally boils over. We need only to look at what happened when the détente of Nixon and Brezhnev was dashed to pieces with the Soviet invasion of Afghanistan in 1979. The policy was roundly condemned as sheer naïveté in the face of wily adversaries, with President Ronald Reagan later describing détente in a radio address as “what a farmer has with his turkey – until Thanksgiving Day”. Science was the first target for diplomatic attacks. After the invasion, Senator Robert Dole (R-KS) launched legislation barring the National Science Foundation from funding trips to the USSR. And the push seemed bipartisan, with Representative George Brown Jr. (D-CA-36) proposing a House Joint Resolution enacting an immediate “halt [to] official travel related to scientific and technical cooperation with the Soviet Union”. Image 7: Russia’s cosmonauts board the ISS on 18th March 2022, shortly before Russia ends its participation in the program Now, as we face war on the European continent, even the ISS – the descendant of Apollo-Soyuz’s seemingly-apolitical scientific endeavours – seems to be falling apart spectacularly. On April 2 this year, Roscosmos, the Russian space agency, announced that it would be ending its participation in the ISS program, demanding a “full and unconditional removal of…sanctions” imposed over the Russian invasion of Ukraine. Earlier in the year, Roscosmos’ Director General Dmitry Rogozin openly suggested on Twitter that the ISS being without Russian involvement would lead to “an uncontrolled deorbit and fall [of the station] into the United States or Europe”, alluding to “the option of dropping a 500-ton structure [on] India and China.” Rogozin’s threats became even more pronounced as the war continued, with Roscosmos producing a video depicting Russia’s two astronauts on the station not bringing NASA astronaut Mark Vande Hei back to Earth with them (American astronauts primarily go to and return from space via Russian Soyuz capsules). Shared by Russian state news, its chilling final scenes show the Russian segment of the ISS detaching too, with Vande Hei presumably left to die in space aboard the station. Such attacks need not remain rhetorical, either. Scientific advancements have long been tied to weaponry and defence systems, with mathematicians and physicists from John Littlewood to Richard Feynman involved in making bombs and ballistics in times of war. Even Arecibo, that bastion of a united humanity, began life as a Department of Defence initiative detecting Soviet ballistic missiles. Today, the AUKUS defence partnership – one of the most significant Indo-Pacific defence developments in recent memory – centres on sharing nuclear submarine science and technology, promising scientific cooperation regarding “cyber capabilities, artificial intelligence, quantum technologies, and additional undersea capabilities”. Even if induced by factors beyond our control, such weapons-based science is a far cry from the pacifist ideals of the Arecibo message. Thus, perhaps this messy reality is more central to our science than we like to admit. From the ISS to Australia’s waters, science still is intertwined with conflict and frequently co-opted by geopolitical actors in times of renewed aggression. Science at its worst is mere weaponry. But at its best, it speaks to something greater. HOPE IN THE DARKNESS In June 1977, the world was far from diplomatically stagnant. From the rumblings of Middle Eastern peace (what became the Camp David Accords) to new hopes of nuclear arms reduction, US President Jimmy Carter had quite the array of diplomatic dilemmas to consider. But amidst all that cold politics, he penned a letter to be sent on board the spacecraft Voyager, now the furthest manmade object from our solar system, declaring “We are attempting to survive our time so we may live into yours…This record represents our hope and our determination, and our good will in a vast and awesome universe.” And if this magazine has purported to speak to the ‘alien’ – far removed from our human lives - then perhaps we have discovered quite the opposite: that looking out up there is so much about looking in down here. Science presents a way we can look out at the alien and see ourselves – “survive our time…into yours”, finding a path ahead reflected in the inky blackness above. We are often constrained by time and circumstance, forced in the face of nefarious actors to compromise our idealism and use science as a mere weapon or tool. Discovery for discovery’s sake is frequently the first casualty when battle lines are drawn and aggression begun, and too often the political pessimism of the scientist can seem overpowering. But if the stories of broken détentes, diplomatic realpolitik and weaponised technology have made it all feel inevitable, then perhaps it is worth considering the story we began with, looking up into the night sky and remembering that somewhere amidst the stars is a tiny warble in the electromagnetic spectrum. Long after the funds and papers that forged it have faded away, after the people who wrote it have perished, it will continue. In its odd combination of ones and zeroes, it will represent humanity: our contradictions and our fears, our constant foibles and infighting, but also our occasional glimpses of a future beyond them. A signal…a reminder that when the times, the people Image 8: President Jimmy Carter’s message, sent aboard Voyager, the furthest man-made probe from Earth and the ideas line up just right, science can be the torchbearer for something greater. Something so rare that amidst all the ills of the world, it often seems non-existent, and so powerful that over two millennia ago, Aeschylus himself deemed it the very thing given to humanity by Prometheus to save us from destruction – the ideal that transformed us from mortals fixated on ourselves and our deaths to a civilisation capable of great things. “τυφλὰς…ἐλπίδας”, he called it: blind hope. A handshake in a capsule. A life-saving jab on board a ship. A binary message in a bottle, out among the stars. Fleeting images – not of what we are, but of what we can be: visions of blind hope, that sheer belief that we can grow past our worst violent impulses and reach out into the great beyond. Maybe it’s foolish. Maybe it’s naïve. But, on a brisk fall evening, looking out at a sky full of stars, each one more twinkling than the last, it’s easy to stop and imagine…maybe it’s the only thing that matters. Andrew Lim is an Editor and Feature Writer with OmniSci Magazine and led the team behind the Australian Finalist Submission to the New Arecibo Message Challenge. Image Credits (in order): National Atmospheric and Ionosphere Centre; National Aeronautics and Space Administration; National Archives Nixon White House Photo Office Collection; Kith Serey/Pool via Reuters; Malacanang Presidential Photo via Reuters; The Office of the Lead Scientist of Victoria; AP; National Aeronautics and Space Administration Previous article Next article alien back to

  • Believing in aliens... A science?

    By Juulke Castelijn < Back to Issue 3 Believing in aliens... A science? By Juulke Castelijn 10 September 2022 Edited by Tanya Kovacevic and Ashleigh Hallinan Illustrated by Quynh Anh Nguyen Next The question of the existence of ‘intelligent life forms’ on a planet other than ours has always been one of belief. And I did not believe. It was probably the image of a green blob with multiple arms and eyes squelching across the ground and emitting noises unidentifiable as any form of language which turned me off the whole idea. But a book I read one day completely changed my mind; it wasn’t about space at all, but about evolution. ‘Science in the Soul’ is a collection of works written by the inimitable Richard Dawkins, a man who has argued on behalf of evolutionary theory for decades. Within its pages, you will find essays, articles and speeches from throughout his career, all with the target of inspiring deep rational thought in the field of science. A single essay gives enough food for thought to last the mind many days, but the ease and magnificence of Dawkin’s prose encourages the devourment of many pages in a single sitting. The reader becomes engulfed in scientific argument, quickly and completely. Dawkins shows the fundamental importance of the proper understanding of evolution as not just critical to biology, but society at large. Take, for instance, ‘Speaking up for science: An open letter to Prince Charles,’ in which he argues against the modelling of agricultural practices on natural processes as a way of combating climate change. Even if agriculture could be in itself a natural practice (it can’t), nature, Dawkins argues, is a terrible model for longevity. Instead, nature is ‘a short-term Darwinian profiteer’. Here he refers to the mechanism of natural selection, where offspring have an increased likelihood of carrying the traits which favoured their parents’ survival. Natural selection is a reflective process. At a population level, it highlights those genetic traits that increased chances of survival in the past. There is no guarantee those traits will benefit the current generation at all, let alone future generations. Instead, Dawkins argues, science is the method by which new solutions to climate change are found. Whilst we cannot see the future, a rational application of a wealth of knowledge gives us a far more sensitive approach than crude nature. Well, perhaps not crude per se. If anyone is an advocate for the beauty and complexity of natural life, it is surely Dawkins. But a true representation of nature, he argues, rests on the appreciation of evolution as a blinded process, with no aim or ambition, and certainly no pre-planned design. With this stance, Dawkins directly opposes Creationism as an explanation of how the world emerged, a battle from which he does not shy away. Evolution is often painted as a theory in which things develop by chance, randomly. When you consider the complexity of a thing such as the eye, no wonder people prefer to believe in an intelligent designer, like a god, instead. But evolution is not dependent on chance at all, a fact Dawkins argues many times throughout his collection. There is nothing random about the body parts that make up modern humans, or any other living thing - they have been passed down from generation to generation because they enhanced our ancestors’ survival. The underlying logic is unrivalled, including by religion. But that doesn’t mean Dawkins is not a man of belief. Dawkins believes in the existence of intelligent extraterrestrial life, and for one reason above all: given the billions upon billions of planets in our universe, the chance of our own evolution would have to be exceedingly small if there was no other life out there. In other words, we believe there is life out there because we do not believe our own evolution to be so rare as to only occur once. Admittedly, it is not a new argument but it had not clicked for me before. Perhaps it was Dawkins’ poetic phrasing. At this stage it is a belief, underlined by a big ‘if’. How could we ever know if there are intelligent life forms on a planet other than Earth? Dawkins provides an answer here too. You probably won’t be surprised that the answer is science, specifically a knowledge of evolution. We do not have to discover life itself, only a sign of something that marks intelligence - a machine or language, say. Evolution remains our only plausible theory of how such a thing could be created, because it can explain the formation of an intelligent being capable of designing such things. We become the supporting evidence of life somewhere else in the universe. That’s satisfying enough for me. Previous article Next article alien back to

  • Interstellar Overdrive: Secrets of our Distant Universe | OmniSci Magazine

    < Back to Issue 7 Interstellar Overdrive: Secrets of our Distant Universe by Sarah Ibrahimi 22 October 2024 edited by Hendrick Lin illustrated by Amanda Agustinus “Somewhere, something incredible is waiting to be known” - Carl Sagan Humanity's innate curiosity and desire of uncovering the unknown has been the spark for mankind's explorations since the beginning of time. From Columbus' expedition across the Atlantic to discover the New World, to Armstrong's first steps on the Moon's surface, we have experienced technological advancement at a lightning pace over the course of human history. Perhaps the most enthralling of these advances has been the scientific quest to unveil the true nature of our universe - the stars, the planets and the beings that exist within it and far beyond. And now, a novel and revolutionary tool has been developed to deepen our understanding of the cosmos. The James Webb Space Telescope (JWST) developed by NASA is the largest of its kind to ever be placed in space. Launched on Christmas Day in 2021 on board the Ariane 5 rocket, it travelled 1.5 million kilometres equipped with various high-resolution and high-sensitivity instruments, allowing scientists the ability to capture detailed infrared astronomical images of our old and distant universe (NASA, 2022a). In a matter of less than a year, the deepest infrared image known to mankind was produced. Named Webb's First Deep Field, it was unveiled by U.S. President Joe Biden on June 11th, 2022 at the White House, encapsulating never-before-seen perspectives of our universe. With this revelation, a new gateway has been opened into answering the countless questions of the early universe pondered by astrophysicists and the public alike. Confronting viewers with an array of contrasting colours and eccentric shapes, Webb’s First Deep Field can be hard to interpret ( figure 1 ). Figure 1. Webb’s First Deep Field: SMACS 07223 Note. From/Adapted from Webb’s First Deep Field: SMACS 07223 [photo] by James Webb Space Telescope. NASA, 2022b. https://webbtelescope.org/contents/media/images/2022/035/01G7DCWB7137MYJ05CSH1Q5Z1Z?page=1&keyword=smac Copyright 2022, NASA. But with a careful eye and some clever detective work, we can begin to decipher the secrets contained within. For example, the bright lights depicting what appear to be stars are rather entire galaxies, each a gateway to billions of stars. In addition, Webb’s Near-Infrared Camera (NIRCam) is able to capture distant galaxies with the sharpest focus to date, unravelling important features from their faint complexities. Appreciation for this image increases exponentially once we begin to comprehend the magnitude of its importance - it depicts the galaxy cluster, SMACS 0723, exactly as it looked 4.6 billion years ago! In other words, this image is a glimpse back to a time well before humans or any life forms existed. Amongst the myriad of initial images produced by JWST, one particular point of interest would be the Southern Ring Nebula illustrating the dying NGC 3132 star ( figure 2 ). This can be seen through the expulsion of its gases and outer layers, producing striking imagery through Webb’s NIRCam. Viewers may also notice the bright lights representing individual galaxies in the nebula's background - again, not to be mistaken as stars. JWST’s ability to capture such a pivotal point in the trajectory of a star's life is crucial in assisting scientists to calculate the volumes of gas and dust present, as well as their unique molecular compositions. Figure 2. Southern Ring Nebula captured by JWST Note. From/Adapted from Southern Ring Nebula [photo] by James Webb Space Telescope. NASA, 2022c. https://webbtelescope.org/contents/media/images/2022/033/01G70BGTSYBHS69T7K3N3ASSEB Copyright 2022, NASA. The efforts to produce such groundbreaking images and insights into the universe did not happen overnight. The Hubble Space Telescope, launched in 1990, was an important predecessor to the JWST. Whether it was confirming the existence of black holes, or the Nobel Prize winning discovery demonstrating the accelerating rate of expansion of the universe, the Hubble Space Telescope laid the foundations for the JWST to flourish. These marvellations revealed by the JWST would also not be possible without the efforts of countless scientists to improve the technological potential of the Hubble Telescope. As a result of these developments, JWST contains a larger primary mirror, deeper infrared vision, and is optimised for longer ultraviolet and visible wavelengths, all with the aim to increase the telescope’s ability to capture profound images of our universe. Nonetheless, a number of hypotheses relevant to matters such as dark energy, exoplanets, and infrared astrophysics remain unanswered. As a next step forward, the Nancy Grace Roman Space Telescope is set to launch in 2027 with the capacity to produce a panoramic view two hundred times greater than the infrared view generated by Hubble and JWST. The questions that continue to itch our minds remain limitless. As Einstein once lamented, "the more I learn, the more I realise how much I don't know”. There is still so much that remains to be discovered. However, the JWST illustrates that through collaborative scientific efforts, humankind can begin to unravel the many mysteries that govern our universe, one galaxy at a time. References NASAa. (2022, July 12). NASA’s Webb Delivers Deepest Infrared Image of Universe yet. https://www.nasa.gov/image-article/nasas-webb-delivers-deepest-infrared-image-of-universe-yet/ NASAb. (2022, July 11). Webb’s First Deep Field . Webb Space Telescope. https://webbtelescope.org/contents/media/images/2022/035/01G7DCWB7137MYJ05CSH1Q5Z1Z?page=1&keyword=smac NASAc. (2022, July 11). Southern Ring Nebula. Webb Space Telescope. https://webbtelescope.org/contents/media/images/2022/033/01G70BGTSYBHS69T7K3N3ASSEB Previous article Next article apex back to

  • The Rise of The Planet of AI | OmniSci Magazine

    The Rise of The Planet of AI By Ashley Mamuko When discussing AI, our minds instinctively fear of sentience and robotic uprising. However, is our focus misplaced on the “inevitable” humanoid future when AI has become ubiquitous and undetectable in our lives? Edited by Hamish Payne & Katherine Tweedie Issue 1: September 24, 2021 Illustration by Aisyah Mohammad Sulhanuddin On August 19th 2021, Tesla announced a bold project on its AI Day. The company plans to introduce humanoid robots for consumer use. These machines are expected to perform basic, mundane household tasks and streamline easily into our everyday lives.With this new release, the future of AI seems to be closing in. No longer do we stand idle, expecting the inevitable humanoid-impacted future. By 2022, these prototypes are expected to launch. It seems inevitable that our future would include AI. We have already familiarised ourselves with this emerging technology in the media we continue to enjoy. Wall E, Blade Runner, The Terminator, and Ex Machina are only a few examples of the endless list of AI-related movies, spanning decades and detailing both our apprehension and acceptance through multiple decades. Most of these movies portray these machines as sentient yet intrinsically evil, as they pursue human destruction. But to further understand the growing field of study of AI, it’s important to first briefly introduce its history and procurement before noting the growing concerns played up in the Hollywood Blockbusters. The first fundamental interpretations of Artificial Intelligence span a vast period of time. Its first acknowledgement may be attributed to the 1308 Catalan poet and theologian Ramon Llull. His work Ars generalis ultima (The Ultimate General Art) advanced a paper-based mechanical process that creates new knowledge from a combination of concepts. Llull aimed to create a method of deducing logical religious and philosophical truths numerically. In 1642, French mathematician Blaise Pascal invented the first mechanical calculating machine; the first iteration of the modern calculator (1). The Pascaline, as it is now known, only had the ability to add or subtract values using a dial and spoke system (2). Though these two early ideas do not match our modern perceptions of what AI is, they lay the foundation of pushing logical processes to do more than just mechanical means. These two instances in history foreshadow the use of mechanical devices in performing human cognitive functions. Not till the 1940s and early 1950s did we finally obtain the necessary means of more complex data processing systems. With the introduction of computers, the novelty of algorithms created a more streamlined function of storing, computing, and producing. In 1943, Warren McCulloch and Walter Pitts founded the idea of artificial neural networks in their paper “A Logical Calculus of Ideas Immanent in Nervous Activity” (3). This presented the notion of computers behaving similar to a human mind and introduced the subsection of “deep learning”. Alan Turing proposed a test to assess a human’s ability to differentiate between human behaviour and robotic behaviour. In 1950, the Turing Test (later known as the Imitation Game) asked participants to identify if the dialogue they were engaging with was with another person or a machine (4). Despite the breakthroughs made in this expertise, the term Artificial Intelligence wasn’t finally coined till 1955 by John McCarthy of AI. Later on, McCarthy along with many other budding experts would hold the famous 1956 Dartmouth College Workshop (5). This meetup of a few scientists would later be pinpointed in history as the birth of the AI field. As the field continued to grow, more public concerns were raised alongside the boom of science fiction literature and movies cropping up. The notorious 1968 movie 2001: A Space Odyssey shaped such a role into the public perception of the field that by the 1960s and 1970s, an AI Winter occurred. Very little notable progress was made in the field due to the lack of funding based on fear (6). Finally after some time had passed and some more advancements were made with algorithm technology, the notable Deep Blue chess game against Gary Kasparov. The event occurring in May 1997 where the Deep Blue robot beat world champion chess superstar Gary Kasparov marked a silence ushering of perhaps a “decline in human society” at the fall of the machine. Fast forward to now, AI has traversed through leaps and bounds to achieve a much more sophisticated level of algorithms and machine learning techniques. To further understand the uses of AI, I interviewed Dr Liz Sonenberg, a professor in the School of Computing and Information Systems at The University of Melbourne and is a Pro Vice-Chancellor (Research Infrastructure and Systems) in Chancellery Research and Enterprise. She’s an expert in the field and has done a multitude of research. "Machine learning is simply a sophisticated algorithm to detect patterns in data sets that has a basis in statistics." With this algorithm, we have been able to implement it in a variety of our daily tech encounters. AI sits behind the driving force of Google Maps and navigation, as well as voice control. It can easily be found anywhere. “Just because these examples do not exhibit super intelligence, does not mean they are not useful,” Dr Sonenberg explains. Dr Sonenberg alludes that the real problem with AI lies within it’s fairness. These “pattern generating algorithms” at times “learn from training sets not representative of the whole population, which can end up with biased answers.” With a flawed training set, a flawed system is in place. This can be harmful to certain demographics and cause a sway on consumer habits. With AI-aided advice, the explanation behind outcomes and decisions are not supported either. Algorithms are only able to mechanically produce an output, but not explain them. With more high-stakes decisions untrusted upon the reliability of AI, the issue of flawed algorithms becomes more pronounced. With my interview with Dr Sonenberg, not one moment was the fear of super-intelligence, robot uprisings, and the likes brought up... With the new-found knowledge of AI’s current concerns I brought up with Dr Sonenberg, I conducted another interview with Dr Tim Miller, a Professor of Computer Science in the School of Computing and Information Systems at The University of Melbourne, and Dr Jeannie Paterson, a Professor teaching subjects in law and emerging technologies in the School of Law at The University of Melbourne. They both are also Co-Directors at The Centre for Artificial Intelligence and Digital Ethics (CAIDE). As we began the interview, Dr Miller explained again that AI “is not magic” and implements the use of “math and statistics”. Dr Paterson was clear to bring up that anti-discrimination laws have been in place but as technology evolves and embeds itself more into public domain, it must be scrutinised. The deployment of AI can easily cause harm to people due to systems not being public, causing sources to be difficult to identify and causily attribute. With the prospect of biased algorithms, a fine dissonance occurs. Dr Miller elaborated on the use of AI in medical imaging used in private hospitals. As private hospitals tend to attract a certain echelon of society, the training set is not wholly representative of the greater population. “A dilemma occurs with racist algorithms… if it is not used [outcomes] could be worse.” When the idea of a potential super-intelligent robot emerging in the future was brought into conversation, the two didn’t seem to be very impressed. “Don’t attribute superhuman qualities [to it],” says Dr Paterson. Dr Miller states that the trajectory of AI’s future is difficult to map. Predictions in the past of how AI progresses with it’s abilities have occurred, but they occur much later than expected… easily decades later. The idea of super-intelligence also poses the question on how to define intelligence. “Intelligence is multidimensional, it has its limits,” says Dr Miller. In this mystical future world of AI, a distinction is placed not just on, “what will machines be able to do but what will not have them do,” states Dr Miller. “This regards anything that requires social interaction, creativity and leadership”; so the future is aided by AI, not dictated by it. However, in a more near future, some very real concerns are posed. Job security, influence on consumer habits, transparency, law approach, and accountability are only a few. With more and more jobs being replaced by machines, every industry is at stake. “Anything repetitive can be automated,” says Dr Miller. But this does not instinctively pose a negative, as more jobs will be created to further aid the use of AI. And not all functions of a job can be replaced by AI. Dr Paterson explains with the example of radiology that AI is able to diagnose and interpret scans, but a radiologist does more than just diagnose and interpret on a daily basis. “The AI is used to aid in the already existing profession, not simply overtake it.” Greater transparency is needed in showing how AI uses our data. “It shouldn’t be used to collect data unlimitedly,” says Dr Paterson, “is it doing what’s being promised, is it discriminating people, is it embedding inequality?” With this in mind, Dr Paterson suggests that more law authorities should be educated on how to approach topics regarding AI. “There needs [to be] better explanation… [We] need to educate judges and lawyers.” With the notorious Facebook-Cambridge Analytica scandal of 2018, the big question of accountability was raised. The scandal involved the unwarranted use of data from 87 million Facebook users by Cambridge Analytica which served to support the Trump campaign. This scandal brought to light how the data we used can be exploited nonconsensually and used to influence our behaviours, as this particular example seemed to sway the American presidential election. Simply put, our information can be easily exploited and sent off to data analytics to further influence our choices. This creates the defence that apps “ merely provide a [service], but people use [these services] in that way,” as said by Dr Miller. Simply put, the blame becomes falsely shifted onto the users for the spread of misinformation. The impetus, however, should lie with social networking sites disclosing to it’s users more transparency on their data usage and history as well as providing adequate protection on their data. To be frank, the future of robotic humanoid AI integrating seamlessly into human livelihoods will not occur within our lifetimes, or potentially even our grandchildren’s. The forecast seems at best, unpredictable; and at worst, unattainable due to the complexity of what constitutes full “sentience”. However, this does not indicate that AI lies dormant within our lives. The fundamental technology based in computing, statistics, and information systems lays most of the groundwork for most transactions we conduct online, whether monetary or social or otherwise. AI and it’s promises should not be shunted aside due to the misleading media surrounding it’s popularised definition and “robot uprisings” but rather taught more broadly to all audiences. So perhaps Elon Musk’s fantastical ideas of robotic integration will not occur by 2022 but the presence of AI in modern technologies should not go unnoticed. References: 1. "A Very Short History of Artificial Intelligence (AI)." 2016. Forbes. https://www.forbes.com/sites/gilpress/2016/12/30/a-very-short-history-of-artificial-intelligence-ai/?sh=38106456fba2. 2. “Blaise Pascal Invents a Calculator: The Pascaline.” n.d. Jeremy Norma's Historyofinformation.com. https://www.historyofinformation.com/detail.php?id=382. 3, 4, 6. “History of Artificial Intelligence.” n.d. Council of Europe. https://www.coe.int/en/web/artificial-intelligence/history-of-ai. 5. Smith, Chris, Brian McGuire, Ting Huang, and Gary Yang. 2006. “The History of Artificial Intelligence,” A file for a class called History of Computing offered at the University of Washington. https://courses.cs.washington.edu/courses/csep590/06au/projects/history-ai.pdf.

  • Hiccups | OmniSci Magazine

    < Back to Issue 2 Hiccups Evolution might be a theory, but if it’s evidence you’re after, there’s no need to look further than your own body. The human form is full of fascinating parts and functions that hold hidden histories - from the column that brought you a deep-dive into ear wiggling in Issue 1, here’s an exploration of why we hiccup! by Rachel Ko 10 December 2021 Edited by Katherine Tweedie and Ashleigh Hallinan Illustrated by Gemma Van der Hurk Hiccups bring a special brand of chaos to a day. It’s one that lingers, rendering us helpless and in suspense; a subtle, internal chaos of quiet frustration that forces us to drop what we’re doing to monitor each breath – in and out, in and out – until the moment they abruptly decide to stop. It’s an experience we’ve all had – one that can hit anyone at any time – and for most of us, hiccups are a concentrated episode of inconvenience; best ignored, and overcome. Yet, despite our haste to get rid of them when they interrupt our day, hiccups seem to have mystified humans for generations. Historically, the phenomenon has been the source of many superstitions, both good and bad. A range of cultures associate them with the concept of remembrance: in Russia, hiccups mean someone is missing you (1), while an Indian myth suggests that someone is remembering you negatively for the evils you have committed (2). Likewise, in Ancient Greece, hiccups were a sign that you were being complained about (3), while in Hungary, they mean you are currently the subject of gossip. On a darker note, a Japanese superstition prophesises death to one who hiccups 100 times. (4) Clearly, the need to justify everything, even things as trivial as hiccups, has always been an inherent human characteristic, transcending culture and time. As such, science has more recently made its attempt at objectively identifying a reason behind the strange phenomenon of hiccups. After all, if you take a step back and think about it, hiccups are indeed quite strange. Anatomically, hiccups (known scientifically as singultus) are involuntary spasms of the diaphragm (5): the dome-like sheet of muscle separating the chest and abdominal cavities. (6) The inspiratory muscles, including the intercostal and neck muscles, also spasm, while the expiratory muscles are inhibited. (7) These sudden contractions cause a rapid intake of air (“hic”), followed by the immediate closure of the glottis or vocal cords (“up”). (8) As many of us have probably experienced, a range of stimuli can cause these involuntary contractions. The physical stimuli include anything that stretches and bloats the stomach, (9) such as overeating, rapid food consumption and gulping, especially of carbonated drinks. (10) Emotionally, intense feelings and our responses to them, such as laughing, sobbing, anxiety and excitement, can also be triggers. (11) This list is not at all exhaustive; in fact, the range of stimuli is so large that hiccups might be considered the common thread between a drunk man, a Parkinson’s disease patient and anyone who watches The Notebook. The one thing that alcohol, (12) some neurological drugs (13) and intense sobbing (14) do have in common is that they exogenously stimulate the hiccup reflex arc. (15) This arc involves the vagal and phrenic nerves that stretch from the brainstem to the abdomen which cause the diaphragm to contract involuntarily. (16) According to Professor Georg Petroianu from the Herbert Wertheim College of Medicine, (17) many familiar home remedies for hiccupping – being scared, swallowing ice, drinking water upside down – interrupt this reflex arc, actually giving these solutions a somewhat scientific rationale. While modern research has successfully mapped out the process of hiccups, their purpose is still unclear. As of now, the hiccup reflex arc and the resulting diaphragmatic spasms seem to be effectively useless. Of the existing theories for the function of hiccups, the most prominent seems to be that they are a remnant of our evolutionary development, (18) essentially ‘vestigial’; in this case, a feature that once served our amphibian ancestors millions of years ago, but now retain little of their original function. (19) In particular, hiccups are believed to be a relic of the ancient transition of organisms from water to land. (20) When early fish lived in stagnant waters with little oxygen, they developed lungs to take advantage of the air overhead, in addition to using gills while underwater. (21) In this system, inhalation would allow water to move over the gills, during which a rapid closure of the glottis – which we see now in hiccupping – would prevent water from entering the lungs. It is theorised that when descendants of these fish moved onto land, gills were lost, but the neural circuit for this glottis closing mechanism was retained. (22) This neural circuit is indeed observable in human beings today, in the form of the hiccup central pattern generator (CPG). (23) CPGs exist for other oscillating actions like breathing and walking, (24) but a particular cross-species CPG stands out as a link to human hiccupping: the neural CPG that is also used by tadpoles for gill ventilation. Tadpoles “breathe” in a recurring, rhythmic pattern that shares a fundamental characteristic feature with hiccups: both involve inspiration with closing of the glottis. (25) This phenomenon strengthens the idea that the hiccup CPG may be left over from a previous stage in evolution and has been retained in both humans and frogs. However, the CPG in frogs is still used for ventilation, while in humans, the evolution of lungs to replace gills has rendered it useless. (26) Based on this information, it seems hiccupping lost its function with time and the development of the human lungs, remaining as nothing more than an evolutionary remnant. However, we cannot discredit hiccupping as having become entirely useless as soon as gills were lost. Interestingly, hiccupping has only been observed in mammals – not in birds, lizards or other air-breathing animals. (27) This suggests that there must have been some evolutionary advantage to hiccupping at some point, at least in mammals. A popular theory for this function stems from the uniquely mammalian trait of nursing. (28) Considering the fact that human babies hiccup in the womb even before birth, this theory considers hiccupping to be almost a glorified burp, intended to remove air from the stomach. This becomes particularly advantageous when closing the glottis prevents milk from entering the lungs, aiding the act of nursing. (29) Today, we reduce hiccups to the disorder and disarray they bring to our day. But, next time you are hit with a bout of hiccups, take a second to find some calm amidst the chaos and appreciate yet another fascinating evolutionary fossil, before you hurry to dismiss them. After that, feel free to eat those lemons or gargle that salty water to your diaphragm’s content. References Sonya Vatomsky, "7 Cures For Hiccups From World Folklore," Mentalfloss.Com, 2017, https://www.mentalfloss.com/article/500937/7-cures-hiccups-world-folklore. Derek Lue, "Indian Superstition: Hiccups | Dartmouth Folklore Archive," Journeys.Dartmouth.Edu, 2018, https://journeys.dartmouth.edu/folklorearchive/2018/11/14/indian-superstition-hiccups/. Vatomsky, "7 Cures For Hiccups From World Folklore". James Mundy, "10 Most Interesting Superstitions In Japanese Culture | Insidejapan Tours," Insidejapan Blog, 2013, https://www.insidejapantours.com/blog/2013/07/08/10-most-interesting-superstitions-in-japanese-culture/. Paul Rousseau, "Hiccups," Southern Medical Journal, no. 88, 2 (1995): 175-181, doi:10.1097/00007611-199502000-00002. Bruno Bordoni and Emiliano Zanier, "Anatomic Connections Of The Diaphragm Influence Of Respiration On The Body System," Journal Of Multidisciplinary Healthcare, no. 6 (2013): 281, doi:10.2147/jmdh.s45443. Christian Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," Bioessays no. 25, 2 (2003): 182-188, doi:10.1002/bies.10224. Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," 182-188. John Cameron, “Why Do We Hiccup?,” filmed for TedEd, 2016, TED Video, https://ed.ted.com/lessons/why-do-we-hiccup-john-cameron#watch. Monika Steger, Markus Schneemann, and Mark Fox, "Systemic Review: The Pathogenesis And Pharmacological Treatment Of Hiccups," Alimentary Pharmacology & Therapeutics 42, no. 9 (. 2015): 1037-1050, doi:10.1111/apt.13374. Lien-Fu Lin, and Pi-Teh Huang, "An Uncommon Cause Of Hiccups: Sarcoidosis Presenting Solely As Hiccups," Journal Of The Chinese Medical Association 73, no. 12 (2010): 647-650, doi:10.1016/s1726-4901(10)70141-6. Steger, Schneemann and Fox, "Systemic Review: The Pathogenesis And Pharmacological Treatment Of Hiccups," 1037-1050. Unax Lertxundi et al., "Hiccups In Parkinson’s Disease: An Analysis Of Cases Reported In The European Pharmacovigilance Database And A Review Of The Literature," European Journal Of Clinical Pharmacology 73, no. 9 (2017): 1159-1164, doi:10.1007/s00228-017-2275-6. Lin and Huang, "An Uncommon Cause Of Hiccups: Sarcoidosis Presenting Solely As Hiccups," 647-650. Peter J. Kahrilas and Guoxiang Shi, "Why Do We Hiccup?" Gut 41, no. 5 (1997): 712-713, doi:10.1136/gut.41.5.712. Steger, Schneemann and Fox, "Systemic Review: The Pathogenesis And Pharmacological Treatment Of Hiccups," 1037-1050. Georg A. Petroianu, "Treatment Of Hiccup By Vagal Maneuvers," Journal Of The History Of The Neurosciences 24, no. 2 (2014): 123-136, doi:10.1080/0964704x.2014.897133. Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," 182-188. Cameron, “Why Do We Hiccup?” Michael Mosley, "Anatomical Clues To Human Evolution From Fish," BBC News, published 2011, https://www.bbc.com/news/health-13278255. Michael Hedrick and Stephen Katz, "Control Of Breathing In Primitive Fishes," Phylogeny, Anatomy And Physiology Of Ancient Fishes (2015): 179-200, doi:10.1201/b18798-9. Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," 182-188. Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," 182-188. Pierre A. Guertin, "Central Pattern Generator For Locomotion: Anatomical, Physiological, And Pathophysiological Considerations," Frontiers In Neurology 3 (2013), doi:10.3389/fneur.2012.00183. Hedrick and Katz, "Control Of Breathing In Primitive Fishes," 179-200. Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," 182-188. Daniel Howes, "Hiccups: A New Explanation For The Mysterious Reflex," Bioessays 34, no. 6 (2012): 451-453, doi:10.1002/bies.201100194. Howes, "Hiccups: A New Explanation For The Mysterious Reflex," 451-453. [1] Howes, "Hiccups: A New Explanation For The Mysterious Reflex," 451-453. Previous article back to DISORDER Next article

  • Wicked Invaders of the Wild | OmniSci Magazine

    < Back to Issue 5 Wicked Invaders of the Wild Serenie Tsai 24 October 2023 Edited by Krisha Darji Illustrated by Jennifer Nguyen Since the beginning of time, there has been a continuous flow of species in and out of regions that establishes a foundation for ecosystems. When species are introduced into new environments and replicate excessively to interfere with native species, they become invasive. Invasive species refer to those that spread into new areas and pose a threat to other species. Factors contributing to their menacing status include overfeeding native species, lack of predators, and outcompeting native species (Sakai et al., 2001). Invasive species shouldn’t be confused with feral species which are domestic animals that have reverted to their wild state, or pests which are organisms harmful to human activity (Contrera-Abarca et al., 2022; Hill, 1987). Furthermore, not all introduced species are invasive; crops such as wheat, tomato and rice have been integrated with native agriculture successfully. Many species were introduced accidentally and turned invasive; however, some were intentionally introduced to manage other species, and a lack of foresight resulted in detrimental ecological impacts. Each year, invasive species cost the global economy over a trillion dollars in damages (Roth, 2019). Claimed ecological benefits of invasive species Contrary to the name, invasive species could potentially benefit the invaded ecosystem. Herbivores can reap the benefits of the introduced biodiversity, and native plants can increase their tolerance (Brändle et al., 2008; Mullerscharer, 2004). Deer and goats aid in suppressing introduced grasses and inhibit wildfires (Fornoni, 2010). Likewise, species such as foxes and cats have the capacity to regulate the number of rats and rabbits. Furthermore, megafaunal extinction has opened opportunities to fill empty niches, for example, camels could fill the ecological niche of a now-extinct giant marsupial (Chew et al., 1965; Weber, 2017). Thus, studies indicate the possibility of species evolving to fill vacant niches (Meachen et al., 2014). Below, I’ll explore the rise and downfall of invasive species in Australia. Cane toad Cane toads are notorious for their unforeseen invasion. Originally introduced as a biological control for cane beetles in 1935, their rookie status was advantageous to their proliferation and dominance over native species (Freeland & Martin, 1985). Several native predators were overthrown and native fauna in Australia lacked resistance to the cane toad’s poison used as a defence mechanism (Smith & Philips, 2006). However, research suggests an evolutionary adaptation to such poison (Philips &Shine, 2006). There isn't a universal method to regulate cane toads, so efforts to completely eradicate cane toads are futile. However, populations are kept low by continuously monitoring areas and targeting cane toad eggs or their adult form. Common Myna The origins of Common Myna introduced into New South Wales and Victoria are uncertain; however, it was introduced into Northern Queensland as a mechanism to predate on grasshoppers and cane beetles(Neville & Liindsay, 2011) and introduced into Mauritius to control locust plagues (Bauer, 2023). The Common Myna poses an alarming threat to ecosystems and mankind, its severity is elucidated by its position in the world’s top 100 invasive species list (Lowe et al., 2000). It has spurred human health concerns including the spread of mites and acting as a vector for diseases destructive to human and farm stock (Tidemann, 1998). Myna also has a vicious habit of fostering competition with cavity-nesting native birds, forcing them and their eggs from their nest, however, the extent of this is unclear, and the influence of habitat destruction needs to be considered (Grarock et al., 2013). The impact of this bird lacks empirical evidence, so appropriate management is undecided (Grarock et al., 2012). However, modification of habitats could be advantageous as the Myna impact urban areas more, whereas intervening in their food resources would be rendered useless with their highly variable diet (Brochier et al., 2012). Zebra mussels Zebra mussels accidentally invaded Australia's aquatic locality when introduced by the ballast water of cargo ships. From an ecological perspective, Zebra Mussels overgrow the shells of native molluscs and create an imbalance within the ecosystem (Dzierżyńska-Białończyk et al., 2018). From a societal perspective, it colonizes docks, ship hulls, and water pipes and damages power plants (Lovell et al., 2006) Controlling the spread of Zebra Mussels includes manual removal, chlorine, thermal treatment and more. Control methods It is crucial to deploy preventative methods to mitigate the spread of invasive species before it becomes irreversible. Few known control methods are employed for certain types of animals but with no guarantee of success. Some places place bounties on catching the animals, however, the results of this technique are conflicting. In 1893, foxes were the target of financial incentives, but the scheme was deemed ineffective (Saunders et al., 2010). However, government bounties were introduced for Tasmanian tigers in 1888, which drastically caused a population decline and their eventual extinction (National Museum of Australia, 2019). Similarly, the prevalence of Cane Toads became unbearable, and in response, armies were deployed, and fences in rural communities were funded. Moreover, in 2007, inspired by a local pub’s scheme to hand out beers in exchange for cane toads, the government staged a “Toad Day Out” to establish a bounty for cane toads (Williams, 2011). Invasive species are detrimental to ecosystems, whether introduced intentionally or by accident, management of species is still a work in progress. References Lowe S., Browne M., Boudjelas S., & De Poorter M. (2000) 100 of the World’s Worst Invasive Alien Species: A selection from the Global Invasive Species Database . The Invasive Species Specialist Group (ISSG). Bauer, I. L. (2023). T he oral repellent–science fiction or common sense? Insects, vector- borne diseases, failing strategies, and a bold proposition. Tropical Diseases, Travel Medicine and Vaccines, 9(1), 7. Brändle, M., Kühn, I., Klotz, S., Belle, C., & Brandl, R. (2008). Species richness of herbivores on exotic host plants increases with time since introduction of the host. Diversity and Distributions, 14(6), 905–912. https://doi.org/10.1111/j.1472-4642.2008.00511.x Brochier, B., Vangeluwe, D., & Van den Berg, T. (2010). Alien invasive birds. Revue scientifique et technique, 29(2), 217. Chicago. Cayley, N. W., & Lindsey, T. What bird is that?: a completely revised and updated edition of the classic Australian ornithological work . Chew, R. M., & Chew, A. E. (1965). The Primary Productivity of a Desert-Shrub ( Larrea tridentata ) Community . Ecological Monographs, 35(4), 355–375. https://doi.org/10.2307/1942146 Contreras-Abarca, R., Crespin, S. J., Moreira-Arce, D., & Simonetti, J. A. (2022). Redefining feral dogs in biodiversity conservation . Biological Conservation, 265, 109434. https://doi.org/10.1016/j.biocon.2021.109434 Fornoni, J. (2010). Ecological and evolutionary implications of plant tolerance to herbivory. Functional Ecology, 25(2), 399–407. https://doi.org/10.1111/j.1365-2435.2010.01805.x Freeland, W. J., & Martin, K. C. (1985). The rate of range expansion by Bufo marinus in Northern Australia , 1980-84 . Wildlife Research, 12(3), 555-559. Grarock, K., Lindenmayer, D. B., Wood, J. T., & Tidemann, C. R. (2013). Does human- induced habitat modification influence the impact of introduced species? A case study on cavity-nesting by the introduced common myna ( Acridotheres tristis ) and two Australian native parrots. Environmental Management, 52, 958-970. G. Smith, J., & L. Phillips, B. (2006). Toxic tucker: the potential impact of Cane Toads on Australian reptiles . Pacific Conservation Biology, 12(1), 40. https://doi.org/10.1071/pc060040 G. Smith J, L. Phillips B. Toxic tucker: the potential impact of Cane Toads on Australian reptiles. Pacific Conservation Biology [Internet]. 2006;12(1):40. Available from: http://www.publish.csiro.au/pc/PC060040 Hill, D. S. (1987). Agricultural Insect Pests of Temperate Regions and Their Control . In Google Books. CUP Archive. https://books.google.com.au/books?hl=en&lr=&id=3-w8AAAAIAAJ&oi=fnd&pg=PA27&dq=pests+definition&ots=90_-WiF_MZ&sig=pKxuVjDJ_bZ3iNMb5TpfXA16ENI#v=onepage&q=pests%20definition&f=false Lovell, S. J., Stone, S. F., & Fernandez, L. (2006). The Economic Impacts of Aquatic Invasive Species: A Review of the Literature. Agricultural and Resource Economics Review, 35(1), 195–208. https://doi.org/10.1017/s1068280500010157 Meachen, J. A., Janowicz, A. C., Avery, J. E., & Sadleir, R. W. (2014). Ecological Changes in Coyotes ( Canis latrans ) in Response to the Ice Age Megafaunal Extinctions . PLoS ONE, 9(12), e116041. https://doi.org/10.1371/journal.pone.0116041 Mullerscharer, H. (2004). Evolution in invasive plants: implications for biological control . Trends in Ecology & Evolution, 19(8), 417–422. https://doi.org/10.1016/j.tree.2004.05.010 ANU. Myna problems. (n.d.). Fennerschool-Associated.anu.edu.au . http://fennerschool- associated.anu.edu.au//myna/problem.html National Museum of Australia. (2019). Extinction of thylacine | National Museum of Australia . Nma.gov.au . https://www.nma.gov.au/defining-moments/resources/extinction-of-thylacine Cayley, N. W. & Lindsey T. (2011) What bird is that?: a completely revised and updated edition of the classic Australian ornithological work . Walsh Bay, N.S.W.: Australia’s Heritage Publishing. Phillips, B. L., & Shine, R. (2006). An invasive species induces rapid adaptive change in a native predator: cane toads and black snakes in Australia . Proceedings of the Royal Society B: Biological Sciences, 273(1593), 1545–1550. https://doi.org/10.1098/rspb.2006.3479 Roth, A. (2019, July 3). Why you should never release exotic pets into the wild. Animals. https://www.nationalgeographic.com/animals/article/exotic-pets-become-invasive-species Sakai, A. K., Allendorf, F. W., Holt, J. S., Lodge, D. M., Molofsky, J., With, K. A., Baughman, S., Cabin, R. J., Cohen, J. E., Ellstrand, N. C., McCauley, D. E., O’Neil, P., Parker, I. M., Thompson, J. N., & Weller, S. G. (2001). The Population Biology of Invasive Species. Annual Review of Ecology and Systematics , 32(1), 305–332. https://doi.org/10.1146/annurev.ecolsys.32.081501.114037 Saunders, G. R., Gentle, M. N., & Dickman, C. R. (2010). The impacts and management of foxes ( Vulpes vulpes ) in Australia . Mammal review, 40(3), 181-211. Weber, L. (2013). Plants that miss the megafauna. Wildlife Australia, 50(3), 22–25. https://search.informit.org/doi/10.3316/ielapa.555395530308043 Williams, G. (2011). 100 Alien Invaders . In Google Books. Bradt Travel Guides. https://books.google.com.au/books?hl=en&lr=&id=qtS9TksHmOUC&oi=fnd&pg=PP1&dq=invasive+species+australia+bounty+ Wicked back to

OmniSci Magazine acknowledges the Traditional Owners and Custodians of the lands on which we live, work, and learn. We pay our respects to their Elders past and present.

Subscribe to the Magazine

Follow Us on Socials

  • Facebook
  • Instagram
  • LinkedIn
UMSU Affiliated Club Logo
bottom of page