Search Results
127 results found with an empty search
- From the Editors-in-Chief | OmniSci Magazine
< Back to Issue 4 From the Editors-in-Chief by Caitlin Kane, Rachel Ko, Patrick Grave, Yvette Marris 1 July 2023 Edited by the Committee Illustrated by Gemma van der Hurk Scirocco, summer sun, shimmering on the horizon. Salt-caked channels spiderweb your lips, scored by rivulets of sweat. Shifting, hissing sands sting your legs. You are the explorer, the adventurer, the scientist. A rusted spring, you heave forward, straining for each step, hauling empty waterskins. ----- The lonely deserts of science provide fertile ground for mirages. An optical phenomenon that appears to show lakes in the distance, the mirage has long been a metaphor for foolhardy hopes and desperate quests. The allure of a sparkling oasis just over the horizon, however, is undeniable. The practice of science involves both kinds of stories. Some scientists set a distant goal and reach it — perhaps they are lucky, perhaps they have exactly the right skills. Other scientists yearn to crack a certain problem but never quite get there. In this issue of OmniSci Magazine, we chose to explore this quest for the unknown that may be bold, unlucky, or even foolhardy: chasing the ‘Mirage’. Each article was written entirely by a student, edited by students, and is accompanied by an illustration that was created by a student. We, as a magazine, exist to provide university students a place to develop their science communication skills and share their work. If there’s a piece you enjoy, feel free to leave a comment or send us some feedback – we love to know that our work means something to the wider world. We’d like to thank all our contributors — our writers, designers, editors, and committee — who have each invested countless hours into crafting an issue that we are all incredibly proud of. We’d also like to thank you, our readers; we are incredibly grateful that people want to read student pieces and learn little bits from the work. That’s enough talking from us until next issue. Go and read some fantastic student writing! Previous article Next article back to MIRAGE
- Peaks and Perspectives: A Word from the Editors-in-Chief | OmniSci Magazine
< Back to Issue 7 Peaks and Perspectives: A Word from the Editors-in-Chief by the Editors-in-Chief 22 October 2024 illustrated by Ingrid Sefton In geometry, an apex may refer to the highest point of a solid figure, such as a pyramid. Move to the fields of ecology and evolution, and we find apex predators, overseeing population dynamics atop of the food chain. We too find ourselves situated at an apex position in society – observing, experimenting with, and utilising the world at our feet for scientific innovation and headway. Common amongst these apexes in science is unsurprisingly the emphasis on reaching soaring heights and breathtaking summits. We strive to reach these peaks, endpoints that are perceived to signal scientific greatness and knowledge. We create, we innovate, we explore – all with this vision in mind. Yet, this is not, or rather, should not be the “why” for scientific endeavour. Implicit in reaching the highest point of something is the notion that there is no further to climb. That upon reaching an apex, all that remains is to precariously balance upon this peak and hope not to misstep, tumbling down from great heights. Scientific curiosity and a yearning to understand the science underpinning our existence is not about reaching the envisioned apex. It is instead defined by the steps climbed by us and our predecessors in our journey towards discovery, and in turn, the steps that remain untrod and paths that remain uncharted. The routes we are yet to take will be forever changing. Piloted by the evolving foci of our society, where and how we may next seek to innovate remains undetermined. Infinite possibilities abound. With a birds-eye view, Apex visualises the new levels of human-tech connectivity, ills of antimicrobial resistance, and the fringes of outer space that loom on the horizon; with it, encouraging readers to envisage where the next steps may lie. Yet alongside these perspectives of the expansive, limitless world, Apex invites reflection and hypotheticals. Taking time to pause from the unfaltering upward march of innovation, this issue embraces the breathtaking view of where we are now. Apex guides us to consider time-old traditions and technicalities from a new perspective, celebrating those who have paved the way to the peaks of modern science. Wandering within, across and between disciplines of Science, it is these ruminations along the way that enrich the journey. After all, what is scientific advancement without knowing what we do not know? In the words of Sir Isaac Newton, it is by standing on the shoulders of giants that we hope to see further. So come along, and revel in the expansive view. Let the heights of scientific innovation inspire you, but don’t let such peaks constrain you. Previous article Next article apex back to
- Cosmic Carbon Vs Artificial Intelligence | OmniSci Magazine
< Back to Issue 6 Cosmic Carbon Vs Artificial Intelligence by Gaurika Loomba 28 May 2024 Edited by Rita Fortune Illustrated by Semko van de Wolfshaar “There are many peculiar aspects of the laws of nature that, had they been slightly different, would have precluded the existence of life” - Paul Davies, 2003 Almost four billion years ago, there was nothing but an incredibly hot, dense speck of matter. This speck exploded, and the universe was born. Within the first hundredth of a billionth of a trillionth of a trillionth second, the universe began expanding at an astronomical rate. For the next 400 million years, the universe was made of hydrogen, helium, and a dash of lithium – until I was born. And thus began all life as you know it. So how did I, the element of life, the fuel of industries, and the constituent of important materials, originate? Stars. Those shiny, mystical dots in the night sky are giant balls of hot hydrogen and helium gas. Only in their centres are temperatures high enough to facilitate the collision of three helium-4 nuclei within a tiny fraction of a second. I am carbon-12, the element born out of this extraordinary reaction. My astronomical powers come from my atomic structure; I have six electrons, six protons, and six neutrons. The electrons form teardrop shaped clouds, spread tetrahedrally around my core, my nucleus, where the protons and neutrons reside. My petite size and my outer electrons allow my nucleus to exert a balanced force on other atoms that I bond with. This ability to make stable bonds makes me a major component of proteins, lipids, nucleic acids, and carbohydrates, the building blocks of life. The outer electrons also allow me to form chains, sheets, and blocks of matter, such as diamond, with other carbon-12 atoms. Over the years of evolution, organic matter buried in Earth formed fossil fuels, so I am also the fuel that runs the modern world. As if science wasn’t enough, my spiritual significance reiterates my importance for the existence of life. According to the Hindu philosophy, the divine symbol, ‘Aum’ is the primordial sound of the Cosmos and ‘Swastika’, its visual embodiment. ‘Alpha’ and ‘Omega’, the first and last letters of the Greek alphabet, represent the beginning and ending, that is the ‘Eternal’ according to Christian spirituality. When scientists photographed my atomic structure, spiritual leaders saw the ‘Aum’ in my three-dimensional view and the ‘Swastika’ in my two-dimensional view. Through other angles, the ‘Alpha’ and ‘Omega’ have also been visualised (Knowledge of Reality, 2001). I am the element of life, and within me is the divine consciousness. I am the beginning and I am the end. My greatness has been agreed upon by science and spirituality. In my absence, there would be no life, an idea humans call carbon chauvinism. This ideology and my greatness remained unquestioned for billions of years, until the birth of Artificial Intelligence. I shaped the course of evolution for humans to be self-conscious and intelligent life forms. With the awareness of self, I aspired for humans to connect back to the Cosmos. But now my intelligent toolmakers, aka humans, are building intelligent tools. Intelligence and self-consciousness, which took nature millions of years to generate, is losing its uniqueness. Unfortunately, if software can be intelligent, there is nothing to stop it becoming conscious in the future. Soon, the earth will be populated by silicon-based entities that can compete with my best creation. Does this possibility compromise my superiority? A lot of you may justifiably think so. The truth is that I am the beginning. Historically, visionaries foresaw asteroid attacks as the end to human life. These days, climate change, which is an imbalance of carbon in the environment, is another prospective end. Now, people believe that conscious AI will outlive humans. Suggesting that I will not be the end; that my powers and superiority will be snatched by AI. So the remaining question is, who will be the end? I could tell you the truth, but I want to see who is with me at the end. The choice is yours. References Davies, P. (2003). Is anyone out there? https://www.theguardian.com/education/2003/jan/22/highereducation .uk Knowledge of Reality (2001). Spiritual Secrets in the Carbon Atom . https://www.sol.com.au/kor/11_02.htm Previous article Next article Elemental back to
- Fungal Pac Man | OmniSci Magazine
< Back to Issue 8 Fungal Pac Man by Ksheerja Srivastava 3 June 2025 Edited by Rita Fortune Illustrated by Esme MacGillivray We live in a world where a fungus would probably beat you at Pac-Man. While playing, the average person just follows the dots, but fungi are playing a whole different game. Despite no central brain, they navigate complex mazes, optimise routes, and even communicate across vast networks. To do so, fungi use such efficient strategies that scientists are studying them as a means to improve everything from city planning to biosensors. Nature has been perfecting pathfinding long before we put a quarter in the arcade. The elongated bodies of fungi, known as mycelia, build vast and complex networks. These structures emerge from natural algorithms - specifically, a process called collision-induced branching (1). In this process, new growth divides into new paths upon meeting an obstacle. When fungal hyphae hit a wall (literally or figuratively), they don’t just stop; they branch out, adapt, and keep moving. Traditional path-finding algorithms like Depth-First Search (DFS) or Breadth-First Search (BFS) methodically crawl through paths, moving step by step without reacting to obstacles (2). Fungi, on the other hand, adjust on the fly, often landing on the most resource-efficient routes way faster. Imagine reaching a junction in Pac-Man and instead of choosing just one path, Pac-Man splits into two, each clone taking a different route to cover more ground. This is exactly why fungal networks often end up looking eerily like optimised transport systems, such as railway lines or power grids! (3) Some fungi aren’t just clever in how they grow - they can quite literally compute. Certain species, like Basidiomycete fungi, communicate through spikes of electrical activity pulsing through their mycelial networks, processing information in ways surprisingly reminiscent of neural systems (4). What makes them even more intriguing is their hypersensitivity to the world around them. These organisms can detect subtle shifts in their environment - both chemical and physical. It’s like they’ve memorised every path they’ve taken, so when a new pellet appears on the far side of the board, they don’t need to search blindly. They already know the fastest way there, no matter where the original Pac-Man started. Endophytic fungi, fungi that live inside plants without causing harm, have been used to create biosensors - devices that can detect environmental contaminants like pollutants or pesticides (5). When these fungi encounter harmful chemicals, they react, making them perfect for monitoring things like toxins in the environment. Scientists have even developed yeast-based biosensors to specifically detect chemicals like tebuconazole, a common pesticide (6). Fungi don’t stop at chemistry and computations. It turns out they’re mechanically perceptive too. In one study, oyster fungi incorporated into fungal insoles responded to compressive stress, hinting at applications in wearable tech or even seismic sensing systems (7). Mycelium-based composites also exhibit unique patterns of electrical activity as moisture levels shift, making them promising candidates for humidity-responsive technologies. As if that weren’t enough, some fungi have the incredible ability to glow in the dark, a phenomenon known as bioluminescence. This natural light can be harnessed in special sensors, which use the glow to indicate the presence of specific substances. Essentially, when the fungi detect certain chemicals, they light up, providing an easy way to spot pollutants or toxins (8). These properties make fungi wildly efficient. No random turns, no wasted loops, just constant feedback powering smarter decisions. They know where they’ve been, sense what’s coming, and find the fastest route every time. It’s Pac-Man with a built-in optimisation engine, and that’s exactly how fungi behave in the wild. How well do you think you’d do against this version of Pac-Man? Probably not great. Let’s face it: they’re not only outsmarting us, they’re doing it with no brain at all. As we look toward smarter and more sustainable technologies, fungi might just be the key to a new era of bio-inspired computing and environmental monitoring. Researchers are already tapping into their natural brilliance to create more efficient systems for everything from biosensors to sustainable materials. The next time you see a mushroom, remember: it’s not just a fungus, it’s part of a vast, intelligent network playing the ultimate game of survival, one optimised move at a time. In a world where efficiency and adaptability are paramount, fungi might just be the unsung heroes we need to help us solve some of the biggest challenges ahead. References Asenova E, Lin HY, Fu E, Nicolau DV, Nicolau DV. Optimal Fungal Space Searching Algorithms. IEEE Trans Nanobioscience. 2016 Oct;15(7):613-618. doi: 10.1109/TNB.2016.2567098. Epub 2016 May 13. PMID: 27187968. Hanson KL, Nicolau DV Jr, Filipponi L, Wang L, Lee AP, Nicolau DV. Fungi use efficient algorithms for the exploration of microfluidic networks. Small. 2006 Oct;2(10):1212-20. doi: 10.1002/smll.200600105. PMID: 17193591. Asenova E, Fu E, Nicolau Jr DV, Lin HY, Nicolau DV. Space searching algorithms used by fungi. InBICT'15: Proceedings of the 9th EAI International Conference on Bio-inspired Information and Communications Technologies (formerly BIONETICS) 2016. European Alliance for Innovation. Adamatzky A. Towards fungal computers. Interface focus. 2018 Dec 6;8(6):20180029. Khanam Z, Gupta S, Verma A. Endophytic fungi-based biosensors for environmental contaminants-A perspective. South African Journal of Botany. 2020 Nov 1;134:401-6. Mendes F, Miranda E, Amaral L, Carvalho C, Castro BB, Sousa MJ, Chaves SR. Novel yeast-based biosensor for environmental monitoring of tebuconazole. Applied Microbiology and Biotechnology. 2024 Dec;108(1):10. Nikolaidou A, Phillips N, Tsompanas MA, Adamatzky A. Reactive fungal insoles. InFungal Machines: Sensing and Computing with Fungi 2023 Sep 17 (pp. 131-147). Cham: Springer Nature Switzerland. Singh S, Kumar V, Dhanjal DS, Thotapalli S, Singh J. Importance and recent aspects of fungal-based biosensors. InNew and Future Developments in Microbial Biotechnology and Bioengineering 2020 Jan 1 (pp. 301-309). Elsevier. Previous article Next article Enigma back to
- Behind the Scenes of COVID-19 | OmniSci Magazine
Conversations in Science Behind the Scenes of COVID-19 with Dr Julian Druce By Zachary Holloway What will our future with COVID-19 look like? How do we live with it? How could it have been managed better? In conversation with Dr Julian Druce, a renowned expert in the field of virology. Edited by Caitlin Kane & Breana Galea Issue 1: September 24, 2021 Illustration by Janna Dingle Interview with Dr Julian Druce, head of the Virus Identification Laboratory at the Victorian Infectious Diseases Reference Laboratory. Before the middle of 2021, it seemed Australia was finally seeing the back of the COVID-19 pandemic: case numbers were down, the vaccine rollout was gaining momentum and Victoria had defeated the Delta variant twice. Fast forward to today, and the outlook doesn’t appear to be as rosy. Over a year and a half from when the pandemic began, it is still dominating headlines around the world. But like many in Australia, I still had many questions regarding the state of the pandemic, our path out of it and how scientists behind the scenes were shaping our public health response. I sat down in conversation with Dr Julian Druce hoping to find some of the answers to these questions. Zachary Holloway: What was the work you were conducting at the Victorian Infectious Diseases Reference Laboratory (VIDRL) before the COVID-19 pandemic? Dr Julian Druce: VIDRL itself is a public health reference laboratory, with a large focus on virology. For virology there are four main labs: one is a big serology laboratory which tests for antibodies and the footprints that a virus leaves after your immune system has interrogated that pathogen. The other labs are more focused on direct detection of some specific viruses: there’s an HIV-specific lab, a hepatitis-specific lab and then my lab, which focuses on all other viruses. These mostly use very specific PCR (polymerase chain reaction) tests for the detection of the virus. Another option for rapidly detecting viruses that might be new is by having tests that, rather than detecting a specific virus, detect a family of viruses at once. They’re called consensus PCRs or pan-viral PCRs. One of those tests was a pan-coronavirus PCR, and that had been sitting in a freezer for thirteen years, only to be brought out at the start of 2020 when SARS-CoV-2 emerged, and that was the test we used to verify that we had the virus by sequencing the PCR product. ZH: I know that VIDRL was the first lab outside of China to grow SARS-CoV-2 in culture. What was the process for this, and how did this help in developing a standardised test for COVID-19? JD: My boss, Dr Mike Catton, and I had been on WHO [World Health Organisation] teleconference calls all through the preceding weeks where everyone was clamouring for someone to grow the virus. So I immediately put it up for culture on the Friday night when we detected it. This process puts a small amount of patient sample onto cells that may get infected with the virus. I came in on Sunday to check it, and thought something might be happening so put the flask of cells onto a camera that took photos every fifteen minutes. As soon as I checked this on Monday, I knew that it was growing because there was an obvious pattern in the cells that showed they were changing. In terms of having the cultured virus, it was then just a process of getting it out to other labs and collaborators. We gamma-irradiated some material and that material, which is killed, was a good positive control material for other laboratories to use to verify and validate their testing algorithms. Because at that point, there were only self-designed tests for COVID-19 in a few labs. This material was used to help validate all the labs around Melbourne and Australia as commercial tests became available to get them ready for testing. ZH: How important was genome sequencing for our contact tracers to be better able to track and trace the spread of the virus? JD: In general, roughly every two weeks the virus will generate one mutation somewhere. That mutation can be used to track the lineage – a bit like a family tree – and once that mutation goes from, say, me to you, you might get a new mutation when you pass it on to someone else. That mutation then becomes a key identifier for that strain. That really helped in tracking and tracing in the early days, to understand who was probably giving it to whom even though contact tracing can often work that out. Importantly though, at that very early stage we closed our borders to China, but we left our borders open to America and Europe. So as cases were coming in from those countries, we had to do genomic sequencing to verify what strain, or lineage if you like, with key mutations were showing up. We could then readily identify whether the samples were from Europe, America or the Ruby Princess, or from wherever there were new cases coming in. ZH: Has the increased infectivity of the Delta variant of SARS-CoV-2 beaten contact tracers and made Australia’s “COVID zero” strategy unachievable? JD: In terms of “COVID zero”, the national pandemic plan has always been to suppress the virus and flatten the curve, and the public health aim of that is to push the volume of samples down and stretch it out along a timeline axis. You might end up with the same numbers, but it’s stretched out across a year rather than one or two months, which shatters your health system. But what we found early was that with a lot of goodwill and effort from the public, we did eliminate the virus. We didn’t necessarily expect to do that, so that was a lucky event. But with the Delta variant, it does seem that it spreads more efficiently: the calculated reproduction rate for this variant is about 3-4 or more, and about 2-3 for the original wild-type. So this makes it much harder to eliminate. ZH: I think millions of people around the country want to know the answer to this question, but when will lockdowns stop being a viable strategy for containing this virus? Does it come with increasing vaccination, or could it continue after that? JD: It very much depends on what happens as we move forward. Of course, vaccination is the pathway out of this. As more people become vaccinated and less susceptible to serious disease and death, we will slowly transform this virus into a common cold, or at least that’s what is likely to happen. But I suspect that as we open up, if it all goes badly, we may have to have some level of restrictions to mitigate transmission. Some of this is already being discussed with entry passports, and people not being allowed into pubs, theatres, or wherever else there is close confinement in a natural or urban setting, unless they’re double-dosed. ZH: In retrospect, how will we rate the response to this pandemic? Was it proportional to the dangers it posed? JD: I think that will be debated for years. Every country has done it a little bit differently, from the worst end of the scale to the best end of the scale. Australia is probably on the better end, in terms of suppressing and eliminating the virus, but we haven’t done as well with the vaccine rollout. We’re getting there now – we’re catching up – but I think, generally, Australia will be viewed favourably as having had a good response. In Australia there’s a double-edged sword with vaccination uptake because we didn’t have the carnage that other countries had.. But now that we’ve got the virus circulating again, that has prompted a greater uptake of the vaccine, which is a good thing. Outside of Australia, I imagine the World Health Organisation will do an analysis of the generalised responses of different countries: from some of the poorer performers – like America and other countries that decided to let it rip, thinking that herd immunity was the best option – to the responses of other countries, mainly severe lockdowns, who suppressed and eliminated the virus. There are still many types of parameters to look at, from economic and socioeconomic to virological and epidemiological, a lot of elements still to tease apart when this is all done. Dr Julian Druce is the head of the Virus Identification Laboratory at the Victorian Infectious Diseases Reference Laboratory, where he works with a team to detect many of the viruses that infect humans and devises new ways to detect novel viruses. We would like to thank Dr Druce for taking the time to meet with us and discuss his work.
- Microbic Mirror of The Self | OmniSci Magazine
< Back to Issue 8 Microbic Mirror of The Self by Sarah Ibrahimi 3 June 2025 Edited by Jax Soon-Legaspi Illustrated by Noah Chen For decades, we did not fully understand the functional purposes of many parts of the human body. The spleen was once thought of as dispensable, earwax merely as dirty waste and the appendix as a useless leftover from the course of human evolution. But science has a habit of humbling us and we now know that all of these components serve essential purposes in the human body. Our understanding of the gut microbiome is following a similar pattern. However, beyond knowing that it plays a role, we still lack a full understanding of the true nature and mechanisms of this mysterious system. Given the average person's current understanding of microbes, it is unsurprising that they are often associated with disease, capable of causing some of the most deadly disorders. They are thought of as a foreign figure entirely and that should remain separate from us. Nevertheless, just like their occupation all over our skin, our gut is home to them too. When we think of our own identities, we tend to boil ourselves down to a singular body, a singular self. Typically, we define ourselves by our jobs, the activities we enjoy and the values we admire - elements all tied to a single individual. Yet, within us lives an entire biosphere that hosts a whole community of microbes. These minute beings govern our guts in symbiosis with other systems of the human body and outnumber human cells ten to one (1). It is a wonder how we are home to trillions of bacteria and are barely conscious of their existence. How do these seemingly fatal organisms operate cooperatively with the body? Can we construe the self as a singular individual when our body is a complex community with seemingly precarious organisms living within us? “What lies behind us and what lies before us are tiny matters compared to what lies within us” - Ralph Waldo Emerson The community that is composed of bacteria, fungi, viruses and archaea plays a significant role in many aspects of our lives, affecting the way we digest food down to the regulation of our mental health. We understand the digestive system to be composed of the mouth, stomach, intestines and other vital organs as the main drivers of digestion. Similarly, the immune system depends on the bone marrow, spleen, white blood cells and antibodies to suppress an infection. Yet, the microbes sequestered within our gut assist extensively in driving the actions associated with these systems. In digestion, the range of their skill extends from the ability to synthesise vitamin K to using cross-feeding mechanisms - a phenomena where one bacterium breaks down parts of plant compounds and passes the byproducts to others, resulting in boosted health (2,3). They have also been shown to promote gut barrier integrity to prevent the entry of harmful pathogens, while also aiding in regulating immune system homeostasis, assisting the body in blocking harmful pathogens and enabling a strengthened immune response in the face of infection (3). Although there has been extensive research conducted to investigate the role of gut microbes in our physical health, their effects on our mental health have often been overlooked. Yet, they play a fundamental role in its regulation and the promotion of positive wellbeing. This contribution is most evident in the context of the gut-brain axis, which consists of two-way signalling between the central nervous system and enteric nervous system, serving the emotional and cognitive domains of the brain. Working hand-in-hand, the mental state of an individual can cause harmful alterations to the composition of healthy gut microbes and in a reciprocal manner, a dysregulated gut flora can adversely affect the brain through pathways such as immune activation and the production of neuroactive substances (4). Such imbalances in the gut microbiota have been linked to the emergence of depressive-like behaviours (5), though there is an increased prevalence of other psychiatric disorders like bipolar disorder, schizophrenia and anxiety that occur as a result too (6). The last decade of science has demonstrated a dramatic increase in the understanding of the gut microbiome as we know it today. Like in any field however, there is still more to be discovered. Similar to the infamous genome-wide association studies that assist in the recognition of certain genetic markers to particular diseases or traits through a statistical basis, metagenome-wide association studies are being conducted to identify associations with microbiome structures and several major diseases (7). Research in this field has already allowed for the detection of shifts in gut compositions and how these changes functionally contribute to many metabolic diseases. However, small sample sizes for such research highlight the requirement for greater development within the field. “The self is not something ready-made, but something in continuous formation through choice of action” - John Dewey The human body has a mutual relationship with the gut microbiome, like that of the gut-brain axis. So when one of these systems is not functioning at its peak, the performance of the other is also derailed. Dysbiosis of the gut's natural flora contributes to clinical conditions such as Irritable Bowel Syndrome (IBS), Autism Spectrum Disorder (ASD) and anxiety (4). However, microbial imbalance is mediated through the actions and behaviours of the individual at hand. Both chronic and acute stressors can increase gut barrier permeability, resulting in a “leaky” gut, allowing bacteria to seep into the cracks and trigger an array of physiological responses like inflammation. It is safe to say that there is no single, definitive state that our individual guts exist in. In a world driven by antimicrobial usage, fluctuating diets and the invisible weight of daily stress, the gut microbiome remains in a state of constant transformation. Ever-changing, they mirror the conscious and unconscious choices we make, ultimately shaping our health in ways we are only beginning to imagine. References National Institutes of Health (NIH) [Internet]. 2015 [cited 2025 Jun 1]. NIH Human Microbiome Project defines normal bacterial makeup of the body. Available from: https://www.nih.gov/news-events/news-releases/nih-human-microbiome-project-defines-normal-bacterial-makeup-body Mueller C, Macpherson AJ. Layers of mutualism with commensal bacteria protect us from intestinal inflammation. 2006 Feb 1 [cited 2025 Jun 1]; Available from: https://gut.bmj.com/content/55/2/276 Zhang YJ, Li S, Gan RY, Zhou T, Xu DP, Li HB. Impacts of Gut Bacteria on Human Health and Diseases. International Journal of Molecular Sciences. 2015 Apr;16(4):7493–519. Carabotti M, Scirocco A, Maselli MA, Severi C. The gut-brain axis: interactions between enteric microbiota, central and enteric nervous systems. Ann Gastroenterol. 2015;28(2):203–9. Bested AC, Logan AC, Selhub EM. Intestinal microbiota, probiotics and mental health: from Metchnikoff to modern advances: Part I – autointoxication revisited. Gut Pathogens. 2013 Mar 18;5(1):5. Nikolova VL, Smith MRB, Hall LJ, Cleare AJ, Stone JM, Young AH. Perturbations in Gut Microbiota Composition in Psychiatric Disorders: A Review and Meta-analysis. JAMA Psychiatry. 2021 Dec 1;78(12):1343–54. Wang J, Jia H. Metagenome-wide association studies: fine-mining the microbiome. Nat Rev Microbiol. 2016 Aug;14(8):508–22. Previous article Next article Enigma back to
- Believing in aliens... A science?
By Juulke Castelijn < Back to Issue 3 Believing in aliens... A science? By Juulke Castelijn 10 September 2022 Edited by Tanya Kovacevic and Ashleigh Hallinan Illustrated by Quynh Anh Nguyen Next The question of the existence of ‘intelligent life forms’ on a planet other than ours has always been one of belief. And I did not believe. It was probably the image of a green blob with multiple arms and eyes squelching across the ground and emitting noises unidentifiable as any form of language which turned me off the whole idea. But a book I read one day completely changed my mind; it wasn’t about space at all, but about evolution. ‘Science in the Soul’ is a collection of works written by the inimitable Richard Dawkins, a man who has argued on behalf of evolutionary theory for decades. Within its pages, you will find essays, articles and speeches from throughout his career, all with the target of inspiring deep rational thought in the field of science. A single essay gives enough food for thought to last the mind many days, but the ease and magnificence of Dawkin’s prose encourages the devourment of many pages in a single sitting. The reader becomes engulfed in scientific argument, quickly and completely. Dawkins shows the fundamental importance of the proper understanding of evolution as not just critical to biology, but society at large. Take, for instance, ‘Speaking up for science: An open letter to Prince Charles,’ in which he argues against the modelling of agricultural practices on natural processes as a way of combating climate change. Even if agriculture could be in itself a natural practice (it can’t), nature, Dawkins argues, is a terrible model for longevity. Instead, nature is ‘a short-term Darwinian profiteer’. Here he refers to the mechanism of natural selection, where offspring have an increased likelihood of carrying the traits which favoured their parents’ survival. Natural selection is a reflective process. At a population level, it highlights those genetic traits that increased chances of survival in the past. There is no guarantee those traits will benefit the current generation at all, let alone future generations. Instead, Dawkins argues, science is the method by which new solutions to climate change are found. Whilst we cannot see the future, a rational application of a wealth of knowledge gives us a far more sensitive approach than crude nature. Well, perhaps not crude per se. If anyone is an advocate for the beauty and complexity of natural life, it is surely Dawkins. But a true representation of nature, he argues, rests on the appreciation of evolution as a blinded process, with no aim or ambition, and certainly no pre-planned design. With this stance, Dawkins directly opposes Creationism as an explanation of how the world emerged, a battle from which he does not shy away. Evolution is often painted as a theory in which things develop by chance, randomly. When you consider the complexity of a thing such as the eye, no wonder people prefer to believe in an intelligent designer, like a god, instead. But evolution is not dependent on chance at all, a fact Dawkins argues many times throughout his collection. There is nothing random about the body parts that make up modern humans, or any other living thing - they have been passed down from generation to generation because they enhanced our ancestors’ survival. The underlying logic is unrivalled, including by religion. But that doesn’t mean Dawkins is not a man of belief. Dawkins believes in the existence of intelligent extraterrestrial life, and for one reason above all: given the billions upon billions of planets in our universe, the chance of our own evolution would have to be exceedingly small if there was no other life out there. In other words, we believe there is life out there because we do not believe our own evolution to be so rare as to only occur once. Admittedly, it is not a new argument but it had not clicked for me before. Perhaps it was Dawkins’ poetic phrasing. At this stage it is a belief, underlined by a big ‘if’. How could we ever know if there are intelligent life forms on a planet other than Earth? Dawkins provides an answer here too. You probably won’t be surprised that the answer is science, specifically a knowledge of evolution. We do not have to discover life itself, only a sign of something that marks intelligence - a machine or language, say. Evolution remains our only plausible theory of how such a thing could be created, because it can explain the formation of an intelligent being capable of designing such things. We become the supporting evidence of life somewhere else in the universe. That’s satisfying enough for me. Previous article Next article alien back to
- Tip of the Iceberg: An Overview of Cancer Treatment Breakthroughs | OmniSci Magazine
< Back to Issue 7 Tip of the Iceberg: An Overview of Cancer Treatment Breakthroughs by Arwen Nguyen-Ngo 22 October 2024 edited by Zeinab Jishi illustrated by Louise Cen Throughout the history of science, there have been many firsts. Anaximander, a Greek scholar, was the first person to suggest the idea of evolution. Contrary to popular belief, the Montgolfier brothers were the pioneers of human flight by their invention of the hot air balloon, as opposed to another pair of brothers, the Wright brothers. In 1976, the first ever vaccine was created by an English doctor, who tested his theory in a rather peculiar manner that would not be approved by today’s ethics guidelines (Rocheleau, 2020). While there have been many extraordinary discoveries, there continue to be many firsts and many breakthroughs that have pathed the way for the next steps in research. In particular is research into ground-breaking treatments for cancer patients. 1890s: Radiotherapy (Gianfaldoni, S., Gianfaldoni, R., Wollina, U., Lotti, J., Tchernev, G., & Lotti, T. 2017) In the last decade of the 19th century, Wilhelm Conrad Rцntgen made the discovery of X-rays, drastically changing the medical scene for treating many diseases. From this discovery, Emil Herman Grubbe commenced the first X-ray treatment for breast cancer, while Antoine Henri Becquerel began to delve deeper into researching radioactivity and its natural sources. In the same year that Rцntgen discovered X-rays, Maria Sklodowska-Curie and Pierre Curie shared theirs vows together, and only three years later, discovered radium as a source for radiation. By then, during a time where skin cancers were frequently treated, this discovery had kick-started the research field into X-rays as well as the use of X-rays in the medical field. Scientists and clinicians have gained a greater understanding of radiation as treatment for diseases, but the research does not stop there and the advancement of radiotherapy only continues to thrive. 1940s: First Bone Marrow Transplant (Morena & Gatti, 2011) Following World War II, the physical consequences of war accelerated research into tissue transplantation. Skin grafts were needed for burn victims, blood transfusions needed ABO blood typing, and the high doses of radiation led to marrow failure and death. During this time, Peter Medawar started his research into rejection of skin grafts as requested by the Medical Research Council during World War II. It was a priority for the treatment of burn victims. Medawar had concluded that graft rejection was a result of an immunological phenomenon related to histocompatibility antigens. Histocompatibility antigens are cell surface glycoproteins that play critical roles in interactions with immune cells. They are unique to every individual and essentially flags one’s cell as their own, therefore making every individual physically unique. 1953: First Human Tumour Cured In 1953, Roy Hertz and Min Chiu Li used a drug, methotrexate, to treat the first human tumour — a patient with choriocarcinoma. Choriocarcinoma is an aggressive neoplastic trophoblastic disease, and can be categorised into two types — gestational and non-gestational (Bishop & Edemekong, 2023). The cancer primarily affects women, as it grows aggressively in a woman’s uterus (MedlinePlus., 2024). However, it can also occur in men as part of a mixed germ cell tumour (Bishop & Edemekong, 2023). Methotrexate is commonly used in chemotherapy as it acts as an antifolate antimetabolite that induces a cytotoxic effect on cells. Once methotrexate is taken up by cells, it forms methotrexate-polyglutamate, which in turn inhibits dihydrofolate reductase, an enzyme important for DNA and RNA synthesis (Hanood & Mittal, 2023). Therefore, by inhibiting DNA synthesis, the drug induces a cytotoxic effect on the cancerous cells. Since the first cure of choriocarcinoma using methotrexate, the drug has both been commonly used for chemotherapy and other applications, including as an immunosuppressant for autoimmune diseases (Hanoodi & Mittal, 2023). 1997: First ever targeted drug: rituximab (Pierpont, Limper, & Richards, 2018) Jumping ahead a few decades and 1997 was the year that JK Rowling published Harry Potter and the Philosopher’s Stone . It was also the year that the first targeted anti-cancer drug was approved by the U.S Food and Drug Administration (FDA), rituximab. Ronald Levy created rituximab with the purpose of targeting malignant B cells. B cells express an antigen – CD20 – which allows B cells to develop and differentiate. Rituximab is an anti-CD20 monoclonal antibody, meaning that it targets the CD20 antigens expressed on malignant B cells. It had improved the progression-free survival and overall survival rates of many patients who had been diagnosed with B cell leukemias and lymphomas (Pavlasova & Mraz, 2020). Much like the Philosopher’s Stone, you may consider rituximab to increase longevity of patients diagnosed with B cell cancers. Although Levy created this drug, his predecessors should not be ignored. Prior to his research and development of rituximab, research and development of monoclonal antibodies can be dated all the way back to the late 1970s (Pavlasova & Mraz, 2020). César Milstein and Georges J. F. Köhler developed the first monoclonal antibody in the mid-1970s, and first described the method for generating large amounts of monoclonal antibodies (Leavy, 2016). Milstein and Köhler were able to achieve this by producing a hybridoma – “ a cell that can be grown in culture and that produces immunoglobulins that all have the same sequence of amino acids and consequently the same affinity for only one epitope on an antigen that has been chosen by the investigator” (Crowley & Kyte, 2014). They had produced a cell with origins from a myeloma cell line and spleen cells from mice immunised against sheep red blood cells (Leavy, 2016). Going forward: CAR T Cells The most recent and exciting development in cancer research has been the development and usage of chimeric antigen receptor (CAR) T cells. CAR T cell therapy is a unique therapy customised to each individual patient, as the CAR T cells used are derived from the patient’s own T cells. The process involves leukapheresis, where the patient’s T cells are collected, and these collected T cells are then re-engineered to include the CAR gene. The patient’s own CAR T cells are produced, expanded and subsequently infused back into the patient. The first concept of CAR T cells to be described was in 1987 by Yoshihisa Kuwana and others in Japan. Following this, different generations of CAR T cells have now been developed and trialled, leading to the FDA’s first two approvals for CAR T cells (Wikipedia Contributors, 2024). This research avenue has only scratched the surface, with many individuals now exploring the best collection methods and how best to stimulate the “fittest” T cells - the apex predator of immune cells. A recent paper was published where CAR T cells were trialled as a second line therapy to follow ibrutinib-treated blood cancers. The phase 2 TARMAC study involved using anti-CD19 CAR T cells to treat patients with relapsed mantle cell lymphoma (MCL) who had been exposed to ibrutinib, a drug used to treat B cell cancers by targeting Bruton Kinase Tyrosine (BTK) found in B cells. The study showed that 80% of patients who had previous exposure to ibrutinib and were treated with CAR T cells as a second-line therapy achieved a complete response. Furthermore, at the 13-month follow-up, the 12-month progression free survival rate was estimated to be 75% and the overall survival rate to be 100% (Minson et al., 2024)! It is without a doubt that as humans, we are naturally curious creatures. It is with this curiosity that we have journeyed through the many scientific breakthroughs and innovations. And within each special nook and cranny of countless fields of science, from flight to evolution, from vaccines to cancer treatments, there have been multitudes of discoveries. There is no doubt that the number of innovations will only continue to grow. References Bishop, B., & Edemekong, P. (2023). Choriocarcinoma. StatPearls . Crowley, T., & Kyte, J. (2014). Section 1 - Purification and characterization of ferredoxin-NADP+ reductase from chloroplasts of S. oleracea . In Experiments in the Purification and Characterization of Enzymes (pp. 25–102). Gianfaldoni, S., Gianfaldoni, R., Wollina, U., Lotti, J., Tchernev, G., & Lotti, T. (2017). An overview on radiotherapy: From its history to its current applications in dermatology. Open Access Macedonian Journal of Medical Sciences, 5 (4), 521–525. https://doi.org/10.3889/oamjms.2017.122 Hanoodi, M., & Mittal, M. (2023). Methotrexate. StatPearls . Leavy, O. (2016). The birth of monoclonal antibodies. Nature Immunology, 17 (Suppl 1), S13. https://doi.org/10.1038/ni.3608 MedlinePlus. (2024). Choriocarcinoma. MedlinePlus . https://medlineplus.gov/ency/article/001496.htm#:~:text=Choriocarcinoma%20is%20a%20fast%2Dgrowing,pregnancy%20to%20feed%20the%20fetus Minson, A., Hamad, N., Cheah, C. Y., Tam, C., Blombery, P., Westerman, D., Ritchie, D., Morgan, H., Holzwart, N., Lade, S., Anderson, M. A., Khot, A., Seymour, J. F., Robertson, M., Caldwell, I., Ryland, G., Saghebi, J., Sabahi, Z., Xie, J., Koldej, R., & Dickinson, M. (2024). CAR T cells and time-limited ibrutinib as treatment for relapsed/refractory mantle cell lymphoma: The phase 2 TARMAC study. Blood, 143 (8), 673–684. https://doi.org/10.1182/blood.2023021306 Morena, M., & Gatti, R. (2011). A history of bone marrow transplantation. Haematology/Oncology Clinics, 21 (1), 1–15. Pavlasova, G., & Mraz, M. (2020). The regulation and function of CD20: An "enigma" of B-cell biology and targeted therapy. Haematologica, 105 (6), 1494–1506. https://doi.org/10.3324/haematol.2019.243543 Pierpont, T. M., Limper, C. B., & Richards, K. L. (2018). Past, present, and future of rituximab: The world’s first oncology monoclonal antibody therapy. Frontiers in Oncology, 8 , 163. https://doi.org/10.3389/fonc.2018.00163 Rocheleau, J. (2020). 50 famous firsts from science history. Stacker . https://stacker.com/environment/50-famous-firsts-science-history Wikipedia contributors. (2024, October 6). CAR T cell. In Wikipedia, The Free Encyclopedia . Retrieved October 17, 2024, from https://en.wikipedia.org/w/index.php?title=CAR_T_cell&oldid=1249695600 Previous article Next article apex back to
- In conversation with Paul Beuchat
By Renee Papaluca < Back to Issue 3 In conversation with Paul Beuchat By Renee Papaluca 10 September 2022 Edited by Zhiyou Low and Andrew Lim Illustrated by Ravon Chew Next Paul is currently a postdoctoral teaching fellow in the Faculty of Engineering and Information Technology. In his spare time, he enjoys overnight hikes, fixing bikes, and rock climbing. Note: The following exchange has been edited and condensed. What was the ‘lightbulb moment’ that prompted you to study science? I often say that I chose engineering a little bit by not wanting to choose anything else. I think it also played into my strengths back in high school. I wasn't particularly into English, history or languages but I really enjoyed physics, chemistry and maths. So, that already drew me to science broadly. What ended up directing me towards engineering, and particularly mechanical engineering, was just always tinkering at home. My dad was always tinkering and building things. We had a garage with all of the tools necessary, and I had free rein to pull things apart and put them back together. Mechanical engineering was a way of taking a more formal route of enjoyment into the hobby. Why did you choose to pursue a research pathway? After I finished my double degrees in Science and Engineering, I got a job, which I enjoyed. It was fun working with a bigger team. In this case, it was an oil and gas company with some pretty big equipment involved. This wasn’t just tinkering with something little in the garage, but something on an industrial scale. At some stage, though, I felt like there was a bit missing. There was a research arm as part of the company, but that wasn't somewhere that I could get to. I was excited by the kind of work being done in that area, and I saw a PhD as a way of pursuing that love so that I could then work on those sorts of exciting things. What advice would you give to students considering a research pathway? Certainly, while I was a PhD, all the postdocs would say that the PhD was the best time of their life. Then the PhDs would say that the Masters was the best. So, be prepared for it to be hard. The advice is to be passionate about the topic and not be fearful about uncertainty or knowing the exact topic straightaway. Also, you likely will need a lot of support to get through the hard parts. It’s nice to have tangential input in the form of seminars, visiting academics from other institutions or even from PhDs in the same group or department. This input gives you new knowledge, new exciting fields and new industry connections. What sparked your love of teaching? My original intention was to complete my PhD, gain the relevant skills and return to the industry. My passion for teaching was sparked during my PhD experience; I got to supervise Masters students that are working on a larger project with me. It was a close collaboration with someone, where you start the process of teaching them whatever the topic is. You work on it together, and eventually, the student becomes the master. They can now guide you along, as well as having vibrant discussions together. That's what I find exciting about tertiary education more broadly - we all are pushing the limits of engineering to achieve better outcomes together. What does your day-to-day life as a teaching fellow look like? One of the focuses of my position was to include more project-based teaching, i.e. to include more hands-on education and work in the classroom, which was not included previously. I got the opportunity to create a new subject. I initially spent a lot of time developing what it was going to be. My day-to-day work included choosing new topics to add to the subject and linking them to a hands-on project, like a ground robot. There's a whole bunch of work that goes into designing a robot and the relevant software on top of preparing lecture slides and delivery—all these bits and pieces that make up a subject. Scattered throughout all this is teaching research; the teaching team assesses the students, and I need to assess the teaching itself. For instance, I need to understand what is being attempted in a particular class, what we are intending to achieve and how this aligns with the current best practices in education research publications. What advice would you give to students considering academic teaching as a career? One of the very nice things here at the University of Melbourne is the support teaching staff can receive through the Graduate Certificate of University Teaching. This gives you insight into and guidance on how to tackle the whole field. For instance, one of the lecturers mentioned that you have to be passionate about teaching because it has its ups and downs. Certainly, while developing a new subject, I found it to be quite stressful. It’s a different way of thinking, and all-new terminology, which is exciting and scary, and that took me a little bit by surprise. Where I shot myself in the foot the most was trying to do too much. I was in a very lucky position where I had free rein to make a subject as hands-on as possible, which opened the floodgates to possibilities. Prioritising was extremely important. It's not that you don’t try everything, but trying too many new exciting ideas at the same time means they probably are all going to fail or take an exorbitant amount of time to implement properly. Being realistic in my instruction was important. Also, having a mentor or someone you can talk very openly with was helpful. What are your future plans? For now, my intention is to stay in teaching. I’d like to push this position to the limits of what I can achieve and see where it takes me. I can also imagine the level of curriculum redesign in shifting whole courses to project-based learning. Current reports, like from the Council of Engineering Deans, are pushing for all engineering education to shift over to project-based learning within the next five to ten years. I’d like to continue teaching, with a view to contributing to higher-level curriculum development. Previous article Next article alien back to
- Understanding The Mysterious Science... | OmniSci Magazine
Understanding the Mysterious Science of Sleep By Evelyn Kiantoro Sleeping is just something we do at the end of the day, but why? It’s a daily routine we rarely question! Check out this article for a brief review of the current research out there on sleep and dreams. Edited by Katherine Tweedie, Juulke Castelijn & Niesha Baker Issue 1: September 24, 2021 Illustration by Casey Boswell “Today I don’t feel like doing anything, I just wanna lay in my bed,” sings Bruno Mars in The Lazy Song. That is exactly what our inner narrative says every Monday morning, right? After the long weekend, having fun partying or catching up with some work, there is nothing worse than getting back into the weekday grind. All we want is an eternity of rest and sleep because – for the majority of us – sleep is a way to relax; it takes us away from the stressful reality of life. However, our physical condition when we sleep suggests that it is not actually very safe. When we sleep, we are in a mysterious state; we lie down and are vulnerable to predators without any defence. To minimise the dangers of sleeping, humans built houses that provide warmth and shelter from the weather and protection from predators. But sleeping is seen in various other lifeforms, not just us humans – and species that live in the wild experience conditions that are far more dangerous. Dreams are an even bigger mystery in the science of sleep; they do not seem to have any significant benefits, and their purpose is largely unknown. However, as with everything that is passed on from generation to generation, sleep and dreams must have a significant evolutionary advantage for our fitness and survival. Due to the different obstacles and routines faced by various species, different species sleep in different ways. Generally, predatory animals such as humans can sleep for long periods of time (1). Conversely, prey animals are constantly vigilant; instead of sleeping for a long time, they only rest for short periods (2). A particularly interesting example are dolphins and seals, who have evolved to keep half of their brain “asleep” while the other is “awake” during sleep (3). This shows us that sleep really is important for our survival, and that various organisms have even adopted mechanisms to combat obstacles to sleeping. So, the cost of sleeping must be worth it, right? The answer is “yes” – but scientists are unsure of exactly why. Why do we sleep? Various theories in literature on the purpose of sleep have been broadly categorised into two theories: the adaptive and restorative theories. One of the reasonings behind the adaptive theories proposes that creatures that are inactive at night have increased chances of survival due to a lower risk of injury (4). Another perspective suggests that humans sleep at night to conserve energy for the day, when it is more efficient to hunt for food (5). This theory has also been supported by the fact that humans have a 10 per cent decrease in metabolism during sleep (6). However, both theories were proposed in relation to our ancient lifestyle when we needed to physically hunt for food. Looking at our present lifestyle, this reasoning may not be as applicable – but it is still embedded in our system. There are other theories that explore the reasoning behind sleep from the perspective of restoration. The restorative theory speculates that sleep allows us to repair cellular components that were used throughout the day, as many important growth hormones are shown to be released during sleep (7). This theory is also supported by the most widely accepted reasoning for why we sleep, which is that sleep is necessary for the growth and maintenance of the brain’s structure and function, and that it is crucial for optimising memory consolidation (8, 9). Sleep also affects other physiological aspects, such as immune function, endocrine function, cardiovascular health and mood (10, 11, 12) . Sleep disorders are shown to be associated with cardiovascular disease, and sleep reportedly enhances immune defences against pathogens. The fact that there are various theories explaining why we sleep shows that there is no single perfect explanation. Regardless of why we sleep, we still get into bed at the end of the day. This is mainly because of our circadian rhythm, which controls our desire for sleep. Our circadian rhythm is controlled via the hypothalamus: an area at the centre of our brain that receives sensory inputs from various parts of the body. During sleep, the hypothalamus receives input from our eyes, which detect light levels (13). When we are exposed to high levels of light in the morning, the circadian rhythm promotes wakefulness (14). However, at night, when there is less exposure to light, the circadian rhythm promotes sleep due to the increase in the production of the sleep-regulating hormone, melatonin (15). Even though we have a central control system that regulates when we sleep, there is still a large variation in sleeping time among humans; some people sleep for only five hours, and others sleep for up to ten or more (16). Sleep duration is affected by factors such as physical and social environment, diet, activity, body mass index, comorbidities and mental health (17). Despite the contributions of lifestyle differences, some studies have shown that human sleep duration and timing is also influenced by genetic factors but is regulated by the circadian rhythm and brain activity (18). Currently, little is known about the specific genes and genetic mechanism involved in sleep duration, and more research is still being done in the area (19). These factors could explain why people often feel sleepy throughout the day, in addition to the variation in sleeping patterns in the population. However, as is so often the case in science, there is no one specific factor that may result in differences within the population – instead, a combination of these factors is likely to be responsible. The phases of sleep Did you know that there are different kinds of sleep? All humans go through two different sleep phases: non-rapid eye movement (NREM) sleep and rapid eye movement (REM) sleep (20). NREM takes up approximately 75–80 per cent of our total sleep duration, whereas REM takes up 20–25 per cent (21). Sleeping normally progresses from NREM 1–4 through to REM, and this cycle occurs four to five times each night (22) - for more details on sleep phases, check out Table 1! Most of the restoration processes in the body are believed to take place during NREM 3, as well as during REM. However, one particular question often stands out when it comes to sleep stages: when do we dream? Dreams: what are they, anyway? While there are some exceptions, it is widely believed that dreaming most frequently occurs when a person is in the REM stage of sleeping (25). When some individuals sleep, they sometimes have difficulty distinguishing between reality and the dreaming state. This can be explained by the fact that we are consciously aware in dreams, and we often have perception and emotion (26). Dreams are in fact richer than our consciousness – they can create scenarios that may be impossible in our conscious reality (27). They are highly visual, contain sounds and are often an experience instead of a mere thought (28). Interestingly, the striking similarities between consciousness and dreams may indicate that dreams reflect the organisation and function of our brain (29)! Various evidence has shown that dreams are more likely to be a result of our imagination. One argument states that blended characters and the bizarre properties of our dreams are more likely to be produced by our imaginations, as these are not something an individual would experience in the conscious state (30). Furthermore, the fact that dreams rarely contain smells or pain may be a result of us having difficulties imagining those sensations while awake (31). Looking at dreams as a higher form of our imagination may explain our uncertainty, poor recall, disconnection from the environment and lack of control over the situation while dreaming (32). However, it is interesting to keep in mind that our imagination is a result of the knowledge we already have. This knowledge is based on what we learn from our conscious reality, explaining why our dreams sometimes feel so realistic. An unsolved mystery Did you realise that sleep is one of the few activities you were not taught to do? As newborns, we only know how to digest and excrete food, breathe, show emotions and sleep. We digest food as an energy source; we excrete food to prevent the build-up of toxic substances; we breathe to supply our organs with oxygen; and we show emotions to communicate how we feel. So why is sleep one of these essential activities? And why is dreaming such a universal human experience? Despite extensive research, the answer remains buried in us like a secret in a mystery novel. This answer is not so far away – but unfortunately for us, it is not the type of book you can finish in a day. Instead, it is one with an infinite number of chapters. References: 1, 2. Purves, Dale, George J. Augustine, David Fitzpatrick, William C. Hall, Anthony-Samuel LaMantia, and Leonard E. White, Neuroscience (5th Edition). Sunderland, MA: Sinauer Associates, 2012, 627. 3. Siegel, Jerome M., “Do All Animals Sleep?”, Trends in Neurosciences 31, no. 4 (2008): 208-213. doi: 10.1016/j.tins.2008.02.001. 4. Siegel, Jerome M., “Sleep Viewed as a State of Adaptive Inactivity”, Nature Reviews 10, no. 10 (2009): 747-753. doi: 10.1038/nrn2697. 5. Freiberg, Andrew S., “Why We Sleep: A Hypothesis for an Ultimate or Evolutionary Origin for Sleep and Other Physiological Rhythms,” Journal of Circadian Rhythms 18, no. 1 (2020): 1-5. doi: 10.5334/jcr.189. 6, 7, 8, 13, 15, 22, 23, 25. Brinkman, Joshua E., Vamsi Reddy, and Sandeep Sharma, Physiology of Sleep (Treasure Island, FL: StatPearls, 2021). 9. Rasch, Bjorn, and Jan Born, “About Sleep’s Role in Memory”, Physiological Reviews 93, no. 2 (2013): 681-766. doi: 10.1152/physrev.00032.2012. 10. Leproult, Rachel, and Eve Van Cauter, “Role of Sleep and Sleep Loss in Hormonal Release and Metabolism”, Endocrine Development 17 (2009): 11-21. doi: 10.1159/000262524. 11, 14, 24. Jawabri, Khalid H., and Avais Raja, Physiology, Sleep Patterns. Treasure Island, FL: StatPearls, 2021. 12. Ahmad, Adeel and S. Claudia Didia, “Effects of Sleep Duration on Cardiovascular Events,” Current Cardiology Reports 22, no. 4 (2020): 18. doi: 10.1007/s11886-020-1271-0. 16, 19. Keene, Alex C., and Erik R. Duboue, “The Origins and Evolution of Sleep,” Journal of Experimental Biology 221, no. 11 (2018): 1-14. doi: 10.1242/jeb.159533. 17. Billings, Martha E., Lauren Hale, and Dayna A. Johnson, “Physical and Social Environment Relationship with Sleep Health and Disorders,” Chest 157, no. 5 (2020): 1305-1308. doi: 10.1016/j.chest.2019.12.002. 18. Porkka-Heiskanen, T., “Sleep regulatory factors,” Italiennes de Biologie 152, no. 2-3 (2014): 57-65. doi: 10.12871/000298292014231. 20. Miyazaki, Shinichi, Chih-Yao Liu, and Yu Hayashi, “Sleep in Vertebrate and Invertebrate Animals, and Insights Into the Function and Evolution of Sleep,” Neuroscience Research 118 (2017): 3-12. doi: 10.1016/j.neures.2017.04.017. 21. Troynikov, Olga, Christopher G. Watson, and Nazia Nawaz, “Sleep Environments and Sleep Physiology,” Journal of Thermal Biology 78, (2018): 192-203, doi: 10.1016/j.jtherbio.2018.09.012. 26, 27. Hobson, Allan J., “REM Sleep and Dreaming: Towards a Theory of Protoconsciousness,” Nature Reviews 10, (2009): 803-813. doi: 10.1038/nrn2716. 28, 31, 32. Nir, Yuval, and Giulio Tononi, “Dreaming and the Brain: From Phenomenology to Neurophysiology,” Trends in Cognitive Sciences 14, no. 2 (2011): 1-25. doi:10.1016/j.tics.2009.12.001. 30. Ichikawa, Jonathan, “Dreaming and Imagination,” Mind & Language 24, no.1 (2009): 103-121, doi: 10.1111/j.1468-0017.2008.01355.x.
- Fool Me Once | OmniSci Magazine
< Back to Issue 4 Fool Me Once by Julia Lockerd 1 July 2023 Edited by Tanya Kovacevic and Elijah McEvoy Illustrated by Sonia Santosa I have rabies. I’m absolutely sure of it. I mean, I can't really tell, but that’s the silent killer, right? You don’t know you’re rabid till it’s all over, and you’re foaming at the mouth and biting your student tutor on the leg. Despite being completely safe here in Australia with its complete lack of rabies-having animals, I’m still pretty sure I’ve managed to catch it. Next week it will all be over for me and my tutor. Sorry, James. Of course, it’s not actually rabies that I’ve contracted, but a much more common condition: Medical Student Syndrome (1). Last week in my lectures, we learned all the ins, outs, and symptoms of the rabies virus. So, naturally, now we all have it. This health-related anxiety is a prime example of how our human brains can trick us into experiencing phantom symptoms. The same cognitive veil is used in clinical trials all over the world in order to test the efficacy of new drugs. We’ve all felt it. That moment when you question, ‘Is this real, or is my mind making its reality?’ We call this the placebo effect. The placebo effect is crucial to modern and historical experimental design. The ‘trickable’ nature of the human mind has changed the course of drug development as we know it. The effects’ success hinges on a patient's belief that they are receiving treatment for their ailment. The simple belief in a cure can often result in real physiological changes in an individual. This makes the placebo effect a very powerful tool in the development of new drugs for the market. In a placebo-controlled trial, half of the sample population will be blindly given a placebo, and the other half of the drug being tested. In order for a potential treatment to be considered effective, it must produce more significant results than the placebo group (2). We must improve our approach to designing and researching hypotheses. Can we use what we know about the placebo effect to make more accurate claims about modern pharmaceutical development? Well, in 2017, Dr. Sara Vanbheim of the Arctic University of Norway published a study that brought into consideration the possible effects of differing sexual characteristics on placebo efficacy (3). This idea could restructure the way experiments are designed going forward and potentially provoke a possible review of drugs already on the market. Is it possible that traditionally marginalised groups are underrepresented in the clinical trial process? Can we restructure experiments to be more inclusive? Are changes even really necessary? These questions were investigated through the compilation and calculation of placebo and nocebo effects on men and women over multiple previously conducted studies mostly centering around physical pain and the administration of analgesia. The term ‘nocebo’ defines the antithesis of a placebo (4), referring to adverse side effects a subject feels when given an inert version of the test drug. While placebos tend to have an analgesic effect, nocebos often cause negative effects or emotions when the subjects are told that they should expect/anticipate them. Before discussing any of these questions, it is worth noting that the Norwegian study focuses solely on classic sexual differences between cis-gender men and women. Though both keywords ‘gender’ and ‘sex’ were included in the study, research surrounding the specific effects of gender identity and gender-affirming therapies on placebos has not been thoroughly conducted as of 2023. It is with this focus that the following hypotheses are stated (3): “1) placebo responses would be stronger or more frequently observed in males than in females, 2) nocebo responses would be stronger or more frequently observed in females than in males, 3) verbally induced placebo responses would be more frequently observed in males than in females, and 4) conditioned nocebo responses would be more frequently observed in females than in males.” Results concluded that there was indeed a significant correlation between sex and placebo/nocebo effects when concerning pain relief. But what is truly fascinating is that while men received elevated levels of a placebo effect, such as reduced symptoms and analgesia, women were more susceptible to hyperalgesia and negative emotions. Those supposed ‘side effects’ appear to weigh more heavily on women (3). What does this say about how men and women process pain and information? The Norwegian study discusses the role of ‘psychophysiological mechanisms’ in pain pathways. Or, more simply, How stress and anxiety can affect the pain the brain perceives. In 8 of the 12 studies, men experienced significantly stronger analgesic effects from the placebo than women (3). It is plausible that men react more strongly to pain induced by stress hormones. This would explain why when taking a placebo, their anxiety level would decrease, and they would receive higher levels of analgesia than their female counterparts (3). Another study, upon which the Norwegian argument builds, investigates placebo delivery methods and their effect on perceived pain in men and women. In this study, men relied far more on verbal queues to provide analgesia, whereas women received a more significant effect from classic conditioning (5). These studies bring into question both the methodological and physiological effects of placebos on different sexes. What do these differences tell us about how men and women perceive the world? And what does this mean for the future of the placebo? The result of all of these studies is to show not whether placebos are bad or good, reliable or unreliable, but instead to highlight the differences in the physiological and psychological links when looking at different groups of people. At its core, a placebo is simply a trick of the brain, a psychological mirage. While the basis and reliability of placebos can be debated at length, their effect on the human brain teaches us something about ourselves societally. In all areas of medicine, the inclusion of people from all different backgrounds, genders, ethnicities, and ages is crucial so professionals know how to identify and treat various manifestations of a disease with grace and care. Now I know James responds better to verbal commands; I’ll be sure to tell him he has rabies the next time I see him. References Henning Schumann J. I contracted medical student syndrome. You probably will too. [Internet]. AAMC. [cited 2023 Jun 22]. Available from: https://www.aamc.org/news/i-contracted-medical-student-syndrome-you-probably-will-too Harvard Health Publishing. The power of the placebo effect - Harvard Health [Internet]. Harvard Health. Harvard Health; 2021. Available from: https://www.health.harvard.edu/mental-health/the-power-of-the-placebo-effect Vambheim S, Flaten MA. A systematic review of sex differences in the placebo and the nocebo effect. Journal of Pain Research. 2017 Jul;Volume 10:1831–9. National Cancer Institute NCI. Definition of nocebo effects [Internet]. www.cancer.gov . 2011. Available from: https://www.cancer.gov/publications/dictionaries/cancer-terms/def/nocebo-effect Enck P, Klosterhalfen S. Does Sex/Gender Play a Role in Placebo and Nocebo Effects? Conflicting Evidence From Clinical Trials and Experimental Studies. Frontiers in Neuroscience. 2019 Mar 4;13. Previous article Next article back to MIRAGE
- Friend or Foe?: The Mechanisms Behind Facial Recognition | OmniSci Magazine
< Back to Issue 8 Friend or Foe?: The Mechanisms Behind Facial Recognition by Mishen De Silva 3 June 2025 Edited by Luci Ackland Illustrated by Aisyah Mohammad Sulhanuddin Among the many mysteries which encompass the world around us, lies a complex interaction right under our nose, or perhaps… right above it. In the labyrinth of human consciousness, we rely on the seemingly arbitrary judgements made from the combination of two eyes, a nose, and a mouth, to discern who might be a friend or foe. Facial recognition gives a snapshot into the intricate dance between our perception and cognition, which allows us to cultivate a more detailed understanding of those around us, and their thoughts, feelings and emotions. In those fleeting moments when you recognise your parents in a sea of unfamiliar faces, spot your friends ensconced among the rows of the lecture theatre, or simply bump into an old friend in a crowd of unacquainted strangers, your brain is able to identify faces in a fraction of a second, a remarkable feat of the human cognitive capacity. But what enables us to distinguish one face from another? How do the faces of those we know stand out from the countless other noses, eyes and mouths we see? To understand what makes these interactions so meaningful, we need to take a closer look at the mechanisms behind facial recognition and decoding within the brain. The Brain’s Blueprint To be human is to seek meaning, even when none may exist. The mind has transformed what is two eyes above a nose, and a nose above a mouth, into its own pattern for classifying the identities and expressions we see around us. Many studies have suggested facial processing to be holistic, where the featural patterns of the eyes, nose and mouth are perceived together and upright (1,2). This mechanism of holistic facial processing explains the interesting phenomena behind pareidolia, where the brain adapts the characteristics of human faces onto everyday objects. It’s the reason why when glancing at a bowling ball it may appear surprised (3), or why some have sworn to see a face on Mars (4)! Figure 1. Bowling balls with surprised facial expressions! (3) In pursuit of meaning for the patterns around us, the brain has developed specialised regions for processing the features of a face to help us recognise individual identities. Facial processing operates through a hierarchical mechanism where distinct aspects of the face are interpreted by different regions of the brain. The unchanging elements of the face such as gender, age, ethnicity and features related to someone’s identity are analysed by the Inferior Occipital Gyrus and Fusiform Face Area (FFA), while the changing aspects such as eye gaze, lip movements and facial expressions are analysed by the Superior Temporal Sulcus and Orbitofrontal Cortex (5,6). Of these face-selective regions, the FFA is particularly important for facial recognition as it helps us recognise who a person is (5). Through the activation of our FFA simple patterns shift from meaningless shapes into familiar visages representing our friends, family, or even our own reflection. Studies have uncovered the importance of the FFA for facial recognition by examining what may happen when this brain region malfunctions (7,8). A unique example of this is prosopagnosia, which results from damage to the FFA in the right hemisphere of the brain (9). Prosopagnosia is a relatively rare condition affecting about 1 in 50 people, impairing their ability to recognise faces (9). Imagine if every face you observed looked the same or unfamiliar… even your own reflection! It is through the brain and its specialised regions for facial recognition where we can appreciate the essence of human connection as a result of our neural hardware. These mechanisms responsible for transforming patterns into faces are the reason we can recognise our neighbour from a stranger, friend from a classmate, or our parents from a teacher. Often overlooked amidst the fleeting and impermanent nature of our social interactions, this complex system guides us along the fragile line of human relationships, between familiarity and estrangement, a friend or foe. It highlights how deeply-rooted our connection and sense of identity is to the faces we see. The Brain’s Threat Detection With each neuron, synapse and pathway, our brains are machines wired for connection, not just in how we think, but also in how we perceive and interact with our surroundings. From the brief exchange of smiles with a stranger, to the furtive glare from someone across the room, one of the hallmarks of our emotional understanding is the ability to decode the thoughts and intentions of others, even from the most subtle of expressions. In the vast and intricate web of neural connectivity, it can be difficult to isolate a singular brain region or connection to explain complex cognitive functions. Brain imaging studies have found a strong bidirectional link between the FFA and amygdala, making this a likely candidate for explaining our remarkable decoding ability (10,11). As the FFA picks up on who a person is or what facial expression is being made, it is the amygdala which then evaluates the emotional salience, or importance, of this face. The amygdala then signals back to the FFA to either increase or decrease the facial processing activity accordingly (10,12). Consider how the visibility of teeth in a barred expression can signal anger, the whiteness of someone’s eyes can hint fear or surprise, and the shape of a person’s eyebrows can indicate the intensity of their emotion, all which guide the brain to prioritise and interpret socially and emotionally relevant cues – almost like a survival filter! (13,14,15). From an evolutionary perspective, the FFA-amygdala feedback loop serves as an important tool for rapidly and accurately interpreting the intentions of others, a pinnacle function in the architecture of our physical and social survival (16). The ability to recognise whether someone poses a friend or foe has been a survival mechanism and evolutionary advantage for millennia. The role of our facial processing network, from the amygdala and FFA, to other brain regions discussed, provides a microcosm into our nature as social beings, and our evolutionary selective changes, which have enhanced our ability to sense, respond to, and connect with those around us (17). In this way, maybe the most profound mysteries lie not in distant galaxies or ancient ruins, but are hidden in plain sight, within the faces we walk past every day. Our brain’s ability to read them is not merely a mechanism for decoding emotion, but a mirror into the nature of what it means to be human, where connection, trust, and survival have long been written in the expressions of those around us. References 1. Farah M, Wilson K, Drain M, Tanaka J. What is “special” about face perception?. Psychological Review [Internet]. 1998 Aug [cited 2025 May 14]; 105(3):482–98. Available from: https://pmc.ncbi.nlm.nih.gov/articles/PMC5561817/ 2. Richler J, Gauthier I. A meta-analysis and review of holistic face processing. Psychological Bulletin [Internet]. 2014 Sep [cited 2025 May 14]; 140(5): 1281–302. Available from: https://pubmed.ncbi.nlm.nih.gov/24956123/ 3. What do you think these bowling balls saw to leave them so surprised & shocked?. Reddit [Internet]. 2022 [cited 2025 May 31]. Available from: https://www.reddit.com/r/Pareidolia/comments/zc12jo/what_do_you_think_these_bowling_balls_saw_to/#lightbox 4. Gilbert L. Why the brain is programmed to see faces in everyday objects. UNSW Sites [Internet]. 2020 Aug [cited 2025 May 14]. Available from: https://www.unsw.edu.au/newsroom/news/2020/08/why-brain-programmed-see-faces-everyday-objects 5. Kanwisher N, Yovel G. The fusiform face area: a cortical region specialized for the perception of faces. Philosophical Transactions of the Royal Society: Biological Sciences [Internet]. 2006 Dec 29 [cited 2025 May 14]; 361(1476):2109–28. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1857737/ 6. Zhen Z, Fang H, Liu J. The Hierarchical Brain Network for Face Recognition. Ptito M, editor. PLoS ONE [Internet]. 2013 Mar [cited 2025 May 14]; 8(3):e59886. Available from: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0059886 7. Hadjikhani N, de Gelder B. Neural basis of prosopagnosia: An fMRI study. Human Brain Mapping [Internet]. 2002 [cited 2025 May 14]; 16(3):176–82. Available from: https://onlinelibrary.wiley.com/doi/abs/10.1002/hbm.10043 8. Sorger B, Goebel R, Schiltz C, Rossion B. Understanding the functional neuroanatomy of acquired prosopagnosia. NeuroImage [Internet]. 2007 Apr [cited 2025 May 14] ;35(2):836–52. Available from: https://www.sciencedirect.com/science/article/pii/S1053811906009906 9. Prosopagnosia | Psychology Today Australia [Internet]. www.psychologytoday.com . [cited 2025 May 14]. Available from: https://www.psychologytoday.com/au/basics/prosopagnosia 10. Herrington J, Taylor J, Grupe D, Curby K, Schultz R. Bidirectional communication between amygdala and fusiform gyrus during facial recognition. NeuroImage [Internet]. 2011 Jun [cited 2025 May 14]; 56(4):2348–55. Available from: https://pubmed.ncbi.nlm.nih.gov/21497657/ 11. Said C, Dotsch R, Todorov A. The amygdala and FFA track both social and non-social face dimensions. Neuropsychologia [Internet]. 2010 Oct [cited 2025 May 14]; 48(12): 3596–605. Available from: https://pubmed.ncbi.nlm.nih.gov/20727365/ 12. Šimić G, Tkalčić M, Vukić V, Mulc D, Španić E, Šagud M, et al. Understanding Emotions: Origins and Roles of the Amygdala. Biomolecules [Internet]. 2021 May [cited 2025 May 14]; 11(6):823. Available from: https://pmc.ncbi.nlm.nih.gov/articles/PMC8228195/ 13. Jacobs R, Renken R, Aleman A, Cornelissen F. The amygdala, top-down effects, and selective attention to features. Neuroscience & Biobehavioral Reviews [Internet]. 2012 Oct [cited 2025 May 14]; 36(9):2069–84. Available from: https://pubmed.ncbi.nlm.nih.gov/22728112/ 14. Horstmann G, Lipp O, Becker S. Of toothy grins and angry snarls – Open mouth displays contribute to efficiency gains in search for emotional faces. Journal of Vision [Internet]. 2012 May [cited 2025 May 14]; 12(5):7–7. Available from: https://jov.arvojournals.org/article.aspx?articleid=2192034#:~:text=We%20suspected%20that%20visible%20teeth,(see%20also%20Figure%205).&text=Mean%20target%20present%20slopes%20(in,while%20angry%20faces%20do%20not.&text=Mean%20target%20present%20slopes%20(in,while%20angry%20faces%20do%20not . 15. Hasegawa H, Unuma H. Facial Features in Perceived Intensity of Schematic Facial Expressions. Perceptual and Motor Skills [Internet]. 2010 Feb [cited 2025 May 14]; 110(1):129–49. Available from: https://pubmed.ncbi.nlm.nih.gov/20391879/ 16. Schmidt K, Cohn J. Human facial expressions as adaptations: Evolutionary questions in facial expression research. American Journal of Physical Anthropology [Internet]. 2001 [cited 2025 May 14]; 116(S33):3–24. Available from: https://pubmed.ncbi.nlm.nih.gov/11786989/ 17. Carter E, Pelphrey K. Friend or foe? Brain systems involved in the perception of dynamic signals of menacing and friendly social approaches. Social Neuroscience [Internet]. 2008 Jun [cited 2025 May 14]; 3(2):151–63. Available from: https://pubmed.ncbi.nlm.nih.gov/18633856/ Previous article Next article Enigma back to