Search Results
146 results found with an empty search
- ISSUE 6 | OmniSci Magazine
Issue 6: Elemental 28 May 2024 This issue explores the building blocks that comprise the world we live in. Our talented writers braved the elements - have a read below! Editorial by Ingrid Sefton & Rachel Ko A word from our Editors-in-Chief. Fire and Brimstone by Jesse Allen The world has long been subject to the fury of fire and volcanic eruptions. Technology to predict seismic activity may allow us to tame this elemental force. Hidden in Plain Sight: The dangerous chemicals in our everyday products by Kara Miwa-Dale Drink bottles, tinned food, receipts: a recipe for disaster? Interviewing A/Prof Mark Green, Kara exposes the hidden dangers of endocrine disrupting chemicals. A Frozen Odyssey: Shackleton’s Trans-Antarctic Expedition by Ethan Bisogni A pursuit of knowledge and a testament to survival, Ethan navigates the enthralling legacy of Sir Ernest Shackleton's Trans-Antarctic Expedition. Everything, Everywhere, All at Once: The Art of Decomposition by Arwen Nguyen-Ngo Arwen breaks down the intricacies of decomposition, leading us to consider the fundamental power not only in creation, but destruction. Out of our element by Serenie Tsai Following the industrial revolution, humankind has exploited and degraded the Earth's natural resources. Serenie shows how nature resists, maintaining the capacity to restore what humans have destroyed. Cosmic Carbon Vs Artificial Intelligence by Gaurika Loomba Carbon constitutes life and death, shaping conscious human existence. What threat could AI hold to the power of this element? Proprioception: Our Invisible Sixth Sense by Ingrid Sefton Our mysterious, yet omnipresent sixth sense - proprioception is the reason we know where our body and limbs are, even in the dark. A Brief History of the Elements: Finding a Seat at the Periodic Table by Xenophon Papas There's hydrogen and helium, then lithium, beryllium - or is there? The periodic table we know today was not always so, as Xen recounts.
- Issue3
issue 3 : alien 10 September 2022 This issue is about exploring all things exotic, unfamiliar, unknown. Dive into the column and feature articles by our talented writers below! columns The Body, Et Cetera “Blink and you’ll miss it”: A Third Eyelid? By Rachel Ko This article unpacks the fascinating evidence for evolution reflected within our very own eyes, connecting us to our reptilian ancestors. Chatter Belly bugs: the aliens that live in our gut By Lily McCann In this issue we explore how microbes influence our health and emotions, and what this means for our concept of identity. Humans of UniMelb In conversation with Paul Beuchat By Renee Papaluca I caught up with Paul Beuchat to learn more about his research journey and his potentially ‘alien’ methods of teaching. Our Past, Present & Future Waving Hello to the Aliens By Reah Shetty Our interaction with the idea of aliens has evolved. The question is how far have we come and how far will we go? Science Books Believing in aliens... A science? By Juulke Castelijn I wasn’t expecting to be persuaded of the existence of life beyond the confines of Earth. Ethics in Science The Ethics of Space Travel By Monica Blasioli Being the beginning of research into the impacts of space travel, can turning space travel into monopoly truly be justified? Wonders of the Landscape Space exploration in Antartica By Ashleigh Hallinan What makes Antarctica special when it comes to meteorite discovery? Science in the Age of Politics Hope, Humanity and the Starry Night Sky By Andrew Lim This second feature in the ‘Science in the Age of Politics’ series considers the importance of the stars, and scientific diplomacy, amidst rising global tensions. features Death of the Scientific Hero By Clarisse Sawyer How do we teach scientific history without promoting historical bigots? Mighty Microscopic Warriors! By Gaurika Loomba Equipped with a plethora of signalling chemicals and cells with different features, our heroic immune system fights wars daily without us realising it. Love and Aliens By Gavin Choong The First Nations’ perspectives are profound, and must be recognised by the Australian legal system. Existing in an Alien World: Navigating Neurodiversity in a System Built for Someone Else By Hazel Theophania Autism isn’t some inscrutable mystery - we’re people, and learning how we operate will help dismantle the barriers built up around us. AI and a notion of 'artificial humanity' By Mia Horsfall We still consider AI as other (or 'alien') to us, but ideal utility would be gained from toeing the precarious line between humanity and machine.
- Life Story of a Drug | OmniSci Magazine
< Back to Issue 8 Life Story of a Drug by Elijah McEvoy 3 June 2025 Edited by Weilena Liu Illustrated by Aisyah Mohammad Sulhanuddin From the mythical visions of church goers who took mushrooms in the infamous ‘Good Friday Experiment’ to the extreme self-reflection of those ‘tripping’ off the traditional South American hallucinogenic tea Ayahuasca (1,2), humans have been painting the extraordinary narratives of psychedelics for thousands of years in thousands of settings. Put simply, psychedelics are a class of psychoactive drugs that can alter your thoughts and senses, inducing wild experiences not thought possible in your brain’s ground state (3). One of the most famous of these drugs is LSD. ‘Lucy in the Sky with Diamonds’ is said to have inspired entire Beatles albums and shown Steve Jobs “that there’s another side to the coin” of life (4,5). LSD is also a psychedelic that stands as an enigma in many regards. It is both naturally derived and synthetically created. It has been tested in psychological therapy and psychological warfare. Even the ‘trips’ experienced by its users entail both unexplainable hallucinations and scientifically proven phenomena. While being lesser understood, the stories of LSD’s enigmatic origins, uses and effects are just as interesting as those that come from its users. The Origins Lysergic Acid Diethylamide (LSD) or ‘acid’ for short is a semi-synthetic chemical compound with humble biological beginnings. LSD is derived from a class of alkaloid metabolite molecules that are naturally produced by the fungus commonly known as ergot. Ergot fungi are members of the parasitic genus Claviceps , which have been infecting staple crops and shaping society long before acid came to distort shapes in the eyes of its users (6). Epidemics of ergotism, a disease caused by these ergot alkaloids after ingesting contaminated crops, swept across Middle Age Europe and led to the deaths of tens of thousands of people (7). Despite credible arguments to the contrary, some historians have even suggested that the Salem Witch Trials may have been sparked by a form of this disease known as convulsive ergotism. Not only were the environmental conditions in 1691 Salem reported to be optimal for ergot growth in the town’s rye, but convulsive ergotism also induces distinct muscle contractions, paranoia and audiovisual hallucinations (8). These symptoms all would have given credit to the claims of bewitchment made by the young girls that instigated the accusations of witchcraft in the town. Aside from death and dark magic, this fungus has also been used as an effective therapeutic across several eras of history. It’s use as a medication for childbirth was recorded as early as 1100 BCE in China, with midwives using ergot or it’s alkaloids to reduce bleeding during birth, expedite delivery or induce an abortion (6,7). It wasn’t until modern pharmacology advanced in the 20th century that scientists began to chemically characterise these ergot alkaloids and use them as the basis to create potent drugs. The story of how LSD was first created and consumed is one that has been immortalised in history books and unofficial holidays. Dr Albert Hoffman, a Swiss biochemist working for the pharmaceutical company Sandoz, first synthesised LSD in 1938 as the 25th substance in a series of lysergic acid derivatives being evaluated by the company (9). Initial testing of this compound indicated it had no unique pharmacological uses beyond those of pre-existing ergot alkaloid derived drugs (9). However, Hoffman couldn’t shake the nagging feeling that LSD-25 had more to offer. After making another batch of the compound 5 years later, Hoffman’s suspicions grew stronger when he was forced to leave the lab early after entering a “dream-like state… [with] a kaleidoscope-like play of colours” (9). A few days later, in a moment that demonstrated both admirable scientific curiosity and blatant rejection of OH&S, Hoffman took a large dose of LSD himself and set in for a trip of a lifetime (9). Like all good scientists, he recorded his experience in a journal, writing at 3pm on 19 April 1943: “visual distortions, symptoms of paralysis, desire to laugh” (9). Hoffman’s notes for the day stopped there. The Uses April 19th has come to be celebrated as ‘Bicycle Day’, commemorating the seemingly endless and surreal bike ride home Hoffman undertook after this self-experimentation. However, a wacky trip was not the only thing that followed this discovery. After Hoffman distributed the drug to his superiors to try for themselves, LSD was sold on the market by Sandoz under the name Delysid. This drug was employed by psychiatrists throughout the 1950s as a treatment for alcoholism or simply ‘psychotherapy-in-a-pill’ for patients suffering psychological trauma (10,11). LSD not only garnered therapeutic interest from scientists but also more nefarious intrigue from the CIA. Seeking to get an upper hand in the department of mental warfare during the Cold War, the CIA bought up 40,000 doses of LSD from Sandoz and performed a variety of unethical experiments on unknowing prisoners, heroin addicts and even other CIA agents in an attempt to understand the drug’s potential for ‘mind control’ under the MKUltra project (12). Moving into the 60s, LSD’s use amongst budding leaders of the Hippie and Yippie movements gave the drug its countercultural status. Harvard Professor Timothy Leary, who was dismissed from his position due to experimenting (literally) with LSD, promoted the drug as an agent of revolution that allowed the youth of America to “turn on, tune in, drop out” (10) of repressive society. Due to its increasing association with these disruptive movements and eventual outlawing by the US government in 1966 (11), acid’s place in culture shifted out of labs and psychologist offices and into illicit recreational usage by experimental hippies and enlightened artists. The Trip Whether accompanied by an experienced monitor or listening to some soothing vinyl records yourself, the experience of taking LSD is predictably unpredictable. ‘Dropping acid’ is unique in that only micrograms of the drug are enough to elicit a palpable psychedelic experience (13), with most users diluting the dosage on tabs of blotting paper or sugar cubes (11). Following consumption, it takes as little as 1.5 hours for LSD to cross the blood-brain barrier, dilate the pupils and bring users to the peak intensity of the drug’s psychological effects (13). The bizarre experiences perceived by those ‘tripping’ on LSD is rooted in a now well-characterised receptor binding interaction in the brain. The nitrogen-based chemical groups of the LSD molecule first anchor themselves within the 5-HT2A serotonin receptors found in the synapses of neurons (14). While the serotonin neurotransmitter typically helps regulate brain activities like mood and memory, LSD binding instead causes the activation of distinct intracellular cascades within these brain cells (3). The importance of this interaction was demonstrated in experiments that proved blocking this receptor can cancel the acid trip all together (3). Recent studies that have further characterised the chemical structure of this interaction have also shown that 5-HT2A forms a lid-like structure that locks LSD into this receptor protein’s binding site and sets the user in for a long trip (14). From these individual cellular interactions, LSD ignites a burst of brain activity. Modern brain scanning technology has revealed that LSD first disrupts the capacity of the thalamus to filter and pass on sensory stimuli from the body to the cortex of the brain. Upon injection of LSD, patient’s brains demonstrated both an overflow of information running between the thalamus and posterior cingulate cortex and restriction of signals going to the temporal cortex (15). Not only does LSD modify the brain’s ability to sort out important stimuli from the outside world, but this small molecule has also been found to temporarily form new connections between different parts of the brain. Hoffman’s recount of how “every sound generated a vividly changing image” (9) on the first Bicycle Day can be explained by the increased connectivity of the brain’s visual cortex on LSD. This causes areas of the brain responsible for other senses or emotions to become involved in creating the images perceived in the user’s head, causing visual hallucinations and geometric distortion that have no basis in real stimuli coming from the eyes (16). In contrast, Hoffman’s feeling of being “outside [his] body” (9) likely came from decreased connectivity between the parahippocampus and retrosplenial cortex, two regions of the brain responsible for cognition. This severance has been correlated with the greater meaning that those tripping on LSD find in objects, events or music along with their characteristic ‘ego dissolution’ (16). This is a phenomenon where users no longer see the world through the lens of their own ‘self’ and instead feel an increased sense of unity with everything around them (17). Very Hippie ideas with a very scientific explanation. The Comedown and Beyond The float back down from the peak of an LSD trip takes up to 10 hours and leaves its users with a variety of stories and outcomes. Contrary to the fearmongering of parents and politicians, LSD does not leave holes in the brain, does not lead to addiction and has not directly led to the death of anyone as a result of overdosage (3). While the risk of a ‘bad trip’ and the feelings of severe anxiety, fear and despair that come with it may be traumatic, these are typically experienced when taking LSD in unsupportive environments without proper mental preparation (13). In fact, when LSD is taken in a manner closer to the controlled ritual practices surrounding psychedelics of old (3), acid is suggested to have long-lasting positive impacts on the user’s attitude and personality (13). It is these experiences that have rejuvenated the field of LSD research from its abrupt stop in the 60s. Modern investigations have picked up where these scientists left off and are evaluating the potential of utilising LSD-assisted therapy to alleviate anxiety and depression. Studies have focused particular attention on addressing these mental health conditions in those suffering from life-threatening illnesses like cancer (18). While some of these experiments lack the controls or data to make strong generalised conclusions, several studies have demonstrated that patients supplied with LSD reported lasting decreases in anxiety surrounding their condition, greater responsiveness to their families and improved quality of life (3,18). All of this is not to promote LSD as a harmless wonder drug. While rare, LSD has been linked to Hallucinogen Persisting Perception Disorder, a condition in which people experience distressing ‘flashbacks’ to the effects and experiences of past psychedelic trips in a normal setting. Additionally, the changes in visual perception, emotion and thought while one is tripping can also cause users to make reckless decisions in dangerous situations (18). However, continuing to wage war against controlled experiments and supervised therapeutic trials with LSD only serves to limit the attempts of scientists in better understanding the balance between this drug’s risks and benefits. While our trip through the life of LSD may end here, there is still much to explore. The greater story of how we use it, how we view it and how it fits into our society is far from over. References Illing S. Vox. 2018 [cited 2024 Oct 23]. The brutal mirror: what the psychedelic drug ayahuasca showed me about my life. Available from: https://www.vox.com/first-person/2018/2/19/16739386/ayahuasca-retreat-psychedelic-hallucination-meditation Majić T, Schmidt TT, Gallinat J. Peak experiences and the afterglow phenomenon: When and how do therapeutic effects of hallucinogens depend on psychedelic experiences? J Psychopharmacol. 2015 Mar 1;29(3):241–53. Nichols DE. Psychedelics. Barker EL, editor. Pharmacol Rev. 2016 Apr 1;68(2):264–355. Gilmore M. Beatles’ Acid Test: How LSD Opened the Door to “Revolver” [Internet]. Rolling Stone. 2016 [cited 2024 Oct 23]. Available from: https://www.rollingstone.com/feature/beatles-acid-test-how-lsd-opened-the-door-to-revolver-251417/ Hsu H. The Lingering Legacy of Psychedelia. The New Yorker [Internet]. 2016 May 17 [cited 2024 Oct 23]; Available from: https://www.newyorker.com/books/page-turner/the-lingering-legacy-of-psychedelia Haarmann T, Rolke Y, Giesbert S, Tudzynski P. Ergot: from witchcraft to biotechnology. Molecular Plant Pathology. 2009 Jul;10(4):563–77. Schiff PLJ. Ergot and Its Alkaloids. American Journal of Pharmaceutical Education. 2006 Oct 15;70(5):98. Woolf A. Witchcraft or Mycotoxin? The Salem Witch Trials. Journal of Toxicology: Clinical Toxicology. 2000 Jan;38(4):457–60. Hofmann A. How LSD Originated. Journal of Psychedelic Drugs. 1979 Jan 1;11(1–2):53–60. Massari P. Harvard Griffin GSAS News. 2021 [cited 2024 Sep 28]. A Long, Strange Trip | The Harvard Kenneth C. Griffin Graduate School of Arts and Sciences. Available from: https://gsas.harvard.edu/news/long-strange-trip Stork CM, Henriksen B. Lysergic Acid Diethylamide. In: Wexler P, editor. Encyclopedia of Toxicology (Third Edition) [Internet]. Oxford: Academic Press; 2014 [cited 2024 Sep 28]. p. 120–2. Available from: https://www.sciencedirect.com/science/article/pii/B9780123864543007442 Stuff You Should Know. Did the CIA test LSD on unsuspecting Americans? - Stuff You Should Know [Internet]. [cited 2024 Aug 25]. (Stuff You Should Know). Available from: https://www.iheart.com/podcast/1119-stuff-you-should-know-26940277/episode/did-the-cia-test-lsd-on-29468397/ Passie T, Halpern JH, Stichtenoth DO, Emrich HM, Hintzen A. The Pharmacology of Lysergic Acid Diethylamide: A Review. CNS Neurosci Ther. 2008 Nov 11;14(4):295–314. Wacker D, Wang S, McCorvy JD, Betz RM, Venkatakrishnan AJ, Levit A, et al. Crystal structure of an LSD-bound human serotonin receptor. Cell. 2017 Jan 26;168(3):377. Sample I. Study shows how LSD interferes with brain’s signalling. The Guardian [Internet]. 2019 Jan 28 [cited 2024 Nov 10]; Available from: https://www.theguardian.com/science/2019/jan/28/study-shows-how-lsd-messes-with-brains-signalling Carhart-Harris RL, Muthukumaraswamy S, Roseman L, Kaelen M, Droog W, Murphy K, et al. Neural correlates of the LSD experience revealed by multimodal neuroimaging. Proceedings of the National Academy of Sciences. 2016 Apr 26;113(17):4853–8. Sample I. LSD’s impact on the brain revealed in groundbreaking images. The Guardian [Internet]. 2016 Apr 11 [cited 2024 Nov 10]; Available from: https://www.theguardian.com/science/2016/apr/11/lsd-impact-brain-revealed-groundbreaking-images Liechti ME. Modern Clinical Research on LSD. Neuropsychopharmacol. 2017 Oct;42(11):2114–27. Previous article Next article Enigma back to
- The Rise of The Planet of AI | OmniSci Magazine
The Rise of The Planet of AI By Ashley Mamuko When discussing AI, our minds instinctively fear of sentience and robotic uprising. However, is our focus misplaced on the “inevitable” humanoid future when AI has become ubiquitous and undetectable in our lives? Edited by Hamish Payne & Katherine Tweedie Issue 1: September 24, 2021 Illustration by Aisyah Mohammad Sulhanuddin On August 19th 2021, Tesla announced a bold project on its AI Day. The company plans to introduce humanoid robots for consumer use. These machines are expected to perform basic, mundane household tasks and streamline easily into our everyday lives.With this new release, the future of AI seems to be closing in. No longer do we stand idle, expecting the inevitable humanoid-impacted future. By 2022, these prototypes are expected to launch. It seems inevitable that our future would include AI. We have already familiarised ourselves with this emerging technology in the media we continue to enjoy. Wall E, Blade Runner, The Terminator, and Ex Machina are only a few examples of the endless list of AI-related movies, spanning decades and detailing both our apprehension and acceptance through multiple decades. Most of these movies portray these machines as sentient yet intrinsically evil, as they pursue human destruction. But to further understand the growing field of study of AI, it’s important to first briefly introduce its history and procurement before noting the growing concerns played up in the Hollywood Blockbusters. The first fundamental interpretations of Artificial Intelligence span a vast period of time. Its first acknowledgement may be attributed to the 1308 Catalan poet and theologian Ramon Llull. His work Ars generalis ultima (The Ultimate General Art) advanced a paper-based mechanical process that creates new knowledge from a combination of concepts. Llull aimed to create a method of deducing logical religious and philosophical truths numerically. In 1642, French mathematician Blaise Pascal invented the first mechanical calculating machine; the first iteration of the modern calculator (1). The Pascaline, as it is now known, only had the ability to add or subtract values using a dial and spoke system (2). Though these two early ideas do not match our modern perceptions of what AI is, they lay the foundation of pushing logical processes to do more than just mechanical means. These two instances in history foreshadow the use of mechanical devices in performing human cognitive functions. Not till the 1940s and early 1950s did we finally obtain the necessary means of more complex data processing systems. With the introduction of computers, the novelty of algorithms created a more streamlined function of storing, computing, and producing. In 1943, Warren McCulloch and Walter Pitts founded the idea of artificial neural networks in their paper “A Logical Calculus of Ideas Immanent in Nervous Activity” (3). This presented the notion of computers behaving similar to a human mind and introduced the subsection of “deep learning”. Alan Turing proposed a test to assess a human’s ability to differentiate between human behaviour and robotic behaviour. In 1950, the Turing Test (later known as the Imitation Game) asked participants to identify if the dialogue they were engaging with was with another person or a machine (4). Despite the breakthroughs made in this expertise, the term Artificial Intelligence wasn’t finally coined till 1955 by John McCarthy of AI. Later on, McCarthy along with many other budding experts would hold the famous 1956 Dartmouth College Workshop (5). This meetup of a few scientists would later be pinpointed in history as the birth of the AI field. As the field continued to grow, more public concerns were raised alongside the boom of science fiction literature and movies cropping up. The notorious 1968 movie 2001: A Space Odyssey shaped such a role into the public perception of the field that by the 1960s and 1970s, an AI Winter occurred. Very little notable progress was made in the field due to the lack of funding based on fear (6). Finally after some time had passed and some more advancements were made with algorithm technology, the notable Deep Blue chess game against Gary Kasparov. The event occurring in May 1997 where the Deep Blue robot beat world champion chess superstar Gary Kasparov marked a silence ushering of perhaps a “decline in human society” at the fall of the machine. Fast forward to now, AI has traversed through leaps and bounds to achieve a much more sophisticated level of algorithms and machine learning techniques. To further understand the uses of AI, I interviewed Dr Liz Sonenberg, a professor in the School of Computing and Information Systems at The University of Melbourne and is a Pro Vice-Chancellor (Research Infrastructure and Systems) in Chancellery Research and Enterprise. She’s an expert in the field and has done a multitude of research. "Machine learning is simply a sophisticated algorithm to detect patterns in data sets that has a basis in statistics." With this algorithm, we have been able to implement it in a variety of our daily tech encounters. AI sits behind the driving force of Google Maps and navigation, as well as voice control. It can easily be found anywhere. “Just because these examples do not exhibit super intelligence, does not mean they are not useful,” Dr Sonenberg explains. Dr Sonenberg alludes that the real problem with AI lies within it’s fairness. These “pattern generating algorithms” at times “learn from training sets not representative of the whole population, which can end up with biased answers.” With a flawed training set, a flawed system is in place. This can be harmful to certain demographics and cause a sway on consumer habits. With AI-aided advice, the explanation behind outcomes and decisions are not supported either. Algorithms are only able to mechanically produce an output, but not explain them. With more high-stakes decisions untrusted upon the reliability of AI, the issue of flawed algorithms becomes more pronounced. With my interview with Dr Sonenberg, not one moment was the fear of super-intelligence, robot uprisings, and the likes brought up... With the new-found knowledge of AI’s current concerns I brought up with Dr Sonenberg, I conducted another interview with Dr Tim Miller, a Professor of Computer Science in the School of Computing and Information Systems at The University of Melbourne, and Dr Jeannie Paterson, a Professor teaching subjects in law and emerging technologies in the School of Law at The University of Melbourne. They both are also Co-Directors at The Centre for Artificial Intelligence and Digital Ethics (CAIDE). As we began the interview, Dr Miller explained again that AI “is not magic” and implements the use of “math and statistics”. Dr Paterson was clear to bring up that anti-discrimination laws have been in place but as technology evolves and embeds itself more into public domain, it must be scrutinised. The deployment of AI can easily cause harm to people due to systems not being public, causing sources to be difficult to identify and causily attribute. With the prospect of biased algorithms, a fine dissonance occurs. Dr Miller elaborated on the use of AI in medical imaging used in private hospitals. As private hospitals tend to attract a certain echelon of society, the training set is not wholly representative of the greater population. “A dilemma occurs with racist algorithms… if it is not used [outcomes] could be worse.” When the idea of a potential super-intelligent robot emerging in the future was brought into conversation, the two didn’t seem to be very impressed. “Don’t attribute superhuman qualities [to it],” says Dr Paterson. Dr Miller states that the trajectory of AI’s future is difficult to map. Predictions in the past of how AI progresses with it’s abilities have occurred, but they occur much later than expected… easily decades later. The idea of super-intelligence also poses the question on how to define intelligence. “Intelligence is multidimensional, it has its limits,” says Dr Miller. In this mystical future world of AI, a distinction is placed not just on, “what will machines be able to do but what will not have them do,” states Dr Miller. “This regards anything that requires social interaction, creativity and leadership”; so the future is aided by AI, not dictated by it. However, in a more near future, some very real concerns are posed. Job security, influence on consumer habits, transparency, law approach, and accountability are only a few. With more and more jobs being replaced by machines, every industry is at stake. “Anything repetitive can be automated,” says Dr Miller. But this does not instinctively pose a negative, as more jobs will be created to further aid the use of AI. And not all functions of a job can be replaced by AI. Dr Paterson explains with the example of radiology that AI is able to diagnose and interpret scans, but a radiologist does more than just diagnose and interpret on a daily basis. “The AI is used to aid in the already existing profession, not simply overtake it.” Greater transparency is needed in showing how AI uses our data. “It shouldn’t be used to collect data unlimitedly,” says Dr Paterson, “is it doing what’s being promised, is it discriminating people, is it embedding inequality?” With this in mind, Dr Paterson suggests that more law authorities should be educated on how to approach topics regarding AI. “There needs [to be] better explanation… [We] need to educate judges and lawyers.” With the notorious Facebook-Cambridge Analytica scandal of 2018, the big question of accountability was raised. The scandal involved the unwarranted use of data from 87 million Facebook users by Cambridge Analytica which served to support the Trump campaign. This scandal brought to light how the data we used can be exploited nonconsensually and used to influence our behaviours, as this particular example seemed to sway the American presidential election. Simply put, our information can be easily exploited and sent off to data analytics to further influence our choices. This creates the defence that apps “ merely provide a [service], but people use [these services] in that way,” as said by Dr Miller. Simply put, the blame becomes falsely shifted onto the users for the spread of misinformation. The impetus, however, should lie with social networking sites disclosing to it’s users more transparency on their data usage and history as well as providing adequate protection on their data. To be frank, the future of robotic humanoid AI integrating seamlessly into human livelihoods will not occur within our lifetimes, or potentially even our grandchildren’s. The forecast seems at best, unpredictable; and at worst, unattainable due to the complexity of what constitutes full “sentience”. However, this does not indicate that AI lies dormant within our lives. The fundamental technology based in computing, statistics, and information systems lays most of the groundwork for most transactions we conduct online, whether monetary or social or otherwise. AI and it’s promises should not be shunted aside due to the misleading media surrounding it’s popularised definition and “robot uprisings” but rather taught more broadly to all audiences. So perhaps Elon Musk’s fantastical ideas of robotic integration will not occur by 2022 but the presence of AI in modern technologies should not go unnoticed. References: 1. "A Very Short History of Artificial Intelligence (AI)." 2016. Forbes. https://www.forbes.com/sites/gilpress/2016/12/30/a-very-short-history-of-artificial-intelligence-ai/?sh=38106456fba2. 2. “Blaise Pascal Invents a Calculator: The Pascaline.” n.d. Jeremy Norma's Historyofinformation.com. https://www.historyofinformation.com/detail.php?id=382. 3, 4, 6. “History of Artificial Intelligence.” n.d. Council of Europe. https://www.coe.int/en/web/artificial-intelligence/history-of-ai. 5. Smith, Chris, Brian McGuire, Ting Huang, and Gary Yang. 2006. “The History of Artificial Intelligence,” A file for a class called History of Computing offered at the University of Washington. https://courses.cs.washington.edu/courses/csep590/06au/projects/history-ai.pdf.
- Hiccups | OmniSci Magazine
< Back to Issue 2 Hiccups Evolution might be a theory, but if it’s evidence you’re after, there’s no need to look further than your own body. The human form is full of fascinating parts and functions that hold hidden histories - from the column that brought you a deep-dive into ear wiggling in Issue 1, here’s an exploration of why we hiccup! by Rachel Ko 10 December 2021 Edited by Katherine Tweedie and Ashleigh Hallinan Illustrated by Gemma Van der Hurk Hiccups bring a special brand of chaos to a day. It’s one that lingers, rendering us helpless and in suspense; a subtle, internal chaos of quiet frustration that forces us to drop what we’re doing to monitor each breath – in and out, in and out – until the moment they abruptly decide to stop. It’s an experience we’ve all had – one that can hit anyone at any time – and for most of us, hiccups are a concentrated episode of inconvenience; best ignored, and overcome. Yet, despite our haste to get rid of them when they interrupt our day, hiccups seem to have mystified humans for generations. Historically, the phenomenon has been the source of many superstitions, both good and bad. A range of cultures associate them with the concept of remembrance: in Russia, hiccups mean someone is missing you (1), while an Indian myth suggests that someone is remembering you negatively for the evils you have committed (2). Likewise, in Ancient Greece, hiccups were a sign that you were being complained about (3), while in Hungary, they mean you are currently the subject of gossip. On a darker note, a Japanese superstition prophesises death to one who hiccups 100 times. (4) Clearly, the need to justify everything, even things as trivial as hiccups, has always been an inherent human characteristic, transcending culture and time. As such, science has more recently made its attempt at objectively identifying a reason behind the strange phenomenon of hiccups. After all, if you take a step back and think about it, hiccups are indeed quite strange. Anatomically, hiccups (known scientifically as singultus) are involuntary spasms of the diaphragm (5): the dome-like sheet of muscle separating the chest and abdominal cavities. (6) The inspiratory muscles, including the intercostal and neck muscles, also spasm, while the expiratory muscles are inhibited. (7) These sudden contractions cause a rapid intake of air (“hic”), followed by the immediate closure of the glottis or vocal cords (“up”). (8) As many of us have probably experienced, a range of stimuli can cause these involuntary contractions. The physical stimuli include anything that stretches and bloats the stomach, (9) such as overeating, rapid food consumption and gulping, especially of carbonated drinks. (10) Emotionally, intense feelings and our responses to them, such as laughing, sobbing, anxiety and excitement, can also be triggers. (11) This list is not at all exhaustive; in fact, the range of stimuli is so large that hiccups might be considered the common thread between a drunk man, a Parkinson’s disease patient and anyone who watches The Notebook. The one thing that alcohol, (12) some neurological drugs (13) and intense sobbing (14) do have in common is that they exogenously stimulate the hiccup reflex arc. (15) This arc involves the vagal and phrenic nerves that stretch from the brainstem to the abdomen which cause the diaphragm to contract involuntarily. (16) According to Professor Georg Petroianu from the Herbert Wertheim College of Medicine, (17) many familiar home remedies for hiccupping – being scared, swallowing ice, drinking water upside down – interrupt this reflex arc, actually giving these solutions a somewhat scientific rationale. While modern research has successfully mapped out the process of hiccups, their purpose is still unclear. As of now, the hiccup reflex arc and the resulting diaphragmatic spasms seem to be effectively useless. Of the existing theories for the function of hiccups, the most prominent seems to be that they are a remnant of our evolutionary development, (18) essentially ‘vestigial’; in this case, a feature that once served our amphibian ancestors millions of years ago, but now retain little of their original function. (19) In particular, hiccups are believed to be a relic of the ancient transition of organisms from water to land. (20) When early fish lived in stagnant waters with little oxygen, they developed lungs to take advantage of the air overhead, in addition to using gills while underwater. (21) In this system, inhalation would allow water to move over the gills, during which a rapid closure of the glottis – which we see now in hiccupping – would prevent water from entering the lungs. It is theorised that when descendants of these fish moved onto land, gills were lost, but the neural circuit for this glottis closing mechanism was retained. (22) This neural circuit is indeed observable in human beings today, in the form of the hiccup central pattern generator (CPG). (23) CPGs exist for other oscillating actions like breathing and walking, (24) but a particular cross-species CPG stands out as a link to human hiccupping: the neural CPG that is also used by tadpoles for gill ventilation. Tadpoles “breathe” in a recurring, rhythmic pattern that shares a fundamental characteristic feature with hiccups: both involve inspiration with closing of the glottis. (25) This phenomenon strengthens the idea that the hiccup CPG may be left over from a previous stage in evolution and has been retained in both humans and frogs. However, the CPG in frogs is still used for ventilation, while in humans, the evolution of lungs to replace gills has rendered it useless. (26) Based on this information, it seems hiccupping lost its function with time and the development of the human lungs, remaining as nothing more than an evolutionary remnant. However, we cannot discredit hiccupping as having become entirely useless as soon as gills were lost. Interestingly, hiccupping has only been observed in mammals – not in birds, lizards or other air-breathing animals. (27) This suggests that there must have been some evolutionary advantage to hiccupping at some point, at least in mammals. A popular theory for this function stems from the uniquely mammalian trait of nursing. (28) Considering the fact that human babies hiccup in the womb even before birth, this theory considers hiccupping to be almost a glorified burp, intended to remove air from the stomach. This becomes particularly advantageous when closing the glottis prevents milk from entering the lungs, aiding the act of nursing. (29) Today, we reduce hiccups to the disorder and disarray they bring to our day. But, next time you are hit with a bout of hiccups, take a second to find some calm amidst the chaos and appreciate yet another fascinating evolutionary fossil, before you hurry to dismiss them. After that, feel free to eat those lemons or gargle that salty water to your diaphragm’s content. References Sonya Vatomsky, "7 Cures For Hiccups From World Folklore," Mentalfloss.Com, 2017, https://www.mentalfloss.com/article/500937/7-cures-hiccups-world-folklore. Derek Lue, "Indian Superstition: Hiccups | Dartmouth Folklore Archive," Journeys.Dartmouth.Edu, 2018, https://journeys.dartmouth.edu/folklorearchive/2018/11/14/indian-superstition-hiccups/. Vatomsky, "7 Cures For Hiccups From World Folklore". James Mundy, "10 Most Interesting Superstitions In Japanese Culture | Insidejapan Tours," Insidejapan Blog, 2013, https://www.insidejapantours.com/blog/2013/07/08/10-most-interesting-superstitions-in-japanese-culture/. Paul Rousseau, "Hiccups," Southern Medical Journal, no. 88, 2 (1995): 175-181, doi:10.1097/00007611-199502000-00002. Bruno Bordoni and Emiliano Zanier, "Anatomic Connections Of The Diaphragm Influence Of Respiration On The Body System," Journal Of Multidisciplinary Healthcare, no. 6 (2013): 281, doi:10.2147/jmdh.s45443. Christian Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," Bioessays no. 25, 2 (2003): 182-188, doi:10.1002/bies.10224. Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," 182-188. John Cameron, “Why Do We Hiccup?,” filmed for TedEd, 2016, TED Video, https://ed.ted.com/lessons/why-do-we-hiccup-john-cameron#watch. Monika Steger, Markus Schneemann, and Mark Fox, "Systemic Review: The Pathogenesis And Pharmacological Treatment Of Hiccups," Alimentary Pharmacology & Therapeutics 42, no. 9 (. 2015): 1037-1050, doi:10.1111/apt.13374. Lien-Fu Lin, and Pi-Teh Huang, "An Uncommon Cause Of Hiccups: Sarcoidosis Presenting Solely As Hiccups," Journal Of The Chinese Medical Association 73, no. 12 (2010): 647-650, doi:10.1016/s1726-4901(10)70141-6. Steger, Schneemann and Fox, "Systemic Review: The Pathogenesis And Pharmacological Treatment Of Hiccups," 1037-1050. Unax Lertxundi et al., "Hiccups In Parkinson’s Disease: An Analysis Of Cases Reported In The European Pharmacovigilance Database And A Review Of The Literature," European Journal Of Clinical Pharmacology 73, no. 9 (2017): 1159-1164, doi:10.1007/s00228-017-2275-6. Lin and Huang, "An Uncommon Cause Of Hiccups: Sarcoidosis Presenting Solely As Hiccups," 647-650. Peter J. Kahrilas and Guoxiang Shi, "Why Do We Hiccup?" Gut 41, no. 5 (1997): 712-713, doi:10.1136/gut.41.5.712. Steger, Schneemann and Fox, "Systemic Review: The Pathogenesis And Pharmacological Treatment Of Hiccups," 1037-1050. Georg A. Petroianu, "Treatment Of Hiccup By Vagal Maneuvers," Journal Of The History Of The Neurosciences 24, no. 2 (2014): 123-136, doi:10.1080/0964704x.2014.897133. Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," 182-188. Cameron, “Why Do We Hiccup?” Michael Mosley, "Anatomical Clues To Human Evolution From Fish," BBC News, published 2011, https://www.bbc.com/news/health-13278255. Michael Hedrick and Stephen Katz, "Control Of Breathing In Primitive Fishes," Phylogeny, Anatomy And Physiology Of Ancient Fishes (2015): 179-200, doi:10.1201/b18798-9. Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," 182-188. Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," 182-188. Pierre A. Guertin, "Central Pattern Generator For Locomotion: Anatomical, Physiological, And Pathophysiological Considerations," Frontiers In Neurology 3 (2013), doi:10.3389/fneur.2012.00183. Hedrick and Katz, "Control Of Breathing In Primitive Fishes," 179-200. Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," 182-188. Daniel Howes, "Hiccups: A New Explanation For The Mysterious Reflex," Bioessays 34, no. 6 (2012): 451-453, doi:10.1002/bies.201100194. Howes, "Hiccups: A New Explanation For The Mysterious Reflex," 451-453. [1] Howes, "Hiccups: A New Explanation For The Mysterious Reflex," 451-453. Previous article back to DISORDER Next article
- Wicked Invaders of the Wild | OmniSci Magazine
< Back to Issue 5 Wicked Invaders of the Wild Serenie Tsai 24 October 2023 Edited by Krisha Darji Illustrated by Jennifer Nguyen Since the beginning of time, there has been a continuous flow of species in and out of regions that establishes a foundation for ecosystems. When species are introduced into new environments and replicate excessively to interfere with native species, they become invasive. Invasive species refer to those that spread into new areas and pose a threat to other species. Factors contributing to their menacing status include overfeeding native species, lack of predators, and outcompeting native species (Sakai et al., 2001). Invasive species shouldn’t be confused with feral species which are domestic animals that have reverted to their wild state, or pests which are organisms harmful to human activity (Contrera-Abarca et al., 2022; Hill, 1987). Furthermore, not all introduced species are invasive; crops such as wheat, tomato and rice have been integrated with native agriculture successfully. Many species were introduced accidentally and turned invasive; however, some were intentionally introduced to manage other species, and a lack of foresight resulted in detrimental ecological impacts. Each year, invasive species cost the global economy over a trillion dollars in damages (Roth, 2019). Claimed ecological benefits of invasive species Contrary to the name, invasive species could potentially benefit the invaded ecosystem. Herbivores can reap the benefits of the introduced biodiversity, and native plants can increase their tolerance (Brändle et al., 2008; Mullerscharer, 2004). Deer and goats aid in suppressing introduced grasses and inhibit wildfires (Fornoni, 2010). Likewise, species such as foxes and cats have the capacity to regulate the number of rats and rabbits. Furthermore, megafaunal extinction has opened opportunities to fill empty niches, for example, camels could fill the ecological niche of a now-extinct giant marsupial (Chew et al., 1965; Weber, 2017). Thus, studies indicate the possibility of species evolving to fill vacant niches (Meachen et al., 2014). Below, I’ll explore the rise and downfall of invasive species in Australia. Cane toad Cane toads are notorious for their unforeseen invasion. Originally introduced as a biological control for cane beetles in 1935, their rookie status was advantageous to their proliferation and dominance over native species (Freeland & Martin, 1985). Several native predators were overthrown and native fauna in Australia lacked resistance to the cane toad’s poison used as a defence mechanism (Smith & Philips, 2006). However, research suggests an evolutionary adaptation to such poison (Philips &Shine, 2006). There isn't a universal method to regulate cane toads, so efforts to completely eradicate cane toads are futile. However, populations are kept low by continuously monitoring areas and targeting cane toad eggs or their adult form. Common Myna The origins of Common Myna introduced into New South Wales and Victoria are uncertain; however, it was introduced into Northern Queensland as a mechanism to predate on grasshoppers and cane beetles(Neville & Liindsay, 2011) and introduced into Mauritius to control locust plagues (Bauer, 2023). The Common Myna poses an alarming threat to ecosystems and mankind, its severity is elucidated by its position in the world’s top 100 invasive species list (Lowe et al., 2000). It has spurred human health concerns including the spread of mites and acting as a vector for diseases destructive to human and farm stock (Tidemann, 1998). Myna also has a vicious habit of fostering competition with cavity-nesting native birds, forcing them and their eggs from their nest, however, the extent of this is unclear, and the influence of habitat destruction needs to be considered (Grarock et al., 2013). The impact of this bird lacks empirical evidence, so appropriate management is undecided (Grarock et al., 2012). However, modification of habitats could be advantageous as the Myna impact urban areas more, whereas intervening in their food resources would be rendered useless with their highly variable diet (Brochier et al., 2012). Zebra mussels Zebra mussels accidentally invaded Australia's aquatic locality when introduced by the ballast water of cargo ships. From an ecological perspective, Zebra Mussels overgrow the shells of native molluscs and create an imbalance within the ecosystem (Dzierżyńska-Białończyk et al., 2018). From a societal perspective, it colonizes docks, ship hulls, and water pipes and damages power plants (Lovell et al., 2006) Controlling the spread of Zebra Mussels includes manual removal, chlorine, thermal treatment and more. Control methods It is crucial to deploy preventative methods to mitigate the spread of invasive species before it becomes irreversible. Few known control methods are employed for certain types of animals but with no guarantee of success. Some places place bounties on catching the animals, however, the results of this technique are conflicting. In 1893, foxes were the target of financial incentives, but the scheme was deemed ineffective (Saunders et al., 2010). However, government bounties were introduced for Tasmanian tigers in 1888, which drastically caused a population decline and their eventual extinction (National Museum of Australia, 2019). Similarly, the prevalence of Cane Toads became unbearable, and in response, armies were deployed, and fences in rural communities were funded. Moreover, in 2007, inspired by a local pub’s scheme to hand out beers in exchange for cane toads, the government staged a “Toad Day Out” to establish a bounty for cane toads (Williams, 2011). Invasive species are detrimental to ecosystems, whether introduced intentionally or by accident, management of species is still a work in progress. References Lowe S., Browne M., Boudjelas S., & De Poorter M. (2000) 100 of the World’s Worst Invasive Alien Species: A selection from the Global Invasive Species Database . The Invasive Species Specialist Group (ISSG). Bauer, I. L. (2023). T he oral repellent–science fiction or common sense? Insects, vector- borne diseases, failing strategies, and a bold proposition. Tropical Diseases, Travel Medicine and Vaccines, 9(1), 7. Brändle, M., Kühn, I., Klotz, S., Belle, C., & Brandl, R. (2008). Species richness of herbivores on exotic host plants increases with time since introduction of the host. Diversity and Distributions, 14(6), 905–912. https://doi.org/10.1111/j.1472-4642.2008.00511.x Brochier, B., Vangeluwe, D., & Van den Berg, T. (2010). Alien invasive birds. Revue scientifique et technique, 29(2), 217. Chicago. Cayley, N. W., & Lindsey, T. What bird is that?: a completely revised and updated edition of the classic Australian ornithological work . Chew, R. M., & Chew, A. E. (1965). The Primary Productivity of a Desert-Shrub ( Larrea tridentata ) Community . Ecological Monographs, 35(4), 355–375. https://doi.org/10.2307/1942146 Contreras-Abarca, R., Crespin, S. J., Moreira-Arce, D., & Simonetti, J. A. (2022). Redefining feral dogs in biodiversity conservation . Biological Conservation, 265, 109434. https://doi.org/10.1016/j.biocon.2021.109434 Fornoni, J. (2010). Ecological and evolutionary implications of plant tolerance to herbivory. Functional Ecology, 25(2), 399–407. https://doi.org/10.1111/j.1365-2435.2010.01805.x Freeland, W. J., & Martin, K. C. (1985). The rate of range expansion by Bufo marinus in Northern Australia , 1980-84 . Wildlife Research, 12(3), 555-559. Grarock, K., Lindenmayer, D. B., Wood, J. T., & Tidemann, C. R. (2013). Does human- induced habitat modification influence the impact of introduced species? A case study on cavity-nesting by the introduced common myna ( Acridotheres tristis ) and two Australian native parrots. Environmental Management, 52, 958-970. G. Smith, J., & L. Phillips, B. (2006). Toxic tucker: the potential impact of Cane Toads on Australian reptiles . Pacific Conservation Biology, 12(1), 40. https://doi.org/10.1071/pc060040 G. Smith J, L. Phillips B. Toxic tucker: the potential impact of Cane Toads on Australian reptiles. Pacific Conservation Biology [Internet]. 2006;12(1):40. Available from: http://www.publish.csiro.au/pc/PC060040 Hill, D. S. (1987). Agricultural Insect Pests of Temperate Regions and Their Control . In Google Books. CUP Archive. https://books.google.com.au/books?hl=en&lr=&id=3-w8AAAAIAAJ&oi=fnd&pg=PA27&dq=pests+definition&ots=90_-WiF_MZ&sig=pKxuVjDJ_bZ3iNMb5TpfXA16ENI#v=onepage&q=pests%20definition&f=false Lovell, S. J., Stone, S. F., & Fernandez, L. (2006). The Economic Impacts of Aquatic Invasive Species: A Review of the Literature. Agricultural and Resource Economics Review, 35(1), 195–208. https://doi.org/10.1017/s1068280500010157 Meachen, J. A., Janowicz, A. C., Avery, J. E., & Sadleir, R. W. (2014). Ecological Changes in Coyotes ( Canis latrans ) in Response to the Ice Age Megafaunal Extinctions . PLoS ONE, 9(12), e116041. https://doi.org/10.1371/journal.pone.0116041 Mullerscharer, H. (2004). Evolution in invasive plants: implications for biological control . Trends in Ecology & Evolution, 19(8), 417–422. https://doi.org/10.1016/j.tree.2004.05.010 ANU. Myna problems. (n.d.). Fennerschool-Associated.anu.edu.au . http://fennerschool- associated.anu.edu.au//myna/problem.html National Museum of Australia. (2019). Extinction of thylacine | National Museum of Australia . Nma.gov.au . https://www.nma.gov.au/defining-moments/resources/extinction-of-thylacine Cayley, N. W. & Lindsey T. (2011) What bird is that?: a completely revised and updated edition of the classic Australian ornithological work . Walsh Bay, N.S.W.: Australia’s Heritage Publishing. Phillips, B. L., & Shine, R. (2006). An invasive species induces rapid adaptive change in a native predator: cane toads and black snakes in Australia . Proceedings of the Royal Society B: Biological Sciences, 273(1593), 1545–1550. https://doi.org/10.1098/rspb.2006.3479 Roth, A. (2019, July 3). Why you should never release exotic pets into the wild. Animals. https://www.nationalgeographic.com/animals/article/exotic-pets-become-invasive-species Sakai, A. K., Allendorf, F. W., Holt, J. S., Lodge, D. M., Molofsky, J., With, K. A., Baughman, S., Cabin, R. J., Cohen, J. E., Ellstrand, N. C., McCauley, D. E., O’Neil, P., Parker, I. M., Thompson, J. N., & Weller, S. G. (2001). The Population Biology of Invasive Species. Annual Review of Ecology and Systematics , 32(1), 305–332. https://doi.org/10.1146/annurev.ecolsys.32.081501.114037 Saunders, G. R., Gentle, M. N., & Dickman, C. R. (2010). The impacts and management of foxes ( Vulpes vulpes ) in Australia . Mammal review, 40(3), 181-211. Weber, L. (2013). Plants that miss the megafauna. Wildlife Australia, 50(3), 22–25. https://search.informit.org/doi/10.3316/ielapa.555395530308043 Williams, G. (2011). 100 Alien Invaders . In Google Books. Bradt Travel Guides. https://books.google.com.au/books?hl=en&lr=&id=qtS9TksHmOUC&oi=fnd&pg=PP1&dq=invasive+species+australia+bounty+ Wicked back to
- Knot Theory and Its Applications. Why Knot? | OmniSci Magazine
< Back to Issue 9 Knot Theory and Its Applications. Why Knot? by Ryan Rud 28 October 2025 Illustrated by Saraf Ishmam Edited by Elijah McEvoy Knot theory is a theoretical study in mathematics, where your brain thinks of an imaginary knot, and manipulates it to your heart’s desire. Yes, the kind of knot you are probably thinking of now, it might be a shoelace, a knot in a piece of string or some utility knot. Good job, but it’s missing one detail: the knot needs to be tied at its ends. Think of this as a string with both ends tied together so that it can’t come undone when you play with it. Now you can pull at and twist this knot, as long as you don’t break it. Congratulations, you now understand the basics of knot theory. (1) So why should we care about a niche field of maths that you will probably never use in your everyday life? Well, the first answer to that is simply ‘for the love of the game’. For some people problem-solving is an endless endeavour that satisfies an urge to understand and be intellectually stimulated. But that’s not for everyone. So then we remember all the times when random elements of pure mathematics became essential when applied to seemingly unrelated topics. Such as how number theory became applied to information transmission, cryptography and computing. (2) How quaternions made for more efficient digital transformations in computer science. (3) Or how graph theory was used to strongly conjecture that any two people have 6 degrees of separation between each other. (4) Although we may not routinely ponder these discoveries, it is because of the works of pure mathematicians that we can admire certain facts that we could not prove otherwise or appreciate how they silently helped to make all the digital devices in your homes. But before we get into the applications, it is good to be familiar with some general terminology. That knot which you pictured earlier with its ends tied is called a standard knot. In 1867 Lord Kelvin thought of the revolutionary idea that what we know as elements - the ones made of protons and neutrons - are actually types of standard knots. (5) He wasn’t right, but it inspired his assistant Peter Guthrie Tait to begin the rigorous study of knots and we have been trying to find applications ever since. Here are the first knots in the greater sequence of the periodic table of knots (see cover image for more!): Figure 1. An ordered table of the first 15 prime knots. (6) There are knots made from one piece of string (prime knots) and knots made from multiple knots joined end-to-end (composite knots) (Fig.2b). There are also links, where two closed knots are combined without gluing the string (Fig.2a). Understanding any further implications of this terminology is not necessary here, but it may help to have a visual understanding of them for the next part. Figure 2. a) Showcasing types of mathematical links; unlink on the left, Hopf link in the centre and whitehead link on the left. b) Demonstrating how two prime knots are combined into a composite knot. c) Demonstrating chirality in trefoil knots, notice the overlapping pattern. Lastly, like many things in mathematics we need a way to systematically and efficiently describe how we manipulate the knots. Luckily, Kurt Reidemeister had the pleasure of providing us with a knot-manipulating moveset in the 1930s through rigorous proofs.These are the legal set of moves that can be done to a knot without changing the knot structure. If we were to cut the knot, twist or untwist the string and then reattach the ends, this is called a crossing switch and it changes the knot. Again, this is not an extensive course but it helps to know of the terminology and visualise it. Feel free to do more research into the details of these topics using the references below! Figure 3. A depiction of the Reidemeister moves. DNA and knot theory Deoxyribonucleic acid (DNA) is the most important and relevant knotting molecule. Each cell nucleus contains (on the millionth order) DNA that is regularly knotting, coiling and compressing to fit into this tight space. However, the best application of knot theory is to the closed end, circular DNA in bacteria. During DNA replication, the unwinding of DNA at one end creates immense torsional strain on the other side of the loop, which is enough supercoiling that prevents replication and leads to cell death.To counter this, bacteria utilise an enzyme known as type II topoisomerase which makes double-stranded cuts in the DNA, followed by a rearrangement of the tangle and reconnecting of the strands, a crossing switch! Without this adaptation, all cellular life would have evolved differently. If you gave this DNA to a mathematician and asked which position in the DNA would be best for this enzyme to cut with the intent of untangling, they could spend a lifetime performing Reidemeister moves and contemplating, never knowing where or how many cuts to make. In contrast to our world’s best mathematicians, topoisomerase is incredibly efficient in where it cuts. We have yet to understand what mechanism allows for such accurate cuts, but practical research into topoisomerase could potentially help knot theorists solve the immensely inscrutable question of the minimum number of crossing switches to simplify any knot. Furthermore, if an understanding of the mechanisms for topoisomerases in bacteria and humans is possible, then humanity can access a new form of control over DNA. It has been speculated that there are possible uses of topoisomerases to inhibit cancer growth, or as a revolutionary way to treat bacterial disease. While we do not have this intel right now, this is one of the ways knot theory could be integral to applied sciences and given time and research funding, it can prove itself useful. (7-8) Knots in chemistry So what other molecules can form knots? Chemists have been creating molecules which involve the basic knots and links since the 1960s (see Fig 4), when topological isomerism was discovered and characterised. Topological isomers are chemicals that are similar in many properties, but differ in spatial arrangement. We can think of it like chirality for knots (see Fig 2c). Chirality is the property of an object not being the same as its mirror image, like a right and left hand. Subsequently, these molecules were made through a technique called ‘templating’, where a metal ion or some template structure was used to produce a desired product, based on how the template interacts with the reactants. There is also another category of knot called a ravel (Fig 4h), where a knot has multiple strings connected at vertices. Altogether, the study of topological isomerism and templating techniques have been advanced by the experimental desire to produce these beautiful molecules. This then indirectly contributes to the production of new molecules and drugs that can go on to have real world impacts. (9) Figure 4. a) The first molecular trefoil knot produced in 1989. c) The first molecule pentafoil knot produced in 2011. d) First molecular Borromean rings, a type of link produced in 2004. f) The first molecule solomon link produced in 2013. h) The first molecular ravel produced in 2011. (9) The recent breakthrough in knot theory I admit, progress in knot theory is slow and perhaps you did not find the scientific revelation of knot theory here that you were hoping for. But that does not mean that current research is ineffective. As recent as June of this year, there was a groundbreaking proof. Think back to the prime and composite knots (scroll up if you have to). Prime knots have an unknotting number, which is the number of crossing changes needed to simplify it to the unknot, similar to what the topoisomerase does. If we merge two prime knots into a composite knot, it can be easily seen that it takes as many crossing switches to simplify the composite, as it does the crossing switches for the sum of the primes. In other words, to untangle a composite knot, you cut and reglue it as many times as the prime knots that make it up. Now, the breakthrough was a proof that it is possible to untangle some composite knots through less crossing switches than the sum of its prime knots. This may seem bleak, but it disproves a widely believed conjecture and now theorists are one step closer to solving the question of the minimum number of crossing switches needed to simplify a knot. (10) Conclusion I will end this with a quote from Dr Arunima Ray, a mathematician that specialises in knot theory and low-dimensional topology at the University of Melbourne, and a dear professor of mine. Hopefully this is just more proof (pun intended) that the work us mathematicians do is tangible: “I had never imagined that mathematics could be used to describe something so abstract as knot theory, but to me the appeal was its tangibility. No matter who you are, there really is something in mathematics for you.” References Pencovitch M. What’s not to love? [Internet] Mathematics Today . 2021. Available from: https://ima.org.uk/17434/whats-knot-to-love/ Koblitz N. A course in number theory and cryptography . 2nd ed. Springer Science & Business Media; 1994. Jeremiah. Understanding quaternions. 3D Game Engine Programming [Internet]. June 25, 2012. Available from: https://www.3dgep.com/understanding-quaternions/ Zhang L, Tu W. Six degrees of separation in online society [Internet]. Research Gate. 2009. Available from: https://www.researchgate.net/publication/255614427_Six_Degrees_of_Separation_in_Online_Society Wilson RM. Holograms tie optical vortices in knots. Physics Today. 2010. https://doi.org/ 10.1063/1.3366639 Li M, Wang T, Kau A, George W, Petrenko A. Knots. Brilliant. 2025 [Internet]. Available from: https://brilliant.org/wiki/knots/ Catherine. All tangled up: an introduction to knot theory [Internet]. Gleammath. April 28, 2021. Available from: https://www.gleammath.com/post/all-tangled-up-an-introduction-to-knot-theory Skjeltorp AT, Clausen S, Helgesen G, Pieranski P. Knots and applications to biology, chemistry and physics. In: Riste T, Sherrington D, editors. Physics of Biomaterials: Fluctuations, Selfassembly and Evolution. Dordrecht: Springer Netherlands; 1996. p.187–217. https://doi.org/10.1007/978-94-009-1722-4_8 Horner KE, Miller MA, Steed JW, Sutcliffe PM. Knot theory in modern chemistry [Internet]. Chemical Society Reviews. 2016;45(23). Available from: https://durham-repository.worktribe.com/output/1394834 Brittenham M, Hermiller S. Unknotting number is not additive under connected sum [Internet]. Arxiv . 2025. Available from: https://arxiv.org/html/2506.24088v1 Previous article Next article Entwined back to
- Why Do We Gossip? | OmniSci Magazine
< Back to Issue 5 Why Do We Gossip? Lily McCann 24 October 2023 Edited by Celina Kumala Illustrated by Rachel Ko Have you ever heard of ‘Scold’s bridle’? A metal restraint, fitted with a gag, that was strapped about the face as a medieval punishment for excessive chatter; gossip, it seems, was not received too fondly in the Middle Ages. While the bridle may have gone out of fashion long ago, today the word gossip still carries negative connotations. The Oxford Dictionary, for instance, defines gossip as “informal talk or stories about other people’s private lives, that may be unkind or not true” (Oxford Learner’s Dictionaries, 2023). Entries in the Urban Dictionary use yet stronger terms, going so far as to describe gossip as the “garbage of stupid silly ignorant people” (Lorenzo, 2006). Is this too harsh? Cruz et al. (2021) propose a much more neutral definition in their analysis of frameworks to study gossip, concluding that gossip is “a sender communicating to a receiver about a target who is absent or unaware of the content”. Whether the gossip conveys positive or negative content — otherwise known as its valence — is not a requirement of the definition itself. Gossip, then, is not always “unkind” (Oxford Learner’s Dictionaries, 2023) or “garbage” (Lorenzo, 2006). In fact, with a bit of further reading, we can see that this “informal talk” has played an important part in our evolution and even serves positive purposes in society. In the first sense, gossip is an important facilitator of safety. It allows dangerous situations to be identified: spreading the knowledge that a certain individual is prone to violence, for instance, ensures the rest of a community takes care of their own safety with regards to that individual. On a different note, passing about the fact that another individual is skilled in certain aspects of resource procurement allows wider access to these resources. It is easy to see in these examples how gossip could give a selective advantage in the survival of societies. But the influence of gossip goes further than this. It has been shown that gossip in fact encourages cooperation and generosity (Wu et al., 2015). How? The crucial mediator is reputation (Nowak, 2006). Reputation is incredibly important - see Taylor Swift’s 2017 album for more. A poor reputation leads to ostracisation, and for an individual in prehistoric societies, this could be fatal. Cultivating a good reputation among peers thousands of years ago, as today, improves the chances of success in life by increasing access to resources and the willingness of others to help you. Positive gossip can facilitate all this. So, how do we foster positive gossip? What will encourage someone to put in a good word for us? The most effective approach is to act in a way that benefits that individual. It predisposes them to spread the word of our generosity, helping to build a reputation for goodness that will in turn have positive outcomes for ourselves. Thus, it’s easy to see how behaviours that foster good gossip are incentivised in our everyday lives. This propensity to spread the knowledge of how certain individuals interact with others has been incredibly impactful in the development of human societies. The fact that our species can flourish and sustain itself in such immense populations requires a high level of cooperation - which enables us to share resources and productivity - even with people we do not know. Otherwise known as indirect reciprocity, this ability to work with strangers is enabled by reputation (Nowak, 2006). How else do we know that it is safe to interact with a stranger, other than through the means of gossip, which informs us of their reliability and trustworthiness? But what about when gossip is incorrect? The Oxford definition hints at the possibility that information spread through gossip “may be…not true”. Can untrue gossip hinder our progress, by limiting interactions with individuals who may have the potential to help us, or promoting those interactions that would better have been avoided? And if gossip can be incorrect, does that not render reputation meaningless? What is the incentive to be good, if gossip could label you as a bad egg, regardless (Nieper et al., 2022)? Incorrectly negative gossip can be extremely impactful for the subject of that gossip. Studies have shown that it decreases productivity and prosocial behaviour - not to mention burdening victims with the psychological effects of ostracisation, injustice and loneliness (Kong, 2018; Martinescu et al., 2021). Through gossip, we can exert immense power over other beings. It is understandable, then, that we fear gossip, and try to discount it by painting it as “garbage” (Lorenzo, 2006), “unkind” or “not true” (Oxford Learner’s Dictionaries, 2023). And yet, whilst negative gossip can be a detriment, positive gossip can yield great benefits, reinforcing prosocial behaviour, fostering cooperation and promoting generosity. So, rather than fearing gossip, perhaps we ought to acknowledge its benefits and harness it for good. Perhaps it's worth considering how we can each use gossip to exert a bit of good upon our world. References Dores Cruz, T. D., Nieper, A. S., Testori, M., Martinescu, E., & Beersma, B. (2021). An Integrative Definition and Framework to Study Gossip. Group & Organization Management, 46(2), 252-285. http://doi.org/10.1177/1059601121992887 Kong, M. (2018). Effect of Perceived Negative Workplace Gossip on Employees’ Behaviors. Frontiers in Psychology , 9(2728). http://doi.org/10.3389/fpsyg.2018.01112 Lorenzo, A. (2006). Gossip . Urban Dictionary. Accessed October 10, 2023. https://www.urbandictionary.com/define.php?term=gossip Martinescu, E., Jansen, W., & Beersma, B. (2021). Negative Gossip Decreases Targets’ Organizational Citizenship Behavior by Decreasing Social Inclusion: A Multi-Method Approach. Group and Organization Management, 46(3), 463-497. http://doi.org/10.1177/1059601120986876 Oxford Learner’s Dictionaries. (2023). Gossip - definition . Accessed October 10, 2023. https://www.oxfordlearnersdictionaries.com/definition/american_english/gossip_1#:~:text=gossip-,noun,all%20the%20gossip%20you%20hear . Nieper, A. S., Beersma, B., Dijkstra, M. T. M., & van Kleef, G. A. (2022). When and why does gossip increase prosocial behavior? Current Opinion in Psychology, 44, 315-320. http://doi.org/10.1016/j.copsyc.2021.10.009 Nowak, M. A. (2006). Five Rules for the Evolution of Cooperation . Science, 314(5805), 1560-1563. http://doi.org/10.1126/science.1133755 Wu, J., Balliet, D., & Van Lange, P. A. M. (2015). When does gossip promote generosity? Indirect reciprocity under the shadow of the future. Social Psychological and Personality Science, 6(8), 923-930. http://doi.org/10.1177/1948550615595272 Wicked back to
- A Coral’s Story: From thriving reef to desolation | OmniSci Magazine
< Back to Issue 7 A Coral’s Story: From thriving reef to desolation by Nicola Zuzek-Mayer 22 October 2024 edited by Arwen Nguyen-Ngo illustrated by Amanda Agustinus The sun is shining. Shoals of fish are zooming past me, leaving their nests where I let them stay for protection from bigger fish. I look to my right and the usual fish have come to dine from me, filling their bellies with vital nutrients. I feel proud of our coexistence: I feed the big fish and provide shelter to small fish, whilst they clean algae off of me. I am the foundation of the reef. I am the architect of the reef. Without me, there would be nothing. I can’t help but think that the reef is looking vibrant today. A wide variety of different coloured corals surround me in the reef, with some of my closest friends a stone’s throw away. We’ve all known each other for our entire lives, and it’s such a close knit community of diverse corals. Life is sprawling in this underwater metropolis, and it reminds me of how much I love my home. But recently, I’ve heard some gossip amongst the city’s inhabitants that this paradise may change soon – and for the worse. Something about the land giants destroying our home. I refuse to believe such rumours – why would they want to destroy us? Our home is so beautiful, and we have done nothing to hurt them. Our beauty attracts many of them to come visit us, and most never hurt us. But sometimes I feel pain when they visit on a particularly sunny day, when I see white particles drop down to the reef and pierce my branches, polluting the city. My friends have told me that these giants wear something called ‘sunscreen’ to protect themselves from the sun, but their ‘protection’ is actually poisoning us. I hope that they realise that soon. Another thing that I’ve noticed recently is that the ocean is feeling slightly warmer than before, and my growth is slowing more. Yes, I’m concerned, but I don’t think that the issue will get worse. 30 years later… The sun is blisteringly hot. I feel sick and the water around me is scorching hot. The vibrant colours of the reef are disappearing, and there are fewer organisms around. We used to be so diverse, but so many species of fish have died out. It’s eerie to see the area so desolate. My body is deteriorating and I feel so much more fragile than before. I feel tired all the time, after using so much energy to repair my body in the acidic water. I sense myself becoming paler, losing all colour in my body. I struggle to breathe. My coral friends and family are long gone, perished from the acidity of the ocean. I am the last one remaining. In my last moments, I can only wish to go and relive the past. I wish that the land giants had done more to help not only my city, but other reef cities around the world. All the other cities are empty now, and all ecosystems are long gone. If only someone had helped our dying world. Previous article Next article apex back to
- Enter . . . the Anthropocene? | OmniSci Magazine
< Back to Issue 9 Enter . . . the Anthropocene? by Rita Fortune 28 October 2025 Illustrated by Zara Burk Edited by Kylie Wang We live in a time where humanity’s impact on the world around us is clearly visible. From the neverending barrage of information about climate change, to extinction and habitat loss, the consequences of our actions are impossible to avoid. There’s no denying that the world around us is changing, but what if there are deeper implications? What if our impact on the planet will be apparent thousands, even millions of years into the future? Have we changed our planet’s system to such an extent that the birth of our species defined a new geological epoch? The geological timescale is how we understand the relative timing of past events. From the advent of life, to mass extinctions, all of it is documented in the rock record. Our geological past is divided into formalised time periods: eons, eras, periods, epochs and ages. These time periods are generally divided by major changes visible in the rock record, such as mass extinctions, major climate shifts, or changes in magnetic polarity, with absolute ages determined by radioactive dating (1). Currently, we are formally sitting in the Holocene Epoch, which began around 11.7 thousand years ago, with the end of the last glacial maximum and beginning of the subsequent warmer interglacial phase (2). However, due to the enormity of impact on earth systems that humanity has had, especially since the dawn of the industrial revolution, some scientists are pushing for the formalisation of a new epoch: the Anthropocene. The concept of the Anthropocene was first officially coined by Paul Crutzen and Eugene Stoermer in 2002 (3). Initially, it was used to recognise the exploitation of earth’s resources by humankind, including the emission of greenhouse gases, urbanisation of land, and increase in species extinction rates. Crutzen and Stoermer suggested the beginning of the Anthropocene to be in the late 18th century, as, in the last 200 years, the “global effects of human activities have become clearly noticeable” (3). The concept, at its core, has remained the same since then, but there have been some changes and debate around formal definitions and informal uses of the term. The Anthropocene has been adopted in popular culture, with its broad use encompassing humanity’s interactions with the earth, but there is ongoing debate about its formal use. Furthermore, although the theory traces its origins to earth system science, efforts to formalise the Anthropocene have been multidisciplinary, involving not only stratigraphers and palaeontologists, but also experts from various scientific backgrounds (4). Formalising the Anthropocene as an epoch distinct from the Holocene relies on being able to find stratal evidence in the rock record for where this transition took place (4). There are countless pieces of evidence for our impact on Earth’s systems.Yet, there is still debate around which ones can be used to define the Anthropocene. The Anthropocene Working Group identified as potential evidence for the beginning of the Anthropocene: the increase in sedimentation and erosion rates; changes to carbon, nitrogen and phosphorus cycles; climate change and increase in sea level, and; biotic changes such as unprecedented spread of species across Earth (4). Many of these impacts will leave permanent evidence in the geological record, indicating our existence long after our civilisations have crumbled. There are many potential ways to define the beginning of the Anthropocene. Crutzen suggested this crucial moment to be the invention of the steam engine, which led to the industrial revolution, often used as a baseline to compare our current climate to (3). However, evidence of industrialisation from this time is really only visible in Europe, with sediments from the Southern Hemisphere showing no change (5). More recently, it has been posited that the detonation of the first atomic bomb in 1945 should be the official marker of the Anthropocene, as it deposited a thin stratal layer of radionuclides, which do not naturally occur in the environment (6). While it’s clear that humans are a major source of change on Earth, some say that it does not necessarily mean we’ve entered a new epoch. Although geological time periods are often delineated based on environmental change, not every environmental change necessitates the creation of a new epoch. There have been past periods of (relatively) rapid climate change that are not associated with new time periods. An example of this is the Palaeocene-Eocene Thermal Maximum (PETM). During this time, there was significant global warming, change in habitats, and migration in species. This warm period lasted for approximately 100,000 years, but there were no mass extinctions. Once temperatures returned to normal, ecosystems essentially returned to how they were before the event (7). Geologically speaking, the proposed Anthropocene is a minuscule amount of time. Although the effects are extreme, if we stopped all emissions right now, it is possible that within 5000 years the climate could return to pre-industrial levels (8). Another argument presented by some authors is that the stratigraphic basis for the Anthropocene doesn’t exist yet, and is merely expected to exist in the future. Many structures which have an anthropogenic origin, such as excavation, boreholes and mine dumps, are not yet geological strata. Additionally, in strata that have recorded anthropogenic change, such as speleothems, marshes, lake and ocean floor sediments, the layers representing the Anthropocene would be so thin as to be difficult to distinguish from the underlying Holocene sediments (6). Without the gift of hindsight that has allowed scientists to examine previous epochs, it is difficult to say whether or not the change we currently see will be significant enough on a geological scale to officially move us into a new epoch. There has been suggestion that instead of a new epoch, the Anthropocene could be a Sub-Age, or an Age within the Holocene Epoch (4); acknowledging our profound impact on the earth, but believing that the earth’s system will eventually return to pre-industrial levels. Further complicating the matter, there are suggestions that humans have been altering the earth’s climate since long before the industrial revolution. Evidence shows that a rise in CO2 occurred with the advent of farming by early humans, 7000 years ago. Around the same time, there was also a rise in atmospheric methane, which has been attributed to rice paddies and livestock (9). With the increase in human population happening at this time, there was likewise an increase in land clearance, both to accommodate dwellings and farming. Even though these emissions and land clearing are tiny by today’s standards, they may have been enough to push our climate away from heading into its next glacial period, priming the warmer conditions we experience today. Some arguments have even been made that irreversible impact by humans stretches back even further, to the Pleistocene extinctions of megafauna across multiple continents (10). There is no doubt that humans have had, and are having, a massive impact on the environment. The atmosphere and oceans will take thousands of years to recover from their current level of warming. However, these massive changes do not necessarily mean that we have entered a new epoch. Although it appears there will be ample stratigraphic records of our impacts on this planet, without hindsight, it is difficult to see just how much change we have created. In the context of geological time, humans have been around for a minutely short period. Although what’s happening today might seem dramatic to us, it is possible that millions of years in the future all we will have left behind is a few centimetres of ocean floor sediment. Either way, the Anthropocene as an informal term for our current time period is valuable for acknowledging the consequences of our actions, and a reminder of the permanence of our record. References 1.University of Calgary. Geologic time scale. Energy Education. 2024. Accessed October 21, 2025. https://energyeducation.ca/encyclopedia/Geologic_time_scale#cite_note-GTS-3 2. Walker M, Johnsen S, Rasmussen SO, Popp T, Steffensen JP, Gibbard P, et al. Formal definition and dating of the GSSP (Global Stratotype Section and Point) for the base of the Holocene using the Greenland NGRIP ice core, and selected auxiliary records. J. Quaternary Sci. 2009;24(1):3–17. doi: 10.1002/jqs.1227 3. Crutzen PJ, Stoermer EF. The ‘Anthropocene’ (2000) [Internet]. Benner S, Lax G, Crutzen PJ, Pöschl U, Lelieveld J, Brauch HG, editors. Cham: Springer International Publishing; 2021. 3 p. (Paul J. Crutzen and the Anthropocene: A New Epoch in Earth’s History). Available from: https://doi.org/10.1007/978-3-030-82202-6_2 4. Zalasiewicz J, Waters CN, Summerhayes CP, Wolfe AP, Barnosky AD, Cearreta A, et al. The Working Group on the Anthropocene: Summary of evidence and interim recommendations. Anthropocene. 2017;19:55–60. doi: 10.1016/j.ancene.2017.09.001 5. Pare S. Nuclear bombs set off new geological epoch in the 1950s, scientists say. Live Science. 2023. Accessed October 21, 2025. https://www.livescience.com/planet-earth/nuclear-bombs-set-off-new-geological-epoch-in-the-1950s-scientists-say 6. Finney S, Edwards L. The “Anthropocene” epoch: Scientific decision or political statement? GSA Today. 2016;26:4–10. doi: 10.1130/GSATG270A.1 7. The Editors of Encyclopaedia Britannica. Paleocene-Eocene Thermal Maximum (PETM). Britannica. 2023. Accessed October 21, 2025. https://www.britannica.com/science/Paleocene-Eocene-Thermal-Maximum 8. The Royal Society. If emissions of greenhouse gases were stopped, would the climate return to the conditions of 200 years ago? The Royal Society. 2020. Accessed October 21, 2025. https://royalsociety.org/news-resources/projects/climate-change-evidence-causes/question-20/ 9. Ruddiman WF, He F, Vavrus SJ, Kutzbach JE. The early anthropogenic hypothesis: A review. Quaternary Science Reviews. 2020;240:106386. doi: 10.1016/j.quascirev.2020.106386 10. Doughty CE, Wolf A, Field CB. Biophysical feedbacks between the Pleistocene megafauna extinction and climate: The first human-induced global warming? Geophys. Res. Lett. 2010;37(15). doi:10.1029/2010GL043985 Previous article Next article Entwined back to
- Unravelling the Threads: From the Editors-in-Chief & Cover Illustrator | OmniSci Magazine
< Back to Issue 9 Unravelling the Threads: From the Editors-in-Chief & Cover Illustrator by Ingrid Sefton, Aisyah Mohammad Sulhanuddin & Anabelle Dewi Saraswati 28 October 2025 Illustrated by Anabelle Dewi Saraswati Edited by the Editor-in-Chiefs Innovation evolves, and perhaps what once made headlines becomes embodied in ourselves and in our universe. The science that we once saw is no longer visible, yet no less integral in the ways in which it governs our world. Like the strings of a puppet, scientific principles guide us and coordinate the patterns and movements which shape our daily lives. Yet equally, science encourages us to look behind the curtain in order to unravel the forces which pull on the strings of our universe. Following these rich threads of knowledge, so often taken for granted, this issue brings to the fore and celebrates the science that keeps our world running. An introspective chat with the brain, a journey along the production line that creates our much-loved daily cup of matcha, fundamental questions about how we seek and create knowledge: Entwined seeks to make explanations explicit and start conversations about the scientific mechanisms embedded in our lives. When we take the time to focus our gaze, encourage awe at the everyday and seek reflection over reaction – that’s when we start to disentangle the science that binds us; that which keeps us Entwined . Begin your immersion in the world of Entwined with Issue 9’s Cover Illustrator, Anabelle Dewi Saraswati , as she explains the vision and rationale behind her work. “I found myself drawn to the world of Art Nouveau for these cover illustrations, captivated by the way forms seem to grow into each other, sharing meaning and life, much like the theme of ‘Entwined’ itself. There is something magical about that moment in history, where art, architecture, and science all seemed to bleed into one another, each discipline borrowing and lending, rooted in the emphasis on the beauty of nature after the coldness created by the Industrial Revolution. That sense of crossover felt like the perfect encapsulation for this issue, derived from pictorial history. The way feminine figures and flowing hair seem to melt into vines and leaves, everything tangled together in a quiet conversation. The motion and sense of growth, but also its hidden mathematical precision required to produce such beautiful curving forms. Art Nouveau captured how the artificial and natural worlds are always weaving into each other, inseparable. I wanted to draw from that imagery in a way that acknowledges its history I return to my architectural roots in structure, composition and line with my approach in building these pieces. The signage piece is fully hand-drawn and deliberate – reflecting the craft and typographic precision of the era. The collage is a layering of textures and fragments, letting ideas overlap and bleed into each other, much like memories and histories do. A way to begin the issue visually to trace the growth of worlds as they intertwine. Paying homage to the harmony between the natural and the human-made, to reflect on how we are shaped by the places we inhabit, the histories we inherit, and the stories we choose to keep alive.” Previous article Next article Entwined back to
- Conferring with Consciousness | OmniSci Magazine
< Back to Issue 9 Conferring with Consciousness by Ingrid Sefton 28 October 2025 Illustrated by Heather Sutherland Edited by Steph Liang Down the rabbit hole Indulge me for a moment, will you? I value your opinion. Your opinion, as in, one which has arisen from your mind. I would assume. It would seem unusual to consider that, perhaps, your thoughts are not your own. Stranger still to ponder the possibility that they did not arise from your mind. I digress – or maybe not. For it is this dilemma which I wish to pick your brain on. The mind. The brain. You. Are they one and the same; entwined? What do you think? Again, assuming it is you thinking. Assuming you feel certain enough to agree with this. Really, with what certainty can we say anything? You may be wondering who “I” am. I am but you, of course! I kid, but not entirely. Think of me as the brain; your brain if you wish. An excellent name I gave myself, if you ask me. Before we spiral any deeper into this chasm that is consciousness – because that is what this is about, is that not what this, life, is all about? – I must disclose a few things. One, I do not expect you to have answers to these questions I pose. Because two. We do not have answers. I apologise that I have not come bearing the answers to our existence, that I have not yet unpicked these questions of “who?”, “how?”, “why?”. I come offering an alternative. I wish to present to you these entangled threads of consciousness: of what we currently know, of what we hope to know and of where we can proceed from here. Then it’s back to you. You get to decide what you think (again, with the thinking). Maybe, for you and the workings of your inner mind, consciousness and all it entails will be revealed in full clarity. Maybe not. You certainly won’t know unless you try. A brief neural memoir Many a Nobel prize has been awarded for discoveries relating to the nervous system: from the morphology of neurons (Golgi and Cajal 1906) and their electrical signalling properties (Eccles, Hodgkin and Huxley 1963), to the nature of information processing in the visual system (Hubel and Wiesel 1981) (1). Despite some obvious gaps remaining in what is known about the brain (ahem, that slight issue of consciousness), the field of neuroscience has rapidly progressed over the last century. Gone are the days of thinking I was nothing more than a cooling mechanism for the blood, as Greek philosopher Aristotle once believed (2). How dismissive of my intellect! I assure you, I have far more important things to be doing. Generating the experience of “you”, as one small matter. The techniques developed to study the brain have also rapidly advanced. It was not until the invention of microscopes in the 19 th century that the neuron doctrine even came about . Pioneered by Santiago Ramón y Cajal, this is the (now) well-accepted concept that the nervous system is made up of discrete cells known as neurons, challenging older theories which proposed a continuous neural network (3). Today, neuroscientists have the ability to appreciate my anatomical and functional complexity at a huge range of temporal and spatial resolutions. Whole-brain connectivity can be studied using functional magnetic resonance imaging (fMRI), while the electrical activity of single neurons can be recorded using patch-clamp electrode technology. Not to mention optogenetics, chemogenetics, viral transduction: while the available experimental techniques are still unable to address all our brainy questions, the field of neuroscience has never been in a better position to get closer to answers. The potential of neurons Neurons: those special, excitable cells that make up the squishy entity I seem to be. The mechanisms of how neurons detect, generate and transmit signals have been described in utmost precision. When I talk of excitable cells, I am not referring to a bunch of cheerful, eager neurons. Excitability, in this context, refers to the fact that neurons can respond to a sensory stimulus by generating and propagating electrical signals, known as action potentials. Clearly, I am made up of slightly more than two neurons cheerfully signalling to each other back and forth. Try 86 billion, between the cortex and cerebellum combined (4). Yet, despite our deep understanding of neural signalling mechanisms, this has yet to reveal an explanation for consciousness. Individual neurons in isolation, it would appear, don’t hold the answers we want. In turn, a focus of neuroscience research has been on the wider “neuronal correlates of consciousness”, the minimal neuronal mechanisms that are sufficient to generate a conscious experience (5). This relates broadly to the generation of consciousness itself, but also to studying the neural underpinnings of specific conscious experiences. For example, which collective neural substrates support the process of visual object recognition. This is often a focus of fMRI studies, which examine brain activity in an attempt to pin-point where in the brain a particular cognitive function may be performed. Fancy techniques aside, some of the most fundamental insights into my regional specialisations have arisen from careful observation following selective lesions or damage to the brain. The critical, yet specific role of Broca’s area in speech production was discovered in 1861 by surgeon Paul Broca’s observations of his patient “Tan”. Tan had lost his ability to produce meaningful speech, yet was still able to comprehend speech; Broca identified a lesion in Tan’s left frontal lobe post-mortem, drawing the conclusion that this region is selectively involved in speech production (6). But what does all of this show us? Perhaps the only thing that neuroscientists can agree on, is that conscious experience is fundamentally, in some way, somehow, related to my activity: the brain. In turn, the activity of the brain is related to the activity of neurons; firing and signalling and transforming information. A lot is known about neurons. Less can be said about specific cognitive functions, yet we can see correlations between the regional brain activity and particular conscious experiences. Here lies my problem. The elephant in the room. How do we get from individual neurons to conscious experience? A map with no destination Enter “The Connectome” and the Human Connectome Project: a collective attempt to map the neuronal connections of the human brain, in an effort to connect structure to function (7). And in turn, for our purposes, to ideally connect this to consciousness. The rationale is that by modelling and trying to “build” a brain using a bottom-up approach, we may therefore understand the mechanisms of how cognitive functions arise. I’m sure it will come as no surprise that this isn’t the simplest of tasks. To measure, record and model billions of neurons and synapses requires techniques, time, and resources that are incredibly hard to come by in sufficient quantities. Excitingly, scientists have recently managed to successfully map a whole brain. That is, of a fly (8). With 3016 neurons and 548000 synapses, this was no simple feat. In case you had forgotten my own complexity, however, let me remind you of my 86 billion neurons, and estimated 1.5 x10 14 total synapses in the cortex alone (4). Progress has also been made on the human front, nonetheless. It was recently announced that a cubic millimetre of human temporal cortex has been completely reconstructed using electron microscopy, involving 1.4 petabytes of electron microscopy data (1000 Terabytes or one quadrillion bytes) (9). One cubic millimetre down, approximately a million to go. Putting practicalities aside, let us suppose we do, one day, manage to map and model an entire human brain, in all its intricacies. What now? What does one actually do with this data, and how would this allow us to better understand how consciousness arises? Up until now, we have been following the train of thought that consciousness, somehow, results from the activity of neurons, yet does not arise from the activity of individual neurons. This leads us to the notion that perhaps consciousness is due to the collective, computational activity of neurons working together – that with enough complexity, and enough information processing, together this will lead to the first-person experience of being “you”. Does this actually make sense? You tell me. Wishful thinking and conscious rocks The notion that, at a certain level of complex neuronal signal processing, a first-person perspective of “being you” (i.e. consciousness) arises is often termed “strong emergence” or “magical emergence” (10). With what we currently know about the properties of neurons, there is fundamentally no reason why this should happen. The “property” of consciousness, which cannot be predicted from the principles of how individual neurons function, seemingly just emerges. Consciousness, therefore, must somehow be greater than the sum of its parts, only emerging when neurons interact as a wider network. Maybe, the answer to this is merely that we don’t understand the mechanisms of neurons as well as we think we do. It could be that we have missed a fundamental property of how neurons operate and upon discovery of this, it would suddenly be completely explicable how consciousness arises. Or maybe, computation and neural signalling is not all there is to it. An alternative line of thinking is that rather than consciousness being a property that “arises”, it is a basic constituent of the universe that is missing from our current model of standard physics (11). That is, consciousness has been present all along and exists in everything. The philosophical view of ‘panpsychism’ embraces this idea to the extreme, proposing that everything within the universe is, to some degree, conscious (12). As in yes, that rock over there might just be conscious. Other theories suggest that consciousness only emerges in a recognisable form in certain conditions or at some critical threshold; myself and all my neurons apparently being one such example of the “right” conditions. Theories of consciousness don’t just stop at computation and fundamental properties of the universe. Quantum physics, microtubule computations, electromagnetic fields; all have been proposed as part of this web of “why” (13). While some theories arguably veer more towards pseudoscience than well-founded scholarship, they all make one thing clear. At this stage, just about every idea remains fair game in the quest for answers. Pondering hard, or hardly pondering? The question of consciousness is far from limited to the field of neuroscience. Philosophers too have long wracked their brains in an attempt to rationalise and unpick this problem. What unites the work of neuroscientists and philosophers alike, along with the many theories of consciousness, is that nothing provides a satisfactory explanation for why consciousness should emerge from the activity of neurons. Philosopher David Chalmers has termed this the “hard problem”. “Why should physical processing give rise to a rich inner life at all? It seems objectively unreasonable that it should, and yet it does” (14). If consciousness is simply the result of high-level processing and the computational activity of neurons, why would we even need to be conscious? If all the brain is doing is computation, and thus everything can be done via computation, there would appear to be no purpose in having a subjective experience of being “you”. Whichever side of consciousness we may be inclined to take, computational, fundamental, or otherwise, the fact remains. We cannot seem to move beyond mere description, to explanation. We have not solved the “hard problem”. A final conundrum, and a sole certainty Physicist Emerson M Pugh once made the somewhat sceptical remark that “if the human brain were so simple that we could understand it, we would be so simple that we couldn't.” (15) Is the reason that we have yet to understand consciousness simply, frustratingly, that we are not meant to? Logical conundrums aside, I rest my case. I hope I have given you some food for thought, or at the very least, not set off too dramatic an existential crisis. Somewhere between the neural wirings of the brain and the experience of consciousness lies an answer, regardless of whether we are destined to find it out. Make of this what you will. And if nothing else, let me try reassuring you once again with the wisdom of René Descartes. “ Cogito, ergo sum ” “ I think, therefore I am ” (16). If you are here, and you are thinking, you are conscious. You, my friend, are you. References Nobel Prizes in nerve signaling. Nobel Prize Outreach. September 16, 2009. Accessed October 18, 2025. https://www.nobelprize.org/prizes/themes/nobel-prizes-in-nerve-signaling-1906-2000/ . Rábano A. Aristotle’ s “mistake”: the structure and function of the brain in the treatises on biology. Neurosciences and History . 2018;6(4):138-43. Golgi C. The neuron doctrine - theory and facts . 1906. p. 190–217. https://www.nobelprize.org/uploads/2018/06/golgi-lecture.pdf Herculano-Houzel S. The human brain in numbers: a linearly scaled-up primate brain. Front Hum Neurosci . 2009;3:31. doi: 10.3389/neuro.09.031.2009 Koch C, Massimini M, Boly M, Tononi G. Neural correlates of consciousness: progress and problems. Nature Reviews Neuroscience . 2016;17(5):307-21. Broca area . Encyclopedia Britannica; 2025. Accessed October 18, 2025. https://www.britannica.com/science/Broca-area Elam JS, Glasser MF, Harms MP, Sotiropoulos SN, Andersson JLR, Burgess GC, et al. The Human Connectome Project: A retrospective. NeuroImage . 2021;244. doi: 10.1016/j.neuroimage.2021.118543 Winding M, Pedigo BD, Barnes CL, Patsolic HG, Park Y, Kazimiers T, et al. The connectome of an insect brain. Science . 2023;379(6636). doi: 10.1126/science.add9330 Shapson-Coe A, Januszewski M, Berger DR, Pope A, Wu Y, Blakely T, et al. A petavoxel fragment of human cerebral cortex reconstructed at nanoscale resolution. Science . 2024;384(6696). doi: 10.1126/science.adk4858 Chalmers D. Strong and Weak Emergence. In: Clayton P, Davies P. The Re-Emergence of Emergence: The Emergentist Hypothesis from Science to Religion . Oxford University Press; 2008. Kitchener PD, Hales CG. What Neuroscientists Think, and Don’t Think, About Consciousness. Frontiers in Human Neuroscience . 2022;16. doi: 10.3389/fnhum.2022.767612 Goff P, William Seager, and Sean Allen-Hermanson. Panpsychism . The Stanford Encyclopedia of Philosophy. Summer 2022. Seth AK, Bayne T. Theories of consciousness. Nature Reviews Neuroscience . 2022;23(7):439-52. doi: 10.1038/s41583-022-00587-4 Chalmers D. Facing up to the hard problem of consciousness . In: Shear J. Explaining Consciousness: The Hard Problem. MIT Press; 1997. Pugh GE. The Biological Origin of Human Values . Routledge & Kegan Paul; 1978. Descartes R. Principles of Philosophy . 1644. Previous article Next article Entwined back to








