top of page

Search Results

144 results found with an empty search

  • Staying at the Top of Our Game: the Evolutionary Arms Race | OmniSci Magazine

    < Back to Issue 7 Staying at the Top of Our Game: the Evolutionary Arms Race by Aizere Malibek 22 October 2024 edited by Rita Fortune illustrated by Aizere Malibek Organisms have been competing for biological domination since the beginning of life. Evolutionary adaptations arise from genetic mutations, which propel biodiversification and allow organisms with favourable traits to survive and reproduce. This is the foundation of Charles Darwin’s Theory of Evolution, explaining the rise of antimicrobial resistance and contagious viruses, while also offering solutions to these threats in public health and medicine. Mutations in the DNA of pathogens allow them to adapt to our immunological defences and invade our bodies. Conversely, the variation in our immune cells allows us to detect and defend against pathogens as a counter-adaptation. Medicine has advanced dramatically in the recent decades, with novel vaccines, antivirals and antibiotics being developed quicker than ever before. Unfortunately, persistent pathogens have found a way to survive attacks from our immune systems and drugs, making it difficult to devise an effective cure for these infections. Take HIV, for instance: the virus activates programmed cell-death in our CD4+ T immune cells and alters their metabolism as a survival mechanism (Gougeon, 2003; Palmer et al., 2016). In turn, this directly reduces the immune system’s ability to defend against the virus. This is further complicated by the high mutation rate of HIV, leading to rapid resistance to various treatment options (Gupta et al., 2018). Fortunately, scientific discoveries are helping us develop solutions for infectious diseases. It was found that HIV is susceptible to immune responses in its initial immature stages, which has become a target of the current pursuits in vaccine development for the virus (Picker et al., 2012). Vaccines are beneficial in these cases because they expose memory cells in order to inactive microbial antigens, which are a key cell involved in our active immune responses. This allows our bodies to tackle the pathogens more efficiently, reducing the symptoms and long-term effects of infection. Another emerging treatment option is through CRISPR-Cas9 technology. Originally discovered as a bacterial defence system against viruses, CRISPR allows scientists to precisely edit genes. This technology is being explored not only for its potential to correct genetic disorders, but also as a weapon against pathogens. Researchers are looking into using CRISPR to target viral DNA in infected human cells, cutting it out before the virus can replicate (Mengstie & Wondimu, 2021). If successful, CRISPR could be a game-changer in the fight against diseases like HIV, influenza, and even the next pandemic. However, HIV is just one example of this ongoing evolutionary arms race between pathogens and humans. The phenomenon isn’t restricted to just viruses; bacteria and fungi have also become significant opponents. The rise of antibiotic resistance in bacteria is an alarming and rising public health issue today. Antibiotics are increasingly losing their efficacy due to misuse and overprescription. Pathogens like Escherichia coli ( E. coli ) and Staphylococcus aureus ( S. aureus ) have developed multiple resistance mechanisms, including the production of enzymes that break down the antibiotic molecules before they can exert their effect (Reygaert, 2018). Methicillin-resistant Staphylococcus aureus (MRSA) is a prime example of antibiotic resistance. Initially, methicillin was developed to treat penicillin-resistant strains of bacteria. However, as methicillin became widely used, new strains of S. aureus emerged that could resist the potent drug. MRSA infections are now incredibly difficult to treat and pose a serious public health threat, particularly in hospitals and healthcare settings where immunocompromised patients are most vulnerable (Collins et al., 2010). Vaccines are not as effective against bacteria and fungi due to the more complex structures of these organisms. So how do we stay ahead in this race? One promising area of research is the development of next-generation antibiotics and antivirals. Researchers are now investigating bacteriophages—viruses that specifically infect bacteria—as a potential solution to antibiotic-resistant infections. These phages, which evolve alongside bacteria, could be used to target and destroy harmful bacterial strains without the collateral damage caused by traditional antibiotics (Plumet et al., 2022). While scientific innovation is key to staying ahead in the evolutionary arms race, public health policies play an equally important role. Misuse of antibiotics, for instance, has significantly accelerated the rise of antibiotic-resistant bacteria outside healthcare settings (David & Daum, 2010). Governments and healthcare organisations are now pushing for stricter regulations on antibiotic prescriptions and promoting the responsible use of these drugs. Global collaboration is also essential. Pathogens don’t respect national borders, and the spread of infectious diseases is a global issue. Initiatives like the World Health Organisation’s Global Antimicrobial Resistance Surveillance System (GLASS) are crucial in monitoring and controlling the spread of resistant pathogens worldwide. By sharing data and resources, countries can better coordinate their responses to emerging threats, mitigating the risks posed to global health. The dynamic shifts in power between humans and pathogens continues to unfold in this evolutionary arms race. While scientific innovation is allowing the development of new tools, from vaccines to gene-editing technologies, we must also adopt policies that promote responsible drug use and global cooperation. In this race, staying at the top of our game requires constant vigilance, innovation, and adaptation—because pathogens certainly aren’t slowing down. The stakes are high, but with continued research and collaboration, we have the potential to maintain the upper hand in this ever-evolving battle for survival. References Collins, J., Rudkin, J., Recker, M., Pozzi, C., O'Gara, J. P., & Massey, R. C. (2010). Offsetting virulence and antibiotic resistance costs by MRSA. Isme Journal, 4(4), 577-584. https://doi.org/10.1038/ismej.2009.151 David, M. Z., & Daum, R. S. (2010). Community-Associated Methicillin-Resistant Staphylococcus aureus : Epidemiology and Clinical Consequences of an Emerging Epidemic. Clinical Microbiology Reviews, 23(3), 616-+. https://doi.org/10.1128/cmr.00081-09 Gougeon, ML. Apoptosis as an HIV strategy to escape immune attack. Nat Rev Immunol 3 , 392–404 (2003). https://doi.org/10.1038/nri1087 Gupta, R. K., Gregson, J., Parkin, N., Haile-Selassie, H., Tanuri, A., Forero, L. A., Kaleebu, P., Watera, C., Aghokeng, A., Mutenda, N., Dzangare, J., Hone, S., Hang, Z. Z., Garcia, J., Garcia, Z., Marchorro, P., Beteta, E., Giron, A., Hamers, R., . . . Bertagnolio, S. (2018). HIV-1 drug resistance before initiation or re-initiation of first-line antiretroviral therapy in low-income and middle-income countries: a systematic review and meta-regression analysis. Lancet Infectious Diseases, 18(3), 346-355. https://doi.org/10.1016/s1473-3099(17)30702-8 Mengstie, M. A., & Wondimu, B. Z. (2021). Mechanism and Applications of CRISPR/Cas-9-Mediated Genome Editing. Biologics-Targets & Therapy, 15, 353-361. https://doi.org/10.2147/btt.S326422 Palmer, C. S., Cherry, C. L., Sada-Ovalle, I., Singh, A., & Crowe, S. M. (2016). Glucose Metabolism in T Cells and Monocytes: New Perspectives in HIV Pathogenesis. EBioMedicine, 6, 31–41. https://doi.org/10.1016/j.ebiom.2016.02.012 Picker, L. J., Hansen, S. G., & Lifson, J. D. (2012). New Paradigms for HIV/AIDS Vaccine Development. In C. T. Caskey, C. P. Austin, & J. A. Hoxie (Eds.), Annual Review of Medicine, Vol 63 (Vol. 63, pp. 95-111). https://doi.org/10.1146/annurev-med-042010-085643 Plumet, L., Ahmad-Mansour, N., Dunyach-Remy, C., Kissa, K., Sotto, A., Lavigne, J. P., Costechareyre, D., & Molle, V. (2022). Bacteriophage Therapy for Staphylococcus Aureus Infections: A Review of Animal Models, Treatments, and Clinical Trials. Frontiers in cellular and infection microbiology, 12, 907314. https://doi.org/10.3389/fcimb.2022.907314 Reygaert, W. C. (2018). An overview of the antimicrobial resistance mechanisms of bacteria. Aims Microbiology, 4(3), 482-501. https://doi.org/10.3934/microbiol.2018.3.482 Previous article Next article apex back to

  • Real Life Replicants | OmniSci Magazine

    < Back to Issue 4 Real Life Replicants by Elijah McEvoy 1 July 2023 Edited by Yasmin Potts and Megane Boucherat Illustrated by Jolin See Hal, Ultron and (of course) the Terminator. Comparisons between these fictional, world-destroying, artificial intelligence systems and those in our current age of AI are seemingly never-ending. As a child born with a lightsaber in hand, I find these sensationalist remarks endlessly entertaining. Not only because it baffles me to see concepts once relegated to the realm of science fiction be discussed as serious news topics, but also because they’ve got their references all mixed up. The current challenge posed by the new wave of generative artificial intelligence doesn’t come in the form of a ruthless, gun-toting Arnie. It comes in the form of replicants. Just like these uncannily human androids from Ridley Scott’s cult classic Blade Runner, the rapidly increasing capacity of AI to talk, look and create like humans is beginning to blur the line between what is authentically human and what is the product of an algorithm. From the posh C3P0 to the snarky Cortana, having a friendly AI sidekick has always been a childhood dream of mine. This dream has now become a reality with the rise in AI chat-bots. At the forefront of these is Replika, an app that enables users to talk to their own personalized AI via the use of text-like messages. For its two million users (1), Replika provides a variety of functions. For some, Replika acts as a friend in times of loneliness; a feature that contributed to its spike in users during the height of the COVID-19 pandemic (2). For others, as founder Eugenia Kuyda suggests, it provides a space for users to “open up” about personal or mental health issues and “feel accepted” by a human-like figure (1). For many though, Replika is a digital romantic partner. While it is easy to snicker at the concept of an AI girlfriend, those with past relationship trauma or those living in environments that may be hostile towards their sexuality have used Replika as an outlet to explore genuine feelings of love in a safe setting (3). However, with such attachment comes the chance for exploitation. As stated by Nir Eisikovits, Director of the Applied Ethics Centre at the University of Massachusetts, his concern is “not whether machines are sentient” but rather our own tendency “to imagine that they are” (4). Like the holographic billboards for the AI “JOI” in Blade Runner 2049, suggestive advertisements and aggressive flirting by the AI itself have all been employed by Replika to encourage users to stay on the app and pay a premium subscription for explicit content (5). While Replika has since removed sexual material, the large backlash from users at this decision (6) highlights the unethically coercive power such mimicry of human personality could have on consumers. For years, we’ve been warned of the danger of manipulative TV advertisements encouraging excessive junk food consumption and gambling. Imagine what could be done when that ad is no longer a 30 second video but instead an anthropomorphized AI tailored exactly to you, your interests and your vulnerabilities. Not only is AI replicating the way we talk, but also how we look. From videos of an animated Tom Cruise to convincing photos of a Balenciaga-wearing Pope (7), advanced deepfake videos and prompt-generated images from AI systems like DALL-E are becoming easier to create by the day (8). While the most prominent use of this technology is currently in the form of harmless memes, it can and has been used for more sinister means. Women across the world have had their faces used in non-consensual deepfake pornography, often as a form of revenge or blackmail (9). Furthermore, a fabricated video of Volodymyr Zelensky surrendering to Vladimir Putin that spread on social media last year proves AI’s unsettling potential in political disinformation (8). While fakes like that of Zelensky may have been taken down quickly due to easily identifiable tells, in many cases the damage has already been done the moment people see these videos or images. Mistrust in the news is heightened and real evidence can be accused of being AI generated, a strategy already implemented by Donald Trump to dismiss evidence of his misogyny (8). Although the current usage of this technology is concerning enough, the degradation of truth within society will only worsen as these replicants become increasingly accurate and faster to produce (8). Still, it is the ability for AI to complete jobs once thought to be uniquely human that will result in the largest change to the current status quo. Latest estimates from Goldman Sachs state that close to 300 million jobs globally could be automated by the current AI wave (10). The threat of job losses due to automation is far from new, stretching all the way back to 1811 with the infamous Luddites protesting factory machines (11). However, generative AI is placing a greater variety of jobs in jeopardy due to its ability to exude human creativity, giving rise to what Stanford Professor Victor R. Lee entitles an “authenticity crisis” (12). One of those jobs is that of writers. A common phrase amongst movie reviewers today is “this could have been written by an AI”. While usually used as a jab against the latest Marvel movie, large language models like Chat GPT that are capable of identifying and mimicking patterns in writing make it more than just a joke. Amongst calls for better conditions for screenwriters, a key demand from the Writers Guild of America in this year's Los Angeles writers’ strike was that AI will not be used to write or rewrite scripts (13). When you combine the growing authenticity of these AI with the greedy desires of major studios, it is not a far cry to suggest that producers may use AI to quickly generate scripts for generic soap operas and cash grab Netflix movies, leaving the human creatives to simply ‘clean-up’ these stories at a cut pay rate. Despite all these concerns, generative AI does have the ability to immeasurably improve society. The capacity of this technology to increase workplace efficiency (10), accelerate scientific progress (14) and constantly amuse us with clips of a rapping Joe Biden is undeniable. With the cat out of the bag, innovation in these areas cannot nor should not be halted completely. However, if sci-fi movies have taught me anything useful, it’s that we should not be blinded by the potential of scientific progress. Whether it be through governmental action to regulate the use of AI in industry or the scientific development of better deepfake-spotting technology to help stifle disinformation, implementing safeguards around AI is crucial in avoiding its “ethical debt” (15). Whilst looking to the world of science fiction as an indication of our future may be a bit far-fetched, it may also be a needed reminder of the world scientists should try not to replicate. References Tong A. AI company restores erotic role play after backlash from users ‘married’ to their bots [Internet]. The Sydney Morning Herald. 2023 [cited 2023 May 14]. Available from: https://www.smh.com.au/world/north-america/ai-company-restores-erotic-roleplay-after-backlash-from-users-married-to-their-bots-20230326-p5cvao.html Clarke L. ‘I learned to love the bot’: meet the chatbots that want to be your best friend. The Observer [Internet]. 2023 Mar 19 [cited 2023 May 14]; Available from: https://www.theguardian.com/technology/2023/mar/19/i-learned-to-love-the-bot-meet-the-chatbots-that-want-to-be-your-best-friend The rise and fall of replika [Internet]. [cited 2023 May 14]. Available from: https://www.youtube.com/watch?v=3WSKKolgL2U Eisikovits N. AI isn’t close to becoming sentient – the real danger lies in how easily we’re prone to anthropomorphize it [Internet]. The Conversation. 2023 [cited 2023 May 14]. Available from: http://theconversation.com/ai-isnt-close-to-becoming-sentient-the-real-danger-lies-in-how-easily-were-prone-to-anthropomorphize-it-200525 Cole S. ‘My ai is sexually harassing me’: replika users say the chatbot has gotten way too horny [Internet]. Vice. 2023 [cited 2023 May 14]. Available from: https://www.vice.com/en/article/z34d43/my-ai-is-sexually-harassing-me-replika-chatbot-nudes ‘My wife is dead’: How a software update ‘lobotomised’ these online lovers. ABC News [Internet]. 2023 Feb 28 [cited 2023 May 14]; Available from: https://www.abc.net.au/news/science/2023-03-01/replika-users-fell-in-love-with-their-ai-chatbot-companion/102028196 How to spot an ai-generated image like the ‘balenciaga pope’ [Internet]. Time. 2023 [cited 2023 May 14]. Available from: https://time.com/6266606/how-to-spot-deepfake-pope/ Wong M. We haven’t seen the worst of fake news [Internet]. The Atlantic. 2022 [cited 2023 May 14]. Available from: https://www.theatlantic.com/technology/archive/2022/12/deepfake-synthetic-media-technology-rise-disinformation/672519/ Atillah IE. AI could make deepfake porn an even bigger threat for women [Internet]. euronews. 2023 [cited 2023 May 14]. Available from: https://www.euronews.com/next/2023/04/22/a-lifelong-sentence-the-women-trapped-in-a-deepfake-porn-hell Toh M. 300 million jobs could be affected by latest wave of AI, says Goldman Sachs | CNN Business [Internet]. CNN. 2023 [cited 2023 May 14]. Available from: https://www.cnn.com/2023/03/29/tech/chatgpt-ai-automation-jobs-impact-intl-hnk/index.html McClelland C. The impact of artificial intelligence - widespread job losses [Internet]. IoT For All. 2023 [cited 2023 May 14]. Available from: https://www.iotforall.com/impact-of-artificial-intelligence-job-losses Hollywood writers are on strike over an AI threat that some are warning is coming for you next. ABC News [Internet]. 2023 May 5 [cited 2023 May 14]; Available from: https://www.abc.net.au/news/2023-05-06/hollywood-writer-s-strike-over-pay-and-artificial-intelligence/102296704 Lee VR. Generative AI is forcing people to rethink what it means to be authentic [Internet]. The Conversation. 2023 [cited 2023 May 14]. Available from: http://theconversation.com/generative-ai-is-forcing-people-to-rethink-what-it-means-to-be-authentic-204347 The AI revolution in science [Internet]. [cited 2023 May 14]. Available from: https://www.science.org/content/article/ai-revolution-science Fiesler C. AI has social consequences, but who pays the price? Tech companies’ problem with ‘ethical debt’ [Internet]. The Conversation. 2023 [cited 2023 May 14]. Available from: http://theconversation.com/ai-has-social-consequences-but-who-pays-the-price-tech-companies-problem-with-ethical-debt-203375 Previous article Next article back to MIRAGE

  • Big Bang To Black Holes: Probing the Illusionary Nature of Time | OmniSci Magazine

    < Back to Issue 4 Big Bang To Black Holes: Probing the Illusionary Nature of Time by Mahsa Nabizada 1 July 2023 Edited by Elijah McEvoy and Caitlin Kane Illustrated by Aisyah Mohammad Sulhanuddin Time is ubiquitous: it governs our daily lives, marking our existence from birth to death. We measure time in seconds, minutes, hours, days or years, using man-made tools like clocks and calendars which reinforce the perception that it is tangible and objective. In fact, the most used noun in English is time (1). However, delving into the realms of science and philosophy, the true nature of time becomes illusionary. We can acknowledge our personal perception of time is inherently subjective. Our experiences of time vary depending on our surroundings, emotional state and physical state. For example, while time may seem to drag on when we're bored or anxious, it can pass quickly when we're having a good time. Although we imagine time to be objective, it could be merely an illusion resulting from the limitations of our perceptions and the conditions of our observation. Exploring these questions requires scientific perspectives, so let's delve into the enigmatic physics of time. In three-dimensional space, physical spaces are fixed, meaning that we can revisit the same location repeatedly. For example, we may visit our favourite restaurant as many times as we wish. However, this is not the case with time. Time only moves forward, and we cannot go back to a previous moment; it belongs to the past and cannot be retrieved (2). This unidirectional nature of time is referred to as the arrow of time. Time is believed to originate from the Big Bang, the event that marked the beginning of the universe (3). From that point, time has progressed towards the present, where you are currently reading this article, and it continues to move into the future. The second law of thermodynamics, known as entropy, plays a crucial role in representing the forward movement of this arrow of time (4). Entropy refers to the state of disorder, uncertainty, or randomness in a system like a measure of the disorder present in the universe. At the moment of the Big Bang, the universe had low entropy, with matter and energy concentrated and organised. However, since that initial state, matter in the universe has been expanding and moving away from each other, leading to an increase in entropy and transforming the universe into a high entropy system. The concepts of the arrow of time and entropy, guided by the second law of thermodynamics, allow for a distinction between the past and the future and play a pivotal role in the existence of life. Without entropy and the resulting change there would be no discernible difference between events that occurred 1000 years ago and events happening in the present. Furthermore, the progression of life from birth to death can be explained through the phenomenon of entropy, as governed by the second law. However, on the quantum level, the behaviour of particles becomes more complex. Just as there is no inherent forward or backward direction in vast space, at the molecular level, the concept of entropy is not as apparent. While time appears to have a clear direction on the macroscopic level, when observing the particles that make up the universe, time can flow and operate in multiple directions. The laws of physics that govern these particles do not distinguish between the past and the future. They describe the behaviours of physical systems without differentiating between temporal directions. The theory of general relativity, proposed by Albert Einstein, provides a fundamental framework for understanding the workings of spacetime (5). According to the theory of general relativity, the presence of mass or energy causes a distortion in the fabric of spacetime, which in turn affects the motion of other objects. For example, it describes gravity as the curvature of spacetime caused by the presence of mass and energy. Essentially, spacetime can be thought of as a fluid that is influenced by both gravity and velocity. This theory has illuminated not just the behaviour of celestial bodies and the vast structure of the universe, but also enhanced our understanding of the intricate interplay between space, time, and matter. Within Einstein’s theories, time dilation is a scientific phenomenon that can be explored through a thought experiment known as the twin paradox (6). It demonstrates how the perception of time can vary between two individuals who experience different levels of motion or gravitational forces. Time dilation is not limited to the twin paradox or space travel; it is a fundamental concept in understanding the relationship between time, motion, and gravity. It has been experimentally confirmed and plays a significant role in our understanding of the universe. Imagine you, Twin A, are stationary on Earth while your sister, Twin B, is traveling in a rocket at a constant speed. Due to the sideways motion of the rocket, Twin B’s clock will appear slower to Twin A since her path through spacetime is longer due to the effects of special relativity and time dilation. Therefore, from Twin A’s perspective on Earth, time seems to pass slower on the moving rocket. However, from Twin B’s perspective, Twin A is the one in motion and therefore Twin A’s clock appears slower to her. Both frames of reference seem to indicate that the other's clock is slower, which seems contradictory. In reality, both observations are correct because the laws of physics remain the same in both frames of reference. Now, the question arises: who is actually younger? According to each twin's viewpoint, the other twin is younger. However, in reality, only one twin can have aged less than the other. Fortunately, there is a resolution to this paradox. When Twin B turns around to return to Earth, she undergoes acceleration which means the usual laws no longer apply. As a result, Twin B will be younger than her Earth-bound sister, Twin A, upon returning to Earth due to the effects of acceleration. To explain this effect during the period of acceleration, we need to consider that general relativity causes time dilation in the presence of gravitational fields. Gravitational time dilation means that clocks run slower in stronger gravitational fields compared to clocks in weaker gravitational fields. During the acceleration phase, when Twin B’s rocket is returning to Earth, her time now appears to go slower, while the clock on Earth appears to run faster. This phenomenon is similar to the extreme time dilation experienced near the edge of a black hole, known as an event horizon (7). From the observer’s frame of reference outside the black hole, time slows as an object approaches the event horizon, until it appears time has stopped. Hence an object falling into the black hole would appear to have stopped, completely frozen. Even though it governs our daily lives and despite our ability to measure it with great accuracy, there is no definitive answer to what time truly is. From the subjective experiences of our daily lives to the enigmatic physics of the Big Bang and black holes, the illusionary nature of time unveils an array of complexities, reminding us that this fundamental concept remains one of the most captivating mysteries of our existence. As famously stated by Einstein: "For us believing physicists, the distinction between past, present, and future is only a stubbornly persistent illusion” (8). References Study: “Time” Is Most Often Used Noun [Internet]. www.cbsnews.com . 2006. Available from: https://www.cbsnews.com/news/study-time-is-most-often-used-noun/ Davies P. The arrow of time. Royal Astronomical Society [Internet]. 2005 Feb 1 [cited 2023 Jun 4];46(1):1.26–9. Available from: https://academic.oup.com/astrogeo/article/46/1/1.26/253257 University of Western Australia. Evidence for the Big Bang [Internet]. Evidence for the Big Bang. 2014 p. 1–4. Available from: https://www.uwa.edu.au/study/-/media/Faculties/Science/Docs/Evidence-for-the-Big-Bang.pdf Hall N. Second Law - Entropy [Internet]. Glenn Research Center | NASA. 2023. Available from: https://www1.grc.nasa.gov/beginners-guide-to-aeronautics/second-law-entropy/ Norton JD. General Relativity [Internet]. sites.pitt.edu . 2001 [cited 2022 Feb]. Available from: https://sites.pitt.edu/~jdnorton/teaching/HPS_0410/chapters/general_relativity/ Perkowitz S. Twin paradox | physics | Britannica. In: Encyclopædia Britannica [Internet]. 2020 [cited 2013 Jun 14]. Available from: https://www.britannica.com/science/twin-paradox Hadi H, Atazadeh K, Darabi F. Quantum time dilation in the near-horizon region of a black hole. Physics Letters B [Internet]. 2022 Nov 10 [cited 2023 Jun 11];834:137471. Available from: https://www.sciencedirect.com/science/article/pii/S0370269322006050 A Debate Over the Physics of Time | Quanta Magazine [Internet]. Quanta Magazine. 2016. Available from: https://www.quantamagazine.org/a-debate-over-the-physics-of-time-20160719/ Previous article Next article back to MIRAGE

  • Bionics: Seeing into the Future | OmniSci Magazine

    Exciting technological leaps are being made in the futuristic field of visual prostheses. Australians suffering from visual impairment can be helped by emerging treatments including Bionic Eyes: a sight for sore eyes. This piece takes a look at the prevalent impairments and our ocular opportunities to treat them. Bionics: Seeing into the Future By Joshua Nicholls While the Bionic Eye might seem like a technology of the far future, exciting advancements are being made in the field of visual prostheses. This piece points a keen eye at emerging treatments for some of the most prominent diseases, along with their possible bionic treatments. Issue 1: September 24, 2021 Illustration by Friday Kennedy Visual prostheses, colloquially known as bionic eyes, are a set of experimental devices designed to restore — or partially restore — vision to those with varying levels of blindness (1). While once viewed as “science fiction”, these technologies are becoming a reality for thousands of Australians with visual impairments. Since its inception in 1956 by the Australian inventor Graham Tassicker (2), the idea of restoring vision using electronics has undergone several developments, ranging from rudimentary cortical stimulation to modern advancements in state-of-the-art retinal implants. As of 2018, it was estimated that over 13 million Australians have some form of visual impairment. Of these 13 million, 411,000 have cataracts or the clouding of the lens; 244,00 have macular degeneration, which degrades fine detail vision; and 133,000 are either partially or entirely blind (3,4). The economic burden of blindness in Australia is substantial. In 2009, it was estimated that the total cost of vision loss per person aged 40 and over was $28,905 — a nationwide total of 16.6 billion AUD (5). Figure 1: Categorisation of Total Economic Cost of Vision Loss in 2009 (5) Age-related macular degeneration (AMD) is one condition for which visual prosthetics may be applicable. AMD refers to the irreversible loss of high-acuity, colour-sensitive cone cells in the centre field of vision. This structure of the retina is responsible for reading, recognising faces, driving, and other visual tasks that require sharp focal vision. In fact, you are using these cells to read this article right now. Its typical onset is later in life, affecting 12% of people aged 80 or over (6). As the leading chronic eye condition for elderly Australians (7), it accounts for 48% of all cases of blindness nationwide (8). According to AIHW4, there is also a higher prevalence amongst females than in males — between 4.9%–6.8% and 3.6–5.1%, respectively. Macular degeneration exists in two forms: dry and wet. Dry macular degeneration is caused by thinning of the macula; it is the most common form of the disease and progresses slowly over many years. Wet macular degeneration is a potentially more severe variation of the disease which is caused by the sudden development of leaky blood vessels around the macula (9). With no known cure — and most treatments being directed towards prevention and delaying progression — interventions relying on prosthetics may be the best hope for the restoration of lost eyesight (10). Graham Tassicker was the first to realise the potential utility of cortical stimulation in restoring sight to those with vision loss. In 1956, Tassicker developed a photosensitive selenium cell which, when placed behind the retina, resulted in phosphene visualisation — the phenomenon of seeing light without light actually entering the eye (2). This was the first evidence of non-cortical stimulation to elicit visual experience. It was in the 1990s that visual prostheses took a radical development; sophisticated retinal surgeries and the creation of biomaterials led to a surge of novel inventions, including cortical implant miniaturisation and artificial retinas — the latter of which is the most advanced to date. There is currently a state-of-the-art retinal bionic system that has recently undergone clinical trial research: the Argus II Retinal Stimulation System. The Argus is an epiretinal (above the retina) implant which has been designed by SecondSight; as of 2013, it was FDA approved for retinitis pigmentosa (RP) but has potential utility for dry AMD. It consists of a device that is implanted in the patient’s eye and an external processing unit worn by the user. The system consists of sixty electrodes, each of which is two-hundred-micrometres in diameter. Images that have been captured by a small camera on glasses are converted into electrical impulses to stimulate surviving ganglion cells on the retina. It is currently the most widely used retinal prosthetic system in the world, with more than 350 RP patients being treated to date. The cost of this device is 150,000 USD — a price that excludes surgery and post-operative training (11). Figure 2: The design of the Argus II (12) In 2015, a case study was performed by the Argus II study group on the impact the implant would have on restoring visual function to subjects who had complete blindness from RP. The results from this study were quite promising; it showed that of the 30 patients who received the Argus II system, all significantly performed better on a white square test than they did without the prosthesis. (None of the subjects scored any points with the device absent.) The Argus also showed reliability for 29 subjects, all of whom still had functioning devices after three years (13). In 2020, a clinical trial of this device for dry AMD was completed. The study, which consisted of five patients, assessed the safety and feasibility of the device. According to Mills et al. (14), no patients reported confusion when operating the Argus alongside their healthy peripheral vision. Adverse events occurred in two patients who experienced proliferative vitreoretinopathy — or tractional retinal detachment. However, due to recent events surrounding the COVID-19 pandemic, the company declared that they would be performing “an orderly wind-down of the company’s operations”. SecondSight is now focusing on a new device: The Orion. This device is designed to stimulate the visual cortex of the brain — a return to the original conception of visual prosthetics. The Orion is planned to expand the pool of patients who are eligible for visual prosthetics. It will essentially bypass the requirement for healthy ganglion cells and a functioning optic nerve, which retinal prosthetics require. The only forms of blindness not encompassed by this technique are congenital forms of blindness or people who are ‘cortically blind’ from suffering damage to the visual cortex area V1. The Orion is modelled after the Argus II with its 60 cortical-stimulating electrodes receiving input from a camera on the user’s glasses. Under the Breakthrough Device Pathway, the FDA approved Orion for an early feasibility study. Six human subjects have been fitted with the device — one woman and five men between the ages of 29 and 57. Of these six, one had endophthalmitis, two had glaucoma, and three suffered trauma. After one year of wearing the device, four of the patients could accurately discern the location of a palm-sized white square on a computer screen, and five could locate its movement in space. The Orion has shown a good safety profile after 12 months of use, and follow-ups on its progress will occur for five years (15). Visual prostheses have a promising and bright future of development ahead of them. While it is still in its infancy, the results of ongoing clinical trials show promise for sight restoration. With multiple models and modes of intervention available, artificial vision is slowly becoming a reality for the visually impaired, but further developments in the field are still required. It would be promising to see advancements from mere two-dimensional grey-scale images to the rich, three-dimensional, and full-colour experience that we take for granted as normal vision. For now, two essential factors need to be improved for the full realisation of artificial vision: cost and electrode density. The Argus costs 150,000 USD — an expense that excludes surgery and training. This figure may be unfeasible for the thousands of Australians who would benefit from such a device. If the current trend of Moore’s Law continues, electrode density will increase whilst the cost of the device will decrease — a trend analogous to the increase in power and improved price of computers in the last century. This pixel density will hopefully improve to the point of achieving near-normal visual acuity. The 60 pixels, while helpful in regaining some functionality, cannot compare to the some 96 million photoreceptor cells in the retina — 5 million of which are located in the cone-dense macula. Nevertheless, artificial vision is an exciting and innovative technology currently under development. While much research is still needed, further advancements in bionics will one day make visual prosthetics a ubiquitous and affordable technology to those in need. About the writer: Joshua Nicholls was the 2021 winner of the Let's Torque competition. Joshua : I am a 5th-year neuroscience and biochemistry student at the Swinburne University of Technology. I finished my Health Science degree a few years ago, majoring in neuroscience. I am now completing my final few subjects in my Bachelor of Science, with biochemistry as my major. For the state-wide Let’s torque competition, I changed my pitch to artificial vision, hence its title, Bionics: Seeing into the Future—a catchy pun, if I do say so myself. I made the rather complex topic of visual prosthetics approachable and understandable to the general audience by, as stated previously, conveying a story. I asked my audience to consider losing vision, if not completely, at least partially. Considering this, I then asked them to imagine what life must be like for the some 13 million Australians of whom suffer from some form of visual impairment. This exercise brought home the very real phenomenon of visual impairment, which many of us have—or will—be impacted. The solution for currently untreatable vision loss is already underway: The Bionic Eye, as it is colloquially known. While it may sound like science fiction, bionics (or prosthetics) are nothing new; artificial hearing through the cochlear implant and artificial limbs are becoming rather ubiquitous. I briefly detailed a few diseases for which visual prosthetics may be appropriate, such as age-related macular degeneration and retinitis pigmentosa, and spoke about past and current clinical trials demonstrating their efficacy. To end my pitch, I talked about the lasting impact these devices will have on people’s lives and the future developments required. In doing so, I relayed the past, present, and future of the bionic eye, which detailed a coherent and relatable story to my audience. I was successful in my pitch and won first place among the state! It was an absolute privilege even to have been a part of this competition; coming first place was an added honour and will remain one of the highlights of my life. I believe this experience will serve as a footstone toward my career in science and science communication. If anyone has any desires to get their foot in the door of this field, get your name and face out there and just go for it! References: Ong, J. M., & da Cruz, L. (2012). The bionic eye: a review. Clinical & experimental ophthalmology, 40(1), 6-17. Tassicker, G. (1956). Preliminary report on a retinal stimulator. The British journal of physiological optics, 13(2), 102-105. Australian Bureau of Statistics. (2018). National Health Survey: First Results, 2017–18. Canberra: ABS Retrieved from https://www.abs.gov.au/statistics/health/health-conditions-and-risks/national-health-survey-first-results/latest-release Australian Institute of Health and Welfare. (2021). Eye health. Canberra: AIHW Retrieved from https://www.aihw.gov.au/reports/eye-health/eye-health Taylor, P., Bilgrami, A., & Pezzullo, L. (2010). Clear focus: The economic impact of vision loss in Australia in 2009. Vision2020. Retrieved from https://www.vision2020australia.org.au/wp-content/uploads/2019/06/Access_Economics_Clear_Focus_Full_Report.pdf Mehta, S. (2015). Age-related macular degeneration. Primary Care: Clinics in Office Practice, 42(3), 377-391. Foreman, J., Xie, J., Keel, S., van Wijngaarden, P., Sandhu, S. S., Ang, G. S., . . . Taylor, H. R. (2017). The prevalence and causes of vision loss in Indigenous and non-Indigenous Australians: the National eye health survey. Ophthalmology, 124(12), 1743-1752. Taylor, H. R., Keeffe, J. E., Vu, H. T. V., Wang, J. J., Rochtchina, E., Mitchell, P., & Pezzullo, M. L. (2005). Vision loss in Australia. Med J Aust, 182(11), 565-568. doi:10.5694/j.1326-5377.2005.tb06815.x Calabrese, A., Bernard, J.-B., Hoffart, L., Faure, G., Barouch, F., Conrath, J., & Castet, E. (2011). Wet versus dry age-related macular degeneration in patients with central field loss: different effects on maximum reading speed. Investigative ophthalmology & visual science, 52(5), 2417-2424. Cheung, L. K., & Eaton, A. (2013). Age‐related macular degeneration. Pharmacotherapy: The Journal of Human Pharmacology and Drug Therapy, 33(8), 838-855. Luo, Y. H.-L., & Da Cruz, L. (2016). The Argus® II retinal prosthesis system. Progress in retinal and eye research, 50, 89-107. SecondSight. (2021). SecondSight: Life in a New Light. Retrieved from https://secondsight.com/ Ho, A. C., Humayun, M. S., Dorn, J. D., Da Cruz, L., Dagnelie, G., Handa, J., . . . Hafezi, F. (2015). Long-term results from an epiretinal prosthesis to restore sight to the blind. Ophthalmology, 122(8), 1547-1554. Mills, J., Jalil, A., & Stanga, P. (2017). Electronic retinal implants and artificial vision: journey and present. Eye, 31(10), 1383-1398. Pouratian N., Yoshor D., & Greenberg R. (2019). Orion Visual Cortical Prosthesis System Early Feasibility Study: Interim Results. Paper presented at American Academy of Ophthalmology Annual Meeting.

  • Serial Killers | OmniSci Magazine

    < Back to Issue 5 Serial Killers Selin Duran 24 October 2023 Edited by Yasmin Potts Illustrated by Aditya Dey Serial killers. Do we love them or hate them? It’s hard to know, especially as the media surrounding them is increasing. From fiction to nonfiction killers, our society is obsessed with giving a voice and perspective to these people. We have movies, documentaries, TV series and even Youtube videos accounting the lives and stories of killers. Despite this, people rarely stop to ask themselves why they enjoy this style of media - some of the most wicked and gruesome acts, glorified for the interest of many. Yet, every day we are met with new shows highlighting the life of coldblooded killers. But why are we interested in them? It’s mostly a morbid curiosity; as humans, we are drawn to crime. We want to know why people choose to kill and how they do it. Jack Haskins, a University of Tennessee journalism professor, noted that "humans [are] drawn to public spectacles involving bloody death...Morbid curiosity, if not inborn, is at least learned at a very early age " (UPI Archives, 1984). As a collective, we have always wanted to explore the horrid acts of those who kill. But it’s only with the help of modern media that people enjoy them. Media loves a good story - and what makes a good story? A crazy serial killer on the loose. One of the earliest movies about a serial killer is Fritz Lang's 1931 film M . Set in Berlin, the film details a killer who targets children. Since then, a downward spiral of fictional serial killer movies has taken society by storm. Being all the craze during the mid-80s and 90s, the highest amount of serial killer media were produced in this timeframe. One of the most popular works is director Alfred Hitchcock's iconic Psycho, which won eight Academy Awards (IMDb, 2021). What is truly disturbing is the story of this film. Norman Bates, our killer, is deemed mentally insane and suffers from Dissociative Identity Disorder. Through his personality changes, he proceeds to kill two people during the film, in addition to multiple murders not depicted. Yet, when he is jailed, we learn that his actions were the result of abuse he endured when he was younger. Suddenly, we're forced to feel sympathetic towards his situation. How can that be a reasonable justification towards murder, and why do we applaud the film for this? As a society, accepting murder based on mental insanity seems more than unreasonable - but no one has questioned it thus far. This unfortunately happens not only with fictional killers, but with nonfiction ones. Our interest in killers turns into a way to inform ourselves of these situations (Harrison, 2023). We look to these documentaries and podcasts that tell the stories of the most notorious serial killers to learn something and prevent the situation from happening to us. All whilst indulging in content that emphasises these killers as being regular people, not evil individuals, who committed crimes for personal pleasure. We don’t need to see a biopic about the ventures of Ted Bundy and Jeffery Dahmer. Yet the second you search their names on Google, an all-star cast portraying the life of a man who tortured their victims fills your screen. This is certainly not an ethical thing to endorse. Despite this, not a single person thinks twice about it due to how common it is. Directors are profiting off victims and as a society, we are allowing it because of our curiosity. What happened to compassion? Because I certainly believe we have lost it. We have become so infatuated with killers that their actions seem unimportant to us. We yearn to discover more about their lives and forget that real people were implicated in these events. These killer stories provide bursts of short-lived adrenaline and then we return to our normal lives. In forgetting the consequences of these real stories, we are in many ways as bad as the killers themselves. And that is truly wicked. References Harrison, M. A. (2023). Why Are We Interested in Serial Killers? Just as Deadly: The Psychology of Female Serial Killers . Cambridge: Cambridge University Press, 17–31. https://www.cambridge.org/core/books/just-as-deadly/why-are-we-interested-in-serial-killers/B35C2243B387273749EA164318C27623?utm_campaign=shareaholic&utm_medium=copy_link&utm_source=bookmark IMDb. (2021). Psycho (1960) - Awards . https://www.imdb.com/title/tt0054215/awards/ UPI Archives. (1984). Few answers on origin of morbid curiosity. UPI. https://www.upi.com/Archives/1984/04/07/Few-answers-on-origin-of-morbid-curiosity/7976450162000/#:~:text=%27Throughout%20human%20history%2C%20humans%20have Wicked back to

  • Hiccups | OmniSci Magazine

    < Back to Issue 2 Hiccups Evolution might be a theory, but if it’s evidence you’re after, there’s no need to look further than your own body. The human form is full of fascinating parts and functions that hold hidden histories - from the column that brought you a deep-dive into ear wiggling in Issue 1, here’s an exploration of why we hiccup! by Rachel Ko 10 December 2021 Edited by Katherine Tweedie and Ashleigh Hallinan Illustrated by Gemma Van der Hurk Hiccups bring a special brand of chaos to a day. It’s one that lingers, rendering us helpless and in suspense; a subtle, internal chaos of quiet frustration that forces us to drop what we’re doing to monitor each breath – in and out, in and out – until the moment they abruptly decide to stop. It’s an experience we’ve all had – one that can hit anyone at any time – and for most of us, hiccups are a concentrated episode of inconvenience; best ignored, and overcome. Yet, despite our haste to get rid of them when they interrupt our day, hiccups seem to have mystified humans for generations. Historically, the phenomenon has been the source of many superstitions, both good and bad. A range of cultures associate them with the concept of remembrance: in Russia, hiccups mean someone is missing you (1), while an Indian myth suggests that someone is remembering you negatively for the evils you have committed (2). Likewise, in Ancient Greece, hiccups were a sign that you were being complained about (3), while in Hungary, they mean you are currently the subject of gossip. On a darker note, a Japanese superstition prophesises death to one who hiccups 100 times. (4) Clearly, the need to justify everything, even things as trivial as hiccups, has always been an inherent human characteristic, transcending culture and time. As such, science has more recently made its attempt at objectively identifying a reason behind the strange phenomenon of hiccups. After all, if you take a step back and think about it, hiccups are indeed quite strange. Anatomically, hiccups (known scientifically as singultus) are involuntary spasms of the diaphragm (5): the dome-like sheet of muscle separating the chest and abdominal cavities. (6) The inspiratory muscles, including the intercostal and neck muscles, also spasm, while the expiratory muscles are inhibited. (7) These sudden contractions cause a rapid intake of air (“hic”), followed by the immediate closure of the glottis or vocal cords (“up”). (8) As many of us have probably experienced, a range of stimuli can cause these involuntary contractions. The physical stimuli include anything that stretches and bloats the stomach, (9) such as overeating, rapid food consumption and gulping, especially of carbonated drinks. (10) Emotionally, intense feelings and our responses to them, such as laughing, sobbing, anxiety and excitement, can also be triggers. (11) This list is not at all exhaustive; in fact, the range of stimuli is so large that hiccups might be considered the common thread between a drunk man, a Parkinson’s disease patient and anyone who watches The Notebook. The one thing that alcohol, (12) some neurological drugs (13) and intense sobbing (14) do have in common is that they exogenously stimulate the hiccup reflex arc. (15) This arc involves the vagal and phrenic nerves that stretch from the brainstem to the abdomen which cause the diaphragm to contract involuntarily. (16) According to Professor Georg Petroianu from the Herbert Wertheim College of Medicine, (17) many familiar home remedies for hiccupping – being scared, swallowing ice, drinking water upside down – interrupt this reflex arc, actually giving these solutions a somewhat scientific rationale. While modern research has successfully mapped out the process of hiccups, their purpose is still unclear. As of now, the hiccup reflex arc and the resulting diaphragmatic spasms seem to be effectively useless. Of the existing theories for the function of hiccups, the most prominent seems to be that they are a remnant of our evolutionary development, (18) essentially ‘vestigial’; in this case, a feature that once served our amphibian ancestors millions of years ago, but now retain little of their original function. (19) In particular, hiccups are believed to be a relic of the ancient transition of organisms from water to land. (20) When early fish lived in stagnant waters with little oxygen, they developed lungs to take advantage of the air overhead, in addition to using gills while underwater. (21) In this system, inhalation would allow water to move over the gills, during which a rapid closure of the glottis – which we see now in hiccupping – would prevent water from entering the lungs. It is theorised that when descendants of these fish moved onto land, gills were lost, but the neural circuit for this glottis closing mechanism was retained. (22) This neural circuit is indeed observable in human beings today, in the form of the hiccup central pattern generator (CPG). (23) CPGs exist for other oscillating actions like breathing and walking, (24) but a particular cross-species CPG stands out as a link to human hiccupping: the neural CPG that is also used by tadpoles for gill ventilation. Tadpoles “breathe” in a recurring, rhythmic pattern that shares a fundamental characteristic feature with hiccups: both involve inspiration with closing of the glottis. (25) This phenomenon strengthens the idea that the hiccup CPG may be left over from a previous stage in evolution and has been retained in both humans and frogs. However, the CPG in frogs is still used for ventilation, while in humans, the evolution of lungs to replace gills has rendered it useless. (26) Based on this information, it seems hiccupping lost its function with time and the development of the human lungs, remaining as nothing more than an evolutionary remnant. However, we cannot discredit hiccupping as having become entirely useless as soon as gills were lost. Interestingly, hiccupping has only been observed in mammals – not in birds, lizards or other air-breathing animals. (27) This suggests that there must have been some evolutionary advantage to hiccupping at some point, at least in mammals. A popular theory for this function stems from the uniquely mammalian trait of nursing. (28) Considering the fact that human babies hiccup in the womb even before birth, this theory considers hiccupping to be almost a glorified burp, intended to remove air from the stomach. This becomes particularly advantageous when closing the glottis prevents milk from entering the lungs, aiding the act of nursing. (29) Today, we reduce hiccups to the disorder and disarray they bring to our day. But, next time you are hit with a bout of hiccups, take a second to find some calm amidst the chaos and appreciate yet another fascinating evolutionary fossil, before you hurry to dismiss them. After that, feel free to eat those lemons or gargle that salty water to your diaphragm’s content. References Sonya Vatomsky, "7 Cures For Hiccups From World Folklore," Mentalfloss.Com, 2017, https://www.mentalfloss.com/article/500937/7-cures-hiccups-world-folklore. Derek Lue, "Indian Superstition: Hiccups | Dartmouth Folklore Archive," Journeys.Dartmouth.Edu, 2018, https://journeys.dartmouth.edu/folklorearchive/2018/11/14/indian-superstition-hiccups/. Vatomsky, "7 Cures For Hiccups From World Folklore". James Mundy, "10 Most Interesting Superstitions In Japanese Culture | Insidejapan Tours," Insidejapan Blog, 2013, https://www.insidejapantours.com/blog/2013/07/08/10-most-interesting-superstitions-in-japanese-culture/. Paul Rousseau, "Hiccups," Southern Medical Journal, no. 88, 2 (1995): 175-181, doi:10.1097/00007611-199502000-00002. Bruno Bordoni and Emiliano Zanier, "Anatomic Connections Of The Diaphragm Influence Of Respiration On The Body System," Journal Of Multidisciplinary Healthcare, no. 6 (2013): 281, doi:10.2147/jmdh.s45443. Christian Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," Bioessays no. 25, 2 (2003): 182-188, doi:10.1002/bies.10224. Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," 182-188. John Cameron, “Why Do We Hiccup?,” filmed for TedEd, 2016, TED Video, https://ed.ted.com/lessons/why-do-we-hiccup-john-cameron#watch. Monika Steger, Markus Schneemann, and Mark Fox, "Systemic Review: The Pathogenesis And Pharmacological Treatment Of Hiccups," Alimentary Pharmacology & Therapeutics 42, no. 9 (. 2015): 1037-1050, doi:10.1111/apt.13374. Lien-Fu Lin, and Pi-Teh Huang, "An Uncommon Cause Of Hiccups: Sarcoidosis Presenting Solely As Hiccups," Journal Of The Chinese Medical Association 73, no. 12 (2010): 647-650, doi:10.1016/s1726-4901(10)70141-6. Steger, Schneemann and Fox, "Systemic Review: The Pathogenesis And Pharmacological Treatment Of Hiccups," 1037-1050. Unax Lertxundi et al., "Hiccups In Parkinson’s Disease: An Analysis Of Cases Reported In The European Pharmacovigilance Database And A Review Of The Literature," European Journal Of Clinical Pharmacology 73, no. 9 (2017): 1159-1164, doi:10.1007/s00228-017-2275-6. Lin and Huang, "An Uncommon Cause Of Hiccups: Sarcoidosis Presenting Solely As Hiccups," 647-650. Peter J. Kahrilas and Guoxiang Shi, "Why Do We Hiccup?" Gut 41, no. 5 (1997): 712-713, doi:10.1136/gut.41.5.712. Steger, Schneemann and Fox, "Systemic Review: The Pathogenesis And Pharmacological Treatment Of Hiccups," 1037-1050. Georg A. Petroianu, "Treatment Of Hiccup By Vagal Maneuvers," Journal Of The History Of The Neurosciences 24, no. 2 (2014): 123-136, doi:10.1080/0964704x.2014.897133. Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," 182-188. Cameron, “Why Do We Hiccup?” Michael Mosley, "Anatomical Clues To Human Evolution From Fish," BBC News, published 2011, https://www.bbc.com/news/health-13278255. Michael Hedrick and Stephen Katz, "Control Of Breathing In Primitive Fishes," Phylogeny, Anatomy And Physiology Of Ancient Fishes (2015): 179-200, doi:10.1201/b18798-9. Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," 182-188. Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," 182-188. Pierre A. Guertin, "Central Pattern Generator For Locomotion: Anatomical, Physiological, And Pathophysiological Considerations," Frontiers In Neurology 3 (2013), doi:10.3389/fneur.2012.00183. Hedrick and Katz, "Control Of Breathing In Primitive Fishes," 179-200. Straus et al., "A Phylogenetic Hypothesis For The Origin Of Hiccough," 182-188. Daniel Howes, "Hiccups: A New Explanation For The Mysterious Reflex," Bioessays 34, no. 6 (2012): 451-453, doi:10.1002/bies.201100194. Howes, "Hiccups: A New Explanation For The Mysterious Reflex," 451-453. [1] Howes, "Hiccups: A New Explanation For The Mysterious Reflex," 451-453. Previous article back to DISORDER Next article

  • Thinking Outside the Body: The Consciousness of Slime Moulds | OmniSci Magazine

    < Back to Issue 8 Thinking Outside the Body: The Consciousness of Slime Moulds by Jessica Walton 3 June 2025 Edited by Han Chong Illustrated by Ashlee Yeo Imagine yourself as an urban planner for Tokyo’s public transport system in 1927. Imagine mapping out the most efficient paths through dense urban sprawl, around obstructing rivers and mountains. And imagine meticulously designing the most efficient possible model, after years of study and expertise… only to find your design prowess, 83 years later, matched by a slime mould: a creature with no eyes, no head nor limbs, nor nervous system. Of course, this is anachronistic. For one, the Tokyo railroad system developed over time, not all at once. But it was designed to meet the needs of the city and maximise efficiency. Yet in 2010, when researchers exposed the slime mould Physarum polycephalum to a plate mimicking Tokyo city (with population density represented by oat flakes) it almost exactly mimicked the Tokyo railroad system (1). This became one of the most iconic slime mould experiments, ushering in a flood of research about biological urban design asking the question: Could a slime mould, or other similar organisms, map out human cities for us? But a slime mould doesn’t know what cities are. They’re single-celled organisms; they don’t understand urban planning, or public transport, or humans. They are classified as protists, largely because we’re not sure how else to categorise them, not because they’re particularly ‘protist-y.’ They have no brain and are single-celled for most of their life; so they can’t plan routes, have preferences, or make memories. Right? Except, perhaps they can. Slime moulds are extremely well-studied organisms because they exhibit precisely these behaviours. But how do they think? And what does it mean— to think ? Slime moulds have evidenced memory and learning. The protoplasm network they form is really just one huge cell that eventually develops into a plasmodium, growing and releasing spores. While plasmodial slime moulds (like P. polycephalum ) do this during reproduction, cellular slime moulds (dictyostelids) are able to aggregate together into one cell like this when food is scarce or environments are difficult (meaning they must be able to detect and evaluate if these things are true). Most slime mould behaviour is understood through cell signalling and extracellular interaction mechanisms; responding to chemical gradients using receptors along their membrane, which signal to the cells to move up the concentration gradient of a chemoattractant molecule and away from a chemorepellent. This makes sense; bacteria (like almost every other living organism) do this all the time and it’s the chief way that they make decisions . But what about memory and preferences? What about stimuli beyond the immediate detected chemicals? Slime moulds can, for example, anticipate repeated events and avoid simple traps to reach food hidden behind a U-shaped barrier (2,3). These are beyond input-to-output; something more complex must be happening. Something conscious? Thinking ? The idea of consciousness requiring complex neuronal processes is becoming rapidly outdated as we observe patterns of thinking in organisms that, according to classical definitions, really should not be able to. Using the slime mould as an example, Sims and Kiverstein (2022) argue against the ‘neurocentric’ assumption that an organism must have a brain to be cognisant. Instead, P. polycephalum is suggested to exhibit spatial memory, with cognition being suggested to sometimes include external elements (3). They showed it may undergo simple, habitual learning and hypothesised it uses an oscillation-based mechanism within the cell (3). Similarly, oscillator units along the slime mould’s extending tendrils oscillate at a higher frequency at higher concentrations of food source molecules (like some tasty glucose), signalling to the slime mould to move in that direction (4). Sims and Kiverstein (2022) also posit that the slime trail left by slime mould could function as an external memory mechanism. They found that P. polycephalum avoids slime trails as they represent places it has already been; suggesting a method of spatial memory (4). This was further proved as not a pure input-output response by showing that the avoidance response could be overridden when food is placed on or near slime trails (5). They suggest that the slime mould was able to balance multiple inputs, including oscillation levels and slime trail signals, exhibiting simple decision-making. Should we count these processes as thinking ? This topic is debated by philosophers as much as biologists. Sims and Kiverstein (2022) use the Hypothesis of Extended Cognition, being that mind sometimes extends into the environment outside of the brain and body, to argue firmly that it does count. But at the end of the day, despite understanding the chemical and electrical processes between neurons signalling and the cellular makeup of the brain, we still don’t understand how electrical signals through a series of axons make the leap to complex consciousness. Rudimentary and external cognition pathways, as seen with the slime mould, may also be an evolutionary link in the building blocks to more complex, nerve-based consciousness and decision making (3). We don’t yet understand the phenomena inside our own skulls—how can we hope to define it across all other organisms? Slime moulds clearly have something beyond simple chemical reactions. This begs the question: Aren't our own minds also fundamentally just made of simple chemical reactions? And if a slime mould is able to evaluate multiple inputs, how wonderfully complex must such processes be inside (and outside) a sea anemone, a cockroach or a cat? There’s no way to know what such a consciousness would look like or feel like to our frame of reference. When a slime mould, moving as a network around an agar plate, ‘looks up’ (or an equivalent slime mould action) and perceives unfathomable entities, how does it process that? What does the slime mould think of us? Bibliography 1. Kay R, Mattacchione A, Katrycz C, Hatton BD. Stepwise slime mould growth as a template for urban design. Sci Rep. 2022 Jan 25;12(1):1322. 2. Saigusa T, Tero A, Nakagaki T, Kuramoto Y. Amoebae Anticipate Periodic Events. Phys Rev Lett. 2008 Jan 3;100(1):018101. 3. Sims M, Kiverstein J. Externalized memory in slime mould and the extended (non-neuronal) mind. Cognitive Systems Research. 2022 Jun 1;73:26–35. 4. Reid CR, Latty T, Dussutour A, Beekman M. Slime mold uses an externalized spatial “memory” to navigate in complex environments. Proc Natl Acad Sci U S A. 2012 Oct 23;109(43):17490–4. 5. Reid CR, Beekman M, Latty T, Dussutour A. Amoeboid organism uses extracellular secretions to make smart foraging decisions. Behavioral Ecology. 2013 Jul;24(4):812–8. Previous article Next article Enigma back to

  • Building the Lightsaber | OmniSci Magazine

    < Back to Issue 2 Building the Lightsaber Some of the most iconic movie gadgets are the oldest ones. For this issue we look at how the lightsaber was brought to life. by Manthila Ranatunga 10 December 2021 Edited by Sam Williams and Tanya Kovacevic Illustrated by Rohith S Prabhu Star Wars : A New Hope was a massive success when it hit cinemas back in 1977. It was a groundbreaking sensation in the field of science fiction movies and computer generated imagery (CGI) in films. What really caught many fans’ eyes was, of course, the lightsaber. Also referred to as a “laser sword”, it is described as “an elegant weapon, for a more civilised age”. Now in our civilised age, we have decided to replicate this dangerous weapon. Lightsabers have already been built by a few enthusiasts. For this piece, we will be focusing on Hacksmith Industries’ lightsaber build from 2020 , as it is the closest to the real deal. Fig. 1. “Hacksmith Industries’ latest lightsaber build”, Hacksmith Industries, 4000° PLASMA PROTO-LIGHTSABER BUILD, 2020. Hacksmith Industries was founded by James Hobson, an engineer who builds real-life versions of film and video game gadgets. After multiple attempts, the team managed to fabricate a retractable, plasma-based lightsaber. However, this is not a real lightsaber, but more-so a protosaber in the Star Wars universe. We will get back to this point later on. How do they work? Let us first talk about how lightsabers work in the movies. A lightsaber consists of three parts: the hilt, the Kyber crystal and the blade itself. Similar to a traditional sword, the hilt is the handle and is made of a durable metal such as aluminium. It contains the Kyber crystal, which is a rare crystal found in the Star Wars universe and is the power source of the lightsaber. Moving onto the more interesting part, the blade is a beam of plasma. Often called “the fourth state of matter”, it is created by heating gas up to temperatures as high as 2,500 degrees celsius. A battery inside the hilt activates the crystal. The produced plasma is then focused through a lens and directed outwards. An electromagnetic field, essentially a force field, generated at the hilt contains the plasma in a defined beam and directs it back into the hilt. The crystal absorbs the energy and recycles it. Hence lightsabers are extremely energy-efficient, allowing Jedi Knights to use them for their whole lifetimes. Fig. 2. Robert W. Schönholz, Blue Lightsaber, c.2016. Of course, the lightsaber breaks the laws of physics. Electromagnetic fields do not work as they do on fictional planets like Coruscant. Energy-dense power sources such as Kyber crystals do not exist in real life, which leads us to the protosaber. In Star Wars lore, a protosaber is a lightsaber with an external power source. It was the predecessor to the lightsaber when Kyber crystals could not be contained inside the hilt. Since real-life high energy sources cannot be squished into the hilt, Hacksmith Industries' lightsaber build is reminiscent of the early protosaber. The build The engineers at Hacksmith Industries settled on liquefied petroleum gas (LPG) as the power source, the same gas used for home heating systems and barbecues. This gas is fed through the brass and copper hilt, and is burnt continuously to keep producing plasma. To form the beam shape of the blade, they incorporated laminar flow of gas. Ever seen videos of “frozen” water coming out of taps like this ? Laminar flow occurs when layers of fluid molecules, in this case LPG, flow without mixing. In this instance, a smooth beam is created. Unlike actual lightsabers, the beam does not return to the hilt to be absorbed. Of course, to be a lightsaber, it has to function like one, too. The plasma is extremely hot, reaching up to 2,200 degrees celsius. Therefore, it can cut through metal and other objects much like we see in the movies. This also means contact with the blade can lead to serious or even fatal injuries. The external power supply is in the form of a backpack, with mounted LPG canisters and electronics for assistance. Overall, the build looks, feels and works like a real lightsaber, which makes it a pretty accurate replica. However, we do not have the Force or ancient Jedi wisdom, so there are some notable imperfections in the design. Fig. 3. “Finished lightsaber build”, Hacksmith Industries, 4000° PLASMA PROTO-LIGHTSABER BUILD, 2020. Colours Lightsabers come in a variety of colours, each reflecting the wielder's moral values in Star Wars canon. Blue, for example, represents justice and protection. Green, blue and red are the most commonly seen in the movies, but lightsabers also come in purple, orange, yellow, white and black. If you did high school science, you may remember mixing bunsen burner flames with salts to produce colours. The same principle applies here; salts can be mixed in with plasma to colour the blade. For example, Strontium Chloride gives a red colour, so you can finally live out your Sith fantasies. Fig. 4. “Lightsaber colours by mixing salts”, Hacksmith Industries, 4000° PLASMA PROTO-LIGHTSABER BUILD, 2020. Improvements The downside of using plasma is that we cannot fight with it. Blades would pass right through each other without clashing. To fix this, a metal rod that can withstand high temperatures, such as Tungsten, could form the blade with a beam of plasma around it. However, this means the lightsaber would not be retractable, which defeats the purpose. To keep the blade coloured, salts have to be continuously fed through the hilt. This can be done with another pressurised canister along with the LPG, although it requires extra space. Despite the imperfections, the protosaber by Hacksmith Industries is the closest prototype to a real-life lightsaber. With constantly evolving technology, we will be able to build a more compact model that more closely resembles those in the movies. Makers all around the world are building cool movie gadgets like the lightsaber, so keep a lookout for your favourite ones. You never know what the nerds may bring! References 1. Amy Tikkanen, “Star Wars”, Britannica, published April 10, 2008, https://www.britannica.com/topic/Star-Wars-film-series. 2, 4, 7. Hacksmith Industries, “4000° PLASMA PROTO-LIGHTSABER BUILD (RETRACTABLE BLADE!)”, October 2020, YouTube video, 18:15, https://www.youtube.com/watch?v=xC6J4T_hUKg. 3. Joshua Sostrin, “Keeping it real with the Hacksmith”, YouTube Official Blog (blog), November 12, 2020, https://blog.youtube/creator-and-artist-stories/the-hacksmith-10-million-subscribers/. 5. Daniel Kolitz, “Are Lightsabers Theoretically Possible?”, Gizmodo, published August 10, 2021, https://www.gizmodo.com.au/2021/08/are-lightsabers-theoretically-possible/. 6. Richard Rogers, “Lightsaber Battery Analysis”, Arbin Instruments: News, published October 3, 2019, https://www.arbin.com/lightsaber-battery-analysis/. 8. Phil Edwards, “Star Wars lightsaber colors, explained”, Vox, published May 4, 2015, https://www.vox.com/2015/5/31/8689811/lightsaber-colors-star-wars. Previous article back to DISORDER Next article

  • Peaks and Perspectives: A Word from the Editors-in-Chief | OmniSci Magazine

    Issue 7: Apex 22 October 2024 This issue surveys our world from above. So come along, and revel in the expansive view - have a read below! Editorial Peaks and Perspectives: A Word from the Editors-in-Chief by the Editors-in-Chief A word from our Editors-in-Chief. Corals A Coral’s Story: From thriving reef to desolation by Nicola Zuzek-Mayer Nicola sheds light on the devastating future faced by our coral reefs, with the effects of anthropogenic climate change far from having reached its peak. Humans vs Pathogens Staying at the Top of Our Game: the Evolutionary Arms Race by Aizere Malibek As nations vie for military supremacy, Aizere covers a microscopic competition between humans and the microbes evolving strategies against our defences. Seeing Space Interstellar Overdrive: Secrets of our Distant Universe by Sarah Ibrahimi Embark on an epic journey as Sarah explores the cosmic mysterious being revealed by NASA's James Webb Space Teloscope. Fossil Markets Fossil Markets: Under the Gavel, Under Scrutiny by Jesse Allen Diving into the wild world of fossil auctions, Jesse prompts us to ask: who is the real apex predator, the T-rex or hedge-fund billionaires? Cancer Treatments Tip of the Iceberg: An Overview of Cancer Treatment Breakthroughs by Arwen Nguyen-Ngo Icebreakers. Follow Arwen as she recounts the countless stories of the giants before us, who carved a path for our cancer research today. Triangles Pointing the Way: A Triangular View of the World by Ingrid Sefton Guiding us through land, seas and screens, Ingrid explores this humble 3-sided shape as a vital tool of modern society and its many fascinating uses. Anti-ageing Science Timeless Titans: Billionaires defying death by Holly McNaughton From billionaire-backed pills to young blood transfusion, Holly traverses the futuristic world of anti-ageing and asks: what happens when death is no longer inevitable? Brain-computer Implants Neuralink: Mind Over Matter? by Kara Miwa-Dale Would the ability to control a computer with your mind bolster possibilities or bring harm? Kara visualises a possible future under the Neuralink implant. Fish Morphology Designing the perfect fish by Andy Shin With a splash of creativity, Andy concocts the ultimate 'Frankenfish' by investigating the traits that allow fish to flourish in their aquatic environments. Commercial Aviation Soaring Heights: An Ode to the Airliner by Aisyah Mohammad Sulhanuddin Settle in and take a round trip with Aisyah through the evolution of commercial aviation, from the secrets of aircraft cuisine to the mechanics of staying afloat.

  • ISSUES | OmniSci Magazine

    Issues Check out all our issues of OmniSci Magazine! Cover: Anabelle Dewi Saraswati 28 October, 2025 READ NOW Issue 8 Cover: May Du 3 June, 2025 READ NOW Issue 7: Apex Cover: Ingrid Sefton 22 October, 2024 READ NOW Issue 6: Elemental Cover: Louise Cen 28 May, 2024 READ NOW Issue 5: Wicked Cover: Aisyah Mohammad Sulhanuddin 24 Oct, 2023 READ NOW ISSUE 4: MIRAGE Cover: Gemma van der Hurk 1 July, 2023 READ NOW ISSUE 3: ALIEN Cover: Ravon Chew September 10, 2022 READ NOW SUMMER ISSUE 2022: A Year In Science Cover: Quynh Anh Nguyen March 23, 2023 READ NOW ISSUE 2: DISORDER Cover: Janna Dingle December 10, 2021 READ NOW ISSUE 1: Science is Everywhere Cover: Cheryl Seah December 24, 2021 READ NOW

  • Echidnas: Gentle Courters In The Competitive Animal Kingdom | OmniSci Magazine

    < Back to Issue 4 Echidnas: Gentle Courters In The Competitive Animal Kingdom by Emily Siwing Xia 1 July 2023 Edited by Maddison Moore and Arwen Nguyen-Ngo Illustrated by Christy Yung When we think of animals or nature in competition, we picture aggression and savagery over resources such as food, territory and mates. Beyond aggression, however, the variety of animal behaviour associated with competition for resources is immense. A gentle form of competition is the bizarre mating ritual of our own unique Australian fauna: the echidna. Known as Tachyglossus Aculeatus and spiny anteaters, echidnas are quill-covered animals living in Australia and New Guinea. Since Australia is so isolated from other continents, our fauna has often been regarded by outsiders with an air of mystery and awe. To start with, echidnas are in the same family as the famed platypus, called monotremes (egg-laying mammals). Surviving monotreme species can only be found in Australia and New Guinea. The four species of echidnas, along with their duck-billed cousin, are the very few surviving members in this classification. Despite the similarities in their name and appearance in both being covered with hollow, spiny quills, these spiny anteaters are not actually closely related to the more well-known anteaters in the Americas on a genetic and evolutionary basis. Echidnas feed on a diet of ants and termites, using their electroreceptive beaks to find burrowing prey digging them out with their hind claws. These powerful claws are long and curved backwards, specially designed for digging. Funnily, when the British Museum received an echidna specimen, they switched the backward claws frontwards thinking that it was a mistake. As mentioned before, mating rituals can be a violent (even bloody) ordeal in nature. From barbed penises in cats and deadly fights for females in elephant seals, straight to sexual cannibalism in praying mantises, there seems to be endless examples of brutality in the animal world. However, behind these brutal images is another side of nature that seems gentle and even humorous at times: for example, the ritual of our spiny suitors. Echidna mating rituals begin with the formation of a mating train. From June to September in Australia, male echidnas mate by lining up — from their beak tips to their spiny bottoms — to follow behind one single female. These trains can have more than 10 males in line and last for days, even weeks, at a time. During the mating season, male echidnas may leave a train to join or form a different train behind another eligible female. Their mating efforts often lead males to travel for long distances, even beyond their own home ranges. If the males get interrupted and lose track of the female, they reform their train by picking up her scent with their snouts in the air. They are such determined suitors that it is extremely difficult for a female echidna to evade them. Usually, there is one male that remains through the long-winded process, and they get to mate with the female. The reason behind forming echidna trains is unknown, but scientists generally agree that it is correlated with some type of selection process. One theory is that it aids the female in weeding out all the weaker males by tiring them out until the last one remains. Another is that the female is waiting for the right male that she is interested in to get behind her. Either way, it is a process of determination and perseverance. In exceedingly rare occasions where there are still multiple suitors left at the end, the males dig a trench surrounding the female and compete through head bumping. Although there is still much not understood about head bumping due to its scarce occurrence, it is generally considered an echidna social behaviour that serves to maintain dominance. Head bumps are generally only given by dominant echidnas to subordinate echidnas who haven’t recognised their dominance status and moved away. This rarely happens and is a relatively peaceful affair compared to conflicts in other animals. The winner of the mating head bumping ritual then digs until the previously mentioned trench is deep enough for him to be below the female so they can mate through their cloacas. 23 days after copulation, the female lays a soft-shelled leathery egg into a temporary pouch where it continues to incubate for 10 more days when a tiny puggle (a baby echidna or platypus) hatches. The puggle drinks milk from the female’s special mammary hairs until it is capable of feeding itself and has fully covered spines and fur. At last, the matured echidna leaves their mother’s burrow to live independently. The mating rules and practices amongst echidnas are a demonstration of patience and courtesy. This contrasts with the general public misconception of nature being merciless, which is characterised by the brutal competition for food, social status and mating opportunities. Although they are in the same competition for a mate, the lines of waddling echidnas are polite, organised and humorous. Behind the mask of brutality, nature continues to have its pleasant secrets. References Morrow G, Nicol SC. Cool Sex? Hibernation and Reproduction Overlap in the Echidna. PLoS One. 2009 Jun 29;4(6):e6070. Echidna [Internet]. AZ Animals. [cited 2023 Jun 22]. Available from: https://a-z-animals.com/animals/echidna/ Anne Marie Musser. Echidna | Britannica [Internet]. 2023 [cited 2023 Jun 22]. Available from: https://www.britannica.com/animal/echidna-monotreme Echidna trains: explained [Internet]. Australian Geographic. August 6, 2021 [cited 2023 Jun 22]. Available from: https://www.australiangeographic.com.au/topics/wildlife/2021/08/echidna-trains-explained/ Lindenfors P, Tullberg BS. Evolutionary aspects of aggression the importance of sexual selection. Adv Genet. 2011;75:7–22. Warm Your Heart With Videos of ‘Echidna Love Trains’ [Internet]. Atlas Obscura. September 1, 2017. [cited 2023 Jun 22]. Available from: http://www.atlasobscura.com/articles/echidna-love-trains Previous article Next article back to MIRAGE

  • ​Meet OmniSci Writer Mahsa Nabizada | OmniSci Magazine

    Doubting time is real? We spoke to first-year uni student Mahsa Nabizada about her upcoming article on this very topic, plus advice for starting university and why Thorium has a special place in her heart. Mahsa is a writer at OmniSci and a first-year university student planning to study mathematical physics. For Issue 4: Mirage, she is writing about the illusion of time. Mee t OmniSci writer Mahsa Nabizada Mahsa is a writer at OmniSci and a first-year university student planning to study mathematical physics. For Issue 4: Mirage, she is writing about the illusion of time. interviewed by Caitlin Kane What are you studying? I’m studying a Bachelor of Science, and I’m in my first year so I haven't majored yet, but what I’m looking to major in right now is mathematical physics. Do you have any advice for yourself at the beginning of semester, the start of your uni journey? First of all, take it easy. This is a new experience, not only moving out of home, but transitioning from high school to university. I think take your time adjusting to everything and be kind to yourself. Also, really be open to different opportunities, whether that’s meeting new people or learning new topics and new areas. In high school, the fields you're exposed to are very limited but in university it’s much broader. Just like the amount of clubs that are available or opportunities to meet people from different industries. What first got you interested in science? I have always found a natural inclination towards science subjects, and the amount of growth in the industry, whether advancements in technology or health… All of those things I can see the impact in society on the day to day and how it would impact the average person. There are new job descriptions being developed, areas that will be opened in five years. I guess the opportunities that are available, and the excitement and impact that STEM can make in society and to the average person. Do you have a dream role as a scientist, like something that you’ve always imagined doing or that you’re working towards? I don’t have a role in mind, but I do have things I’d love to be involved in. One of those things is research… development in any area, especially STEM areas. I think I'd love to be involved in some sort of research in a future role, no matter what area. I would love to be involved personally or professionally in some kind of community service, like volunteering to work with kids or high school students who are interested in STEM. In high school, I had people who spoke to me about STEM and I found that really helpful. Things like that do make a big impact on students and what they choose or what they are encouraged in going forward.. I would love to be working with a team of diverse professionals solving issues that affect people in society day-to-day. When diverse minds come together, there is opportunity for great things to come out of that. I think that is how I would like to make a positive impact. What is your role at OmniSci? I am a writer and basically I’m given a platform to write on the theme an article about something that I’m interested in. There’s quite a lot of flexibility to that and part of the great thing about this role is that I’m also supported by an editor to help me with my ideas. How did you get involved with OmniSci? What made you want to get involved? In O-Week, I met someone who mentioned the club. It stuck in my head. During week two or three, I was like I really want to join some clubs, ones that I can contribute in and make some friends, ones that would have some like-minded students in it. Hence, I became a member and I heard about the role of writer in the email. Are there other roles or article ideas that you would be interested in trying in the future? I definitely would like to keep writing. There is just so much in the astrophysics area that I’m interested in, but also in the STEM area in general. Moving forward I’d like to contribute as a writer interviewing really interesting people at our university, the University of Melbourne. I think we have some great researchers, amazing talented people, on different projects. As I’ve been supported by my editor and Editor-in-Chief, I would like to in the future also support other writers as an editor or as part of another role in the club to support other writers and members to develop their ideas. Can you give us a sneak peek of what you're working on this issue? Examining the illusion of time is something that I’ve thought about before, how our perception of time on a day-to-day basis is subjective. Sometimes it flies by, sometimes it goes so slowly and why we feel that. Because I come from a physics background, I wanted to bring physics into this and examine those experiences. Right now, I am now at the writing stage on the experience of time, how it varies based on our surroundings, emotional stage and physical state. It is possible that it’s nothing more than an illusion created by the limitations of our perception and conditions of our observation. Moving forward I would like to explore this — it’s a fascinating topic — and interview someone in the field of astrophysics more on the theory of relativity and how time moves relative to the observer, time's connection with gravity… that’s where I’m at right now. What do you like doing in your spare time (when you're not contributing at OmniSci)? I enjoy reading about a variety of different topics, whether that’s fiction, physics, different science areas, but also philosophy. I enjoy sometimes playing chess, hanging out with my friends, and I’m also into watching different plays. I watched Macbeth recently and I'm going to watch another play soon. Do you have any recommendations for any books, articles, plays, other kinds of things that you’ve been getting into? With plays I would say it can depend on what you like. If you find that a play is hard to read, I would suggest not giving up, and going and seeing if you can watch it. Sometimes that can be more engaging. With philosophy I just like researching… there’s lots of different philosophical resources out there. I learn a lot when I’m talking to someone and they don’t agree with me and I go in with an open mind. By the end of the conversation my opinion might have changed, or I might have learnt a completely new philosophical idea that might have changed my view on a certain issue. Which chemical element would you name your firstborn child (or pet) after? I would say... Uranium or Thorium. In grade eleven or grade twelve, my physics assignment was on nuclear power so I spent a lot of time researching Uranium and Thorium, and nuclear fusion, nuclear fission and nuclear power in general. I spent a lot of time, not just on my assignment, but in my own time learning about nuclear power and its future. Either of those, just because I’ve spent a lot of time researching it. I don’t think a child, but potentially a pet if I run out of other ideas. Is there anything else that you wanted to share with the OmniSci community? I think the club in general is quite inspiring. The fact that most people are volunteers and students are taking initiative and time out of their schedule to be a part of this. Read Mahsa's articles Big Bang to Black Holes: Illusionary Nature of Time

OmniSci Magazine acknowledges the Traditional Owners and Custodians of the lands on which we live, work, and learn. We pay our respects to their Elders past and present.

Subscribe to the Magazine

Follow Us on Socials

  • Facebook
  • Instagram
  • LinkedIn
UMSU Affiliated Club Logo
bottom of page