top of page

Search Results

144 results found with an empty search

  • Mental Time Travel: How Far Can I Remember? | OmniSci Magazine

    < Back to Issue 8 Mental Time Travel: How Far Can I Remember? by Sophie Potvin 3 June 2025 Edited by Kara Miwa-Dale Illustrated by Elena Pilo Boyl Trigger warning: This article mentions mental illness and trauma... If at any point the content is distressing, please contact any of the support services listed at the end of the article. Mental Time Travel: How Far Can I Remember? I like to go back in time. Travel to places I have been to. See faces I have not seen in a while. Meet my younger self. See the world as new. As every memory slips through my fingers, I write the pages hoping not to forget anymore. How far can I remember? She opens her eyes, her head hammering as she puts her glasses on to ease the pain. The room is uncommonly empty; it almost echoes her thoughts. In the centre of the room is a teal box in the shape of a seahorse with the label “Recreate your favorite scenes!” This box is the hippocampus — the seahorse shaped structure that is found in the medial temporal lobe (MTL) of the brain — that encodes the space and context of a memory. It is essential for associating information from sensory cortices, binding it to the context and sending the information to the rest of the brain. Confusion makes its way through her mind as a sheet appears on top of the box like magic. It says “Pick a book, read the recipe, and put the right items in the teal seahorse box.” Did you know that every memory is a reconstruction — that a scene is made up every time you remember an event? She does not know it yet, but she will certainly learn that when these fragile pieces are brought back together in the hippocampus, she can relive a moment. Endless shelves of books and objects suddenly appear in rows and columns just like a grid, a playground. She notices that the shelf in front of her, the one wearing the tag “2025”, is half empty. The one next to it, with the sticker “2024”, is full. She walks through a few rows, imagining what secrets are held in the books and between their lines. Her hand chooses the blue book “Costa Rica: Camaronal” and flips through the pages. These words are written in her handwriting: “starry sky, moonlight, high tide, sunburn, hammocks, turtles, beach, sunrise, sand, meetings, deck of cards”. She finds the objects at the end of the shelf and runs to the teal box. She can feel the air sticking to her skin, and hear the waves crashing on the shore. It is the power of mental time-travelling; recollecting episodes of her life. The objects disappear from the box, the feeling goes away, but she wants more. She runs like a child and stops in front of the “2019” shelf to experience a Dungeons & Dragons Friday night with her high school friends. She seems surprised to see that the list of objects for that memory is so short. She brings back the objects, but the hippocampus can only make her travel to a blurry place. Moments from six years ago are already a faint memory. Her curiosity takes over when she wonders how far she can remember. She finds the recipe of her last night of summer camp in 2013: “‘I Love It (feat. Charli XCX)’, dance, lights”. She sighs when looking at the short list because she hates to forget, she really does. Her heart starts beating fast, is her memory failing her? How bad can it be? She continues to wander down the alleys, but her eyes are tearing up as she thinks how she might be nothing without her memories; only a few objects are left, most of them did not stand the test of time. As she reaches her early years, she notices the label “cognitive self” and the floor colour changes under her feet. The cognitive self is a knowledge structure that helps to integrate and bind memories from personal experiences. These experiences are added to the evolving self-consciousness. Along with neurobiological changes in brain structures and the acquisition of language, this can help to make them last longer and shape a sense of being. At least she knows that she is someone. Intrigued, she brings all the objects she can find in the “2004” shelf, but there is no recipe to guide her, no story to be made. All the pieces are in the box, but nothing happened; no feelings, no breeze, no music. The memories that were made in the first two years of her life, were taking the form of beliefs, habits or procedures. There is nothing she can consciously recollect. The inability to consciously recollect memories from one’s own early years of life is also known as infantile amnesia. While waiting for the hippocampus box to make its magic, she loses patience, hits the box a few times begging it to give her back her memories. She does not know that it is universal: cognitively healthy adults and nonhuman species like mice or birds experience infantile amnesia. During infantile neurodevelopment, humans and other species like birds and rats undergo a critical period of learning for memory. Throughout critical periods, different functions like language, sensory functions or memory—in this case, the hippocampal memory system—mature with experience. The presence of specific stimuli are essential for functional development because without it, its competence will forever be impaired. Her hippocampal system must have been responsive to a great amount of experiences to ensure its maturation. It is working as it should. Inside of her, a void of hopelessness sits in her chest because she feels like her brain is failing her; it is her against biology. She looks for clues in the fuller shelves wondering where the memories could be hidden. Were memories ever stored or created? They were created, but any information was stored in latent form due to the immature mechanisms of the young hippocampus. They can get activated under particular circumstances, but not recollected consciously. It is a failure in memory retrieval, not a failure in memory storage. She finds a trap on the green floor thinking pieces might be hidden in the basement. Events leave traces—whether they are full-fledged memories or only remnants—and during the critical period, deleterious experiences can have lifelong consequences on behaviour, affection and the development of psychopathologies. The trap is too small for her to enter, warning her she should not enter this road. She understands that some things are not meant to be found. These moments she cannot recollect are hiding in plain sight; they are embedded in her. Somehow, she learned from them. For a second, she hates the teal seahorse box. Then, she looks at it in awe, terrified and amazed at peace with herself. The hippocampus box starts to turn and Joe Dassin plays. Threads of lights bind items and books together. It takes her back as far as she can go. Feelings. Moments. People. Episodes. Magic. Her. She opens her eyes, teal ink pen in her hand as she is writing these words. Some things I will never remember; My first steps on my two feet. The first time I met my sisters. Just old stories or memories handpicked from a field of photos; And in the end, I would be a stranger. Support resources Grief Australia: counselling services, support groups https://www.grief.org.au/ga/ga/Get-Support.aspx?hkey=2876868e-8666-4ed2-a6a5-3d0ee6e86c30 Griefline: free telephone support, community forum and support groups https://griefline.org.au/ Better Health Channel: coping strategies, list of support services, education on grief https://www.betterhealth.vic.gov.au/health/servicesandsupport/grief Beyond Blue: understanding grief, resources, support, counselling https://www.beyondblue.org.au/mental-health/grief-and-loss Lifeline: real stories, techniques & strategies, apps & tools, support guides, interactive https://toolkit.lifeline.org.au/topics/grief-loss/what-is-grief?gclid=CjwKCAjw-KipBhBtEiwAWjgwrE1pJaaBabh3pT_UR0PlVBZTFMEA26NVJe2ue8sqCF0BLg2rMI4i2xoCp5IQAvD_BwE Reach Out Australia: coping strategies https://au.reachout.com/articles/working-through-grief?gclid=CjwKCAjw-KipBhBtEiwAWjgwrKXLb9w-wXXVLIbhZDkPumIF6ebe-0Pk77Hv7-cK4dLDrHJxCRkyRBoC2B4QAvD_BwE Find a Helpline: for international/country-specific helplines https://findahelpline.com/ References 1. Li S, Callaghan BL, Richardson R. Infantile amnesia: forgotten but not gone. Learn Mem [Internet]. 2014, March [cited 2025 Mar 27]; 21(3):135–9. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3929851/ 2. Donato F, Alberini CM, Amso D, Dragoi G, Dranovsky A, Newcombe NS. The Ontogeny of Hippocampus-Dependent Memories. J Neurosci [Internet] . 2021, Feb 3 [cited 2025 Mar 27]; 41(5):920–6. Available from: https://doi.org/10.1523/JNEUROSCI.1651-20.2020 3. Howe, ML. Early Childhood Memories Are not Repressed: Either They Were Never Formed or Were Quickly Forgotten. Topics in Cognitive Science [Internet]. 2022, July 11 [cited 2025 Mar 27]; 16(4): 707–717. Available from: https://onlinelibrary.wiley.com/doi/10.1111/tops.12636 4. Bauer PJ, Amnesia, Infantile☆. In: Benson JB, editor. Encyclopedia of Infant and Early Childhood Development (Second Edition) [Internet]. Oxford: Elsevier; 2020. p. 45–55 [cited 2025 Mar 27]. Available from: https://www.sciencedirect.com/science/article/pii/B9780128093245212078 5. Stoencheva B, Stoyanova K, Stoyanov D. Infantile Amnesia can be Operationalized as a Psychological Meta Norm in the Development of Memory. JIN [Internet]. 2025, Feb 10 [cited 2025 Mar 27]; 24(2):1–11. Available from: https://www.imrpress.com/journal/JIN/24/2/10.31083/JIN25889 Previous article Next article Enigma back to

  • Behind the Scenes of COVID-19 | OmniSci Magazine

    Conversations in Science Behind the Scenes of COVID-19 with Dr Julian Druce By Zachary Holloway What will our future with COVID-19 look like? How do we live with it? How could it have been managed better? In conversation with Dr Julian Druce, a renowned expert in the field of virology. Edited by Caitlin Kane & Breana Galea Issue 1: September 24, 2021 Illustration by Janna Dingle Interview with Dr Julian Druce, head of the Virus Identification Laboratory at the Victorian Infectious Diseases Reference Laboratory. Before the middle of 2021, it seemed Australia was finally seeing the back of the COVID-19 pandemic: case numbers were down, the vaccine rollout was gaining momentum and Victoria had defeated the Delta variant twice. Fast forward to today, and the outlook doesn’t appear to be as rosy. Over a year and a half from when the pandemic began, it is still dominating headlines around the world. But like many in Australia, I still had many questions regarding the state of the pandemic, our path out of it and how scientists behind the scenes were shaping our public health response. I sat down in conversation with Dr Julian Druce hoping to find some of the answers to these questions. Zachary Holloway: What was the work you were conducting at the Victorian Infectious Diseases Reference Laboratory (VIDRL) before the COVID-19 pandemic? Dr Julian Druce: VIDRL itself is a public health reference laboratory, with a large focus on virology. For virology there are four main labs: one is a big serology laboratory which tests for antibodies and the footprints that a virus leaves after your immune system has interrogated that pathogen. The other labs are more focused on direct detection of some specific viruses: there’s an HIV-specific lab, a hepatitis-specific lab and then my lab, which focuses on all other viruses. These mostly use very specific PCR (polymerase chain reaction) tests for the detection of the virus. Another option for rapidly detecting viruses that might be new is by having tests that, rather than detecting a specific virus, detect a family of viruses at once. They’re called consensus PCRs or pan-viral PCRs. One of those tests was a pan-coronavirus PCR, and that had been sitting in a freezer for thirteen years, only to be brought out at the start of 2020 when SARS-CoV-2 emerged, and that was the test we used to verify that we had the virus by sequencing the PCR product. ZH: I know that VIDRL was the first lab outside of China to grow SARS-CoV-2 in culture. What was the process for this, and how did this help in developing a standardised test for COVID-19? JD: My boss, Dr Mike Catton, and I had been on WHO [World Health Organisation] teleconference calls all through the preceding weeks where everyone was clamouring for someone to grow the virus. So I immediately put it up for culture on the Friday night when we detected it. This process puts a small amount of patient sample onto cells that may get infected with the virus. I came in on Sunday to check it, and thought something might be happening so put the flask of cells onto a camera that took photos every fifteen minutes. As soon as I checked this on Monday, I knew that it was growing because there was an obvious pattern in the cells that showed they were changing. In terms of having the cultured virus, it was then just a process of getting it out to other labs and collaborators. We gamma-irradiated some material and that material, which is killed, was a good positive control material for other laboratories to use to verify and validate their testing algorithms. Because at that point, there were only self-designed tests for COVID-19 in a few labs. This material was used to help validate all the labs around Melbourne and Australia as commercial tests became available to get them ready for testing. ZH: How important was genome sequencing for our contact tracers to be better able to track and trace the spread of the virus? JD: In general, roughly every two weeks the virus will generate one mutation somewhere. That mutation can be used to track the lineage – a bit like a family tree – and once that mutation goes from, say, me to you, you might get a new mutation when you pass it on to someone else. That mutation then becomes a key identifier for that strain. That really helped in tracking and tracing in the early days, to understand who was probably giving it to whom even though contact tracing can often work that out. Importantly though, at that very early stage we closed our borders to China, but we left our borders open to America and Europe. So as cases were coming in from those countries, we had to do genomic sequencing to verify what strain, or lineage if you like, with key mutations were showing up. We could then readily identify whether the samples were from Europe, America or the Ruby Princess, or from wherever there were new cases coming in. ZH: Has the increased infectivity of the Delta variant of SARS-CoV-2 beaten contact tracers and made Australia’s “COVID zero” strategy unachievable? JD: In terms of “COVID zero”, the national pandemic plan has always been to suppress the virus and flatten the curve, and the public health aim of that is to push the volume of samples down and stretch it out along a timeline axis. You might end up with the same numbers, but it’s stretched out across a year rather than one or two months, which shatters your health system. But what we found early was that with a lot of goodwill and effort from the public, we did eliminate the virus. We didn’t necessarily expect to do that, so that was a lucky event. But with the Delta variant, it does seem that it spreads more efficiently: the calculated reproduction rate for this variant is about 3-4 or more, and about 2-3 for the original wild-type. So this makes it much harder to eliminate. ZH: I think millions of people around the country want to know the answer to this question, but when will lockdowns stop being a viable strategy for containing this virus? Does it come with increasing vaccination, or could it continue after that? JD: It very much depends on what happens as we move forward. Of course, vaccination is the pathway out of this. As more people become vaccinated and less susceptible to serious disease and death, we will slowly transform this virus into a common cold, or at least that’s what is likely to happen. But I suspect that as we open up, if it all goes badly, we may have to have some level of restrictions to mitigate transmission. Some of this is already being discussed with entry passports, and people not being allowed into pubs, theatres, or wherever else there is close confinement in a natural or urban setting, unless they’re double-dosed. ZH: In retrospect, how will we rate the response to this pandemic? Was it proportional to the dangers it posed? JD: I think that will be debated for years. Every country has done it a little bit differently, from the worst end of the scale to the best end of the scale. Australia is probably on the better end, in terms of suppressing and eliminating the virus, but we haven’t done as well with the vaccine rollout. We’re getting there now – we’re catching up – but I think, generally, Australia will be viewed favourably as having had a good response. In Australia there’s a double-edged sword with vaccination uptake because we didn’t have the carnage that other countries had.. But now that we’ve got the virus circulating again, that has prompted a greater uptake of the vaccine, which is a good thing. Outside of Australia, I imagine the World Health Organisation will do an analysis of the generalised responses of different countries: from some of the poorer performers – like America and other countries that decided to let it rip, thinking that herd immunity was the best option – to the responses of other countries, mainly severe lockdowns, who suppressed and eliminated the virus. There are still many types of parameters to look at, from economic and socioeconomic to virological and epidemiological, a lot of elements still to tease apart when this is all done. Dr Julian Druce is the head of the Virus Identification Laboratory at the Victorian Infectious Diseases Reference Laboratory, where he works with a team to detect many of the viruses that infect humans and devises new ways to detect novel viruses. We would like to thank Dr Druce for taking the time to meet with us and discuss his work.

  • The Rise of The Planet of AI | OmniSci Magazine

    The Rise of The Planet of AI By Ashley Mamuko When discussing AI, our minds instinctively fear of sentience and robotic uprising. However, is our focus misplaced on the “inevitable” humanoid future when AI has become ubiquitous and undetectable in our lives? Edited by Hamish Payne & Katherine Tweedie Issue 1: September 24, 2021 Illustration by Aisyah Mohammad Sulhanuddin On August 19th 2021, Tesla announced a bold project on its AI Day. The company plans to introduce humanoid robots for consumer use. These machines are expected to perform basic, mundane household tasks and streamline easily into our everyday lives.With this new release, the future of AI seems to be closing in. No longer do we stand idle, expecting the inevitable humanoid-impacted future. By 2022, these prototypes are expected to launch. It seems inevitable that our future would include AI. We have already familiarised ourselves with this emerging technology in the media we continue to enjoy. Wall E, Blade Runner, The Terminator, and Ex Machina are only a few examples of the endless list of AI-related movies, spanning decades and detailing both our apprehension and acceptance through multiple decades. Most of these movies portray these machines as sentient yet intrinsically evil, as they pursue human destruction. But to further understand the growing field of study of AI, it’s important to first briefly introduce its history and procurement before noting the growing concerns played up in the Hollywood Blockbusters. The first fundamental interpretations of Artificial Intelligence span a vast period of time. Its first acknowledgement may be attributed to the 1308 Catalan poet and theologian Ramon Llull. His work Ars generalis ultima (The Ultimate General Art) advanced a paper-based mechanical process that creates new knowledge from a combination of concepts. Llull aimed to create a method of deducing logical religious and philosophical truths numerically. In 1642, French mathematician Blaise Pascal invented the first mechanical calculating machine; the first iteration of the modern calculator (1). The Pascaline, as it is now known, only had the ability to add or subtract values using a dial and spoke system (2). Though these two early ideas do not match our modern perceptions of what AI is, they lay the foundation of pushing logical processes to do more than just mechanical means. These two instances in history foreshadow the use of mechanical devices in performing human cognitive functions. Not till the 1940s and early 1950s did we finally obtain the necessary means of more complex data processing systems. With the introduction of computers, the novelty of algorithms created a more streamlined function of storing, computing, and producing. In 1943, Warren McCulloch and Walter Pitts founded the idea of artificial neural networks in their paper “A Logical Calculus of Ideas Immanent in Nervous Activity” (3). This presented the notion of computers behaving similar to a human mind and introduced the subsection of “deep learning”. Alan Turing proposed a test to assess a human’s ability to differentiate between human behaviour and robotic behaviour. In 1950, the Turing Test (later known as the Imitation Game) asked participants to identify if the dialogue they were engaging with was with another person or a machine (4). Despite the breakthroughs made in this expertise, the term Artificial Intelligence wasn’t finally coined till 1955 by John McCarthy of AI. Later on, McCarthy along with many other budding experts would hold the famous 1956 Dartmouth College Workshop (5). This meetup of a few scientists would later be pinpointed in history as the birth of the AI field. As the field continued to grow, more public concerns were raised alongside the boom of science fiction literature and movies cropping up. The notorious 1968 movie 2001: A Space Odyssey shaped such a role into the public perception of the field that by the 1960s and 1970s, an AI Winter occurred. Very little notable progress was made in the field due to the lack of funding based on fear (6). Finally after some time had passed and some more advancements were made with algorithm technology, the notable Deep Blue chess game against Gary Kasparov. The event occurring in May 1997 where the Deep Blue robot beat world champion chess superstar Gary Kasparov marked a silence ushering of perhaps a “decline in human society” at the fall of the machine. Fast forward to now, AI has traversed through leaps and bounds to achieve a much more sophisticated level of algorithms and machine learning techniques. To further understand the uses of AI, I interviewed Dr Liz Sonenberg, a professor in the School of Computing and Information Systems at The University of Melbourne and is a Pro Vice-Chancellor (Research Infrastructure and Systems) in Chancellery Research and Enterprise. She’s an expert in the field and has done a multitude of research. "Machine learning is simply a sophisticated algorithm to detect patterns in data sets that has a basis in statistics." With this algorithm, we have been able to implement it in a variety of our daily tech encounters. AI sits behind the driving force of Google Maps and navigation, as well as voice control. It can easily be found anywhere. “Just because these examples do not exhibit super intelligence, does not mean they are not useful,” Dr Sonenberg explains. Dr Sonenberg alludes that the real problem with AI lies within it’s fairness. These “pattern generating algorithms” at times “learn from training sets not representative of the whole population, which can end up with biased answers.” With a flawed training set, a flawed system is in place. This can be harmful to certain demographics and cause a sway on consumer habits. With AI-aided advice, the explanation behind outcomes and decisions are not supported either. Algorithms are only able to mechanically produce an output, but not explain them. With more high-stakes decisions untrusted upon the reliability of AI, the issue of flawed algorithms becomes more pronounced. With my interview with Dr Sonenberg, not one moment was the fear of super-intelligence, robot uprisings, and the likes brought up... With the new-found knowledge of AI’s current concerns I brought up with Dr Sonenberg, I conducted another interview with Dr Tim Miller, a Professor of Computer Science in the School of Computing and Information Systems at The University of Melbourne, and Dr Jeannie Paterson, a Professor teaching subjects in law and emerging technologies in the School of Law at The University of Melbourne. They both are also Co-Directors at The Centre for Artificial Intelligence and Digital Ethics (CAIDE). As we began the interview, Dr Miller explained again that AI “is not magic” and implements the use of “math and statistics”. Dr Paterson was clear to bring up that anti-discrimination laws have been in place but as technology evolves and embeds itself more into public domain, it must be scrutinised. The deployment of AI can easily cause harm to people due to systems not being public, causing sources to be difficult to identify and causily attribute. With the prospect of biased algorithms, a fine dissonance occurs. Dr Miller elaborated on the use of AI in medical imaging used in private hospitals. As private hospitals tend to attract a certain echelon of society, the training set is not wholly representative of the greater population. “A dilemma occurs with racist algorithms… if it is not used [outcomes] could be worse.” When the idea of a potential super-intelligent robot emerging in the future was brought into conversation, the two didn’t seem to be very impressed. “Don’t attribute superhuman qualities [to it],” says Dr Paterson. Dr Miller states that the trajectory of AI’s future is difficult to map. Predictions in the past of how AI progresses with it’s abilities have occurred, but they occur much later than expected… easily decades later. The idea of super-intelligence also poses the question on how to define intelligence. “Intelligence is multidimensional, it has its limits,” says Dr Miller. In this mystical future world of AI, a distinction is placed not just on, “what will machines be able to do but what will not have them do,” states Dr Miller. “This regards anything that requires social interaction, creativity and leadership”; so the future is aided by AI, not dictated by it. However, in a more near future, some very real concerns are posed. Job security, influence on consumer habits, transparency, law approach, and accountability are only a few. With more and more jobs being replaced by machines, every industry is at stake. “Anything repetitive can be automated,” says Dr Miller. But this does not instinctively pose a negative, as more jobs will be created to further aid the use of AI. And not all functions of a job can be replaced by AI. Dr Paterson explains with the example of radiology that AI is able to diagnose and interpret scans, but a radiologist does more than just diagnose and interpret on a daily basis. “The AI is used to aid in the already existing profession, not simply overtake it.” Greater transparency is needed in showing how AI uses our data. “It shouldn’t be used to collect data unlimitedly,” says Dr Paterson, “is it doing what’s being promised, is it discriminating people, is it embedding inequality?” With this in mind, Dr Paterson suggests that more law authorities should be educated on how to approach topics regarding AI. “There needs [to be] better explanation… [We] need to educate judges and lawyers.” With the notorious Facebook-Cambridge Analytica scandal of 2018, the big question of accountability was raised. The scandal involved the unwarranted use of data from 87 million Facebook users by Cambridge Analytica which served to support the Trump campaign. This scandal brought to light how the data we used can be exploited nonconsensually and used to influence our behaviours, as this particular example seemed to sway the American presidential election. Simply put, our information can be easily exploited and sent off to data analytics to further influence our choices. This creates the defence that apps “ merely provide a [service], but people use [these services] in that way,” as said by Dr Miller. Simply put, the blame becomes falsely shifted onto the users for the spread of misinformation. The impetus, however, should lie with social networking sites disclosing to it’s users more transparency on their data usage and history as well as providing adequate protection on their data. To be frank, the future of robotic humanoid AI integrating seamlessly into human livelihoods will not occur within our lifetimes, or potentially even our grandchildren’s. The forecast seems at best, unpredictable; and at worst, unattainable due to the complexity of what constitutes full “sentience”. However, this does not indicate that AI lies dormant within our lives. The fundamental technology based in computing, statistics, and information systems lays most of the groundwork for most transactions we conduct online, whether monetary or social or otherwise. AI and it’s promises should not be shunted aside due to the misleading media surrounding it’s popularised definition and “robot uprisings” but rather taught more broadly to all audiences. So perhaps Elon Musk’s fantastical ideas of robotic integration will not occur by 2022 but the presence of AI in modern technologies should not go unnoticed. References: 1. "A Very Short History of Artificial Intelligence (AI)." 2016. Forbes. https://www.forbes.com/sites/gilpress/2016/12/30/a-very-short-history-of-artificial-intelligence-ai/?sh=38106456fba2. 2. “Blaise Pascal Invents a Calculator: The Pascaline.” n.d. Jeremy Norma's Historyofinformation.com. https://www.historyofinformation.com/detail.php?id=382. 3, 4, 6. “History of Artificial Intelligence.” n.d. Council of Europe. https://www.coe.int/en/web/artificial-intelligence/history-of-ai. 5. Smith, Chris, Brian McGuire, Ting Huang, and Gary Yang. 2006. “The History of Artificial Intelligence,” A file for a class called History of Computing offered at the University of Washington. https://courses.cs.washington.edu/courses/csep590/06au/projects/history-ai.pdf.

  • Existing in an Alien World: Navigating Neurodiversity in a System Built for Someone Else

    By Hazel Theophania < Back to Issue 3 Existing in an Alien World: Navigating Neurodiversity in a System Built for Someone Else By Hazel Theophania 10 September 2022 Edited by Breana Galea and Ruby Dempsey Illustrated by Janna Dingle Next Content warnings: Ableism, mental illness. Have you ever read something that just makes everything click into place? For me, it was that autism is characterised by a difficulty in forming and understanding ‘second-order representations’1. Let me explain: A ‘first order representation’ is the face value, the direct interpretation of an object or event. A ‘second order representation’ is the underlying meaning, the non-literal association with an object or event. Autistic people struggle with the latter. Allistic (non-autistic) people don’t, and for them it’s intrinsic in a large part of communication – nonverbal cues and gestures, sarcasm, undertones, passive aggression, politeness and more complex events like communication of social hierarchy all take place beneath the veneer of explicit communication. They rely on the ability to interpret another’s actions based on extrapolating their perspective. Rather than being automatic for autistic people, doing so is a learned, active behaviour, and one that is taxing to maintain and use. Reading this explanation was epiphanous for me for two reasons: it concisely explained why I and other autistic people I knew had such trouble navigating and communicating in social interactions, and it clarified why conflict and miscommunication arose so frequently. It contextualised and validated the way I experience and understand the world. Autistic communication is direct, predominantly using first order representation. It doesn’t soften effect or hide meaning with subtext; conversely it has difficulty picking up on inference and implication from others. So many times I have answered questions or followed instructions ‘incorrectly’, because I’ve addressed the words and not the implied meaning underneath. Much of boundary setting and emotional communication in social relationships is implicit - are they ‘acting’ interested? Does it ‘feel’ like they are reciprocating? Can you ‘tell’ that they want to be friends? - inability and difficulty in reading those complex second order representations makes navigating those situations painful and confusing. These struggles and anxieties make it much harder for autistic individuals to make and maintain friendships (3). Sedgewick and Pellicano (3) found that both autistic girls and boys report weaker friendships with more conflict than their neurotypical peers. They experience more victimisation, autistic girls especially, from bullying and other relational aggression, and experience far more insecurity around their friendships. The authors identify “both autistic and neurotypical girls alluded to wanting to fit in, but in different ways.” The neurotypical girls in the study were more concerned with securing a place in the social hierarchy – appearing cool and fitting in with the popular crowd - whether through dating or other means. For the autistic girls it was about finding people who actually accepted them as themselves; fitting in was not about adhering to social expectations, but about finding friends where they didn’t have to. Bury and Hedley (5) found much the same issues in analysing the problems autistic people face in the workplace. While the work itself was no more trouble for autistic individuals than their neurotypical counterparts, navigating the social aspects of a workplace drastically increased the stress and drain on autistic employees. Issues can arise from relative trivialities like dealing with food or birthday wishes, up to serious conflicts that jeopardise their employment. The same communication and relational issues that lead to autistic individuals struggling socially can have more serious consequences when the miscommunication and conflict arise when interacting with an authority, such as a boss or supervisor. Problems stem from unclear instructions, not adhering to unwritten or unspoken rules (social and otherwise), interrupting and socialising at wrong times – everything that relies on being able to determine and pick up on implicit communication. In other words, being autistic has career consequences. Now, having anxiety or depression aren’t intrinsic to being autistic (6). They’re not part of the same dysfunction in development. However, something about being expected to negotiate a minefield of implicit communication that others grasp intuitively leads to an extreme coincidence of autism with both anxiety and depression. The social ostracism and punishment for violating rules you’ve never been taught casts a slight shadow over every interaction. The starkly increased incidences of bullying and victimisation autistic youth go through may also contribute to mental illness. Mayes, Murray and their team7 write: “It is quite possible that youth with ASD (youth with Autism Spectrum Disorder (ASD) ) face considerable challenges during the transition from childhood to adolescence. Social difficulties and awareness of being different from others, especially during the teen years, may lead to problems with anxiety, depression, or hostility.” They reported anxiety in autistic children ranging from 67% to 79% depending on the severity of their traits, and depression affecting between 42% and 54% likewise – in comparison to anxiety occurring in 8% of children and adolescents8 and depression in 5% of children, 17% of adolescents13, and 5% of adults12 overall. Similar figures are reported by Susan White and her colleagues in their meta-analysis “Anxiety in children and adolescents with autism spectrum disorders”. The social deficits autistic individuals endure lead to social anxiety by increasing the likelihood of negative interactions9 and then that anxiety makes interaction with others more difficult, perpetuating the cycle. It’s clear there’s an issue here. Despite no biological link, autistic people suffer far greater rates of depression and anxiety than their neurotypical counterparts. They find friendships more taxing, worrying, and less fulfilling due to impossible unrealistic expectations of allistic communication and understanding. They’re far more likely to be the target of bullying and victimisation than their neurotypical counterparts. Autistic adults suffer in their careers and employment due to a lack of accommodation and recognition. But it doesn’t have to be this way. Growing up neurodivergent shouldn’t be traumatic. Existing as an autistic person shouldn’t be fraught with conflict. I don’t know how we will get to that point. It feels like there are a hundred facets to the issue, each their own problem and needing their own solution. That being said, all solutions need to stem from an understanding of autism and autistic individuals. So, what does it mean to be autistic and how can we navigate those communicative differences? The social aspect of autism arises from a deficit in ‘Theory of Mind’, which is the capacity to interpret and conceptualise another’s thoughts, beliefs, emotions, and intentions (1, 2, 9, 10). Second order representations are the events in which Theory of Mind is used to interpret their meaning – and so a disorder in Theory of Mind development affects the ability of an individual to use and understand those second order representations. Essentially: autistic individuals struggle to interpret and conceptualise other people’s thoughts, beliefs, emotions and intentions. What does that mean for communication? As mentioned earlier, it leads to this a twofold miscommunication between autistic and allistic people, where autistic people don’t see meaning where it is, and allistic people see meaning where it isn’t. This is known as the ‘double empathy problem’ (2). But it isn’t just a communication deficit on the part of the autistic person – the disconnect is due to two entirely different communication styles. Allistic people use second order representations readily and frequently. They’re able to infer other’s perspectives with ease and conversation is based around these assumptions. Gestures, body language and inference are used to convey meaning and assess receptiveness. If the wrong assumptions are made, it can lead to ‘fragmenting’, where there is a cost to getting it wrong and the conversation is disrupted (2). It may not be relationship-damaging every time, but people do pick up on misread cues or intentions and often the only indication a mistake has been made is given through those same implicit communications. The creation of a shared understanding is known as ‘intersubjectivity’ (1, 2). Allistic intersubjectivity is managed through these second order representations, where the shared understanding is outlined and defined implicitly. Autistic people don’t have the same ability to interpret second order representations, so rather than probing or assessing what others have in common, they essentially have to guess. As a result, autistic people can seem appear egotistical or self-interested (2) when they spontaneously talk about an interest of theirs, or suddenly change the topic of conversation. In actuality, they’re trying to find common ground. Because finding that initial mutuality is harder, autistic individuals also place far less of a social cost on getting it wrong (2) and so while intersubjectivity may be harder to initially reach, there’s far less penalty for trying and failing. If these bids for connection are reciprocated, it can creates a “rich intersubjective space for shared understanding” (2). These two elements of autistic communication come together to form a coherent communication style. Heasman writes “The generous assumption of common ground and the low demand for coordination are more than two isolated features; they potentially fit together into a functional system that allows rich forms of social relating” (2). The autistic communication style only appears to be dysfunctional when “[placed] against the cultural backdrop of neurotypical norms and expectations” (2). Another way to look at that is that autistic people don’t need ‘extra’ accommodation or compensation compared to allistic people – allistic people just have all their needs already met. They’re already accommodated for, but it’s such a cultural norm that it’s not even perceived as being so. A metaphor for the two types of communication is that of an allistic person and an autistic person trying to set up fishing rods along a river. The allistic person knows where the fish are - perhaps from reading the movement of the water - and sets up all their poles in that spot. The autistic fisherperson has no such information and sets up their rods all up and down the river to try to find themwhere the fish are. Once they’ve got a few bites and know where the fish are, great! They can move all their rods and set up in whatever spot they’ve found. They just don’t have the same ability to determine where to set up in the first place. They’re not any worse at fishing (i.e., communicating) – they just have trouble knowing where to start. Autism is only a disability in an environment that doesn’t support it. As Bury noted, the only deficits in the workplace are from a lack of social accommodation – autistic individuals don’t struggle with the work itself. In fact, both Bury and Hurley-Hanson and her co-authors report that autistic individuals perform better in a multitude of areas: they have greater problem-solving, pattern-recognition and decision-making skills and a greater tolerance for repetition (5, 11). And that’s great! It’s wonderful to be recognised for the talents you have and the effort you put in. But it shouldn’t have to be justified that autistic people deserve employment and equitable treatment. It’s depressing to have your life and experience boiled down to your marketability and employability. But there is still a disconnect between autistic and allistic people. The perception of autistic people as defective rather than different prevents the integration and acceptance of autistic people into the social space and workforce. To work towards an autism-friendly society, education and awareness of the ways communication and understanding differ in neurodivergent individuals need to be ubiquitous. The hardships autistic people face aren’t because we’re autistic – they’re because everyone else isn’t. Instead of us continuing to assimilate to an allistic worldview, perhaps it’s time to meet us halfway and learn how we operate instead. References Frith, U. (1989) A new look at language and communication in autism. Heasman, B. (2018) Neurodivergent intersubjectivity: Distinctive features of how autistic people create shared understanding. Sedgewick, F., Pellicano, E., (2018) ‘It’s different for girls’: Gender differences in the friendships and conflict of autistic and neurotypical adolescents. Happé, F., Leslie, A. (1989) Autism and ostensive communication: The relevance of metarepresentation Bury, S. et al. (2020) Workplace Social Challenges Experienced by Employees on the Autism Spectrum: An International Exploratory Study Examining Employee and Supervisor Perspectives White, W. et al. (2009) “Anxiety in children and adolescents with autism spectrum disorders.” Mayes, S.D., Calhoun, S.L., Murray, M.J. et al. (2011) Variables Associated with Anxiety and Depression in Children with Autism. Bernstein, G. A., & Borchardt, C. M. (1991). Anxiety disorders in childhood and adolescence: A critical review. Journal of the American Academy of Child and Adolescent Psychiatry Bellini, S. (2004) Social Skill Deficits and Anxiety in High-Functioning Adolescents With Autism Spectrum Disorders. Focus on Autism and Other Developmental Disabilities. Brewer, N, Young, RL & Barnett, E 2017, ‘Measuring Theory of Mind in Adults with Autism Spectrum Disorder’ Hurley-Hanson, A. (2020) ‘Autism in the Workplace’, Palgrave Macmillan Institute of Health Metrics and Evaluation. Global Health Data Exchange (GHDx) Selph, S. (2019) Depression in Children and Adolescents: Evaluation and Treatment Previous article Next article alien back to

  • Mighty Microscopic Warriors!

    By Gaurika Loomba < Back to Issue 3 Mighty Microscopic Warriors! By Gaurika Loomba 10 September 2022 Edited by Niesha Baker and Khoa-Anh Tran Illustrated by Rachel Ko Next It’s a fine Saturday afternoon. You’re sitting in your backyard sipping on coffee and losing your mind over the daily Wordle. While you’re so engrossed, an unusual, blue-colored creature pulls another chair and solves the Wordle for you. Just as you look up and try to process the condescending smirk of this creature, your daily news notification pops up. It's true! The whole world has been invaded by aliens! Thankfully this is a figment of our imagination, but would you believe me if I told you that alien invasions are constantly happening unnoticed in the microscopic world of our bodies? Every day, our cells face new ‘alien invasions’, thanks to unhygienic eating, or even just from breathing! In the external world, such an invasion would unsettle the entire human population and adversely impact the lives of everyone. It’s amazing how such invasions inside our bodies are usually defeated daily. So who are these tiny ‘soldiers’ that fight them off, silently and efficiently? It’s time to introduce the two brothers of our story– the innate immune cells system and the adaptive immune cells system, the former being the more enthusiastic and energetic one, while the latter is calmer and wiser. Although different in nature, the two systems coordinate efficiently to eliminate our enemies and help us go on about our lives. The innate immune system acts first when a pathogen (a disease-causing microorganism) manages to enter our bodies by getting around our physical barriers like the skin, and the mucus in the respiratory, gastric, urinary, and sexual tracts, etc. The innate immune system consists of cells like macrophages and dendritic cells (DCs), which are constantly looking out for incoming invaders. These cells recognise pathogens through common foreign attributes that our native cells don’t possess. In order to defend us from the harmful effects of the pathogen, our innate cells engulf them. In fact, the word ‘macrophages’ literally means ‘big eaters.’ Inside our cells, the pathogens’ end is inevitable, smashed and broken into pieces, which are mounted on our soldier cells’ surfaces, informing other soldier cells that an invasion has occurred. Exposing broken parts of the pathogen on our innate cells’ surfaces also produces chemicals called cytokines that help recruit more of our soldier cells to the site of invasion. So, when we get flu, the secreted cytokines is why we run a fever, cough, sneeze, and influx of our soldier cells to the throat area is why we may have swelling around there. Similarly, if we bruise, our blood vessels dilate to allow entry of our soldier cells to the wounded area, which is then manifested as redness and swelling around it. Fortunately, this means of communication of our soldier cells is much faster than our internet connection and so the whole process occurs in a matter of hours. On most days, the keen innate immune system is enough to control an invasion. However, it needs big brotherly advice from the adaptive immune system in case things get out of hand. The main players of this part of the immune system are the calm B- and T-cells. These can be found resting in the lymph nodes, unaware of the invasion in the body. The B- and T-cells are wise soldiers, which is evident in the way they respond to an invasion. Each of these cells has molecules called ‘receptors’, which uniquely recognise pathogen parts presented to them. These receptors, on an adaptive cell, can be thought of as padlocks and the broken pathogen parts, mounted on an innate cell, as a key. In the lymph nodes, each resting B- and T-cell has a different type of padlock, unique for a different key. It is the job of a DC, with a broken pathogen part mounted on its surface, to enter the lymph nodes and search for the most accurate match for its key, from the variety of B- and T-cell padlocks. The key varies based on the different types of pathogens that invade our bodies. Once the perfect match is found, that specific B- and T-cell is activated and rapidly multiplied. This lock-and-key method of activation of adaptive cells confers the specificity of their action. These activated cells move from the lymph nodes to the site of infection and perform different functions that halt the pathogen from spreading the disease, by either killing the pathogen or stopping its reproduction. At the site of infection, innate cells, with the key (broken pathogen part) mounted on their surface wait for the brotherly advice, the incoming adaptive cells with the perfect match to the key. The activated T-cells uniquely interact with macrophages and signal them to start killing the pathogens that they have engulfed. This helps with clearance of the pathogen. Although B-cells are part of the adaptive immune system, they can also recognise the foreign pathogen products, break them down, and present these parts on their surface, just like the innate immune cells. So now B-cells also have a key to the activated T-cell padlocks. Their lock-and-key interaction facilitates the B-cells to release antibodies. Finally, the antibodies, together with the macrophages and DCs, as well as the B- and T-cells of the adaptive immune system, successfully win the war and die peacefully, having completed their purpose. But a small portion of B- and T-cells go on and develop into long-lived memory cells. Over the span of our lives, we are infected and reinfected with pathogens all the time, however not every encounter results in us falling sick. The credit goes to the B- and T-memory cells and their ability to remember the foreign attributes of the pathogen and kill it as soon as it re-invades. Adaptive cells’ memory is the principle of vaccination. An inactive pathogen or a part of the pathogen is introduced into the body. This trains our soldier cells for a real pathogen invasion by triggering the B-cells to form memory and specialised antibodies against the pseudo-pathogen. If the real pathogen infects us again, these pre-formed antibodies make fighting the war much easier and quicker. Correct training of immune cells is essential since a pathogen invasion is a life-or-death situation for us. Any mistakes by our soldier cells can have devastating effects. For example, an important part of the training process is to ensure the immune cells aptly distinguish between civilian cells and foreign cells. This education occurs in the bone marrow. Here, any B- or T-cells that attack civilian cells or cell parts are evicted from the training process so only the most eligible soldier cells continue to become eligible soldiers. (1) But even after a rigorous selection process, things can go wrong with our immune system. Instead of being our defending heroes, they turn their back against us and start identifying civilian cells as aliens and attacking them. Sadly, this is the reality for 5% of the Australian population, with a majority being women. This condition, when the immune cells stop distinguishing internal cells from alien cells, is called an auto-immune disorder. The cause for this disorder is mostly unknown, with some speculations of it being genetic or environmental. The repercussions can be mild, such as causing dry mouth and dry eyes - symptoms for Sjogren’s syndrome, or more severe such as joint pain and immobilisation, known as Rheumatoid Arthritis. These diseases are currently life-long and incurable because they involve our own cells fighting the healthy cells in our body. (2) Nevertheless, the immune system plays a very important role in helping us lead normal lives. It fights the battle against the invaders daily, without us realizing it. Thanks to the soldiers of the immune system, our daily activities, like solving a Wordle on a relaxing Saturday, are not hindered by an alien cell invasion in our bodies! References Kenneth Murphy, Casey Weaver. Basic concepts in Immunology. Janeway’s Immunobiology. 9th ed. United States: Garland Science Taylor and Francis; 2017. p. 4-11 Overview of autoimmune diseases [Internet]. Healthdirect. Available from: Overview of autoimmune diseases | healthdirect Previous article Next article alien back to

  • Meet OmniSci Writer and Editor Elijah McEvoy | OmniSci Magazine

    Bored of that one topic you need to keep revising? Read our chat with Elijah McEvoy about getting inspired by all areas of science, his sci-fi movie recommendations, and hear about his upcoming article about artificial intelligence. Elijah is a writer and editor at OmniSci and a second-year Bachelor of Science student. For Issue 4: Mirage, he is writing about artificial intelligence that masquerades as human, and contributing to two articles as an editor. Mee t OmniSci Writer and Editor Elijah McEvoy Elijah is a writer and editor at OmniSci and a second-year Bachelor of Science student. For Issue 4: Mirage, he is writing about artificial intelligence that masquerades as human, and contributing to two articles as an editor. interviewed by Caitlin Kane What are you studying? Bachelor of Science, looking to major in infection and immunity. I still have some back ups, but that’s looking to be the path. I’m in second year, first semester. Do you have any advice for younger students interested in what you’re studying or more generally? The Bachelor of Science is really, really good. That’s my suggestion. If you’re someone like me who loves all areas of science and was a bit unsure about what path I wanted to go down, then science is really great to explore all those opportunities. What first got you interested in science? I would say probably science fiction movies. I saw Jurassic Park when I was really young and my parents bought it for me on DVD. I found all that science-y background to it very interesting and obviously those stories gets you engaged… What's the scientific backing behind that? That would probably be very early what got me interested in science. Did you always imagine that you would study science formally, or this kind of science? Not exactly. I’ve had the science pathway in mind for a long time, but there were a lot of things in high school that made me consider whether I did or didn’t want to do it. I found writing very interesting in high school, and I was considering whether I do science or I don’t do science… In the end, I’ve found everything that I’m learning so fascinating and I love the ability that I’m continuing to learn everyday in science and that my perspective continues to grow. And the final pathway… is something that’s relatively new. COVID got me interested in studying viruses and microbiology and the management of those situations as well. That is a bit more of a new thing, but all build off continuing to learn and do things in science. What would be your dream role as a scientist? Do you have a job in mind after your studies? I’m a bit undecided… A dream role of mine would definitely involve learning new things, where I can communicate and work in a position that’s not just in a lab or doing continuous research. Something where I can take the stuff learnt in a lab, figured out in a laboratory and apply it to society as a whole, whether working in government or with organisations in public health particularly infection and immunity. What is your role at OmniSci? I’m writing an article for the magazine… I’ve always loved writing and it’s given me an outlet to pursue a bit of writing in a scientific field, which is something very exciting that I’m passionate about. I would describe [editing] as a really great opportunity to work with someone else to hone their idea. I find it very interesting to see what other people's ideas about other aspects of science are and get informed through them, to encourage their opinions and ideas, and the way they express that. Are there other roles you would be interested in trying in the future? Or any other topics you are interested in writing about? Yes, there probably would be. I’ve always found… if you go back to Jurassic park, genetic engineering is always an interesting topic to cover. Particularly one that is growing and growing nowadays with greater access to it. I find all of this very interesting, the science behind genetic engineering… functional and ethical applications, all those questions. How did you get involved with OmniSci? I saw it on the initial club listing in first year, but I don't think anything came out of it… I was trying to figure my way around university as a whole. Then at the start of the year, I made a commitment to myself that I wanted to get involved a bit more. I saw it again in the club listing website and I checked out the website and saw how many people were involved and had different roles and came from different science backgrounds and I thought “oh this looks like a very accepting club and organisation to get involved with” and just signed up! I saw the welcome night that you guys were having and went along to that and decided I wanted to get involved. What is your favourite thing about contributing at OmniSci so far, or something that you’re looking forward to? Giving myself an outlet to learn new things. What I’m writing about isn’t really within my field of science particularly, but it’s a topic I’ve chosen because I find it interesting and it’s encouraged me to go on and learn a lot more about that. But not only that, it’s encouraged me to talk with other people at OmniSci that do know a bit more and can share their opinions. It’s really helped me guide what research I do and where I go from there. That’s probably my favourite thing: giving myself an excuse to learn a bit more about science through writing. Can you give us a sneak peak or pitch of what you're working on this issue? If there’s a lot to come, maybe just what stage you’re up to in the process? Within the theme of mirage, it’s specifically about artificial intelligence that is able to mimic human ability, whether that be human speech, human personality, how we look through deep fake photos and generative AI technology. And looking at how that could potentially impact different wings of life, and how that can be exploited. I mainly go into general discussion of those sort of things and the potential, but I do end on the idea of what needs to be done considering how fast this AI is progressing, and whether regulation is necessary in order to ensure that human work is protected and us as humans are not being exploited by some of the potential applications from this technology. What do you like doing in your spare time (when you're not contributing at OmniSci)? I’m a big movie person. I watch as many movies as possible and I discuss movies with friends… making the most of the student movie nights and cheap deals. Seeing as many movies as possible from a variety of backgrounds. I also like writing. I do a bit of writing in my spare time, but mostly movies. Do you have any movie recommendations? Big question. I love horror movies so if you’re looking for a horror movie I recommend ‘Hereditary’, it’s my favourite horror movie. I guess within the realm of scifi and even artificial intelligence, a really good one that I saw is Ex Machina. Which chemical element would you name your firstborn child (or pet) after? I should be able to think of one—I’m a biochemistry student! Fluorine sounds interesting. Fluora could be a nickname. Yeah, something that you can shorten down. Read Elijah's articles Real Life Replicants

  • Behind the Mask

    By Yvette Marris Behind the Mask By Yvette Marris 23 March 2022 Edited by Tanya Kovacevic Illustrated by Quynh Anh Nguyen It would be hard to write about A Year in Science without the obligatory COVID article. We hear constantly about the stresses of being a frontline healthcare worker, the signs and symptoms of long COVID, and the endless vaccine scepticism. I’d like to tell a slightly different story. During the COVID pandemic, other infections didn’t just take a holiday and cancers didn’t just stop growing. More ordinary illness and injury continued behind the headlines. As a consequence of the pandemic, healthcare workers are additionally dealing with an abundance of patients, delays with diagnosis and some very complex medical cases. Megan Gifford worked in a hospital that didn’t primarily treat COVID-19 patients, but still had to adapt to the constant changing of rules, regulations and policies put in place to protect staff and patients alike from the virus. Now at the Peter MacCallum Cancer Centre in Melbourne, Gifford spoke to me about her experiences working at Townsville University Hospital in the only bone marrow transplant ward servicing a large population across regional Queensland. Gifford experienced the stress and burden of trying, not only to assuage their own anxieties but to also provide current, up-to-date information to patients and deliver high quality care. There were the frustrations of unavoidable logistical problems like border closures, stay-at-home orders, preventing access to crucial materials and patient transport. There was heartbreak of watching transplant patients deteriorate mentally, as their will to persist with treatments began to fade. Pathologists and haematologists also found themselves facing an unprecedented logistical nightmare, including re-allocation of diagnostic equipment and protective equipment for mass COVID testing. Access to essential biomedical material like blood and plasma became increasingly difficult and many suffered as a result. While pandemic consequences like long COVID and the increased prevalence of affective disorders, like depression and anxiety, are well documented in media and academia, post-traumatic stress disorder (PTSD) hasn’t gotten the same amount of attention. Statistics and anecdotes alike are staggering, both for patients and healthcare workers. With stressors like an unprecedented number of critically ill patients, capricious disease progressions, high mortality, and ever-changing treatment guidelines the world was sympathetic to healthcare workers’ struggles (3). Yet with the lockdowns and restrictions over, it would be naïve to think everything would just return to normal. It was found that 29% of healthcare workers had clinical or sub-clinical symptoms of PTSD (1), and that this figure was significantly higher for healthcare workers directly treating COVID patients (2). Gifford recalled anecdotes of “patients suffering anxiety attacks when they smell the hospital alcohol rub and hear the familiar beeping of the various equipment”. Even beyond the mental health scope, logistical issues like delayed learning for medical students or the backlog of elective procedures is still placing an enormous burden on healthcare workers, despite the immediate threat seemingly behind us. But to say that everything remains in shambles would frankly be insulting to healthcare workers, who are working tirelessly to deliver good quality healthcare. The speed at which pathologists and scientists have adapted to limited resources and supply shortages, and the way in which doctors and frontline workers have shifted their style of care and developed new problem-solving skills, are exceptional and should not go unnoticed or unappreciated. Importantly, the COVID-19 pandemic and its ripple effects have brought centre stage the consequences of under-resourced healthcare centres in a way that affected all people, irrespective of geography, class or reputation. The reality is that the conditions in which many metropolitan hospitals found themselves in, with never enough staff or supplies, is a condition that some hospitals experienced long before COVID-19 ever appeared, particularly in rural settings. To say that every dark cloud has a silver lining would be horribly cliché, but in this case, there may be truth to it. This edition of A Year in Science is a chance for us to reflect on all that COVID-19 has called attention to and decide to do something about it. References Carmassi C, Foghi C, Dell’Oste V, Cordone A, Bertelloni CA, Bui E, et al. PTSD symptoms in healthcare workers facing the three coronavirus outbreaks: What can we expect after the COVID-19 pandemic. Psychiatry Research. 2020 Oct;113312. Janiri D, Carfì A, Kotzalidis GD, Bernabei R, Landi F, Sani G. Posttraumatic Stress Disorder in Patients After Severe COVID-19 Infection. JAMA Psychiatry. 2021 Feb; Johnson SU, Ebrahimi OV, Hoffart A. PTSD symptoms among health workers and public service providers during the COVID-19 outbreak. Vickers K, editor. PLOS ONE. 2020 Oct 21;15(10):e0241032. Previous article Next article

  • Fossil Markets: Under the Gavel, Under Scrutiny | OmniSci Magazine

    < Back to Issue 7 Fossil Markets: Under the Gavel, Under Scrutiny by Jesse Allen 22 October 2024 edited by Zeinab Jishi illustrated by Jessica Walton At the crossroads between science and commerce, the trade in fossils has "developed into an organised enterprise" over the course of the twentieth century. With greater investment and heated competition between museums and private collectors, fossils increasingly took their place alongside “art, furniture, and fine wine” (Kjærgaard, 2012, pp.340-344). Fast forward to the twenty-first century, and this trend shows no signs of abating. On the contrary: as of 10 July 2024, a near-complete stegosaurus skeleton - nicknamed ‘Apex’ - was discovered by a commercial palaeontologist in Colorado, and was later purchased by “hedge-fund billionaire” Ken Griffin for US$44.6 million (Paul, 2024). This makes it the single most expensive dinosaur skeleton ever sold, eclipsing the previous record set in 2020 for a T-Rex named ‘Stan’, who was snapped up for US$31.8 million (Paul, 2024). These sales came with their fair share of criticism and controversy, reigniting the long-standing debate about how fossils should be handled, and where these ancient remains rightfully belong. Fossils (from the Latin fossilus , meaning ‘unearthed’) are the “preserved remains of plants and animals” which have been buried in sediments or preserved underneath ancient bodies of water, and offer unique insights into the history and adaptive evolution of life on Earth (British Geological Survey, n.d.). Their value is by no means limited to biology, however: they are useful for geologists in correlating the age of different rock layers (British Geological Survey, n.d.), and reveal the nature and consequences of changes in Earth’s climate (National Park Service, n.d.). Though new discoveries are being made all the time, fossils are inherently a finite resource, which cannot be replaced. This is part of what makes the fossil trade so lucrative, but the forces of limited supply and high demand have also led to the emergence of a dark underbelly. Cases of fossil forgery go back “as far as the dawn of palaeontology itself” in the late 18th and 19th centuries (Benton, 2024). The latest “boom in interest" is massively inflating prices and “fuelling the illicit trade” in fossils (Timmins, 2019). Whereas the US has a ‘finders-keepers’ policy, according to which private traders have carte blanche to dig up and sell any fossils they find, countries such as Brazil, China, and Mongolia do not allow the export of specimens overseas (Timmins, 2019). Sadly, this does little to prevent illegal smuggling; the laws are sometimes vague, and enforcement can be difficult when no single government agency is responsible for monitoring palaeontological activities (Winters, 2024). According to David Hone, a reader in zoology at Queen Mary University of London, “not every fossil is scientifically valuable”; but they are all “objects…worthy of protection,” and too many “scientifically important fossils appear briefly on the auction house website” before “vanish[ing] into a collector’s house, never to be seen again” (Hone, 2024). Museums, universities, and other scientific organisations are finding it more and more difficult to “financially compete with wealthy, private purchasers” as they are simply being priced out of the market (Paul, 2024). As sales become less open to expert scrutiny, the risk of forgery and price distortions become greater. It also has negative implications for future research. Private collectors might give access to one scientist, but not allow others to corroborate their findings. If the fossils aren’t open to all, many institutions simply won’t examine the items in private collections as a matter of principle. (Timmins, 2019). The general public also loses out in a world where dinosaur fossils are reduced to expensive conversation pieces. As Hone writes, “we might never dig up another Stegosaurus, or never find one nearly as complete as [Apex].” Having waited 150 million years to be unearthed, this latest fossil is one of many that may not see the light of day for a very long time. Bibliography Benton, M. (2024, September 5). Modern palaeontology keeps unmasking fossil forgeries – and a new study has uncovered the latest fake . The Conversation. https://theconversation.com/modern-palaeontology-keeps-unmasking-fossil-forgeries-and-a-new-study-has-uncovered-the-latest-fake-223501 British Geological Survey. (n.d.). Why do we study fossils? British Geological Survey. https://www.bgs.ac.uk/discovering-geology/fossils-and-geological-time/fossils/ Hone, D. (2024, June 10). The super-rich are snapping up dinosaur fossils – that’s bad for science . The Guardian. https://www.theguardian.com/commentisfree/article/2024/jun/10/super-rich-dinosaur-fossils-stegosaurus-illegal-trade-science Kjærgaard, P. C. (2012). The Fossil Trade: Paying a Price for Human Origins. Isis , 103 (2), 340–355. https://doi.org/10.1086/666365 National Park Service. (n.d.). The significance of fossils . U.S. Department of the Interior. https://www.nps.gov/subjects/fossils/significance.htm Paul, A. (2024, July 18). Stegosaurus 'Apex' sold for nearly $45 million to a billionaire . Popular Science. https://www.popsci.com/science/stegosaurus-skeleton-sale/ Timmins, B. (2019, August 8). What’s wrong with buying a dinosaur? BBC News. https://www.bbc.com/news/business-48472588 Winters, G.F. (2024). International Fossil Laws. The Journal of Paleontological Sciences , 19 . https://www.aaps-journal.org/Fossil-Laws.html Previous article Next article apex back to

  • A Brief History of the Elements: Finding a Seat at the Periodic Table | OmniSci Magazine

    < Back to Issue 6 A Brief History of the Elements: Finding a Seat at the Periodic Table by Xenophon Papas 28 May 2024 Edited by Arwen Nguyen-Ngo Illustrated by Rachel Ko What are we made of and where did it all come from? Such questions have pervaded the minds of scientific thinkers since ancient times and have entered all fields of enquiry, from the physical to the philosophical. Our best scientific theory today asserts that we’re made of atoms, and these atoms come in different shapes and sizes. Fundamentally, they can be described by the number of subatomic particles (protons, neutrons, and electrons) they contain (Jefferson Lab, 2012). Neatly arranged in a grid, these different elements form the periodic table we know and love today; but it was not always this way. The story of how the periodic table of elements came to be harks back to Ancient Greece and winds its way through the enlightenment into the 20th century. It is an unfinished story of which we are at the frontier of today: in search of dark matter and the ultimate answer to what the universe is made of. We may never know for sure exactly what everything in existence consists of, but it’s a pursuit our earliest ancestors would be proud to see us follow. Thales was first in the ancient Greek-speaking world to postulate about the origins of all material things. He theorised that all matter in the universe was made up of just one type of substance – water – and any other forms of solids, liquids and gases were just derivatives thereof. This idea was not initially opposed, given Thales was one of the earliest of the Ancient Greeks to pursue such questions of a scientific nature. Afterall, he’s remembered today as the “Father of Science” in the Western world. As Thales was from Miletus, a city on the coast of the Ionian Sea in modern day Türkiye and part of Magna Graecia in the 6th cent BC, it is not hard to imagine that water was a crucial aspect in trade, agriculture, and daily life at the time. However, this seemed to oversimplify the matter to some of his contemporaries. Empedocles, who was considered more a magician than a philosopher, revised this mono-elemental theorisation in the 5th Century BC. He proposed four basic substances from which all others were made (Mee, 2020). We know them today famously as the four classical elements: Earth, Air, Water and Fire. This asserted a fundamental principle of “fourness”, encompassing the cardinal directions in the Western world during this time. Interestingly, concurrent to this other traditions such as those in China acknowledged five elements and compass points instead. A generation later to Empedocles’ work, Plato embraced his “fourish” formulation. Being heavily influenced by mathematics as the medium through which we make reason of the natural world, Plato related each of these elements to a mathematical object: a convex, regular polyhedron in three-dimensional Euclidean space, otherwise known as a Platonic solid. Earth was associated with the cube, air with the octahedron, water with the icosahedron, and fire with the tetrahedron. Lastly, the most complicated solid, the dodecahedron – itself made up of composite regular polygons – was associated with the makeup of the constellations and the Heavens themselves, their workings said to be unfathomable by human minds (Ball, 2004). His student, Aristotle, ran with this idea and devised a clever way to break up the elements based on their "qualities”, akin to a first periodic table. These binary roles were hot and cold, wet and dry, with an element containing just two of these qualities each. According to Aristotle, each of these elements could be converted to the other by inverting one of their qualities, seemingly bringing about an early form of alchemy. To these four elements, he also appended a fifth - aether or “pure air” - to fill the expanses of the heavens, which also became associated with the fifth Platonic solid. In the Western World, Aristotle’s word was taken as doctrine for a very long time owing greatly to the fall of Rome and the cultural instability thereafter. Where Europe plummeted into the Dark Ages with a reverence for the scholars of antiquity, scientific and literary endeavour flourished in the Middle East – the word alchemy itself having etymologically Arabic roots. It was not until the late 17th century that the likes of Galileo, Newton, and Descartes revived Western scientific pursuit, and sought to understand how the natural world arranged itself. In the 18th century, new discoveries were being made on the frontiers of science in major cities throughout Europe. In 1772, in Paris, Antoine Lavoisier began work on combustion of materials like phosphorus and sulphur. Lavoisier concluded that if something decomposes into simpler substances, then it is not an element. For example, while water can be turned into a gas when passed over hot iron and is therefore not an element, oxygen and hydrogen are indeed elemental. English chemist John Dalton took after Lavoisier and in 1808 began to arrange elements spatially into a chart, accounting for their various properties. In Strasbourg 1827, Wolfgang Döbereiner recognised that groups of threes arose from the list of elements which behaved similarly, known as “Döbereiner's triads" (Free Animated Education, 2023). John Newlands in 1866 put forward the “Law of Octaves”. Elements with similar properties ended up at regular intervals, dividing the elements into seven groups of eight – hence octaves. However, this method of dividing up the elements broke down in some special cases. Now turning to St. Petersburg, Russia, in February of 1869. Dmitri Mendeleev sits at his desk, with a mess of cards covering the surface of his working space. The professor of chemistry rearranges these elemental cards like a jigsaw puzzle, arranging and rearranging them to align them in accordance with their properties. Supposedly after coming to him in a dream, a pattern emerged. Mendeleev saw the ability for the simple tabulation of the elements based on their atomic number and hence their common properties. This newfound tool, based on Lavoisier’s work a century prior, allowed for the prediction of properties of elements which had not even been discovered yet. Elements which Mendeleev believed to exist, even though they presented as empty gaps in the grid structure of the periodic table. Within just twenty years, Mendeleev’s prediction of the existence of such elements like gallium, scandium, and germanium had been validated with experimental fact. All of this was predicted without knowledge of the true reason for similarities of elemental properties – the electron shell arrangement at a subatomic level. Mendeleev had totally changed the way chemists viewed their discipline and has been immortalised for perhaps the greatest breakthrough work in the history of chemistry (Rouvray, 2019). Today we recognise that all the elements in the universe have origins in the high-pressure hearts of stars. Like a hot furnace, they churn out heavier and heavier elements under their immense internal pressures. Once this life cycle comes to an end, the star erupts into a fiery supernova, releasing even more of the heavier elements we see further down the periodic table. In the last 75 years, scientists have added an additional 24 elements to the periodic table, some of which are so difficult to produce that their half-lives last only a few fractions of a millisecond before decaying away to nothing (Charley, 2012). This begs the question; how do we find new elements? Elements can be created via either fission, splitting apart a heavier atom, or fusion, binding two bodies of atoms together. The heavier an element, that is, the more protons and neutrons in its nucleus, the more unstable it is. Hence it is with great difficulty that scientists attempt to churn out new elements from large particle accelerators, by colliding and combining elements into new ones (Chheda, 2023). The story of physical matter is just one aspect in the search for what “everything” is made of. Dark matter and dark energy – so named because they do not interact with light – have been found to drive the expansion of the universe and the rotation speeds of galaxies. We know remarkably little about these substances, given that they make up around 95% of the total mass of the universe! Without a doubt, we have only just begun the journey to find out what makes up the universe around us. References Chheda, R. (2023, March 31). Can we add new elements to the periodic table? Science ABC. https://www.scienceabc.com/pure-sciences/can-we-add-new-elements-to-the-periodic-table.html Charley, S. (2012). How to make an element. PBS. https://www.pbs.org/wgbh/nova/insidenova/2012/01/how-to-make-an-element.html Free Animated Education. (2023, February 10). Perfecting the periodic table [Video]. YouTube. https://www.youtube.com/watch?v=7tbMGKGgCRA&ab_channel=FreeAnimatedEducation Jefferson Lab. (2012, November 20). The origin of the elements [Video]. YouTube. Ball, P. (2004). The elements: A very short introduction . Oxford University Press. Mee, N. (2020). Earth, air, fire, and water. In Oxford University Press eBooks (pp. 16–23). https://doi.org/10.1093/oso/9780198851950.003.0003 Rouvray, D. (2019). Dmitri Mendeleev. New Scientist. https://www.newscientist.com/people/dmitri-mendeleev Previous article Next article Elemental back to

  • Three-Parent Babies? The Future of Mitochondrial Donation in Australia | OmniSci Magazine

    < Back to Issue 5 Three-Parent Babies? The Future of Mitochondrial Donation in Australia Kara Miwa-Dale 24 October 2023 Edited by Yasmin Potts Illustrated by Aisyah Mohammad Sulhanuddin Mitochondria are the ‘powerhouse of the cell’. Sound familiar? This fact was likely drilled into you during high school biology classes (or by looking at memes). Beyond this, you may not have given mitochondria a second thought - but you should! This organelle has been at the centre of some heated parliamentary debates relating to mitochondrial donation. This new IVF technology, which aims to prevent women from passing on mitochondrial disease, will reshape Australia’s approach to genetic and reproductive technologies. Mitochondrial donation was legalised in Australia last year when ‘Maeve’s Law’ was passed in the Senate. This law reform has generated a minefield of social and ethical questions that are yet to be fully answered. What is mitochondrial disease? Mitochondria are the small but mighty structures found in all our cells (except red blood cells) that produce more than 90% of the energy used by our bodies (Cleveland Clinic, 2023). This organelle is vital for the functioning of important organs such as the heart, brain and liver (Cleveland Clinic, 2023). Mitochondria also have their own DNA, with a relatively small genome size of 37 genes (Garcia et al., 2017), compared to the 20,000 genes in our nuclear DNA (Nurk et al., 2022). Mitochondrial disease refers to a group of disorders in which ‘faulty’ mitochondria results in a range of symptoms such as poor motor control, developmental delay, seizures and cardiac disease (Mito Foundation, 2023). Half of the cases of mitochondrial disease are caused by mutations in mitochondrial DNA. These mutations are transmitted through maternal inheritance, which means that all the mitochondria in your cells are passed on from your biological mother (Mito Foundation, 2023). It is believed that about 1 in 200 people have a mutation in their mitochondrial DNA, with 1 in 5000 people having some form of mitochondrial disease (Mito Foundation, 2023). There is currently no cure for this group of conditions. How does mitochondrial donation work? Mitochondrial donation, also known as Mitochondrial Replacement Therapy (MRT), is an IVF technology which aims to prevent women from passing on mitochondrial disease to their children. For individuals with mitochondrial disease, this technology is currently the only way to have biological children without the risk of passing on their disease. MRT is used to create an embryo containing the nuclear DNA from two parents, in addition to mitochondrial DNA from an egg donor. This process involves taking the nuclear DNA from an embryo (created using the mother’s egg and father’s sperm) and inserting it into a donor egg which contains healthy mitochondria (NHMRC, 2023). The child will still inherit all of their unique characteristics, such as hair colour, through the nuclear DNA of their prospective parents. Therefore, it would be impossible to tell that an individual had been conceived through MRT simply by looking at them. Challenges in defining parenthood. Children conceived through MRT have been popularly referred to in the media as ‘three-parent babies’ since the technique creates an embryo containing DNA from three different individuals. However, this label is inaccurate and misleading. It suggests that all three parents make an equal contribution to the identity of the child, when in fact mitochondrial donors contribute only 0.1% of the child’s total genetic material. So, technically the term ‘2.002-parent babies’ would be more accurate! Under Australian law, mitochondrial donors will not have legal status as parents since their genetic contribution is not thought to influence the unique characteristics of the child. However, there are some concerns about the potential psychological impacts on children conceived through MRT, as the definition of parenthood is becoming increasingly blurry. It is possible that children conceived through mitochondrial donation will regard their mitochondrial donor as significant to their identity, considering how different their life may have looked without them. As researchers learn more about the function of mitochondria, we may indeed find that mitochondrial DNA has a greater influence on a person’s characteristics than we once thought. More recent studies have linked mitochondrial DNA to athletic performance (Maruszak et al., 2014), psychiatric disorders (Sequeira et al., 2012), and ageing (Wallace, 2010). Should mitochondrial donors remain anonymous? If mitochondrial donors contribute such a tiny amount of DNA to a child, and do not influence any of their personal characteristics, should they be obligated to disclose their identity to the recipient? Australia no longer allows egg or sperm donors to remain anonymous in order to protect the rights of individuals to know their biological origins. Yet, in the case of mitochondrial donation, there is a much smaller proportion of DNA involved. Some experts have compared mitochondrial donation to organ donation, in the sense that the donation also provides someone with the organ (or organelle) that enables them to live a healthy life, without altering their unique characteristics. It has therefore been argued that mitochondrial donation should be treated in a similar way to organ donation, allowing donors to remain anonymous. Considering that donated eggs are often in low supply, permitting anonymous donors may provide a way of improving the availability of donor eggs. It is likely that Australia will follow the lead of the UK by permitting anonymous donation. Are we ‘playing God’ by altering the genome? By making heritable changes to an individual’s genome, we are heading into new and potentially dangerous territory. Opponents of mitochondrial donation have voiced fears about the ‘slippery slope’ between trying to eradicate mitochondrial disease and taking this technology too far into the realm of ‘designer babies’. Considering that mitochondrial donation does not involve making any changes to nuclear DNA, and can only be used for medical reasons, these statements seem a bit sensationalist. However, there are some genuine reasons to be concerned about the safety of this technology and its implications for the future of humankind. While MRT is generally considered to be safe based on clinical research, there are still some uncertainties about its efficacy in clinical practice. For example, clinical research has found that there is a chance of ‘carry-over’ of unhealthy mitochondria during the MRT process (Klopstock, Klopstock & Prokisch, 2016). If this carry-over occurs, there is a potential for the numbers of unhealthy mitochondria to gradually increase as the embryo develops, essentially undoing all the hard work of creating an embryo free from mitochondrial disease. However, the percentage of carry-over is usually less than 2% and is likely to become lower as the technology advances (Klopstock, Klopstock & Prokisch, 2016). Unfortunately, we won’t know about any negative long-term impacts of MRT until we are able to observe the development of children conceived through this technology. However, adults over the age of 18 cannot be forced to participate in a study, which makes it more challenging to track long-term outcomes. An important consideration is the privacy and autonomy of these individuals - that they are not over-medicalised or viewed as some sort of ‘spectacle’ to the public. The future of mitochondrial donation in Australia. ‘Maeve’s Law’ was named in honour of Maeve Hood, a cheerful 7-year-old who was diagnosed with a rare mitochondrial disease at 18 months old. The law was passed with the aim of preventing the transmission of mitochondrial disease in Australia, which affects around fifty families each year. This revolutionary law permits the creation of a human embryo containing genetic material from three people and allows heritable changes to be made to the genome (although under strict guidelines). Such practices were previously illegal in Australia due to understandable concern that these technologies could be destructive in the wrong hands. Maeve’s Law introduces an exception to these prohibitions solely for the purpose of preventing serious mitochondrial disease. While MRT is no longer illegal in Australia, Maeve’s Law does not authorise the immediate use of MRT in clinical practice. The law outlines a two-stage approach in which the technology will be implemented, provided that clinical trials are successful. This initiative will be conducted by Monash University through the mitoHOPE (Healthy Outcomes Pilot and Evaluation) program, for which they received $15 million in funding (Monash University, 2023). Stage 1, which is expected to last around ten years, will involve clinical research aimed at improving MRT techniques and validating its safety. After an initial review, mitochondrial donation may become available in a clinical practice setting in Stage 2. Mitochondrial donation is an exciting technology which provides hope to the many Australians touched by the devastating effects of mitochondrial disease. However, it is important that more research is conducted into its safety and efficacy, as well as the long-term implications of its use. As is often the case with groundbreaking technologies such as this, the laws and policies lag behind the science. The passing of Maeve’s Law is only the start of what will be a long journey to the successful implementation of mitochondrial donation in Australia. The next ten years will be crucial in setting a precedent for how our society approaches the use of other novel genetic technologies in healthcare. The question is no longer ‘should we use mitochondrial donation?’ but ‘how can we implement this technology in a safe and ethical way?’ References Cleveland Clinic. (2023). Mitochondrial Diseases . https://my.clevelandclinic.org/health/diseases/15612-mitochondrial-diseases Garcia, I., Jones, E., Ramos, M., Innis-Whitehouse, W., & Gilkerson, R. (2017). The little big genome: The organization of mitochondrial DNA . Frontiers in Bioscience (Landmark Edition), 22, 710. Klopstock, T., Klopstock, B., & Prokisch, H. (2016). Mitochondrial replacement approaches: Challenges for clinical implementation . Genome Medicine, 8(1), 1-3. Maruszak, A., Adamczyk, J. G., Siewierski, M., Sozański, H., Gajewski, A., & Żekanowski, C. (2014). Mitochondrial DNA variation is associated with elite athletic status in the Polish population. Scandinavian Journal of Medicine & Science in Sports, 24(2), 311-318. Mito Foundation. (2023). Maybe Mito Patient Factsheet. https://www.mito.org.au/wp-content/uploads/2019/01/Maybe-Mito-Patient-Factsheet1.pdf Mito Foundation. (2023). Mitochondrial Disease: The Need For Mitochondrial Donation . https://www.mito.org.au/wp-content/uploads/2019/01/Brief-mitochondrial-donation-2.pdf Monash University. (2023). Introducing Mitochondrial Donation into Australia. The mitoHOPE Program. https://www.monash.edu/medicine/mitohope National Health and Medical Research Council. (2023). Mitochondrial Donation. https://www.nhmrc.gov.au/mitochondrial-donation Nurk, S., Koren, S., Rhie, A., Rautiainen, M., Bzikadze, A. V., Mikheenko, A., & Phillippy, A. M. (2022). The complete sequence of a human genome . Science, 376(6588), 44-53. Sequeira, A., Martin, M. V., Rollins, B., Moon, E. A., Bunney, W. E., Macciardi, F., & Vawter, M. P. (2012). Mitochondrial mutations and polymorphisms in psychiatric disorders. Frontiers in Genetics, 3, 103. Wallace, D. C. (2010). Mitochondrial DNA mutations in disease and aging. Environmental and Molecular Mutagenesis, 51(5), 440-450. Wicked back to

  • Where The Wild Things Were | OmniSci Magazine

    Where The Wild Things Were By Ashleigh Hallinan We may consider ourselves to be the most advanced species on the planet, but this has come at the cost of the natural world. Delve into this article to gain insight into how ecosystem restoration plays a role in nature-based solutions for biodiversity loss and climate change mitigation globally. Edited by Niesha Baker & Caitlin Kane Issue 1: September 24, 2021 Illustration by Jess Nguyen The scale of threats posed to humanity and the natural world is confronting and difficult to grasp. The natural world is being pushed towards its brink, but it’s not too late to act. Ecosystem restoration plays an important role in nature-based solutions for biodiversity loss, food insecurity, and climate change. Global discourse and action also need to continue moving towards greater acknowledgement of Traditional Owners and local communities in biodiversity conservation efforts and climate change resilience. Ecosystem degradation is an accelerating calamity of our own making. A recent study from Frontier Forest and Global Change shows that humans have altered 97 per cent of the Earth's land, meaning a mere 3 per cent of land remains untouched, or ‘ecologically intact’ (1). ‘Ecosystem degradation’ refers to the loss of natural productivity from environments as a result of human activity. Many of the world’s ecosystems have been pushed beyond the point of unassisted self-recovery due to a mix of stressors, most of which are human-induced. Ecosystems are made up of interacting organisms and the physical environment in which they are found, so disturbing the balance of an ecosystem can be disastrous for all the living things relying on it, including humans. If trends of ecosystem degradation continue, 95 per cent of the Earth’s land could become degraded by 2050 (2). In this scenario, we would face irreversible damage. But how does this affect you and me? Beyond the role ecosystem degradation plays in accelerating climate change and the loss of countless species from our planet, its impact on ecosystem services is also of great significance. Ecosystem services are the benefits humans derive from the natural environment. These range from the oxygen we breathe to aesthetic appreciation of the natural environments around us. These services are necessary for life to exist on Earth, and without them, our quality of life would decline drastically. Luckily for us, humans are capable of learning from their mistakes, and efforts are being made to address these global concerns. Ecosystem restoration is the process of reversing ecosystem degradation to regain environmental health and sustainability. This often involves re-introducing plant and animal populations that may have been lost, as well as restoring their habitats. Abandoned farmland is one example of where this can be achieved. Farmlands are one of the most vital ecosystems in sustaining humankind. Not only do they provide us with food, but they are also home to a variety of organisms within and above the soil. Many of these organisms play a critical role in soil health, which is essential for agriculture. Agriculture has transformed human societies and fuelled a global population that has grown from one billion to almost eight billion people since around 1804 (3). This has had significant consequences on natural systems worldwide, particularly as farmland has continuously expanded into surrounding landscapes. Agroecosystems now cover around 40 per cent of Earth's terrestrial surface (4). However, despite a growing demand for food due to the world’s rapidly increasing population, the amount of farmland being abandoned outweighs the amount of land being converted to farmland (5). There are an estimated 950 million to 1.1 billion acres of deserted farmland globally (6). This unproductive farmland could be converted to meet conservation goals and mitigate the impacts of climate change. For example, farmland could be regenerated with carbon-capturing forests. These would contribute to sequestering large amounts of anthropogenic CO2, water retention, soil fertility, and providing habitats for a variety of organisms. Abandoned farmland could also be re-established with native vegetation to provide habitats for animals. This was the case at the Monjebup Nature Reserves, located in south-west Western Australia (WA) on Noongar Country, established by Bush Heritage Australia between 2007 and 2014 (7). Despite being a biodiversity hotspot, animals and plants in the Monjebup Nature Reserves have faced many threats. These were mainly in the form of introduced species and land clearing for agriculture. Decades of land clearing resulted in a transition from deep-rooted woody vegetation systems to shallow-rooted annual cropping systems across the south-western Australian landscape. This caused a decrease in natural habitats and accumulation of salt in soil and water, which contributed significantly to biodiversity loss. In 2007, Bush Heritage Australia secured the Monjebup Nature Reserves in a bid to establish important conservation areas. Since then, they have restored nearly 1,000 acres of cleared land in the north of the Reserve (8). An important contributor to the success of this project was Indigenous knowledge, which reflects a long history of close connection with the land. These unique human-land relationships provide opportunities for learning in environmental research, particularly regarding land management and sustainability. The Monjebup Nature Reserves now protect a significant patch of native bushland on the land of the Noongar-Minang and Koreng people. This has been critical in restoring the heavily cleared landscape between WA's Stirling Ranges and Fitzgerald River National Parks, reconnecting remnant bush in the south with that of the Corackerup Nature Reserve further north. It has also provided habitat for vulnerable animal species such as the Malleefowl, Western Whipbird, Carnaby's Cockatoo, and Tammar Wallaby. Local knowledge plays a critical role in re-introducing plants and animals by identifying species suitable to particular environments. In the Monjebup Nature Reserves, re-introduction of native plants involved research on local plant communities and soil conditions in immediately surrounding areas. This research also involved communication with Traditional Owners who had used the area for gathering raw materials, food processing, hunting, stone tool manufacturing, and seasonal movement over millennia (9). Seeds of suitable flora were then collected in and around the site for the restoration works. It is crucial that consultation with Traditional Owners, like that seen in the Monjebup Nature Reserves project, becomes a more common practice. An estimated 37 per cent of all remaining natural lands are under Indigenous management (10). These lands protect 80 per cent of global biodiversity and the majority of intact forests, highlighting the value of Indigenous knowledge (11). We have left ourselves a challenging yet attainable goal. Raising public awareness on the importance of ecosystems and improving our knowledge on the interconnectedness of the natural world will be key to decreasing our impacts on Earth's incredible ecosystems. In March 2019, the United Nations General Assembly announced 2021 to 2030 as the Decade on Ecosystem Restoration (12). El Salvador’s Minister of Environment and Natural Resources, Lina Pohl, proposed the creation of the Decade in a speech to the General Assembly. More than 70 countries from all latitudes quickly jumped on board, committing to safeguarding and restoring ecosystems globally (13). 2030 also happens to be the deadline for the Sustainable Development Goals, which are a collection of 17 interlinked global goals designed to address the global challenges we face, and provide a ‘blueprint to achieve a better and more sustainable future for all’ (14). 2030 is also the year scientists have identified as the last chance to prevent catastrophic climate change (15). As part of the Decade on Ecosystem Restoration, the United Nations has called for countries to make the pledge to restore at least 2.5 billion acres of degraded land - an area larger than China (16). This will require international cooperation, led by the UN Environment Programme and the Food and Agriculture Organisation. Humans have an essential role in halting and reversing the damage that has been caused so far. Ecosystem restoration is not a quick or easy process. It requires deep, systematic changes to the economic, political, and social systems we currently have in place. But the natural world is finite, and it is important we continue taking steps towards a more sustainable future. References: 1. Plumptre, Andrew J., Daniele Baisero, R. Travis Belote, Ella Vázquez-Domínguez, Soren Faurby, Włodzimierz Jȩdrzejewski, Henry Kiara, Hjalmar Kühl, Ana Benítez-López, Carlos Luna-Aranguré, Maria Voigt, Serge Wich, William Wint, Juan Gallego-Zamorano, Charlotte Boyd . “Where Might We Find Ecologically Intact Communities?” Frontiers in Forests and Global Change 4 (15 April 2021): 1-13. https://doi.org/10.3389/ffgc.2021.626635. 2, 4. Scholes, Robert, L Montanarella, Anastasia Brainich, Nichole Barger. “The Assessment Report on Land Degradation and Restoration: Summary for Policymakers”. Bonn, Germany: Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES), 2018. https://ipbes.net/sites/default/files/2018_ldr_full_report_book_v4_pages.pdf 3. Food and Agriculture Organisation of the United Nations,“FAOSTAT”, Accessed 8 September 2021, http://www.fao.org/faostat/en/#home . 5, 6. Yang, Yi, Sarah E. Hobbie, Rebecca R. Hernandez, Joseph Fargione, Steven M. Grodsky, David Tilman, Yong-Guan Zhu, Yu Luo, Timothy M. Smith, Jacob M. Jungers, Ming Yang, Wei-Qiang Chen. “Restoring Abandoned Farmland to Mitigate Climate Change on a Full Earth”. One Earth 3, no. 2 (August 2020): 176–86. https://doi.org/10.1016/j.oneear.2020.07.019. 7, 8, 9. Bush Heritage Australia,“Monjebup Nature Reserves (WA),” Accessed 8 September 2021, https://www.bushheritage.org.au/places-we-protect/western-australia/monjebup . 10. Garnett, Stephen T., Neil D. Burgess, Julia E. Fa, Álvaro Fernández-Llamazares, Zsolt Molnár, Cathy J. Robinson, James E. M. Watson, Kerstin K.Zander, Beau Austin, Eduardo S. Brondizio, Neil French Collier, Tom Duncan, Erle Ellis, Hayley Geyle, Micha V. Jackson, Harry Jonas, Pernilla Malmer, Ben McGowan, Amphone Sivongxay, Ian Leiper. “A Spatial Overview of the Global Importance of Indigenous Lands for Conservation‘. Nature Sustainability 1, no. 7 (July 2018): 369–74. https://doi.org/10.1038/s41893-018-0100-6 . 11. Ogar, Edwin, Gretta Pecl, and Tero Mustonen. ‘Science Must Embrace Traditional and Indigenous Knowledge to Solve Our Biodiversity Crisis’. One Earth 3, no. 2 (August 2020): 162–65. https://doi.org/10.1016/j.oneear.2020.07.006. 12, 13, 14, 15. United Nations Environment Programme and the Food and Agriculture Organization of the United Nations, “About the UN Decade,” Accessed 8 September 2021, http://www.decadeonrestoration.org/about-un-decade . 16. United Nations Environment Management Group, “The UN Sustainable Development Goals – UN Environment Management Group”, Accessed 8 September 2021, https://unemg.org/our-work/supporting-the-sdgs/the-un-sustainable-development-goals/ .

OmniSci Magazine acknowledges the Traditional Owners and Custodians of the lands on which we live, work, and learn. We pay our respects to their Elders past and present.

Subscribe to the Magazine

Follow Us on Socials

  • Facebook
  • Instagram
  • LinkedIn
UMSU Affiliated Club Logo
bottom of page