echemi logo
Product
  • Product
  • Supplier
  • Inquiry
    Home > Biochemistry News > Biotechnology News > When artificial neural networks spend time not learning, they learn better

    When artificial neural networks spend time not learning, they learn better

    • Last Update: 2023-01-05
    • Source: Internet
    • Author: User
    Search more information of high quality chemicals, good prices and reliable suppliers, visit www.echemi.com
      

    Depending on age, humans need 7 to 13 hours of sleep
    every 24 hours.
    During this time, a lot happens: heart rate, breathing and metabolic ups and downs; Adjustment of hormonal levels; The body relaxes
    .
    There are not so many in the brain
    .
    Dr.
    Maxim Bazhenov, a professor of medicine and sleep researcher at the University of California, San Diego School of Medicine, said: "When we sleep, the brain is very busy, constantly repeating what
    we learn during the day.
    Sleep helps to reorganize memories and present them
    in the most efficient way.

    In a previously published paper, Bazhenov and his colleagues report how sleep builds rational memory, the ability to remember arbitrary or indirect associations between objects, people, or events, and prevent forgetting old memories
    .

    Artificial neural networks use the structure of the human brain to improve numerous technologies and systems
    , from basic science and medicine to finance and social media.
    In some ways, they achieved superhuman performance, such as computational speed, but they failed in one key aspect: When artificial neural networks learn sequentially, new information overwrites previous information, a phenomenon known as catastrophic forgetting
    .

    Bazcherov said: "In contrast, the human brain is constantly learning and integrating new data into
    the knowledge it already has.
    It usually learns best
    when new training is intertwined with sleep times that consolidate memories.

    Senior author Bazhenov and colleagues discuss in the November 18, 2022 issue of PLOS Computational Biology how biological models can help mitigate the threat of catastrophic forgetting in artificial neural networks, increasing their utility across a range of research areas of interest
    .
    The scientists used peak neural networks that artificially mimic the natural nervous system: information is not continuously communicated, but is transmitted in the form of discrete events (peaks) at specific points in time
    .
    They found that catastrophic forgetting was alleviated
    when the peak network was trained to complete a new task, but occasionally had offline hours that simulated sleep.
    Just like the human brain, the network's "sleep" allows them to replay old memories
    without explicitly using old training data, the study's authors said.

    Memory in the human brain is represented by synaptic weights
    (the strength or amplitude of the connection between two neurons) patterns.

    Bazhenov said: "When we learn new information, neurons activate in a specific order, which increases synapses
    between neurons.
    During sleep, the peak patterns we learn while awake repeat spontaneously
    .
    This is called reactivation or replay
    .
    Synaptic plasticity, the ability to be altered or shaped, persists during sleep, and it can further enhance synaptic weight patterns that represent memory, helping to prevent forgetting or transferring knowledge from old tasks to new ones
    .

    When Bazhenov and his colleagues applied this approach to artificial neural networks, they found that it helped the network avoid catastrophic forgetting
    .

    "This means that these networks can learn
    continuously just like humans or animals.
    " Understanding how the human brain processes information during sleep helps enhance the memory
    of human subjects.
    Enhancing sleep rhythm can improve memory
    .
    In other projects, we have used computer models to develop optimal strategies for applying stimuli, such as auditory tones, during sleep to enhance sleep rhythm and improve learning capacity
    .
    This can be especially important when memory is not optimal, such as when memory grows with age or in some cases, such as Alzheimer's
    .

    Ryan Golden, Jean Erik Delanois, Pavel Sanda, Maxim Bazhenov.
    Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation.
    PLOS Computational Biology, 2022; 18 (11): e1010628 DOI: 10.
    1371/journal.
    pcbi.
    1010628



    This article is an English version of an article which is originally in the Chinese language on echemi.com and is provided for information purposes only. This website makes no representation or warranty of any kind, either expressed or implied, as to the accuracy, completeness ownership or reliability of the article or any translations thereof. If you have any concerns or complaints relating to the article, please send an email, providing a detailed description of the concern or complaint, to service@echemi.com. A staff member will contact you within 5 working days. Once verified, infringing content will be removed immediately.

    Contact Us

    The source of this page with content of products and services is from Internet, which doesn't represent ECHEMI's opinion. If you have any queries, please write to service@echemi.com. It will be replied within 5 days.

    Moreover, if you find any instances of plagiarism from the page, please send email to service@echemi.com with relevant evidence.