Dalarna University's logo and link to the university's website

du.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • chicago-author-date
  • chicago-note-bibliography
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Predicting Bipolar Mood Disorder using LongShort-Term Memory Neural Networks
Dalarna University, School of Information and Engineering.
2022 (English)Independent thesis Advanced level (degree of Master (One Year)), 10 credits / 15 HE creditsStudent thesis
Abstract [en]

Bipolar mood disorder is a severe mental condition that has multiple episodesof either of two types: manic or depressive. These phases can lead patients tobecome hyperactive, hyper-sexual, lethargic, or even commit suicide — all ofwhich seriously impair the quality of life for patients. Predicting these phaseswould help patients manage their lives better and improve our ability to applymedical interventions. Traditionally, interviews are conducted in the evening topredict potential episodes in the following days. While machine learningapproaches have been used successfully before, the data was limited tomeasuring a few self-reported parameters each day. Using biometrics recordedat short intervals over many months presents a new opportunity for machinelearning approaches. However, phases of unrest and hyperactivity, which mightbe predictive signals, are not only often experienced long before the onset ofmanic or depressive phases but are also separated by several uneventful days.This delay and its aperiodic occurrence are a challenge for deep learning. In thisthesis, a fictional dataset that mimics long and irregular delays is created andused to test the effects of such long delays and rare events. LSTMs, RNNs, andGRUs are the go-to models for deep learning in this situation. However, theydiffer in their ability to be trained over a long time. As their acronym suggests,LSTMS are believed to be easier to train and to have a better ability to remember(as their name suggests) than their simpler RNN counterparts. GRUs representa compromise in complexity between RNNs and LSTMs. Here, I will show that,contrary to the common assumption, LSTMs are surprisingly forgetful and thatRNNs have a much better ability to generalize over longer delays with shortersequences. At the same time, I could confirm that LSTMs are easily trained ontasks that have more prolonged delays.

Place, publisher, year, edition, pages
2022.
Keywords [en]
LSTM, LSTM Memory, Memorization vs. Delays, RNN, GRU
National Category
Social Sciences Interdisciplinary
Identifiers
URN: urn:nbn:se:du-39639OAI: oai:DiVA.org:du-39639DiVA, id: diva2:1639527
Subject / course
Microdata Analysis
Available from: 2022-02-21 Created: 2022-02-21

Open Access in DiVA

fulltext(3170 kB)130 downloads
File information
File name FULLTEXT01.pdfFile size 3170 kBChecksum SHA-512
f737e95ef1d944b051177fede5284aeb73314ddbfe3e9c7948bb8d0ca3474259f8364fc74d5daeea0fcb5141d16b9c9dd5062c45b636d87ee06e067d4810bc0c
Type fulltextMimetype application/pdf

By organisation
School of Information and Engineering
Social Sciences Interdisciplinary

Search outside of DiVA

GoogleGoogle Scholar
Total: 130 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 487 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • chicago-author-date
  • chicago-note-bibliography
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf