show simple item record

dc.contributor.advisoryang, yimin
dc.contributor.authorpaul, adhri nandini
dc.date.accessioned2020-07-07t16:49:35z
dc.date.available2020-07-07t16:49:35z
dc.date.created2020
dc.date.issued2020
dc.identifier.urihttp://knowledgecommons.lakeheadu.ca/handle/2453/4673
dc.description.abstractnetwork aims to optimize for minimizing the cost function and provide better performance. this experimental optimization procedure is widely recognized as gradient descent, which is a form of iterative learning that starts from a random point on a function and travels down its slope, in steps, until it reaches to the steepest point which is time-consuming and slow to converge. over the last couple of decades, several variations of the non-iterative neural network training algorithms have been proposed, such as random forest and quicknet. however, the non-iterative neural network training algorithms do not support online training that given a very largesized training data, one needs enormous computing resources to train neural network. in this thesis, a non-iterative learning strategy with online sequential has been exploited. in chapter 3, a single layer online sequential sub-network node (os-sn) classifier has been proposed that can provide competitive accuracy by pulling the residual network error and feeding it back into hidden layers. in chapter 4, a multilayer network is proposed where the first portion built by transforming multi-layer autoencoder into an online sequential auto-encoder(os-ae) and use os-sn for classification. in chapter 5, os-ae is utilized as a generative model that can construct new data based on subspace features and perform better than conventional data augmentation techniques on real-world image and tabular datasets.en_us
dc.language.isoen_usen_us
dc.subjectneural networksen_us
dc.subjectnetwork training algorithmen_us
dc.subjectmachine learningen_us
dc.subjectonline sequential learningen_us
dc.subjectautoencoderen_us
dc.titleonline sequential learning with non-iterative strategy for feature extraction, classification and data augmentationen_us
dc.typethesisen_us
etd.degree.namemaster of scienceen_us
etd.degree.levelmasteren_us
etd.degree.disciplinecomputer scienceen_us
etd.degree.grantor阿根廷vs墨西哥竞猜 en_us


files in this item

thumbnail

this item appears in the following collection(s)

show simple item record