阿根廷vs墨西哥竞猜
 library logo
    • login
    view item 
    •   knowledge commons home
    • electronic theses and dissertations
    • electronic theses and dissertations from 2009
    • view item
    •   knowledge commons home
    • electronic theses and dissertations
    • electronic theses and dissertations from 2009
    • view item
    javascript is disabled for your browser. some features of this site may not work without it.
    quick search

    browse

    all of knowledge commonscommunities & collectionsby issue dateauthorstitlessubjectsdisciplineadvisorcommittee memberthis collectionby issue dateauthorstitlessubjectsdisciplineadvisorcommittee member

    my account

    login

    light-weight federated learning with augmented knowledge distillation for human activity recognition

    thumbnail
    view/open
    gadg2023m-1b.pdf (12.87mb)
    date
    2023
    author
    gad, gad
    metadata
    show full item record
    abstract
    the field of deep learning has experienced significant growth in recent years in various domains where data can be collected and processed. however, as data plays a central role in the deep learning revolution, there are risks associated with moving the data from where it is produced to central servers and data centers for processing. to address this issue, federated learning (fl) was introduced as a framework for collaboratively training a global model on distributed data. however, deploying fl comes with several unique challenges, including communication overhead and system and statistical heterogeneity. while fl is inherently private as clients don’t share local data, privacy is still a concern in the fl context since sensitive data can be leaked from the exchanged gradients. to address these challenges, this thesis proposes the incorporation of techniques such as knowledge distillation (kd) and differential privacy (dp) with fl. specifically, a modelagnostic fl algorithm based on kd is proposed, called the federated learning algorithm based on knowledge distillation (fedakd). fedakd utilizes a shared dataset as a proxy dataset to calculate and transfer knowledge in the form of soft labels, which are then sent to the server for aggregation and broadcast back to clients to train on them in addition to local training. additionally, we elaborate on applying local differential privacy (ldp) where clients apply gradient clipping and noise injection according to the differentially private stochastic gradient descent (dp-sgd). the fedakd algorithm is evaluated utilizing human activity recognition (har) datasets in terms of accuracy and communication efficiency.
    uri
    https://knowledgecommons.lakeheadu.ca/handle/2453/5175
    collections
    • electronic theses and dissertations from 2009 [1612]

    阿根廷vs墨西哥竞猜 library
    contact us | send feedback

     

     


    阿根廷vs墨西哥竞猜 library
    contact us | send feedback