MD-GAN: Multi-Discriminator Generative Adversarial Networks for Distributed Datasets

A recent technical breakthrough in the domain of machine learning is the discovery and the multiple applications of Generative Adversarial Networks (GANs). Those generative models are computationally demanding, as a GAN is composed of two deep neural networks, and because it trains on large datasets...

Full description

Saved in:
Bibliographic Details
Published inProceedings - IEEE International Parallel and Distributed Processing Symposium pp. 866 - 877
Main Authors Hardy, Corentin, Le Merrer, Erwan, Sericola, Bruno
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.05.2019
Subjects
Online AccessGet full text
ISSN1530-2075
DOI10.1109/IPDPS.2019.00095

Cover

Abstract A recent technical breakthrough in the domain of machine learning is the discovery and the multiple applications of Generative Adversarial Networks (GANs). Those generative models are computationally demanding, as a GAN is composed of two deep neural networks, and because it trains on large datasets. A GAN is generally trained on a single server. In this paper, we address the problem of distributing GANs so that they are able to train over datasets that are spread on multiple workers. MD-GAN is exposed as the first solution for this problem: we propose a novel learning procedure for GANs so that they fit this distributed setup. We then compare the performance of MD-GAN to an adapted version of federated learning to GANs, using the MNIST, CIFAR10 and CelebA datasets. MD-GAN exhibits a reduction by a factor of two of the learning complexity on each worker node, while providing better or identical performances with the adaptation of federated learning. We finally discuss the practical implications of distributing GANs.
AbstractList A recent technical breakthrough in the domain of machine learning is the discovery and the multiple applications of Generative Adversarial Networks (GANs). Those generative models are computationally demanding, as a GAN is composed of two deep neural networks, and because it trains on large datasets. A GAN is generally trained on a single server. In this paper, we address the problem of distributing GANs so that they are able to train over datasets that are spread on multiple workers. MD-GAN is exposed as the first solution for this problem: we propose a novel learning procedure for GANs so that they fit this distributed setup. We then compare the performance of MD-GAN to an adapted version of federated learning to GANs, using the MNIST, CIFAR10 and CelebA datasets. MD-GAN exhibits a reduction by a factor of two of the learning complexity on each worker node, while providing better or identical performances with the adaptation of federated learning. We finally discuss the practical implications of distributing GANs.
Author Sericola, Bruno
Hardy, Corentin
Le Merrer, Erwan
Author_xml – sequence: 1
  givenname: Corentin
  surname: Hardy
  fullname: Hardy, Corentin
  email: Corentin.Hardy@technicolor.com
  organization: Technicolor and INRIA Rennes
– sequence: 2
  givenname: Erwan
  surname: Le Merrer
  fullname: Le Merrer, Erwan
  email: erwan.le-merrer@inria.fr
  organization: INRIA Rennes
– sequence: 3
  givenname: Bruno
  surname: Sericola
  fullname: Sericola, Bruno
  email: bruno.sericola@inria.fr
  organization: INRIA Rennes
BookMark eNotjlFLwzAUhaMouE7fBV_6B1pv0qZJfCur1sE2B07wbaTNLUZrK0k28d9b0KdzHr7vcCJyNowDEnJNIaUU1O1yW22fUwZUpQCg-AmJqGCSUpYXr6dkRnkGCQPBL0jk_TsAgyxXM7JbV0ldbu7i9aEPNqmsb539tIMOo4trHNDpYI8Yl-aIzmtndR9vMHyP7sPH3cRMRnC2OQQ0caWD9hj8JTnvdO_x6j_n5OXhfrd4TFZP9XJRrpI3JnhIlASBpuFGSZ4JWXRZoZCiMLJrJRXYosRWmkJ3ktEcQTBqcmZ0M3WeS5PNyc3frkXE_dd0XLufvZxoYDz7Bd4_UgE
ContentType Conference Proceeding
DBID 6IE
6IL
CBEJK
RIE
RIL
DOI 10.1109/IPDPS.2019.00095
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
Accès ENAC - IEEE Xplore POP ALL
IEEE Xplore All Conference Proceedings
IEEE Electronic Library (IEL)
IEEE Proceedings Order Plans (POP All) 1998-Present
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISBN 172811246X
9781728112466
EISSN 1530-2075
EndPage 877
ExternalDocumentID 8821025
Genre orig-research
GroupedDBID 29O
6IE
6IF
6IH
6IK
6IL
6IN
AAJGR
AAWTH
ABLEC
ADZIZ
ALMA_UNASSIGNED_HOLDINGS
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CBEJK
CHZPO
IEGSK
IPLJI
OCL
RIE
RIL
ID FETCH-LOGICAL-h275t-9807edb5d9853786f369e1e7d8fc817ece8ec8d6af8214e0721d42dab4e0548d3
IEDL.DBID RIE
IngestDate Wed Aug 27 02:35:21 EDT 2025
IsDoiOpenAccess false
IsOpenAccess true
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-h275t-9807edb5d9853786f369e1e7d8fc817ece8ec8d6af8214e0721d42dab4e0548d3
OpenAccessLink https://inria.hal.science/hal-01946665
PageCount 12
ParticipantIDs ieee_primary_8821025
PublicationCentury 2000
PublicationDate 2019-05-01
PublicationDateYYYYMMDD 2019-05-01
PublicationDate_xml – month: 05
  year: 2019
  text: 2019-05-01
  day: 01
PublicationDecade 2010
PublicationTitle Proceedings - IEEE International Parallel and Distributed Processing Symposium
PublicationTitleAbbrev IPDPS
PublicationYear 2019
Publisher IEEE
Publisher_xml – name: IEEE
SSID ssj0020349
Score 2.2222283
Snippet A recent technical breakthrough in the domain of machine learning is the discovery and the multiple applications of Generative Adversarial Networks (GANs)....
SourceID ieee
SourceType Publisher
StartPage 866
SubjectTerms Computational modeling
Deep Learning
Distributed Datasets
Gallium nitride
Generative Adversarial Network
Generative adversarial networks
Generators
Machine learning
Servers
Training
Title MD-GAN: Multi-Discriminator Generative Adversarial Networks for Distributed Datasets
URI https://ieeexplore.ieee.org/document/8821025
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV09T8MwELXaTkwFWsS3PDDiNh9O7LAhQilIrSrRSt0qx74IhNQiki78eu6StiDEwGZlSXJO_O7sd-8xdpVYiygfSgF-pIUEEwjjS0zkgjhSMouyemtgNI6HM_k0j-YNdr3rhQGAinwGPRpWZ_luZde0VdbHbBDxMGqyptJx3au1K65IZ2V7DOkl_cdJOnkm5hbJUXrkHvHDPKXCjkGbjbZ3rSkjb711mfXs5y9Bxv8-1j7rfnfp8ckOfw5YA5aHrL21aeCbv7bDpqNUPNyOb3jVbCvSV1opiAGD5TavZadpzeOVN3Nh6Ivk45odXnDMaXlK4rrkiwWOp6ZE3CuLLpsN7qd3Q7ExUxAvgYpKkWhPgcsilyBAY_jyME7AB-V0brWvwIIGq11scnwbCSSb5mTgTIZjrGpceMRay9USjhnHYlyq0DjIZChtmGgV4ox7eWC8GAFRn7AOBWnxXutlLDbxOf378hnbo2mqSYTnrFV-rOECgb7MLqsZ_gLBKaju
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV09T8MwED2VMsBUoEV844ERt_mwE4cNEUoLTVSJVupWOfZFIKQW0XTh12MnbUGIgc3KksSX-N3Z794DuIqUMijvM4ouF5Sh9Kh0mUnkvICHLONZtTWQpEFvzB4nfFKD600vDCKW5DNs22F5lq_namm3yjomGzR4yLdgmzPGeNWttSmvrNLK-iDSiTr9YTx8ttwtK0jpWP-IH_YpJXp0G5Cs71uRRt7ayyJrq89fkoz_fbA9aH336ZHhBoH2oYazA2isjRrI6r9twiiJ6cNtekPKdlsav9q1wnJgTMFNKuFpu-qR0p15Ie03SdKKH74gJqslsZXXtc5YqEksC4N8xaIF4-796K5HV3YK9MULeUEj4YSoM64jA9GhCHI_iNDFUItcCTdEhQKV0IHMzdswtMJpmnlaZmZs6hrtH0J9Np_hERBTjrPQlxoz5jPlRyL0Tcyd3JNOYCBRHEPTTtL0vVLMmK7m5-Tvy5ew0xslg-mgnz6dwq4NWUUpPIN68bHEcwP7RXZRRvsLxxmsOw
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=proceeding&rft.title=Proceedings+-+IEEE+International+Parallel+and+Distributed+Processing+Symposium&rft.atitle=MD-GAN%3A+Multi-Discriminator+Generative+Adversarial+Networks+for+Distributed+Datasets&rft.au=Hardy%2C+Corentin&rft.au=Le+Merrer%2C+Erwan&rft.au=Sericola%2C+Bruno&rft.date=2019-05-01&rft.pub=IEEE&rft.eissn=1530-2075&rft.spage=866&rft.epage=877&rft_id=info:doi/10.1109%2FIPDPS.2019.00095&rft.externalDocID=8821025