bokomslag Distributed Learning Systems with First-Order Methods
Data & IT

Distributed Learning Systems with First-Order Methods

Ji Liu Ce Zhang

Pocket

1369:-

Funktionen begränsas av dina webbläsarinställningar (t.ex. privat läge).

Uppskattad leveranstid 5-10 arbetsdagar

Fri frakt för medlemmar vid köp för minst 249:-

  • 108 sidor
  • 2020
Scalable and efficient distributed learning is one of the main driving forces behind the recent rapid advancement of machine learning and artificial intelligence. One prominent feature of this development is that recent progress has been made by researchers in two communities: (1) the system community such as database, data management, and distributed systems, and (2) the machine learning and mathematical optimization community. The interaction and knowledge sharing between these two communities has led to the rapid development of new distributed learning systems and theory. This monograph provides a brief introduction to three distributed learning techniques that have recently been developed: lossy communication compression, asynchronous communication, and decentralized communication. These have significant impact on the work in both the system and machine learning and mathematical optimization communities but to fully realize the potential, it is essential they understand the whole picture. This monograph provides the bridge between the two communities. The simplified introduction to the essential aspects of each community enables researchers to gain insights into the factors influencing both.The monograph provides students and researchers the groundwork for developing faster and better research results in this dynamic area of research.
  • Författare: Ji Liu, Ce Zhang
  • Format: Pocket/Paperback
  • ISBN: 9781680837001
  • Språk: Engelska
  • Antal sidor: 108
  • Utgivningsdatum: 2020-06-24
  • Förlag: now publishers Inc