Computational Performance

:label:chap_performance

In deep learning, datasets and models are usually large, which involves heavy computation. Therefore, computational performance matters a lot. This chapter will focus on the major factors that affect computational performance: imperative programming, symbolic programming, asynchronous computing, automatic parallellism, and multi-GPU computation. By studying this chapter, you may further improve computational performance of those models implemented in the previous chapters, for example, by reducing training time without affecting accuracy.

  1. :maxdepth: 2
  2. hybridize
  3. async-computation
  4. auto-parallelism
  5. hardware
  6. multiple-gpus
  7. multiple-gpus-concise
  8. parameterserver