Like the Google framework MapReduce; MR describes a way of implementing parallelism using the Map function which splits a large data into multiple key-value pairs.
如Google框架MapReduce;MR描述了一种使用Map功能实现并行性的方法,它将大型数据分割成多个键-值对。
How are the latency different when kernel-function called from CPU-host or by using dynamic parallelism from GPU?
如何在延迟时间的不同的核函数调用时,CPU的主机或利用GPU的动态并行?
Based on the idea of data parallelism, a parallel training model for RBF (radial basis function) neural network in time-series prediction to improve the training speed is proposed.
根据数据并行的思想,提出了在时序预测中并行训练神经网络的模型,以提高训练速度。
The key function of the non-parallelism under some conditions is shown by the comparison of the results from the classical stability problem of the parallel flow.
与经典的平行流边界层稳定性结果相比,显示了在某些条件下非平行性对稳定性的关键作用。
Finally some implications and Suggestions to teaching of the concept of function are given in terms of the historical parallelism.
文章的最后从“历史相似性”的角度对函数的教学提出了一些建议。
Finally some implications and Suggestions to teaching of the concept of function are given in terms of the historical parallelism.
文章的最后从“历史相似性”的角度对函数的教学提出了一些建议。
应用推荐