Preconditioner
Download 82.01 Kb.
|
MS-13430406-H
- Bu sahifa navigatsiya:
- Conclusion
- References
Table 1: The size of the matrices A, B, C and D.
Generic information of the test problems, including n and m, are given in Table 1. Table 2: Numerical results of Schur complement approximation.
In Table 2, we reported the results of Schur complement approximation in terms of CPU times. Table 3: Numerical results for the three preconditioned GMRES methods.
Table 4: Numerical results for the three preconditioned FGMRES methods.
Table 5: Numerical results for the three preconditioned BICGSTAB methods.
In Tables 3, 4 and 5 we report the results for the preconditioned GMRES, FGMRES and BICGSTAB iterative methods. From numerical results listed in Tables, we can conclude that the Pα,Sˆ preconditioned GMRES, FGMRES and BICGSTAB methods P P require less iterations and has faster CPU times than PT and PD in all trials. The PD and T preconditioner do not converge in case of Test 2, whereas the α,Sˆ preconditioner converge in case of both Test 1 and Test 2. ConclusionIn the present work, we have developed and studied numerically parallel block pre- conditioner for a class of linear systems arising from IVS model. Parallel block precon- ditioner have been proposed based on the approximate Schur complement (Block-Schur) and on a regularization technique. Several numerical experiments have been conducted in parallel on a parallel computer architecture in order to study the performance of the iterative solvers in terms of Krylov subspace methods iterations and computational time. P P Numerical results worked out in Section 5 (Tables 3, 4 and 5) reveal that the regular- ized parallel block preconditioned Krylov subspace methods with suitable parameter has great superiority compared with D and T preconditioned Krylov subspace meth- ods in terms of the iterations and CPU times, and illustrate that the regularized parallel block preconditioned Krylov method is a very efficient method for solving (1). However, I should mention that this new preconditioner involved the parameter α. How to choose the optimal parameters for the regularized preconditioner is a very practical and interesting problem that needs to be further in-depth studied. ReferencesW.E. Arnoldi, The principle of minimized iterations in the solution of the matrix eigenvalue problem, Quart. Appl. Math., 9 (1951), pp. 17-29. E. Anderson, et al., 1999. LAPACK Users; Guide Third., Philadelphia, PA: Society for Industrial and Applied Mathematics. P. R. Amestoy, I. S. Duff, J. Koster and J.-Y. L’Excellent, A Fully Asynchronous Multifrontal Solver Using Dis- tributed Dynamic Scheduling, SIAM Journal on Ma- trix Analysis and Applications., 23 (2001), pp. 15-41. E. Anderson, et al., 1999. LAPACK Users; Guide Third., Philadelphia, PA: Society for Industrial and Applied Mathematics. M. Benzi, J.A. Wathen, Some Preconditioning Techniques for Saddle Point Prob- lems, Model Order Reduction: Theory, Research Aspects and Applications., 13 (2004), pp. 195-211. M. Benzi, G.H. Golub, J. Liesen, Numerical solution of saddle point problems, Acta Numerica., 14 (2005), pp. 1-137. J.R. Cash, Second derivative extended backward differentiation formulas for the numerical integration of stiff sys- tems. SIAM J. Numer. Anal. 18 (1981), pp. 21–36. V. Duwig, A. Barbu, New MFVISC Code for Copper and Helium in Iron under Irradiation. Research Report D-P124, European Project PERFECT, 2004. J. Pestana, A. J. Wathen, Natural preconditioning and iterative methods for saddle point systems, SIAM Rev., 57 (2015), pp. 71-91. Y. Saad, Iterative Methods for Sparse Linear Systems, SIAM, Philadelphia., 6. and 7. Krylov Subspace Methods, Part I and II (2003), pp. 151-244. Y. SAAD, A flexible inner-outer preconditioned GMRES algorithm, SIAM J. Sci. Comput., 14 (1993), pp. 461-469 Download 82.01 Kb. Do'stlaringiz bilan baham: |
Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2024
ma'muriyatiga murojaat qiling
ma'muriyatiga murojaat qiling