Changes

Jump to navigation Jump to search
* '''GPU computing does not work well for linear programming.''' The biggest reason is [https://groups.google.com/forum/?fromgroups#!searchin/gurobi/gpu/gurobi/KTP6zDvodII/oPPQT4-mofMJ that GPUs don't work well for sparse linear algebra used for most LP solver]. Another [https://parallellp.wordpress.com/author/parallellp/ case study] published 3 years ago shows that "the overhead of communication between CPU and GPU grows faster than the benefit of parallelizing matrix operations via CUDA." In another [https://www.researchgate.net/publication/308967833_GPU_Computing_Applied_to_Linear_and_Mixed_Integer_Programming technical report], it is confirmed that "all algorithms (for solving LP and MIP problems)... do not fit to the SIMT paradigm".
 
* For the above reason, Gurobi '''does not''' support GPU computing.
 
* Matlab '''does not''' have a built-in GPU-based linprog and ga.
==Reference==
1. [https://www.mathworks.com/discovery/matlab-gpu.html MATLAB GPU Computing Support for NVIDIA CUDA-Enabled GPUs] <br>
2. [https://www.mathworks.com/help/distcomp/getting-started-with-parallel-computing-toolbox.html Getting Started with Parallel Computing Toolbox] <br>

Navigation menu