The name mxnet-mkl was used for MKL2017 integration before it was replaced by MKL-DNN integration in 1.2.0 release. To provide consistent experience to the users, we reused this name to deliver MXNet + MKL-DNN pip package.
Actually, MKL-DNN doesn't fulfill BLAS requirements. MKL-DNN itself doesn't have any BLAS functionalities. We still need a BLAS library (currently in Makefile, should be one of OpenBLAS/Atlas/MKL) to build MXNet even MKL-DNN is enabled. We talked about building mxnet-mkl package with USE_BLAS=mkl instead, but finally we met the license issue of MKL if we want to redistribute it with MXNet. Hope these can answer your questions. -tao -----Original Message----- From: Aaron Markham [mailto:[email protected]] Sent: Thursday, September 20, 2018 11:26 PM To: [email protected] Subject: Re: Remove MKLML as dependency I find it unintuitive that mxnet-mkl doesn't actually ship with MKL. Why isn't it called mxnet-mkldnn instead? Side note, if mkldnn fulfills BLAS requirements, then why can't we strip out OpenBLAS for the "mxnet-mkl" package? Is there no way to make the submodules conform to using mkldnn? All in the spirit of simplifying things...limiting the deps... On Sep 20, 2018 07:41, "Lv, Tao A" <[email protected]> wrote: Hah, seems it's a little confusing here. I think the "Intel MKL" in the first statement includes both the full MKL and MKLML library. And the "dynamic library" there obviously means the MKLML which is delivered in MKL-DNN repo. MKLML is a subset of full MKL and includes all BLAS functions for both single precision and double precision. From this point of view, I think it can be used as a BLAS library, but cannot be used as full MKL. -tao -----Original Message----- From: Chris Olivier [mailto:[email protected]] Sent: Thursday, September 20, 2018 9:36 PM To: [email protected] Subject: Re: Remove MKLML as dependency thanks for the info. I am still a little confused — your statement said “MKL” and not “MKLML”, so my question is still the same. Are GEMMS in MKLML or just MKL? I know MKLML doesn’t have a blas library like the main MKL. On Wed, Sep 19, 2018 at 11:49 PM Lv, Tao A <[email protected]> wrote: > Hi Chris, please kindly check the statements here: > https://github.com/intel/mkl-dnn#installation > > " Intel MKL-DNN can take advantage of optimized matrix-matrix > multiplication (GEMM) function from Intel MKL. The dynamic library > with this functionality is included in the repository. " > > " You can choose to build Intel MKL-DNN without binary dependency. The > resulting version will be fully functional, however performance of > certain convolution shapes and sizes and inner product relying on > SGEMM function may be suboptimal." > > -tao > > -----Original Message----- > From: Chris Olivier [mailto:[email protected]] > Sent: Thursday, September 20, 2018 11:20 AM > To: [email protected] > Subject: Re: Remove MKLML as dependency > > maybe I missed it, but what does MKLML have that mkldnn doesn’t have > that makes it necessary? > > what’s the motivation for removing it? > > On Tue, Sep 18, 2018 at 11:31 PM Lv, Tao A <[email protected]> wrote: > > > If you just want to test the performance, I think you need link MKL > > for BLAS and MKL-DNN for NN. Also MKL-DNN should link MKL for better > > performance. > > > > Here are some ways for you to install full MKL library if you don't > > have > > one: > > 1. Register and download from intel website: > > https://software.intel.com/en-us/mkl > > 2. Apt-get/yum: currently it need configure Intel’s repositories. > > a. > > > https://software.intel.com/en-us/articles/installing-intel-free-libs-a > nd-python-yum-repo > > b. https://software.intel.com/en-us/articles/ > > thatinstalling-intel-free-libs-and-python-apt-repo > > <https://software.intel.com/en-us/articles/installing-intel-free-lib > > s- > > and-python-apt-repo> 3. pip install mkl / mkl-devel: ‘mkl’ package > > and-python-apt-repo> has > > the runtime and ‘mkl-devel’ includes everything with the headers > > a. > > https://software.intel.com/en-us/articles/installing-the-intel-distr > > ib ution-for-python-and-intel-performance-libraries-with-pip-and > > 4. conda install: also has mkl and mkl-devel > > a. https://anaconda.org/intel/mkl > > b. https://anaconda.org/intel/mkl-devel > > > > If you want to redistribute MKL with MXNet, you may need take care > > of the license issue. Currently, MKL is using ISSL ( > > https://software.intel.com/en-us/license/intel-simplified-software-l > > ic > > ense > > ). > > > > -----Original Message----- > > From: Zai, Alexander [mailto:[email protected]] > > Sent: Wednesday, September 19, 2018 12:49 PM > > To: [email protected] > > Subject: Re: Remove MKLML as dependency > > > > Will test it out tomorrow. > > > > On the side, what is the best way to test MKL build for MXnet. MKL > > is licensed? > > > > Best, > > Alex > > > > On 9/18/18, 7:50 PM, "Lv, Tao A" <[email protected]> wrote: > > > > Hi Alex, > > > > Thanks for bringing this up. > > > > The original intention of MKLML is to provide a light and > > easy-to-access library for ML/DL community. It's released with > > MKL-DNN under Apache-2.0 license. > > > > AFAIK, MKL-DNN still relies on it for better performance. So I'm > > afraid there will be a performance regression in MKL pip packages if > > MKLML is simply removed. > > > > Have you ever tried the build without MKLML and how does the > > performance look like? > > > > -tao > > > > -----Original Message----- > > From: Alex Zai [mailto:[email protected]] > > Sent: Wednesday, September 19, 2018 4:49 AM > > To: [email protected] > > Subject: Remove MKLML as dependency > > > > On our build from source page we have a list of blas libraries > > that are recommended: > > > > https://mxnet.incubator.apache.org/install/build_from_source.html > > > > MKL-DNN > > MKL > > MKLML > > Apple Accelerate > > OpenBlas > > > > MKLML is a subset of MKL ( > https://github.com/intel/mkl-dnn/issues/102) > > and therefore MKLML users can just use MKL instead. Does anyone > > see an issue with me removing this? It would simplify out doc page > > and > build file. > > > > Alex > > > > > > >
