If you did not already know

Model Management Deep Neural Network (MMdnn) MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g. model conversion and visualization. Convert models between Caffe, Keras, MXNet, Tensorflow, CNTK, PyTorch and CoreML.A comprehensive, cross-framework solution to convert, visualize and diagnosis deep neural network models. The ‘MM’ in MMdnn stands for model management and ‘dnn’ is an acronym for deep neural network.Basically, it converts many DNN models that trained by one framework into others. The major features include:· Model File Converter Converting DNN models between frameworks· Model Code Snippet Generator Generating training or inference code snippet for frameworks· Model Visualization Visualizing DNN network architecture and parameters for frameworks· Model compatibility testing (On-going)This project is designed and developed by Microsoft Research (MSR). We also encourage researchers and students leverage this project to analysis DNN models and we welcome any new ideas to extend this project. …

Dopamine Dopamine is a research framework for fast prototyping of reinforcement learning algorithms. It aims to fill the need for a small, easily grokked codebase in which users can freely experiment with wild ideas (speculative research).Our design principles are:• Easy experimentation: Make it easy for new users to run benchmark experiments.• Flexible development: Make it easy for new users to try out research ideas.• Compact and reliable: Provide implementations for a few, battle-tested algorithms.• Reproducible: Facilitate reproducibility in results. …

Partition Set Cover Problem Various $O(\log n)$ approximations are known for the Set Cover problem, where $n$ is the number of elements. We study a generalization of the Set Cover problem, called the Partition Set Cover problem. Here, the elements are partitioned into $r$ color classes, and we are required to cover at least $k_t$ elements from each color class $\mathcal{C}_t$, using the minimum number of sets. We give a randomized LP rounding algorithm that is an $O(\beta + \log r)$ approximation for the Partition Set Cover problem. Here $\beta$ denotes the approximation guarantee for a related Set Cover instance obtained by rounding the standard LP. As a corollary, we obtain improved approximation guarantees for various set systems, for which $\beta$ is known to be sublogarithmic in $n$. …

Like this:

Like Loading…

Related