By Dr. Ricardo Baeza-Yates, CTO, NTENT
A Neural Architecture for Bayesian CompressiveSensing over the Simplex via Laplace Techniques
Igor (noreply@blogger.com)
发表于
If you did not already know
Optimistic Lower Bounds Optimization (OLBO)
While model-based reinforcement learning has empirically been shown to significantly reduce the sample complexity that hinders model-free RL, the theoretical understanding of such methods has been rather limited. In this paper, we introduce a novel algorithmic framework for designing and analyzing model-based RL algorithms with theoretical guarantees, and a practical algorithm Optimistic Lower Bounds Optimization (OLBO). In particular, we derive a theoretical guarantee of monotone improvement for model-based RL with our framework. We iteratively build a lower bound of the expected reward based on the estimated dynamical model and sample trajectories, and maximize it jointly over the policy and the model. Assuming the optimization in each iteration succeeds, the expected reward is guaranteed to improve. The framework also incorporates an optimism-driven perspective, and reveals the intrinsic measure for the model prediction error. Preliminary simulations demonstrate that our approach outperforms the standard baselines on continuous control benchmark tasks. …
Job: Postdoctoral Researcher in Small Data Deep Learning and Explainable Machine Learning, Livermore, CA
Igor (noreply@blogger.com)
发表于
Tidyverse 'Starts_with' in M/Power Query
As a heavy R
and Tidyverse
user, I’ve been playing with Microsofts m
/Power Query language included in Excel and PowerBI from that perspective, looking for the functions to make my life easier, developing small code pipelines for my processing and trying to get a smooth, clear and maintainable data manipulation process in place.
Document worth reading: “Big Data Systems Meet Machine Learning Challenges: Towards Big Data Science as a Service”
Recently, we have been witnessing huge advancements in the scale of data we routinely generate and collect in pretty much everything we do, as well as our ability to exploit modern technologies to process, analyze and understand this data. The intersection of these trends is what is called, nowadays, as Big Data Science. Cloud computing represents a practical and cost-effective solution for supporting Big Data storage, processing and for sophisticated analytics applications. We analyze in details the building blocks of the software stack for supporting big data science as a commodity service for data scientists. We provide various insights about the latest ongoing developments and open challenges in this domain. Big Data Systems Meet Machine Learning Challenges: Towards Big Data Science as a Service
Bayesian inference and religious belief
We’re speaking here not of Bayesianism as a religion but of the use of Bayesian inference to assess or validate the evidence regarding religious belief, in short, the probability that God !=0 or the probability that the Pope is Catholic or, as Tyler Cowen put it, the probability that Lutheranism is true.
If you did not already know
Macroblock Scaling (MBS)
We estimate the proper channel (width) scaling of Convolution Neural Networks (CNNs) for model reduction. Unlike the traditional scaling method that reduces every CNN channel width by the same scaling factor, we address each CNN macroblock adaptively depending on its information redundancy measured by our proposed effective flops. Our proposed macroblock scaling (MBS) algorithm can be applied to various CNN architectures to reduce their model size. These applicable models range from compact CNN models such as MobileNet (25.53% reduction, ImageNet) and ShuffleNet (20.74% reduction, ImageNet) to ultra-deep ones such as ResNet-101 (51.67% reduction, ImageNet) and ResNet-1202 (72.71% reduction, CIFAR-10) with negligible accuracy degradation. MBS also performs better reduction at a much lower cost than does the state-of-the-art optimization-based method. MBS’s simplicity and efficiency, its flexibility to work with any CNN model, and its scalability to work with models of any depth makes it an attractive choice for CNN model size reduction. …
R Packages worth a look
Interface to ‘MLflow’ (mlflow)R interface to ‘MLflow’, open source platform for the complete machine learning life cycle, see <htt …
Sunday Morning Video (in french): Les travaux de Grothendieck.sur les espaces de Banach, Gilles. Pisier (Lectures grothendieckiennes)
Igor (noreply@blogger.com)
发表于