We at STATWORX work a lot with R and we often use the same little helper functions within our projects. These functions ease our daily work life by reducing repetitive code parts or by creating overviews of our projects. At first, there was no plan to make a package, but soon I realised, that it will be much easier to share and improve those functions, if they are within a package. Up till the 24th December I will present one function each day from helfRlein
. So, on the 9th day of Christmas my true love gave to me…
If you did not already know
Kernel Regression With Sparse Metric Learning (KR-SML) Kernel regression is a popular non-parametric fitting technique. It aims at learning a function which estimates the targets for test inputs as precise as possible. Generally, the function value for a test input is estimated by a weighted average of the surrounding training examples. The weights are typically computed by a distance-based kernel function and they strongly depend on the distances between examples. In this paper, we first review the latest developments of sparse metric learning and kernel regression. Then a novel kernel regression method involving sparse metric learning, which is called kernel regression with sparse metric learning (KR$_$SML), is proposed. The sparse kernel regression model is established by enforcing a mixed $(2,1)$-norm regularization over the metric matrix. It learns a Mahalanobis distance metric by a gradient descent procedure, which can simultaneously conduct dimensionality reduction and lead to good prediction results. Our work is the first to combine kernel regression with sparse metric learning. To verify the effectiveness of the proposed method, it is evaluated on 19 data sets for regression. Furthermore, the new method is also applied to solving practical problems of forecasting short-term traffic flows. In the end, we compare the proposed method with other three related kernel regression methods on all test data sets under two criterions. Experimental results show that the proposed method is much more competitive. …
Whats new on arXiv
Unsupervised Deep Slow Feature Analysis for Change Detection in Multi-Temporal Remote Sensing Images
Smartly select and mutate data frame columns, using dict
Motivation
Magister Dixit
“People share and put billions of connections into this big graph every day. We don’t want to just add incrementally to that. We want, over the next five or ten years, to take on a road map to try to understand everything in the world semantically and map everything out. These are the big themes for us and is what we are going to try and do over the next five or ten years. That is what I have tried to focus us on …” Mark Zuckerberg ( September 11, 2013 )
An 8-hour course on R and Data Mining
I will run an 8-hour course on R and Data Mining at Black Mountain, CSIRO, Australia on 10 & 13 December 2018.
An 8-hour course on R and Data Mining
I will run an 8-hour course on R and Data Mining at Black Mountain, CSIRO, Australia on 10 & 13 December 2018.
Document worth reading: “What Do We Understand About Convolutional Networks”
This document will review the most prominent proposals using multilayer convolutional architectures. Importantly, the various components of a typical convolutional network will be discussed through a review of different approaches that base their design decisions on biological findings and/or sound theoretical bases. In addition, the different attempts at understanding ConvNets via visualizations and empirical studies will be reviewed. The ultimate goal is to shed light on the role of each layer of processing involved in a ConvNet architecture, distill what we currently understand about ConvNets and highlight critical open problems. What Do We Understand About Convolutional Networks
Distilled News
Using Semantic Web technologies in the development of data warehouses: A systematic mapping
Interesting packages taken from R/Pharma
A few month ago I joined the R/Pharma conference in Cambridge, MA.