I haven’t posted a ton this year. It turns out, however, that this has been a symptom of having a very eventful year more-so than of disinterest.
“Thus, a loss aversion principle is rendered superfluous to an account of the phenomena it was introduced to explain.”
What better day than Christmas, that day of gift-giving, to discuss “loss aversion,” the purported asymmetry in utility, whereby losses are systematically more painful than gains are pleasant?
The Need for Speed Part 2: C++ vs. Fortran vs. C
Magister Dixit
“The question is no longer whether AI is going to fundamentally change the workplace. According to a recent survey, 85 percent of executives believe that AI will be transformative for their companies, enabling them to enter a new business or competitive advantage. Now, the true question lies in how companies can successfully leverage AI in ways that joins, not replaces, the human workforce.” Rohit Adlakha ( November 3, 2017 )
A Guide to Decision Trees for Machine Learning and Data Science
By George Seif, AI / Machine Learning Engineer
If you did not already know
Snake A regularized optimization problem over a large unstructured graph is studied, where the regularization term is tied to the graph geometry. Typical regularization examples include the total variation and the Laplacian regularizations over the graph. When applying the proximal gradient algorithm to solve this problem, there exist quite affordable methods to implement the proximity operator (backward step) in the special case where the graph is a simple path without loops. In this paper, an algorithm, referred to as ‘Snake’, is proposed to solve such regularized problems over general graphs, by taking benefit of these fast methods. The algorithm consists in properly selecting random simple paths in the graph and performing the proximal gradient algorithm over these simple paths. This algorithm is an instance of a new general stochastic proximal gradient algorithm, whose convergence is proven. Applications to trend filtering and graph inpainting are provided among others. Numerical experiments are conducted over large graphs. …
4 Reasons Santa Needs Machine Learning & AI
lynn.heidmann@dataiku.com (Lynn Heidmann)
发表于
You know how the song goes - he’s making a list, he’s checking it twice. But if you ask us, all Santa Claus should really be doing come December is doing some light maintenance on his machine learning models before heading off on his ML-optimized route on Christmas Eve. It might take a few elves-turned-data scientists, but here are four ways that Santa could optimize with data.
Objects types and some useful R functions for beginners
All objects in R have a given type. You already know most of them, as these types are also usedin mathematics. Integers, floating point numbers, or floats, matrices, etc, are all objects youare already familiar with. But R has other, maybe lesser known data types (that you can find in alot of other programming languages) that you need to become familiar with. But first, we need tolearn how to assign a value to a variable. This can be done in two ways:
Dreaming of a white Christmas – with ggmap in R
With the holidays approaching, one of the most discussed questions at STATWORX was whether we’ll have a white Christmas or not. And what better way to get our hopes up, than by taking a look at the DWD Climate Data Center’s historic data on the snow depth on the past ten Christmas Eves?
University of Virginia: Faculty, Open Rank Model and Simulation at the Human-Technology Frontier [Charlottesville, VA]
At: University of Virginia Location: Charlottesville, VAWeb: virginia.eduPosition: Faculty, Open Rank Model and Simulation at the Human-Technology Frontier