SunJackson Blog


  • 首页

  • 分类

  • 关于

  • 归档

  • 标签

  • 站点地图

  • 公益404

Single Neuron Gradient Descent

转载自:https://cavaunpeu.github.io/2016/05/06/single-neuron-gradient-descent/

Will Wolf


发表于 2016-05-06

In my experience, the gap between a conceptual understanding of how a machine learning model “learns” and a concrete, “I can do this with a pencil and paper” understanding is large. This gap is further exacerbated by the nature of popular machine learning libraries which allow you to use powerful models without knowing how they really work. This isn’t such a bad thing. But knowledge is power. In this post, I aim to close the gap above for a vanilla neural network that learns by gradient descent: we will use gradient descent to learn a weight and a bias for a single neuron. From there, when learning an entire network of millions of neurons, we just do the same thing a bunch more times. The rest is details. The following assumes a cursory knowledge of linear combinations, activation functions, cost functions, and how they all fit together in forward propagation. It is math-heavy with some Python interspersed.

阅读全文 »

Akka Stream

转载自:http://rnduja.github.io/2016/05/06/akka_stream_source_flow_sink/

未知


发表于 2016-05-06

Reactive Stream vs Akka Stream

阅读全文 »

Google's NHS deal does not bode well for the future of data-sharing

转载自:http://inverseprobability.com/2016/05/05/google-nhs-deal

未知


发表于 2016-05-05

Originally appeared in the Guardian’s Media and Tech Network

阅读全文 »

The structure of Mafia syndacates

转载自:http://www.emilio.ferrara.name/2016/05/04/the-structure-of-mafia-syndacates/

admin


发表于 2016-05-04

S Agreste, S Catanese, P De Meo, E Ferrara, G Fiumara. Network structure and resilience of Mafia syndicates. Information Sciences, 2016

阅读全文 »

White House launches workshops to prepare for Artificial Intelligence

转载自:https://aimatters.wordpress.com/2016/05/04/white-house-launches-workshops-to-prepare-for-artificial-intelligence/

Stephen Oman


发表于 2016-05-04

The White House. Photo: By Zach Rudisin (Own work) [CC BY-SA 3.0 (http://creativecommons.org/licenses/by-sa/3.0)%5D, via Wikimedia CommonsIt looks like Artificial Intelligence has really gone mainstream with the White House taking notice and starting to act.

阅读全文 »

Similar pages for Wikipedia

转载自:https://blog.lateral.io/2016/05/similar-pages-wikipedia/

Felix


发表于 2016-05-03

Wikipedia is one of the top 10 most widely used websites globally and a rich source of information. There are pages for every conceivable topic; the English version alone has over 5 million. Using Wikipedia it is both possible to rapidly get information about a topic and then to also delve deeper into the details.

阅读全文 »

A wild dataset has appeared! Now what?

转载自:http://kldavenport.com/a-wild-dataset-has-appeared-now-what/

Kevin Davenport


发表于 2016-05-02

阅读全文 »

Becoming a Data Scientist Podcast Episode 10: Trey Causey

转载自:https://www.becomingadatascientist.com/2016/05/01/becoming-a-data-scientist-podcast-episode-10-trey-causey/

Renee


发表于 2016-05-01

阅读全文 »

Baseball Card Collecting

转载自:http://datagenetics.com/blog/april32016/index.html

未知


发表于 2016-04-29

Gotta Catch‘em all!

阅读全文 »

Rolling and Unrolling RNNs

转载自:https://shapeofdata.wordpress.com/2016/04/27/rolling-and-unrolling-rnns/

Jesse Johnson


发表于 2016-04-28

A while back, I discussed Recurrent Neural Networks (RNNs), a type of artificial neural network in which some of the connections between neurons point “backwards”. When a sequence of inputs is fed into such a network, the backward arrows feed information about earlier input values back into the system at later steps. One thing that I didn’t describe in that post was how to train such a network. So in this post, I want to present one way of thinking about training an RNN, called unrolling.

阅读全文 »
1 … 370 371 372 … 398
SunJackson

SunJackson

3974 日志
5 分类
© 2018 - 2019 SunJackson