Web Picks (week of 25 January 2016)

Every two weeks, we find the most interesting data science links from around the web and collect them in Data Science Briefings, the DataMiningApps newsletter. Subscribe now for free if you want to be the first to get up to speed on interesting resources.

  • Number of legal Go positions
    On Jan 20, 2016, the number of legal positions on a standard size Go board was determined to be 2081681993819799846 9947863334486277028 6522453884530548425 6394568209274196127 3801537852564845169 8519643907259916015 6281285460898883144 2712971531931755773 6620397247064840935.
  • OpenFace
    OpenFace is a Python and Torch implementation of face recognition with deep neural networks and is based on the CVPR 2015 paper FaceNet: A Unified Embedding for Face Recognition and Clustering by Florian Schroff, Dmitry Kalenichenko, and James Philbin at Google. Torch allows the network to be executed on a CPU or with CUDA.
  • Kaggle Datasets
    The new place to discover and seamlessly analyze publicly available data, made available by Kaggle.
  • The Unreasonable Reputation of Neural Networks
    “It is hard not to be enamoured by deep learning nowadays, watching neural networks show off their endless accumulation of new tricks. There are, as I see it, at least two good reasons to be impressed.”
  • Understanding Deep Convolutional Networks [pdf]
    Deep convolutional networks provide state of the art classifications and regressions results over many high-dimensional problems. This article reviews their architecture, which scatters data with a cascade of linear filter weights and non-linearities. A mathematical framework is introduced to analyze their properties. Computations of invariants involve multiscale contractions, the linearization of hierarchical symmetries, and sparse separations. Applications are discussed.
  • Visualizing CNN architectures side by side with mxnet
    Convolutional Neural Networks can be visualized as computation graphs with input nodes where the computation starts and output nodes where the result can be read. Here the models that are provided with mxnet are compared using the mx.viz.plot_network method. The output node is at the top and the input node is at the bottom.
  • A ‘Brief’ History of Neural Nets and Deep Learning
    “This is the first part of ‘A Brief History of Neural Nets and Deep Learning’. Part 2 is here, and parts 3 and 4 are here and here. In this part, we shall cover the birth of neural nets with the Perceptron in 1958, the AI Winter of the 70s, and neural nets’ return to popularity with backpropagation in 1986.”
  • Experiments with style transfer
    “Since the original Artistic style transfer and the subsequent Torch implementation of the algorithm by Justin Johnson were released I’ve been playing with various ways to use the algorithm in other ways.”