115-Dssm for Recommender System

  • Learning Deep Structured Semantic Models for Web Search using Clickthrough Data
  • Mixed Negative Sampling for Learning Two-towser Neural Networks in Recommendations
  • Sampling-Bias-Corrected Neural Modeling for Large Copus Item Recommendations
  • Embedding-based Retrieval in Facebook Search

Read More

111-TensorFlow Gradient Descent Trainging Linear regression

jupyter-notebook command

1
$ nohup jupyter-notebook &

TensorFlow-BasicTraingLoop

Solving machine leanring problem

    1. Obtain training data
    1. Define the model
    1. Define a loss function
    1. Run through the trainging data, calculating loss from the ideal value
    1. Calculate the gradients for that loss and use an optimizer to adjust the variables to fit the data
    1. Evaluate your results

Gradient Descent

  • Batch Gradient Descent
  • Stochastic Gradient Descent
  • Mini-batch Gradient Descent

Read More

离散分布随机采样Alias Method

目标:将在一个离散分布上采样的时间复杂度由O(n)/O(log(n))优化至O(1)

wikipedia:
“In computing, the alias method is a family of efficient algorithms for sampling from a discrete probability distribution, published in 1974 by A. J. Walker.”

Darts, Dice, and Coins: Sampling from a Discrete Distribution
Alias Method离散分布随机取样
The Alias Method: Efficient Sampling with Many Discrete Outcomes

Read More