pydata: Huiming's learning notes

Keep Looking, Don't Settle

Tensorflow简介--05: Multivariate Regression with Stochastic Gradient Descent

3中我们介绍了怎么使用TensorFlow做多元回归。本文使用一个图文结合的具体例子详细说明怎么做。下一篇我们将会沿用4中所说的办法介绍怎么做logistic regression.做线性回归和logistic regression不是tf的特长。tf的特长是用各种神经网络做深度学习。我们将会在后几篇陆续介绍。

weighted avarage, aggrefated function with apply and agg

suppose we want to calculate the weighted average probability of default of all risk rating weighted by the number of borrowers on each pd risk rating(or a little more, grouped by industry or portfolio), we need this weighted average function on each group. Pandas has groupby to split data, and then apply function to calculate and summarize.

tensorflow简介--04

前面几章我们讨论了给定一些features,比如房屋面积,怎么通过TF的线性回归来预测结果,比如说房屋价格。下面会讨论logistic regression,它会通过输入的features来实现分类。比如说通过输入的图片,来进行0-9的分类。

tensorflow简介--03

经过前面介绍的单feature的线性模型,损失函数,梯度下降(part 1),epoch, learn-rate, 可变梯度下降(part 2),我们可以继续介绍TF下面多特征(multi-feature)的线性回归。

tensorflow简介--02

上一章讨论了怎么在TF里面训练机器学习模型,以及基本的tensorflow代码。这为接下来讨论各种训练变化比如stochastic/mini-batch/batch, 以及adaptive learning rate gradient descent预铺了道路.