在3中我们介绍了怎么使用TensorFlow
做多元回归。本文使用一个图文结合的具体例子详细说明怎么做。下一篇我们将会沿用4中所说的办法介绍怎么做logistic regression.做线性回归和logistic regression不是tf
的特长。tf
的特长是用各种神经网络做深度学习。我们将会在后几篇陆续介绍。
test math formula in github pages
A quick way to enable Latex math formula in github pages
convolve, correlate and image process in numpy
convolution matrix is the most import conception in cnn. Here is a simple introduction of convolve, correlate and some image processing basic techs in numpy.
variable selection in linear regression: 2
This is an example of variable selection in linear regression(and it can be easily applied in logistic regression). It has two steps:
jupyter and pandas display
tips on ipython display and pandas options
weighted avarage, aggrefated function with apply and agg
suppose we want to calculate the weighted average probability of default of all risk rating weighted by the number of borrowers on each pd risk rating(or a little more, grouped by industry or portfolio), we need this weighted average function on each group. Pandas
has groupby
to split data, and then apply function to calculate and summarize.
tensorflow简介--04
前面几章我们讨论了给定一些features,比如房屋面积,怎么通过TF的线性回归来预测结果,比如说房屋价格。下面会讨论logistic regression,它会通过输入的features来实现分类。比如说通过输入的图片,来进行0-9的分类。
tensorflow简介--03
经过前面介绍的单feature的线性模型,损失函数,梯度下降(part 1),epoch, learn-rate, 可变梯度下降(part 2),我们可以继续介绍TF下面多特征(multi-feature)的线性回归。
tensorflow简介--02
上一章讨论了怎么在TF里面训练机器学习模型,以及基本的tensorflow代码。这为接下来讨论各种训练变化比如stochastic/mini-batch/batch, 以及adaptive learning rate gradient descent预铺了道路.