1. 程式人生 > >Tensorflow正則化函式tf.contrib.layers.l1_regularizer()和tf.contrib.layers.l2_regularizer()

Tensorflow正則化函式tf.contrib.layers.l1_regularizer()和tf.contrib.layers.l2_regularizer()

L1正則化公式:

L2正則化公式:

tf.contrib.layers.l1_regularizer()和tf.contrib.layers.l2_regularizer()是Tensoflow中L1正則化函式和L2正則化函式的API。

其基本用法如下:

import tensorflow as tf
import os

os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
os.environ["CUDA_VISIBLE_DEVICES"] = "0"

w = tf.constant([[1., -2.], [-3., 4.]])
# w即我們的正則化項的權重,0.5是我們正則化項的權重引數λ
l1_regular = tf.contrib.layers.l1_regularizer(0.5)(w)
l2_regular = tf.contrib.layers.l2_regularizer(0.5)(w)
with tf.Session() as sess:
	sess.run(tf.global_variables_initializer())
	l1_result, l2_result = sess.run([l1_regular, l2_regular])
	# l1的值為(|1|+|-2|+|-3|+|4|)*0.5=5
	print(l1_result)
	# l2的值為((1²+(-2)²+(-3)²+4²)/2)*0.5=7.5
	# L2的正則化損失值預設需要除以2,後面乘以0.5是我們設定的
	print(l2_result)

執行結果如下:

5.0
7.5

Process finished with exit code 0

在tensorflow的模型中,我們想要在loss函式中加入正則化項,應當這麼操作:

loss = tf.reduce_mean(tf.square(y-y_pred) + tf.contrib.layers.l2_regularizer(0.5)(w))

即給每個樣本的loss值都加上正則化項,再求其平均值。