天天看点

TensorFlow通过raw_ops调用底层op计算

tf底层的算子定义在tensorflow-1.15.0\tensorflow\core\ops\ops.pbtxt

raw_ops调用必须采用a = tf.raw_ops.Add(x=x, y=y)的形式,x=不能不写,可以采用下面这种方式解决:

name和data关联可以采用**arges

 inputs = dict(zip(input_names, input_data))

result_tf = tf_op(**inputs, **attr_seting)

example 1 简单加法

import tensorflow as tf
import numpy


x = 10
y = 20

a = tf.raw_ops.Add(x=x, y=y)

# Start training
with tf.Session() as sess:

    # Run the initializer
    print(sess.run(a))
           

注意要使用关键字参数,即形如Add(x=x, y=y),具体的关键字参数是什么,要去算子介绍页面查看,例如tensorflow::ops::Add介绍页。

example 2 调用反向梯度优化op

import tensorflow as tf
import numpy as np


shape = (1,8)

var = tf.Variable(tf.ones(shape=shape),name='var')


alpha = 0.5
delta = tf.Variable(tf.ones(shape=shape),name='delta')


output = tf.raw_ops.ApplyGradientDescent(  var=var,  alpha=alpha,  delta=delta)

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())

    # Run the initializer
    print(sess.run(output))
 
    
           

example 3 调用ApplyMomentum 反向梯度momentum优化op

import tensorflow as tf
import numpy as np


shape = (1,8)
lr = 1.0
momentum = 0.9

var = tf.Variable(tf.ones(shape=shape),name='var')
grad = tf.Variable(tf.ones(shape=shape),name='delta')
accum = tf.Variable(tf.ones(shape=shape),name='delta')

output = tf.raw_ops.ApplyMomentum(
         var=var,
         accum=accum,
         lr=lr,
         grad=grad,
         momentum=momentum)


with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())

    # Run the initializer
    print(sess.run(output))
    print(sess.run(var))
    print(sess.run(accum))
 
    
           

另一种简单的方法就是在TensorFlow的python包中找到算子定义包并导入调用,例如:

from tensorflow.python.training.gen_training_ops import apply_proximal_adagrad
           

继续阅读