Showing:

0

135

4yrs ago

2

## Package

0

### Categories

Forward-mode Automatic Differentiation for TensorFlow

# Installation

## Prerequisites

• tensorflow 0.12+ (compatible with 1.0+)
• Cython
• NumPy
• Python 2 or 3

## Option 1: using pip

.. code:: bash

``````pip install tensorflow_forward_ad
``````

## Option 2: building from source

.. code:: bash

``````git clone https://github.com/renmengye/tensorflow-forward-ad.git
python setup.py install
``````

## Option 3: building from source using Bazel

.. code:: bash

``````git clone https://github.com/renmengye/tensorflow-forward-ad.git
# Run all unit tests (with tensorflow-gpu only).
``````

.. code:: python

``````import tensorflow as tf

# Automatic differentiation.
x = tf.constant(1.0)
y = tf.square(x)
sess = tf.Session()
print(sess.run(dydx))  # [2.0].

# Computes Jacobian-vector product.
x = tf.ones([5, 10])
y = tf.square(x)
v = tf.ones([5, 10]) * 2
sess = tf.Session()
print(sess.run(Jv))  # [array([[ 4.,  4.,  4.,  4., ...

# A list of inputs.
x1 = tf.ones([5, 10])
x2 = tf.ones([10, 8])
y1 = tf.square(tf.matmul(x1, x2))
y2 = tf.sigmoid(tf.matmul(x1, x2))
v1 = tf.ones([5, 10]) * 0.5
v2 = tf.ones([10, 8]) * 2.0
J1v, J2v = forward_gradients([y1, y2], [x1, x2], [v1, v2])
``````

# Second-Order Matrix Vector Product

.. code:: python

``````import tensorflow as tf

x = tf.ones([5, 10])
w = tf.ones([10, 8])
z = tf.square(tf.matmul(x, w))
v = tf.ones_like(x)
y_ = tf.range(5)
f = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y_, logits=z)

# Hessian vector product
Hv = hessian_vec_fw(f, x, v)

# Fisher vector product
Fv = fisher_vec_fw(f, x, v)

# Gauss-Newton vector product
Gv = gauss_newton_vec(f, z, x, v)
``````

## Rate & Review

Great Documentation0
Easy to Use0
Performant0
Highly Customizable0
Bleeding Edge0
Responsive Maintainers0
Poor Documentation0
Hard to Use0
Slow0
Buggy0
Abandoned0
Unwelcoming Community0
100