Minimize a function of one variable in Tensorflow

ghz 7months ago ⋅ 148 views

I am new to Tensorflow and was wondering whether it would be possible to minimize a function of one variable using Tensorflow.

For example, can we use Tensorflow to minimize 2*x^2 - 5^x + 4 using an initial guess (say x = 1)?

I am trying the following:

import tensorflow as tf
import numpy as np

X = tf.placeholder(tf.float32, shape = ())
xvar = tf.Variable(np.random.randn())    
f = 2*mul(X,X) - 5*X + 4

opt = tf.train.GradientDescentOptimizer(0.5).minimize(f)

with tf.Session() as sess:
    tf.global_variables_initializer().run()
    y = sess.run(opt, feed_dict = {X : 5.0}) #initial guess = 5.0
    print(y)

But this gives the following error:

ValueError: No gradients provided for any variable, check your graph for ops that do not support gradients, between variables

Please help me understand what am I doing wrong here.

Answers

To minimize a function using TensorFlow, you should set up the computational graph correctly and ensure that the function you're trying to minimize depends on the TensorFlow variables. The error you're encountering is because the placeholder X is not a TensorFlow variable, so TensorFlow does not know how to compute the gradients with respect to it.

Here's how you can set up and minimize the function ( f(x) = 2x^2 - 5x + 4 ) using TensorFlow:

  1. Define the variable ( x ) that will be optimized.
  2. Define the function ( f(x) ) in terms of this variable.
  3. Use an optimizer to minimize ( f(x) ).

Below is the corrected version of your code:

import tensorflow as tf
import numpy as np

# Define the variable to be optimized
x = tf.Variable(1.0, name='x')  # Initial guess

# Define the function
f = 2 * tf.square(x) - 5 * x + 4

# Define the optimizer
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.1)
train = optimizer.minimize(f)

# Create a session and run the optimization
with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    for step in range(100):
        sess.run(train)
        current_x, current_f = sess.run([x, f])
        print(f"Step {step}: x = {current_x}, f(x) = {current_f}")

Explanation:

  1. Variable Definition:

    x = tf.Variable(1.0, name='x')
    

    Here, x is defined as a TensorFlow variable with an initial value of 1.0. This variable will be optimized.

  2. Function Definition:

    f = 2 * tf.square(x) - 5 * x + 4
    

    The function ( f(x) = 2x^2 - 5x + 4 ) is defined in terms of the variable x.

  3. Optimizer:

    optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.1)
    train = optimizer.minimize(f)
    

    The GradientDescentOptimizer is used to minimize the function ( f ). The minimize method adds operations to the graph to minimize f.

  4. Session and Optimization:

    with tf.Session() as sess:
        sess.run(tf.global_variables_initializer())
        for step in range(100):
            sess.run(train)
            current_x, current_f = sess.run([x, f])
            print(f"Step {step}: x = {current_x}, f(x) = {current_f}")
    

    A TensorFlow session is created to run the operations. The variables are initialized, and the optimization is run for 100 steps. The current value of x and f(x) is printed at each step.

By following these steps, TensorFlow can correctly compute the gradients and perform the optimization.