Optimization Tools#

minimize(fun, x0, args=(), method='COBYLA', options={})[source]#

Minimization of scalar functions of one ore more variables via gradient-free solvers.

The API for this function matches SciPy with some minor deviations.

  • Various optional arguments in the SciPy interface have not yet been implemented.

  • maxiter defines the maximum number of iterations to perform and not the maximum number of function evaluations.

Parameters:
funcallable

The objective function to be minimized, fun(x, *args) -> float, where x is a 1-D array with shape (n,) and args is a tuple of parameters needed to specify the function.

x0jax.Array

Initial guess. Array of real elements of size (n,), where n is the number of independent variables.

argstuple

Extra arguments passed to the objective function.

methodstr, optional

The solver type. Supported are SPSA and COBYLA.

optionsdict, optional

A dictionary of solver options. All methods accept the following generic options:

  • maxiterint

    Maximum number of iterations to perform. Depending on the method each iteration may use several function evaluations.

Returns:
results

An OptimizeResults object.

Examples

We prepare the state

\[\ket{\psi_{\theta}} = \cos(\theta)\ket{0} + \sin(\theta)\ket{1}\]
from qrisp import QuantumFloat, ry
from qrisp.jasp import expectation_value, minimize, jaspify
import jax.numpy as jnp

def state_prep(theta):
    qv = QuantumFloat(1)
    ry(theta[0], qv)
    return qv

Next, we define the objective function calculating the expectation value from the prepared state

def objective(theta, state_prep):
    return expectation_value(state_prep, shots=100)(theta)

Finally, we use optimize to find the optimal choice of the parameter \(\theta_0\) that minimizes the objective function

@jaspify(terminal_sampling=True)
def main():

    x0 = jnp.array([1.0])

    return minimize(objective,x0,args=(state_prep,))

results = main()
print(results.x)
print(results.fun)