Using Contributed Plugins in Pyxu#
In this notebook, we will explore how to use contributed plugins in the Pyxu software framework. We will focus on the pyxu-gradient-descent
plugin.
Step 1: Attempting to Import Before Installation#
Before installing the plugin, let’s try importing an object from the pyxu-gradient-descent
namespace to see what happens.
try:
from pyxu.opt.solver import GradientDescent
except ImportError as e:
print(e)
cannot import name ‘GradientDescent’ from ‘pyxu.opt.solver’ (pyxu/opt/solver/init.py)
You should see an ImportError, indicating that the objects are not available because the plugin is not installed yet.
Step 2: Installing the Plugin#
To install the pyxu-gradient-descent plugin, use the following magic command:
!pip install pyxu-gradient-descent
Step 3: Importing the Necessary Objects#
Now that we have installed the plugin, let’s import the necessary objects:
from pyxu.operator import SquaredL2Norm
from pyxu.opt.solver import GradientDescent
import numpy as np
pyxu/info/plugin.py:69: ContributionWarning: Plugin GradientDescent
loaded, use with caution. warnings.warn(f”Plugin {name}
loaded, use with caution.”, pxw.ContributionWarning)
The warning signals that a class or function has been imported from external code into the core Pyxu codebase, and should be used with caution.
Step 4: Using the Gradient Descent Algorithm#
In this step, we will use the gradient descent algorithm to minimize a function defined by the squared L2 norm. Here is how you can do it:
# Define the dimension
N = 100
# Generate random data
rng = np.random.default_rng(seed=0)
y = rng.uniform(size=N)
# Define the loss function
loss = SquaredL2Norm(dim_shape=(N, )).asloss(y)
# Initialize the gradient descent algorithm
gd = GradientDescent(f=loss)
# Define the initial point
x0 = np.random.randn(N)
# Run the gradient descent algorithm
gd.fit(x0=x0, acceleration=True)
# Check the solution
assert np.allclose(gd.solution(), y)
INFO – [2023-09-13 09:28:52.933466] Iteration 0
iteration: 0
RelError[x]: 0.0
INFO – [2023-09-13 09:28:52.934287] Iteration 1
iteration: 1
RelError[x]: 1.1798071385920126
INFO – [2023-09-13 09:28:52.934882] Iteration 2
iteration: 2
RelError[x]: 9.55411637565644e-17
INFO – [2023-09-13 09:28:52.935515] Stopping Criterion satisfied -> END
If everything runs without errors, it means that the gradient descent algorithm has successfully found a solution that is close to the input data.
Conclusion#
In this notebook, we have seen how to use contributed plugins in the Pyxu software framework. We learned how to install a plugin and use it to run a gradient descent algorithm. We hope this guide has been helpful, and encourage you to explore other plugins available for the Pyxu framework.