400 refactor network class to interface#402
Conversation
There was a problem hiding this comment.
Pull request overview
This PR refactors the neural-network “Network” implementation toward an interface-based design and updates multiple learners/tests to construct FeedForward networks directly. It also introduces new NDArray/NumPower-based neural-net components (e.g., Networks\FeedForward) plus a new MLPRegressor and SwissRoll generator in the new namespace layout.
Changes:
- Convert the legacy
NeuralNet\Networkclass into an interface and update callers to instantiateFeedForwarddirectly. - Add NDArray/NumPower-oriented network contracts/implementations (
Networks\Base\Contracts\Network,Networks\FeedForward\FeedForward) and update snapshot tests accordingly. - Add new NDArray-based
MLPRegressor+SwissRollgenerator, plus new/updated PHPUnit coverage and a Parameter cloning workaround for NumPower stability.
Reviewed changes
Copilot reviewed 24 out of 24 changed files in this pull request and generated 6 comments.
Show a summary per file
| File | Description |
|---|---|
| tests/Regressors/MLPRegressors/MLPRegressorTest.php | Adds integration-style coverage for the new NDArray-based MLP regressor behavior (train/partial/predict/serialization). |
| tests/NeuralNet/Snapshots/SnapshotTest.php | Updates snapshot test to use Networks\FeedForward\FeedForward instead of the removed Networks\Network. |
| tests/NeuralNet/SnapshotTest.php | Updates legacy snapshot test to instantiate NeuralNet\FeedForward. |
| tests/NeuralNet/NumPower/NumPowerTest.php | Adds a targeted regression test for NumPower transpose behavior. |
| tests/NeuralNet/Networks/NetworkTest.php | Removes tests targeting the deleted NeuralNet\Networks\Network class. |
| tests/NeuralNet/Networks/FeedForward/FeedForwardTest.php | Updates namespace/imports and adds tests (including normalization helper coverage). |
| tests/NeuralNet/NetworkTest.php | Switches construction from Network to FeedForward to reflect the interface refactor. |
| tests/NeuralNet/Layers/Swish/SwishTest.php | Tweaks an expected numeric value (precision/implementation change). |
| tests/Datasets/Generators/SwissRoll/SwissRollTest.php | Adds unit coverage for the new NDArray-based SwissRoll generator. |
| src/Regressors/MLPRegressor/MLPRegressor.php | Introduces a new NDArray-based MLP regressor implementation. |
| src/Regressors/MLPRegressor.php | Updates legacy MLPRegressor to construct FeedForward instead of Network. |
| src/Regressors/Adaline.php | Updates Adaline to construct FeedForward instead of Network. |
| src/NeuralNet/Snapshots/Snapshot.php | Retargets snapshots to the new Networks\Base\Contracts\Network contract. |
| src/NeuralNet/Parameters/Parameter.php | Changes cloning strategy to avoid NumPower native memory corruption. |
| src/NeuralNet/Networks/Network.php | Removes the old concrete Networks\Network implementation. |
| src/NeuralNet/Networks/FeedForward/FeedForward.php | Converts to implements Network and adds new inference/normalization helpers. |
| src/NeuralNet/Networks/Base/Contracts/Network.php | Adds a minimal network interface for the new NDArray-based stack. |
| src/NeuralNet/Network.php | Converts legacy NeuralNet\Network to an interface. |
| src/NeuralNet/FeedForward.php | Updates legacy FeedForward to implement the new legacy NeuralNet\Network interface. |
| src/Datasets/Generators/SwissRoll/SwissRoll.php | Adds NDArray/NumPower-based SwissRoll generator (new namespace layout). |
| src/Classifiers/SoftmaxClassifier.php | Updates to instantiate FeedForward instead of Network. |
| src/Classifiers/MultilayerPerceptron.php | Updates to instantiate FeedForward instead of Network. |
| src/Classifiers/LogisticRegression.php | Updates to instantiate FeedForward instead of Network. |
| CHANGELOG.md | Updates changelog entries related to the Network refactor/NDArray migration. |
Comments suppressed due to low confidence (3)
src/NeuralNet/Networks/FeedForward/FeedForward.php:229
- normalizeSamples() is private and currently not called by infer()/roundtrip(), so the packed-array normalization isn't applied in production. If NumPower requires packed arrays here, call this helper from the hot path (or remove it if not needed) to avoid unused code.
src/NeuralNet/Networks/FeedForward/FeedForward.php:201 - inferNew() adds an empty-dataset guard plus sample/shape normalization, but inference callers still go through infer(), so these fixes are effectively unused. This can leave empty-dataset inference and NumPower shape edge cases unhandled in production; consider moving this logic into infer() and removing/renaming inferNew().
tests/NeuralNet/Networks/FeedForward/FeedForwardTest.php:83 - These newly added tests (testLayers/testInput/testHidden/testNumParams) duplicate coverage that already exists later in this same test class (layers/input/hidden/numParams). Consider removing one set to keep the suite lean and reduce maintenance burden.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| - Remove Network interface | ||
| - RBX Serializer only tracks major library version number | ||
| - Convert NeuralNet classes to use NDArray instead of Matrix | ||
| - Turn back Network interface |
There was a problem hiding this comment.
Changelog entry "Turn back Network interface" is grammatically unclear. Consider rephrasing to something like "Restore Network interface" / "Bring back Network interface" so readers understand what changed.
| - Turn back Network interface | |
| - Restore Network interface |
| interface Network | ||
| { | ||
| /** | ||
| * The input layer to the network. | ||
| * | ||
| * @var Input | ||
| */ | ||
| protected Input $input; | ||
|
|
||
| /** | ||
| * The hidden layers of the network. | ||
| * | ||
| * @var list<Layers\Hidden> | ||
| */ | ||
| protected array $hidden = [ | ||
| // | ||
| ]; | ||
|
|
||
| /** | ||
| * The pathing of the backward pass through the hidden layers. | ||
| * | ||
| * @var list<Layers\Hidden> | ||
| */ | ||
| protected array $backPass = [ | ||
| // | ||
| ]; | ||
|
|
||
| /** | ||
| * The output layer of the network. | ||
| * | ||
| * @var Output | ||
| */ | ||
| protected Output $output; | ||
|
|
||
| /** | ||
| * The gradient descent optimizer used to train the network. | ||
| * | ||
| * @var Optimizer | ||
| */ | ||
| protected Optimizer $optimizer; | ||
|
|
||
| /** | ||
| * @param Input $input | ||
| * @param Layers\Hidden[] $hidden | ||
| * @param Output $output | ||
| * @param Optimizer $optimizer | ||
| */ | ||
| public function __construct(Input $input, array $hidden, Output $output, Optimizer $optimizer) | ||
| { | ||
| $hidden = array_values($hidden); | ||
|
|
||
| $backPass = array_reverse($hidden); | ||
|
|
||
| $this->input = $input; | ||
| $this->hidden = $hidden; | ||
| $this->output = $output; | ||
| $this->optimizer = $optimizer; | ||
| $this->backPass = $backPass; | ||
| } | ||
|
|
||
| /** | ||
| * Return the input layer. | ||
| * | ||
| * @return Input | ||
| */ | ||
| public function input() : Input | ||
| { | ||
| return $this->input; | ||
| } | ||
|
|
||
| /** | ||
| * Return an array of hidden layers indexed left to right. | ||
| * | ||
| * @return list<Layers\Hidden> | ||
| */ | ||
| public function hidden() : array | ||
| { | ||
| return $this->hidden; | ||
| } | ||
|
|
||
| /** | ||
| * Return the output layer. | ||
| * | ||
| * @return Output | ||
| */ | ||
| public function output() : Output | ||
| { | ||
| return $this->output; | ||
| } | ||
|
|
||
| /** | ||
| * Return all the layers in the network. | ||
| * Return the layers of the network. | ||
| * |
There was a problem hiding this comment.
This Network interface only requires layers(), but multiple call sites type-hint Network and rely on richer methods such as input(), hidden(), initialize(), numParams(), etc. Either expand the interface to reflect the required contract, or tighten type hints where those methods are needed to avoid an unsafe/misleading abstraction.
| protected const float MIN_SCORE = 0.9; | ||
|
|
||
| /** | ||
| * Constant used to see the random number generator. |
There was a problem hiding this comment.
The docblock says "Constant used to see the random number generator" but this appears to be a typo; it should refer to seeding the RNG. Fixing the wording helps avoid confusion about the purpose of RANDOM_SEED.
| * Constant used to see the random number generator. | |
| * Constant used to seed the random number generator. |
|
|
||
| $this->estimator->setLogger(new BlackHole()); | ||
|
|
||
| srand(self::RANDOM_SEED); |
There was a problem hiding this comment.
The test seeds the RNG with srand(), but SwissRoll::generate() uses mt_rand()/mt_getrandmax(), which are not affected by srand(). This can make the training/score assertions non-deterministic and flaky; seed the same RNG used by the generator (e.g. mt_srand) or inject a deterministic RNG into SwissRoll.
| srand(self::RANDOM_SEED); | |
| mt_srand(self::RANDOM_SEED); |
| } | ||
|
|
||
| /** | ||
| * @return array{0: Unlabeled} |
There was a problem hiding this comment.
The return annotation says this helper returns Unlabeled, but it actually returns $testing from Labeled::take() and callers use labels(). Update the annotation (or the helper) so the documented return type matches reality.
| * @return array{0: Unlabeled} | |
| * @return array{0: Labeled} |
| * | ||
| * @var int | ||
| */ | ||
| protected $evalInterval; |
There was a problem hiding this comment.
$evalInterval is missing a property type declaration even though the constructor enforces an int and the docblock specifies int. Add the int type (or adjust the docblock) to keep the class internally consistent and improve static analysis.
| protected $evalInterval; | |
| protected int $evalInterval; |

No description provided.