Skip to content

tonegas/nnodely

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

823 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

logo

License: MIT codecov Documentation PyPI

Neural Network Framekwork for Modelling, Control, and Estimation of Physical Systems

Modeling, control, and estimation of physical systems are central to many engineering disciplines. While data-driven methods like neural networks offer powerful tools, they often struggle to incorporate prior domain knowledge, limiting their interpretability, generalizability, and safety.

To bridge this gap, we present nnodely (where "nn" can be read as "m," forming Modely) β€” a framework that facilitates the creation and deployment of Model-Structured Neural Networks (MS-NNs).
MS-NNs combine the learning capabilities of neural networks with structural priors grounded in physics, control, and estimation theory, enabling:

  • Reduced training data requirements
  • Generalization to unseen scenarios
  • Real-time deployment in real-world applications

In short:

nnodely is not a replacement for a general purpose deep learning frameworks β€” it is a structured layer on top of them, purpose-built for physical systems.


πŸ“– Documentation β€’ πŸ”¬ Case Studies β€’ πŸš€ Other Applications

Table of Contents

  1. Getting Started
  2. Structure of the Repository
  3. How to contribute
  4. License
  5. References

Getting Started

Installation

You can install nnodely from PyPI via:

pip install nnodely

Alternatively, you can build it from source by first cloning the repository and installing the requirements and the nnodely library:

git clone https://github.com/tonegas/nnodely.git
cd nnodely
pip install -r requirements.txt
pip install .

Hello, World!

To check if nnodely is installed correctly try running the following script.

from nnodely import Input, Output, nnodely, Parameter

x = Input("x")
l = x.last()
o = (Parameter('A')*l.sw([-2,-1])+Parameter('B')*l).closedLoop(x)
f = Output("fib",o)
model = nnodely()
model.addModel("Fibonacci",f)
model.addMinimize("target", l.sw([-2,-1])+l, f)
model.neuralizeModel(1)
model.loadData("data",{ "x" : list(range(100)) } )
model.trainModel(prediction_samples = 2, lr = 0.5, num_of_epochs = 500)
model.exportPythonModel(models = "Fibonacci")
print(model({ "x" : [1] },prediction_samples = 20,num_of_samples = 20))

In the example, the neural network is trained to mimic the Fibonacci series. Finally, a native pytorch network is exported in a file.

(back to top)

Structure of the Repository

nnodely/              # root directory
β”œβ”€β”€ nnodely/          # source code
β”‚   β”œβ”€β”€ basic/        # core low-level classes
β”‚   β”œβ”€β”€ exporter/     # model export utilities
β”‚   β”œβ”€β”€ layers/       # supported layers
β”‚   β”œβ”€β”€ operators/    # core operators
β”‚   β”œβ”€β”€ support/      # utility functions
β”‚   └── visualizer/   # visualization tools
β”œβ”€β”€ case-studies/     # main case studies
β”œβ”€β”€ docs/             # documentation
β”œβ”€β”€ tests/            # unit and integration tests
β”œβ”€β”€ imgs/             # images used in the documentation
└── mplplots/         # utilities for MatPlotLib

(back to top)

More info about repository structure

nnodely Folder

This folder contains all the nnodely library files with relative references.

The nnodely main class defined in nnodely.py, it contains all the main properties of the nnodely object and it derives from five main operators, cointained in the folder operators/:

  1. composer.py contains all the functions to build the networks: addModel, neuralizeModel, addConnection, addClosedLoop etc..
  2. loader.py contains the function for managing the dataset, the main function is dataLoad.
  3. trainer.py contains the function for training the network as the trainModel.
  4. exporter.py contains all the function for import and export: saveModel, loadModel, exportONNX etc..
  5. validator.py contains all the function for validate the model and the resultsAnalysis.
  6. All the operators derive from Network defined in network.py, that contains the shared support functions for all the operators.

The folder basic/ contains the main classes for the low level functionalities:

  1. model.py containts the pytorch template model for the structured network.
  2. modeldef.py containts the operation for work with the json model definition.
  3. loss.py contains the loss functions.
  4. optimizer.py contains the optimizer calss.
  5. relation.py contains all the main classes from which all the layers are derived.

The other folders are:

  1. exporter/ that contains the classes for the export functions.
  2. support/ for the support functions.
  3. visualizer/ that contains all the classes related to the visualization.
  4. And finally the layers/ folder.

The layers/ folder contains all the layers that can be used in the MSNN. In particular, the model structured NN is defined by Inputs, Outputs and Parameters:

  1. input.py contains the Input class used for create an input for the network.
  2. output.py contains the Output class used for create an output for the network.
  3. parameter.py contains the logic for create a generic parameters and constants.

The main basic layers without parameters are:

  1. activation.py this file contains all the activation functions. The activation are mainly based on the pytorch functions.
  2. arithmetic.py this file contains the aritmetic functions as: +, -, /, *., **.
  3. trigonometric.py this file contains all the trigonometric functions.
  4. part.py are used for selecting part of the data.
  5. fuzzify.py contains the operation for the fuzzification of a variable, commonly used in the local model as activation function as in [1] with rectangular activation functions or in [3], [4] and [5] with triangular activation function activation functions. Using fuzzification it is also possible create a channel coding as presented in [2].

The main basic layers with parameters are:

  1. fir.py this file contains the finite impulse response filter function. It is a linear operation on the time dimension (second dimension). This filter was introduced in [1].
  2. linear.py this file contains the linear function. Typical Linear operation W*x+b operated on the space dimension (third dimension). This operation is presented in [1].
  3. localmodel.py this file contains the logic for build a local model. This operation is presented in [1], [3], [4] and [5].
  4. parametricfunction.py are the user custom function. The function can use the pytorch syntax. A parametric function is presented in [3], [4], [5].
  5. equationlearner.py contains the logic for the equation learner. The equation learner is used for learn a relation input outpur following a list of activation functions. The first implementation is presented in [6].
  6. timeoperation.py contains the time operation functions. The time operation are used for extract a time window from a signal. The derivative operation can be used to implement Physics-informed neural network [7] Sobolev learning [8].

Case Studies Folder

In the case studies folder you can find the main case studies cited in the paper. Each case study is a jupyter notebook that explains the main functionalities of the library.

Docs Folder

This folder contains all files used to automatically generate the documentation.

Tests Folder

This folder contains the unit tests of the library. Each file tests a specific functionality.

Matplotlib Folder

This folder contains the utilities for Matplotlib.

Images Folder

This folder contains the images used in the documentation.

(back to top)

How to Contribute

To contribute to the nnodely framework, you can:

  • Open a pull request if you have a new feature or bug fix.
  • Open an issue if you have a question or suggestion.

We welcome contributions and collaborations.

(back to top)

License

This project is released under the license License: MIT.

(back to top)

References

[1] Mauro Da Lio, Daniele Bortoluzzi, Gastone Pietro Rosati Papini. (2019). Modelling longitudinal vehicle dynamics with neural networks. Vehicle System Dynamics. https://doi.org/10.1080/00423114.2019.1638947 (look the [code])

[2] Alice Plebe, Mauro Da Lio, Daniele Bortoluzzi. (2019). On Reliable Neural Network Sensorimotor Control in Autonomous Vehicles. IEEE Transaction on Intelligent Transportation System. https://doi.org/10.1109/TITS.2019.2896375

[3] Mauro Da Lio, Riccardo DonΓ , Gastone Pietro Rosati Papini, Francesco Biral, Henrik Svensson. (2020). A Mental Simulation Approach for Learning Neural-Network Predictive Control (in Self-Driving Cars). IEEE Access. https://doi.org/10.1109/ACCESS.2020.3032780 (look the [code])

[4] Edoardo Pagot, Mattia Piccinini, Enrico Bertolazzi, Francesco Biral. (2023). Fast Planning and Tracking of Complex Autonomous Parking Maneuvers With Optimal Control and Pseudo-Neural Networks. IEEE Access. https://doi.org/10.1109/ACCESS.2023.3330431 (look the [code])

[5] Mattia Piccinini, Sebastiano Taddei, Matteo Larcher, Mattia Piazza, Francesco Biral. (2023). A Physics-Driven Artificial Agent for Online Time-Optimal Vehicle Motion Planning and Control. IEEE Access. https://doi.org/10.1109/ACCESS.2023.3274836 (look [code basic] and [code extended])

[6] Hector Perez-Villeda, Justus Piater, Matteo Saveriano. (2023). Learning and extrapolation of robotic skills using task-parameterized equation learner networks. Robotics and Autonomous Systems. https://doi.org/10.1016/j.robot.2022.104309 (look the [code])

[7] M. Raissi. P. Perdikaris b, G.E. Karniadakis a. (2019). Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations Journal of Computational Physics. https://doi.org/10.1016/j.jcp.2018.10.045 (look the [example Burger's equation])

[8] Wojciech Marian Czarnecki, Simon Osindero, Max Jaderberg, Grzegorz Świrszcz, Razvan Pascanu. (2017). Sobolev Training for Neural Networks. arXiv. https://doi.org/10.48550/arXiv.1706.04859 (look the [code])

[9] Mattia Piccinini, Matteo Zumerle, Johannes Betz, Gastone Pietro Rosati Papini. (2025). A Road Friction-Aware Anti-Lock Braking System Based on Model-Structured Neural Networks. IEEE Open Journal of Intelligent Transportation Systems. https://doi.org/10.1109/OJITS.2025.3563347 (look at the [code])

[10] Mauro Da Lio, Mattia Piccinini, Francesco Biral. (2023). Robust and Sample-Efficient Estimation of Vehicle Lateral Velocity Using Neural Networks With Explainable Structure Informed by Kinematic Principles. IEEE Transactions on Intelligent Transportation Systems. https://doi.org/10.1109/TITS.2023.3303776

(back to top)

About

Model-structured neural network framework for modeling and control of physical systems

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors