Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Selective drop #1

Merged
merged 28 commits into from
Nov 13, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
ac65d5e
restructure; __init__ to each module
ccomkhj Nov 3, 2023
eb41bd6
introduce unit test for MultiConstraintLinearRegression
ccomkhj Nov 3, 2023
6ffc9f2
introduce base class
ccomkhj Nov 3, 2023
87bdd82
inherit from base
ccomkhj Nov 3, 2023
9fb6c79
modulizing shape checks
ccomkhj Nov 3, 2023
f90dd46
docstring; formatting; private method;
ccomkhj Nov 3, 2023
ba8de29
delete overlap methods from super class;
ccomkhj Nov 3, 2023
7f03cf9
introduce test; comparision between single and multi constrained LR; …
ccomkhj Nov 3, 2023
044ed75
update tutorial of multiconstrainedLR based on updates
ccomkhj Nov 3, 2023
6425224
build constrained_mlp; check if it works samely with sklearns mlp
ccomkhj Nov 6, 2023
f74bd21
introduce clipping coefs
ccomkhj Nov 6, 2023
0bfb432
introduce multi coefficients
ccomkhj Nov 6, 2023
257f725
introduce test for mlp
ccomkhj Nov 6, 2023
c14c9d9
fix typo and error
ccomkhj Nov 6, 2023
92c5ad1
formatting
ccomkhj Nov 7, 2023
e5b05bc
gradient update only once per iteration
ccomkhj Nov 7, 2023
d446157
include loss to track model learning
ccomkhj Nov 7, 2023
50b92f8
grad update after beta iteration; equal to basemodel
ccomkhj Nov 8, 2023
384cf5f
lbfgs works with clipping
ccomkhj Nov 8, 2023
978d1a9
include training loss in mlp
ccomkhj Nov 8, 2023
84a1e04
update docs
ccomkhj Nov 8, 2023
ecb6544
minor font
ccomkhj Nov 8, 2023
4cb5b8b
minor font2
ccomkhj Nov 8, 2023
d796262
introduce selective drop LR
ccomkhj Nov 10, 2023
c97630f
introduce test for selecctive drop LR
ccomkhj Nov 10, 2023
65d29bd
mlp with bounds to directly optimizer
ccomkhj Nov 10, 2023
575100b
introduce selective drop module
ccomkhj Nov 10, 2023
95b2791
introduce selective_drop_positive_lr; utilize positive flag from the …
ccomkhj Nov 13, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
82 changes: 35 additions & 47 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,54 +1,42 @@
# constrained-linear-regression
[![PyPI version](https://badge.fury.io/py/constrained-linear-regression.svg)](https://badge.fury.io/py/constrained-linear-regression)
# Multi-Constrained Regression and Neural Network Repository

This is a Python implementation of constrained linear regression in scikit-learn style.
The current version supports upper and lower bound for each slope coefficient.
## Overview

It was developed after this question https://stackoverflow.com/questions/50410037
This repository is dedicated to hosting and sharing advanced techniques in machine learning algorithms, particularly focusing on constraining the weights of certain inputs in regression and multi-layer perceptron. Inspired by the robust scikit-learn library, we have ventured into reverse engineering and extending its capabilities to fit custom requirements for specific types of learning problems.

Installation:
```pip install constrained-linear-regression```
## Purpose

You can use this model, for example, if you want all coefficients to be non-negative:
The purpose of this repository is to provide a resource for machine learning practitioners looking to impose constraints on the input features' weights, which could be critical in certain domains such as finance, healthcare, and operational research. The reverse-engineered solutions herein allow for greater control over the machine learning model's behavior, ensuring that the influence of some features remains within desired boundaries.

```Python
from constrained_linear_regression import ConstrainedLinearRegression
from sklearn.datasets import load_boston
from sklearn.linear_model import LinearRegression
X, y = load_boston(return_X_y=True)
model = ConstrainedLinearRegression(nonnegative=True)
model.fit(X, y)
print(model.intercept_)
print(model.coef_)
```
The output will be like
```commandline
-36.99292986145538
[0. 0.05286515 0. 4.12512386 0. 8.04017956
0. 0. 0. 0. 0. 0.02273805
0. ]
```
You can also impose arbitrary bounds for any coefficients you choose
```Python
model = ConstrainedLinearRegression()
min_coef = np.repeat(-np.inf, X.shape[1])
min_coef[0] = 0
min_coef[4] = -1
max_coef = np.repeat(4, X.shape[1])
max_coef[3] = 2
model.fit(X, y, max_coef=max_coef, min_coef=min_coef)
print(model.intercept_)
print(model.coef_)
```
The output will be
```commandline
24.060175576410515
[ 0. 0.04504673 -0.0354073 2. -1. 4.
-0.01343263 -1.17231216 0.2183103 -0.01375266 -0.7747823 0.01122374
-0.56678676]
## Tutorials

We provide detailed tutorials for the following topics:

- **Multi-Constrained Linear Regression**: This tutorial takes you through the steps of creating a linear regression model that allows constraints to be placed on the weights of multiple input features.
- [Multi-Constrained Linear Regression Tutorial](tutorial/MultiConstrainedLinearRegression.md)

- **Multi-Constrained Multi-Layer Perceptron**: Explore the implementation of a multi-layer perceptron (neural network) that incorporates constraints on the weights corresponding to specific input features.
- [Multi-Constrained Multi-Layer Perceptron Tutorial](tutorial/MultiConstrainedMultiLayerPerceptron.md)

## Features

- Reverse engineering techniques applied to scikit-learn's Linear Regression and MLP models
- Custom weight constraint functionalities
- Step-by-step tutorials for implementing the above models

## Getting Started

To get started with these tutorials and code, you should clone the repository and navigate to the `tutorial` directory where you can find the markdown files with detailed explanations and code samples.

```bash
git clone https://github.com/your-github-username/multi-constrained-models.git
cd multi-constrained-models/tutorial
```
### Contributing
We welcome contributions from the community! Whether it's improving the tutorials, extending the features of the models, or fixing bugs, please feel free to fork the repo, make your changes, and submit a pull request.

You can also set coefficients `lasso` and `ridge` if you want to apply the
corresponding penalties. For `lasso`, however, the output might not be exactly
equal to the result of `sklearn.linear_model.Lasso` due to the difference
in the optimization algorithm.
### Acknowledgments
Thanks to the scikit-learn developers for their work on creating a comprehensive machine learning library.
This project was inspired by the need for industry-specific machine learning models that require tailored constraints.
Contact
If you have any questions or feedback, please open an issue in the repository, and we'll get back to you as soon as possible.
Loading
Loading