Skip to content

Commit

Permalink
Added examples
Browse files Browse the repository at this point in the history
  • Loading branch information
leschultz committed Apr 18, 2024
1 parent 0282d29 commit 72f2810
Show file tree
Hide file tree
Showing 27 changed files with 126 additions and 12,352 deletions.
21 changes: 20 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,21 @@
# multilearn
Pytorch wrapper for multi-task learning
Pytorch wrapper for multi-task learning. The GitHub repo can be found in [here](https://github.com/leschultz/multilearn.git).

## Coding Style

Python scripts follow PEP 8 guidelines. A usefull tool to use to check a coding style is flake8.

```
flake8 <script>
```

## Authors

### Graduate Students
* **Lane Schultz** - *Main Contributer* - [leschultz](https://github.com/leschultz)

## Acknowledgments

* The [Computational Materials Group (CMG)](https://matmodel.engr.wisc.edu/) at the University of Wisconsin - Madison
* Professor Dane Morgan [ddmorgan](https://github.com/ddmorgan) and Dr. Ryan Jacobs [rjacobs914](https://github.com/rjacobs914) for computational material science guidence

39 changes: 39 additions & 0 deletions examples/materials/asr/fit.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
from sklearn.preprocessing import StandardScaler
from multilearn import datasets, models, utils
from torch import optim, nn


def main():

save_dir = 'outputs'
lr = 1e-4
batch_size = 32
n_epochs = 1000
tasks = ['asr']

# Data
X, y = datasets.load(tasks)
data = datasets.splitter(X, y, tasks, train_size=1)

for k, v in data.items():
data[k]['scaler'] = StandardScaler()
data[k]['loss'] = nn.L1Loss()

model = models.MultiNet(tasks=tasks, input_arch={500: 1})
optimizer = optim.Adam

out = utils.train(
model,
optimizer,
data,
n_epochs=n_epochs,
batch_size=batch_size,
lr=lr,
save_dir=save_dir,
)

print(out['df_loss'])


if __name__ == '__main__':
main()
94 changes: 0 additions & 94 deletions examples/materials/asr/run.py

This file was deleted.

4 changes: 2 additions & 2 deletions examples/materials/asr/run.sh
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
export PYTHONPATH=../../:$PYTHONPATH
python3 run.py
export PYTHONPATH=../../../src/:$PYTHONPATH
python3 fit.py
39 changes: 39 additions & 0 deletions examples/materials/combined/fit.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
from sklearn.preprocessing import StandardScaler
from multilearn import datasets, models, utils
from torch import optim, nn


def main():

save_dir = 'outputs'
lr = 1e-4
batch_size = 32
n_epochs = 1000
tasks = ['asr', 'opband', 'stability']

# Data
X, y = datasets.load(tasks)
data = datasets.splitter(X, y, tasks, train_size=1)

for k, v in data.items():
data[k]['scaler'] = StandardScaler()
data[k]['loss'] = nn.L1Loss()

model = models.MultiNet(tasks=tasks, input_arch={500: 1})
optimizer = optim.Adam

out = utils.train(
model,
optimizer,
data,
n_epochs=n_epochs,
batch_size=batch_size,
lr=lr,
save_dir=save_dir,
)

print(out['df_loss'])


if __name__ == '__main__':
main()
94 changes: 0 additions & 94 deletions examples/materials/combined/run.py

This file was deleted.

4 changes: 2 additions & 2 deletions examples/materials/combined/run.sh
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
export PYTHONPATH=../../:$PYTHONPATH
python3 run.py
export PYTHONPATH=../../../src/:$PYTHONPATH
python3 fit.py
Empty file added src/multilearn/__init__.py
Empty file.
File renamed without changes.
32 changes: 0 additions & 32 deletions src/multilearn/data/clean.py

This file was deleted.

File renamed without changes.
Loading

0 comments on commit 72f2810

Please sign in to comment.