Skip to content

Commit

Permalink
Merge pull request #1473 from rstudio/retether-3.6.0
Browse files Browse the repository at this point in the history
Retether to Keras v3.6.0
  • Loading branch information
t-kalinowski authored Oct 17, 2024
2 parents f78909c + c70bcba commit 6729243
Show file tree
Hide file tree
Showing 472 changed files with 14,767 additions and 199 deletions.
3 changes: 2 additions & 1 deletion .tether/man/activation_elu.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ keras.activations.elu(x, alpha=1.0)
__doc__
Exponential Linear Unit.

The exponential linear unit (ELU) with `alpha > 0` is define as:
The exponential linear unit (ELU) with `alpha > 0` is defined as:

- `x` if `x > 0`
- alpha * `exp(x) - 1` if `x < 0`
Expand All @@ -23,3 +23,4 @@ Args:
Reference:

- [Clevert et al., 2016](https://arxiv.org/abs/1511.07289)

2 changes: 1 addition & 1 deletion .tether/man/application_densenet121.txt
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Args:
be applied.
classes: optional number of classes to classify images
into, only to be specified if `include_top` is `True`, and
if no `weights` argument is specified.
if no `weights` argument is specified. Defaults to 1000.
classifier_activation: A `str` or callable.
The activation function to use
on the "top" layer. Ignored unless `include_top=True`. Set
Expand Down
2 changes: 1 addition & 1 deletion .tether/man/application_densenet169.txt
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Args:
be applied.
classes: optional number of classes to classify images
into, only to be specified if `include_top` is `True`, and
if no `weights` argument is specified.
if no `weights` argument is specified. Defaults to 1000.
classifier_activation: A `str` or callable.
The activation function to use
on the "top" layer. Ignored unless `include_top=True`. Set
Expand Down
2 changes: 1 addition & 1 deletion .tether/man/application_densenet201.txt
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Args:
be applied.
classes: optional number of classes to classify images
into, only to be specified if `include_top` is `True`, and
if no `weights` argument is specified.
if no `weights` argument is specified. Defaults to 1000.
classifier_activation: A `str` or callable.
The activation function to use
on the "top" layer. Ignored unless `include_top=True`. Set
Expand Down
2 changes: 1 addition & 1 deletion .tether/man/application_resnet101.txt
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Args:
- `max` means that global max pooling will be applied.
classes: optional number of classes to classify images into, only to be
specified if `include_top` is `True`, and if no `weights` argument is
specified.
specified. Defaults to `1000`.
classifier_activation: A `str` or callable. The activation function to
use on the "top" layer. Ignored unless `include_top=True`. Set
`classifier_activation=None` to return the logits of the "top" layer.
Expand Down
2 changes: 1 addition & 1 deletion .tether/man/application_resnet152.txt
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Args:
- `max` means that global max pooling will be applied.
classes: optional number of classes to classify images into, only to be
specified if `include_top` is `True`, and if no `weights` argument is
specified.
specified. Defaults to `1000`.
classifier_activation: A `str` or callable. The activation function to
use on the "top" layer. Ignored unless `include_top=True`. Set
`classifier_activation=None` to return the logits of the "top" layer.
Expand Down
2 changes: 1 addition & 1 deletion .tether/man/application_resnet50.txt
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Args:
- `max` means that global max pooling will be applied.
classes: optional number of classes to classify images into, only to be
specified if `include_top` is `True`, and if no `weights` argument is
specified.
specified. Defaults to `1000`.
classifier_activation: A `str` or callable. The activation function to
use on the "top" layer. Ignored unless `include_top=True`. Set
`classifier_activation=None` to return the logits of the "top" layer.
Expand Down
2 changes: 1 addition & 1 deletion .tether/man/application_xception.txt
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ Args:
be applied.
classes: optional number of classes to classify images
into, only to be specified if `include_top` is `True`, and
if no `weights` argument is specified.
if no `weights` argument is specified. Defaults to `1000`.
classifier_activation: A `str` or callable. The activation function to
use on the "top" layer. Ignored unless `include_top=True`. Set
`classifier_activation=None` to return the logits of the "top"
Expand Down
2 changes: 1 addition & 1 deletion .tether/man/clone_model.txt
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ def clone_function(layer):
config["seed"] = 1337
return layer.__class__.from_config(config)

new_model = clone_model(model)
new_model = clone_model(model, clone_function=clone_function)
```

Using a `call_function` to add a `Dropout` layer after each `Dense` layer
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@ __signature__
keras.Model.export(
self,
filepath,
format='tf_saved_model'
format='tf_saved_model',
verbose=True
)
__doc__
Create a TF SavedModel artifact for inference.
Expand All @@ -22,6 +23,7 @@ entirely standalone.
Args:
filepath: `str` or `pathlib.Path` object. Path where to save
the artifact.
verbose: whether to print all the variables of the exported model.

Example:

Expand Down
15 changes: 10 additions & 5 deletions .tether/man/get_file.txt
Original file line number Diff line number Diff line change
Expand Up @@ -35,14 +35,18 @@ path_to_downloaded_file = get_file(
```

Args:
fname: Name of the file. If an absolute path, e.g. `"/path/to/file.txt"`
is specified, the file will be saved at that location.
fname: If the target is a single file, this is your desired
local name for the file.
If `None`, the name of the file at `origin` will be used.
If downloading and extracting a directory archive,
the provided `fname` will be used as extraction directory
name (only if it doesn't have an extension).
origin: Original URL of the file.
untar: Deprecated in favor of `extract` argument.
boolean, whether the file should be decompressed
Boolean, whether the file is a tar archive that should
be extracted.
md5_hash: Deprecated in favor of `file_hash` argument.
md5 hash of the file for verification
md5 hash of the file for file integrity verification.
file_hash: The expected hash string of the file after download.
The sha256 and md5 hash algorithms are both supported.
cache_subdir: Subdirectory under the Keras cache dir where the file is
Expand All @@ -51,7 +55,8 @@ Args:
hash_algorithm: Select the hash algorithm to verify the file.
options are `"md5'`, `"sha256'`, and `"auto'`.
The default 'auto' detects the hash algorithm in use.
extract: True tries extracting the file as an Archive, like tar or zip.
extract: If `True`, extracts the archive. Only applicable to compressed
archive files like tar or zip.
archive_format: Archive format to try for extracting the file.
Options are `"auto'`, `"tar'`, `"zip'`, and `None`.
`"tar"` includes tar, tar.gz, and tar.bz files.
Expand Down
67 changes: 67 additions & 0 deletions .tether/man/get_state_tree.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
__signature__
keras.Model.get_state_tree(self, value_format='backend_tensor')
__doc__
Retrieves tree-like structure of model variables.

This method allows retrieval of different model variables (trainable,
non-trainable, optimizer, and metrics). The variables are returned in a
nested dictionary format, where the keys correspond to the variable
names and the values are the nested representations of the variables.

Returns:
dict: A dictionary containing the nested representations of the
requested variables. The keys are the variable names, and the
values are the corresponding nested dictionaries.
value_format: One of `"backend_tensor"`, `"numpy_array"`.
The kind of array to return as the leaves of the nested
state tree.

Example:

```python
model = keras.Sequential([
keras.Input(shape=(1,), name="my_input"),
keras.layers.Dense(1, activation="sigmoid", name="my_dense"),
], name="my_sequential")
model.compile(optimizer="adam", loss="mse", metrics=["mae"])
model.fit(np.array([[1.0]]), np.array([[1.0]]))
state_tree = model.get_state_tree()
```

The `state_tree` dictionary returned looks like:

```
{
'metrics_variables': {
'loss': {
'count': ...,
'total': ...,
},
'mean_absolute_error': {
'count': ...,
'total': ...,
}
},
'trainable_variables': {
'my_sequential': {
'my_dense': {
'bias': ...,
'kernel': ...,
}
}
},
'non_trainable_variables': {},
'optimizer_variables': {
'adam': {
'iteration': ...,
'learning_rate': ...,
'my_sequential_my_dense_bias_momentum': ...,
'my_sequential_my_dense_bias_velocity': ...,
'my_sequential_my_dense_kernel_momentum': ...,
'my_sequential_my_dense_kernel_velocity': ...,
}
}
}
}
```

2 changes: 2 additions & 0 deletions .tether/man/initializer_glorot_normal.txt
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,7 @@ class GlorotNormal(VarianceScaling)
| Method resolution order:
| GlorotNormal
| VarianceScaling
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand All @@ -51,3 +52,4 @@ class GlorotNormal(VarianceScaling)
| Returns:
| A JSON-serializable Python dict.
|

2 changes: 2 additions & 0 deletions .tether/man/initializer_glorot_uniform.txt
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ class GlorotUniform(VarianceScaling)
| Method resolution order:
| GlorotUniform
| VarianceScaling
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand All @@ -50,3 +51,4 @@ class GlorotUniform(VarianceScaling)
| Returns:
| A JSON-serializable Python dict.
|

2 changes: 2 additions & 0 deletions .tether/man/initializer_he_normal.txt
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ class HeNormal(VarianceScaling)
| Method resolution order:
| HeNormal
| VarianceScaling
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand All @@ -50,3 +51,4 @@ class HeNormal(VarianceScaling)
| Returns:
| A JSON-serializable Python dict.
|

2 changes: 2 additions & 0 deletions .tether/man/initializer_he_uniform.txt
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ class HeUniform(VarianceScaling)
| Method resolution order:
| HeUniform
| VarianceScaling
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand All @@ -50,3 +51,4 @@ class HeUniform(VarianceScaling)
| Returns:
| A JSON-serializable Python dict.
|

2 changes: 2 additions & 0 deletions .tether/man/initializer_lecun_normal.txt
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@ class LecunNormal(VarianceScaling)
| Method resolution order:
| LecunNormal
| VarianceScaling
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand All @@ -54,3 +55,4 @@ class LecunNormal(VarianceScaling)
| Returns:
| A JSON-serializable Python dict.
|

2 changes: 2 additions & 0 deletions .tether/man/initializer_lecun_uniform.txt
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ class LecunUniform(VarianceScaling)
| Method resolution order:
| LecunUniform
| VarianceScaling
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand All @@ -50,3 +51,4 @@ class LecunUniform(VarianceScaling)
| Returns:
| A JSON-serializable Python dict.
|

4 changes: 3 additions & 1 deletion .tether/man/initializer_orthogonal.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Help on class OrthogonalInitializer in module keras.src.initializers.random_initializers:

class OrthogonalInitializer(keras.src.initializers.initializer.Initializer)
class OrthogonalInitializer(RandomInitializer)
| OrthogonalInitializer(gain=1.0, seed=None)
|
| Initializer that generates an orthogonal matrix.
Expand Down Expand Up @@ -37,6 +37,7 @@ class OrthogonalInitializer(keras.src.initializers.initializer.Initializer)
|
| Method resolution order:
| OrthogonalInitializer
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand Down Expand Up @@ -66,3 +67,4 @@ class OrthogonalInitializer(keras.src.initializers.initializer.Initializer)
| Returns:
| A JSON-serializable Python dict.
|

4 changes: 3 additions & 1 deletion .tether/man/initializer_random_normal.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Help on class RandomNormal in module keras.src.initializers.random_initializers:

class RandomNormal(keras.src.initializers.initializer.Initializer)
class RandomNormal(RandomInitializer)
| RandomNormal(mean=0.0, stddev=0.05, seed=None)
|
| Random normal initializer.
Expand Down Expand Up @@ -33,6 +33,7 @@ class RandomNormal(keras.src.initializers.initializer.Initializer)
|
| Method resolution order:
| RandomNormal
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand Down Expand Up @@ -63,3 +64,4 @@ class RandomNormal(keras.src.initializers.initializer.Initializer)
| Returns:
| A JSON-serializable Python dict.
|

4 changes: 3 additions & 1 deletion .tether/man/initializer_random_uniform.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Help on class RandomUniform in module keras.src.initializers.random_initializers:

class RandomUniform(keras.src.initializers.initializer.Initializer)
class RandomUniform(RandomInitializer)
| RandomUniform(minval=-0.05, maxval=0.05, seed=None)
|
| Random uniform initializer.
Expand Down Expand Up @@ -33,6 +33,7 @@ class RandomUniform(keras.src.initializers.initializer.Initializer)
|
| Method resolution order:
| RandomUniform
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand Down Expand Up @@ -63,3 +64,4 @@ class RandomUniform(keras.src.initializers.initializer.Initializer)
| Returns:
| A JSON-serializable Python dict.
|

4 changes: 3 additions & 1 deletion .tether/man/initializer_truncated_normal.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Help on class TruncatedNormal in module keras.src.initializers.random_initializers:

class TruncatedNormal(keras.src.initializers.initializer.Initializer)
class TruncatedNormal(RandomInitializer)
| TruncatedNormal(mean=0.0, stddev=0.05, seed=None)
|
| Initializer that generates a truncated normal distribution.
Expand Down Expand Up @@ -36,6 +36,7 @@ class TruncatedNormal(keras.src.initializers.initializer.Initializer)
|
| Method resolution order:
| TruncatedNormal
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand Down Expand Up @@ -66,3 +67,4 @@ class TruncatedNormal(keras.src.initializers.initializer.Initializer)
| Returns:
| A JSON-serializable Python dict.
|

4 changes: 3 additions & 1 deletion .tether/man/initializer_variance_scaling.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Help on class VarianceScaling in module keras.src.initializers.random_initializers:

class VarianceScaling(keras.src.initializers.initializer.Initializer)
class VarianceScaling(RandomInitializer)
| VarianceScaling(scale=1.0, mode='fan_in', distribution='truncated_normal', seed=None)
|
| Initializer that adapts its scale to the shape of its input tensors.
Expand Down Expand Up @@ -45,6 +45,7 @@ class VarianceScaling(keras.src.initializers.initializer.Initializer)
|
| Method resolution order:
| VarianceScaling
| RandomInitializer
| keras.src.initializers.initializer.Initializer
| builtins.object
|
Expand Down Expand Up @@ -76,3 +77,4 @@ class VarianceScaling(keras.src.initializers.initializer.Initializer)
| Returns:
| A JSON-serializable Python dict.
|

Loading

0 comments on commit 6729243

Please sign in to comment.