Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Pipeline class #2140

Closed

Conversation

andrey-churkin
Copy link
Contributor

@andrey-churkin andrey-churkin commented Sep 15, 2023

Changes

The primary goals of that pull request are to

  • Add the Pipeline class
  • Add support for pipelines within the HyperparameterTuner algorithm

Reason for changes

Related tickets

Ref: 117471

Tests

@github-actions github-actions bot added NNCF PTQ Pull requests that updates NNCF PTQ NNCF OpenVINO Pull requests that updates NNCF OpenVINO labels Sep 15, 2023
@andrey-churkin andrey-churkin marked this pull request as ready for review September 18, 2023 08:50
@andrey-churkin andrey-churkin requested a review from a team as a code owner September 18, 2023 08:50
@andrey-churkin
Copy link
Contributor Author

@alexsu52 @KodiaqQ Guys, could you please take a look at this PR? Are there any changes that I should make before I start fixing the tests?

@github-actions github-actions bot added NNCF PT Pull requests that updates NNCF PyTorch experimental NNCF ONNX Pull requests that updates NNCF ONNX labels Sep 18, 2023
@andrey-churkin andrey-churkin force-pushed the ac/pipelines branch 2 times, most recently from 8041e5d to c0ce9ae Compare September 18, 2023 12:06
@github-actions github-actions bot added the NNCF Common Pull request that updates NNCF Common label Sep 18, 2023
@andrey-churkin andrey-churkin force-pushed the ac/pipelines branch 5 times, most recently from 750ad55 to 55fcd6c Compare September 19, 2023 07:28
@codecov
Copy link

codecov bot commented Sep 19, 2023

Codecov Report

Merging #2140 (37c8666) into develop (631a7b8) will increase coverage by 0.04%.
Report is 4 commits behind head on develop.
The diff coverage is 64.46%.

Additional details and impacted files
@@             Coverage Diff             @@
##           develop    #2140      +/-   ##
===========================================
+ Coverage    36.24%   36.29%   +0.04%     
===========================================
  Files          477      478       +1     
  Lines        42570    42617      +47     
===========================================
+ Hits         15428    15466      +38     
- Misses       27142    27151       +9     
Files Changed Coverage Δ
.../experimental/torch/quantization/quantize_model.py 0.00% <0.00%> (ø)
nncf/openvino/quantization/quantize_model.py 0.00% <0.00%> (ø)
...tization/algorithms/channel_alignment/algorithm.py 24.70% <8.33%> (-0.72%) ⬇️
...ization/pipelines/hyperparameter_tuner/pipeline.py 58.00% <11.62%> (ø)
...ation/pipelines/hyperparameter_tuner/param_grid.py 55.88% <54.16%> (ø)
nncf/quantization/quantize_model.py 52.63% <60.00%> (ø)
...f/quantization/pipelines/post_training/pipeline.py 95.00% <95.00%> (ø)
nncf/onnx/quantization/quantize_model.py 96.42% <100.00%> (-0.24%) ⬇️
.../quantization/algorithms/smooth_quant/algorithm.py 31.92% <100.00%> (+4.07%) ⬆️
nncf/quantization/pipelines/pipeline.py 100.00% <100.00%> (ø)

@@ -116,8 +116,7 @@ def native_quantize_impl(
advanced_parameters=advanced_parameters,
)

graph = GraphConverter.create_nncf_graph(model)
quantized_model = quantization_algorithm.apply(model, graph, dataset=calibration_dataset)
quantized_model = quantization_algorithm.run(model, calibration_dataset)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

quantization_pipeline.run? For the consistency.

calibration_dataset: Dataset,
validation_fn: Callable[[Any, Iterable[Any]], Tuple[float, Union[None, List[float], List[List[TTensor]]]]],
subset_size: int,
initial_metric_results: MetricResults,
quantized_metric_results: MetricResults,
):
"""
:param algorithm_cls: Class of algorithm.
:param pipeline_cls: Class of pipeline.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pipeline or StepwisePipeline after all?


algorithm = self._algorithms[best_combination_key]
result_model = algorithm.apply(model, initial_graph, self._statistic_points)
# TODO(andrey-churkin): Show final best settings
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How do the final best_settings apply to the model?

if bias_correction_params.threshold is not None:
threshold = bias_correction_params.threshold

pipeline_steps[-1].append(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why -1?

PipelineStep = List[Algorithm]


def get_statistic_points(pipeline_step: PipelineStep, model: TModel, graph: NNCFGraph) -> StatisticPointsContainer:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not keep these methods in the StepwisePipeline class?

# Run current pipeline step
step_model = run_pipeline_step(pipeline_step, step_statistics, step_model, step_graph)

step_graph = None # We should rebuild the graph for the next pipeline step
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why can't we drop it and rebuild NNCFGraph in each step? Any optimizations here?

Comment on lines 126 to 129
for algorithm in pipeline_step[:-1]:
current_model = algorithm.apply(current_model, current_graph, pipeline_step_statistics)
current_graph = NNCFGraphFactory.create(current_model)
current_model = pipeline_step[-1].apply(current_model, current_graph, pipeline_step_statistics)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure that determining of the major algorithms such as MinMax quantization based on their position is a good idea. We should always keep in mind the position of all algorithms, even outside of the PostTrainingQuantization.

algo = PostTrainingQuantization(target_device=target_device)
min_max_algo = algo.algorithms[0]
pipelines = PostTrainingQuantization(target_device=target_device)
min_max_algo = pipelines.pipeline_steps[-1][0]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This index hardcode doesn't explain clearly, how the MinMax algorithm should be get.
A good example of the increased pipeline complexity and the need for the MinMax as the independent unit.
cc @alexsu52

@andrey-churkin
Copy link
Contributor Author

run openvino pre-commit tests

@@ -175,7 +180,7 @@ def find_best_combination(
return best_combination_key


class HyperparameterTuner:
class HyperparameterTuner(Pipeline):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I couldn't find which interface I should implement to support my algorithm with the HyperparameterTuner algorithm.

@andrey-churkin andrey-churkin changed the title Add Pipeline and StepwisePipeline classes Add Pipeline class Sep 21, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
experimental NNCF Common Pull request that updates NNCF Common NNCF ONNX Pull requests that updates NNCF ONNX NNCF OpenVINO Pull requests that updates NNCF OpenVINO NNCF PT Pull requests that updates NNCF PyTorch NNCF PTQ Pull requests that updates NNCF PTQ
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants