Skip to content

Conversation

@justinchuby
Copy link
Collaborator

@justinchuby justinchuby commented Jan 15, 2026

This pull request refactors how operator schemas are handled throughout the autocast, converter, and evaluator modules. The main change is replacing direct usage of OpSchema with a new _schemas.OpSignature abstraction, leading to more consistent and modular code when dealing with operator signatures, especially for input casting and evaluation. Several related methods are renamed and refactored for clarity and encapsulation.

Important changes

  • The Evaluator interface now defines eval_op on onnx ops. The old eval was removed in favor of a more flexible eval_op. The exporter's eval will continue to function with a compatible logic in class Op
  • op_schema properties from Functions are removed

Operator signature abstraction and autocast refactor:

  • Replaced usage of OpSchema with _schemas.OpSignature in onnxscript/_internal/autocast.py, updating all relevant function signatures and internal logic to use the new abstraction. This includes changing how input parameters are filtered and type constraints are accessed.

AST Converter integration:

  • Updated the converter (onnxscript/_internal/converter.py) to pass op_signature instead of op_schema to autocast functions, ensuring compatibility with the new signature abstraction.

Evaluator refactor and encapsulation:

  • Refactored the evaluator (onnxscript/_internal/evaluator.py) to use _adapt_inputs, _adapt_attributes, and _adapt_outputs methods, encapsulating the logic for adapting inputs/outputs and removing unused or redundant methods. Now, operator signatures are consistently adapted from OpSchema using _schemas.OpSignature.

Addtionally:

  • Clean up typing annotation utilities
  • Now supply IR attrs directly when creating attributes to avoid proto serialization and loss of subgraph references

Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
@codecov
Copy link

codecov bot commented Jan 15, 2026

❌ 2 Tests Failed:

Tests completed Failed Passed Skipped
13671 2 13669 1424
View the full list of 2 ❄️ flaky test(s)
onnxscript.tools.transformers_models.phi_test.TestExportPhi::test_phi_export_cpu

Flake rate in main: 96.35% (Passed 19 times, Failed 502 times)

Stack Traces | 19.5s run time
..../test_onnx_ir_git/lib/python3.11.../onnx_ir/passes/_pass_infra.py:250: in call
    pass_result = pass_(model)
                  ^^^^^^^^^^^^
..../test_onnx_ir_git/lib/python3.11.../onnx_ir/passes/_pass_infra.py:127: in __call__
    result = self.call(model)
             ^^^^^^^^^^^^^^^^
onnxscript/version_converter/__init__.py:77: in call
    raise ValueError(
E   ValueError: The model contains functions. The version conversion pass does not support functions. Please use `common_passes.InlinePass` to inline the functions before applying this pass (_ConvertVersionPassRequiresInline).

The above exception was the direct cause of the following exception:
onnxscript/_internal/version_utils.py:94: in call_f
    return fct(self)
           ^^^^^^^^^
.../tools/transformers_models/phi_test.py:30: in test_phi_export_cpu
    proto = onnxscript.tools.transformers_models.export_to_onnx(model, *input_tensors)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../tools/transformers_models/__init__.py:41: in export_to_onnx
    prog = torch.onnx.export(model, args, dynamo=True)  # pylint: disable=no-value-for-parameter
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
..../test_onnx_ir_git/lib/python3.11.../torch/onnx/__init__.py:364: in export
    return _compat.export_compat(
..../test_onnx_ir_git/lib/python3.11.../_internal/exporter/_compat.py:180: in export_compat
    onnx_program.model = onnxscript_apis.convert_version(
onnxscript/_framework_apis/torch_2_6.py:43: in convert_version
    version_converter.convert_version(model, target_version, fallback=True)
onnxscript/version_converter/__init__.py:173: in convert_version
    ConvertVersionPass(target_version=target_version, fallback=fallback)(model)
..../test_onnx_ir_git/lib/python3.11.../onnx_ir/passes/_pass_infra.py:127: in __call__
    result = self.call(model)
             ^^^^^^^^^^^^^^^^
onnxscript/version_converter/__init__.py:52: in call
    return self.convert_pass(model)
           ^^^^^^^^^^^^^^^^^^^^^^^^
..../test_onnx_ir_git/lib/python3.11.../onnx_ir/passes/_pass_infra.py:127: in __call__
    result = self.call(model)
             ^^^^^^^^^^^^^^^^
..../test_onnx_ir_git/lib/python3.11.../onnx_ir/passes/_pass_infra.py:253: in call
    raise PassError(
E   onnx_ir.passes.PassError: An error occurred when running the '<onnxscript.version_converter._ConvertVersionPassRequiresInline object at 0x112190bd0>' pass after the following passes: ['<onnx_ir.passes.common.inliner.InlinePass object at 0x112115450>']
onnxscript.tools.transformers_models.phi_test.TestExportPhi::test_phi_export_cpu_export_api

Flake rate in main: 96.35% (Passed 19 times, Failed 502 times)

Stack Traces | 13.1s run time
..../test_onnx_ir_git/lib/python3.11.../onnx_ir/passes/_pass_infra.py:250: in call
    pass_result = pass_(model)
                  ^^^^^^^^^^^^
..../test_onnx_ir_git/lib/python3.11.../onnx_ir/passes/_pass_infra.py:127: in __call__
    result = self.call(model)
             ^^^^^^^^^^^^^^^^
onnxscript/version_converter/__init__.py:77: in call
    raise ValueError(
E   ValueError: The model contains functions. The version conversion pass does not support functions. Please use `common_passes.InlinePass` to inline the functions before applying this pass (_ConvertVersionPassRequiresInline).

The above exception was the direct cause of the following exception:
onnxscript/_internal/version_utils.py:94: in call_f
    return fct(self)
           ^^^^^^^^^
.../tools/transformers_models/phi_test.py:48: in test_phi_export_cpu_export_api
    proto = onnxscript.tools.transformers_models.export_to_onnx(
.../tools/transformers_models/__init__.py:41: in export_to_onnx
    prog = torch.onnx.export(model, args, dynamo=True)  # pylint: disable=no-value-for-parameter
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
..../test_onnx_ir_git/lib/python3.11.../torch/onnx/__init__.py:364: in export
    return _compat.export_compat(
..../test_onnx_ir_git/lib/python3.11.../_internal/exporter/_compat.py:180: in export_compat
    onnx_program.model = onnxscript_apis.convert_version(
onnxscript/_framework_apis/torch_2_6.py:43: in convert_version
    version_converter.convert_version(model, target_version, fallback=True)
onnxscript/version_converter/__init__.py:173: in convert_version
    ConvertVersionPass(target_version=target_version, fallback=fallback)(model)
..../test_onnx_ir_git/lib/python3.11.../onnx_ir/passes/_pass_infra.py:127: in __call__
    result = self.call(model)
             ^^^^^^^^^^^^^^^^
onnxscript/version_converter/__init__.py:52: in call
    return self.convert_pass(model)
           ^^^^^^^^^^^^^^^^^^^^^^^^
..../test_onnx_ir_git/lib/python3.11.../onnx_ir/passes/_pass_infra.py:127: in __call__
    result = self.call(model)
             ^^^^^^^^^^^^^^^^
..../test_onnx_ir_git/lib/python3.11.../onnx_ir/passes/_pass_infra.py:253: in call
    raise PassError(
E   onnx_ir.passes.PassError: An error occurred when running the '<onnxscript.version_converter._ConvertVersionPassRequiresInline object at 0x112b18bd0>' pass after the following passes: ['<onnx_ir.passes.common.inliner.InlinePass object at 0x112910510>']

To view more test analytics, go to the Test Analytics Dashboard
📋 Got 3 mins? Take this short survey to help us improve Test Analytics.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR refactors how operator schemas are handled by replacing direct usage of onnx.defs.OpSchema with a new _schemas.OpSignature abstraction throughout the autocast, converter, and evaluator modules. This change improves modularity and encapsulation when dealing with operator signatures.

Changes:

  • Replaced OpSchema with OpSignature in autocast functions for input type casting
  • Updated converter to pass op_signature instead of op_schema to autocast functions
  • Refactored evaluator methods to be private (_adapt_inputs, _adapt_attributes, _adapt_outputs) and removed the use_graph_attribute method along with its associated conditional logic

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 2 comments.

File Description
onnxscript/_internal/autocast.py Updated to use OpSignature instead of OpSchema, including parameter filtering and type constraint access changes
onnxscript/_internal/converter.py Changed to pass op_signature instead of op_schema to autocast functions
onnxscript/_internal/evaluator.py Made adapter methods private, removed use_graph_attribute method, and simplified _adapt_attributes to always use graph attributes
Comments suppressed due to low confidence (1)

onnxscript/_internal/evaluator.py:548

  • The use_graph_attribute method in ORTMixedEvaluator is now dead code since it's no longer called after removing the conditional logic in _adapt_attributes. This method should be removed from the ORTMixedEvaluator class.
    def use_graph_attribute(self, schema: onnx.defs.OpSchema) -> bool:
        return _schema_id(schema) not in self._python_ops

@justinchuby justinchuby marked this pull request as draft January 16, 2026 01:24
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
@justinchuby justinchuby marked this pull request as ready for review January 16, 2026 19:44
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 8 out of 8 changed files in this pull request and generated 2 comments.

Comments suppressed due to low confidence (1)

onnxscript/_internal/values.py:364

  • The docstring still mentions a 'functions' parameter that was removed from the method signature. This outdated documentation should be removed.
            functions: A list of functions to include in the model.
                By default, all functions called at least once are included.

Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
@justinchuby justinchuby marked this pull request as ready for review January 23, 2026 02:49
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 13 out of 13 changed files in this pull request and generated 4 comments.

)

# Duplicate the graph to create the model
main_graph = self.function_ir.graph.clone()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if this is necessary. I guess this is being used to handle the functions down below better. But, in practice, I don't think I have seen uses where functions are passed in, but we do call to_model_proto commonly.

Aren't we making the common case more expensive (cloning the graph before serializaing it instead of serializing it) in order to handle a usage I have not seen?

I guess two other options are: (a) Handle the special-case where no functions are specified more efficiently, or (b) Break compatibility and even drop the functions parameter or design a better API.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to_model_proto is called very rarely, so I don't think this will be costly. I cloned because the graph was subsequently modified (with updated opset imports etc.)

I do agree we should do (b). But that can be independent of copying.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I call to_model_proto all the time ... admittedly that is not production code, but creating test models for various testing purposes. But: I think it is better to get the API right before trying to fix any internal implementation issue, especially if it is going to affect the implementation choice. In this case, the API change can make the copy unnecessary. (If and when we expand the scope of onnxscript to define entire models, I think conversion to Model (proto or IR) will become more important.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Still, copying the IR object is cheap. I don't think it will make any material runtime difference but the duplication makes the process more robust.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed clone and the functions parameter.

justinchuby and others added 5 commits January 22, 2026 22:49
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
@justinchuby
Copy link
Collaborator Author

@gramalingam all tests passed

Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

Development

Successfully merging this pull request may close these issues.

4 participants