Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AutoBump] Merge with b790061b (Sep 05) (39) #364

Open
wants to merge 4 commits into
base: bump_to_b3942ff9
Choose a base branch
from

Commits on Sep 3, 2024

  1. [fximporter] Avoid importing from _torchMlir (llvm#3685)

    Downstream projects don't necessarily register this C++ module. This
    package removes the dependency and uses `torch.iinfo` to access the max
    and min values instead.
    zjgarvey authored Sep 3, 2024
    Configuration menu
    Copy the full SHA
    2960538 View commit details
    Browse the repository at this point in the history
  2. Add a canonicalization pattern for aten.unflatten.int (llvm#3656)

    Addresses an issue in <llvm#3651>
    where some unflatten ops generated from onnx models weren't propagating
    static shape information. It may be necessary to add further
    optimizations for the more general case when some static information is
    present in the unflatten (or possibly reshape/view) op's `sizes` list,
    but not reflected in the output shape. These ops will only successfully
    infer shapes if the `sizes` list is gotten from a list of constant ints
    (with possibly one -1). A common example where this fails is when some
    of the `sizes` are determined from `aten.size.int` ops on dynamic
    tensors, and other `sizes` are known statically.
    
    This PR includes:
    - a canonicalizer for `aten.unflatten.int` which converts to
    `aten.unsqueeze` when it is expanding one dim to two, and one of the new
    dims is statically 1.
    - an improvement to the folder for `aten.__or__.bool` which does not
    rely on *both* operands being static.
    zjgarvey authored Sep 3, 2024
    Configuration menu
    Copy the full SHA
    295bf41 View commit details
    Browse the repository at this point in the history

Commits on Sep 5, 2024

  1. Configuration menu
    Copy the full SHA
    b790061 View commit details
    Browse the repository at this point in the history

Commits on Sep 23, 2024

  1. Configuration menu
    Copy the full SHA
    d954130 View commit details
    Browse the repository at this point in the history