Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[wip] Add cosine similarity #157

Draft
wants to merge 16 commits into
base: main
Choose a base branch
from
Draft

Conversation

jandremarais
Copy link
Contributor

#136

[x] add norm functions
[ ] add cosine similarity function

@jandremarais
Copy link
Contributor Author

@edgarriba I want to give the cosine similarity another shot. I started with the norm functions and thought I'll ask for an interim review before I go further.

/// # Returns
///
/// A new tensor with the data raised to the power.
pub fn powf(&self, n: T) -> Tensor<T, N, A>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the point of having the kornia-core-ops was to start reducing operators in the tensor definition. Does it make sense to have TensorOps trait that implements all this map based operators ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I understand. I was unsure how would you like to go about moving the old operators over and the powf function was so close to the existing operators, that I thought I would rather put it with them. I can move them over to a TensorOps trait if we are ready for that.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, can do step by step and iterate on the api

@jandremarais
Copy link
Contributor Author

This is what it looks like wrapped in a TensorOps trait. Open to feedback before moving the older ops over.

Copy link
Member

@edgarriba edgarriba left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking good !

///
/// ```
/// use kornia_core::{Tensor, CpuAllocator};
/// use kornia_core_ops::ops::TensorOps;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
/// use kornia_core_ops::ops::TensorOps;
/// use kornia_core_ops::TensorOps;

Maybe we can expose to root ?

T: std::ops::Add<Output = T> + Float,
{
let p_inv = T::one() / p;
Ok(self.powf(p).sum_elements(dim)?.powf(p_inv))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we provide a reduction argument?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mean to drop the dimension that the operation was run over?

@edgarriba
Copy link
Member

@jandremarais consdier some changes from #156

@edgarriba
Copy link
Member

@jandremarais any updates here ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants