-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[wip] Add cosine similarity #157
base: main
Are you sure you want to change the base?
Conversation
@edgarriba I want to give the cosine similarity another shot. I started with the norm functions and thought I'll ask for an interim review before I go further. |
crates/kornia-core/src/tensor.rs
Outdated
/// # Returns | ||
/// | ||
/// A new tensor with the data raised to the power. | ||
pub fn powf(&self, n: T) -> Tensor<T, N, A> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the point of having the kornia-core-ops was to start reducing operators in the tensor definition. Does it make sense to have TensorOps
trait that implements all this map
based operators ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I understand. I was unsure how would you like to go about moving the old operators over and the powf function was so close to the existing operators, that I thought I would rather put it with them. I can move them over to a TensorOps trait if we are ready for that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, can do step by step and iterate on the api
This is what it looks like wrapped in a TensorOps trait. Open to feedback before moving the older ops over. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking good !
crates/kornia-core-ops/src/ops.rs
Outdated
/// | ||
/// ``` | ||
/// use kornia_core::{Tensor, CpuAllocator}; | ||
/// use kornia_core_ops::ops::TensorOps; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/// use kornia_core_ops::ops::TensorOps; | |
/// use kornia_core_ops::TensorOps; |
Maybe we can expose to root ?
T: std::ops::Add<Output = T> + Float, | ||
{ | ||
let p_inv = T::one() / p; | ||
Ok(self.powf(p).sum_elements(dim)?.powf(p_inv)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we provide a reduction argument?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you mean to drop the dimension that the operation was run over?
@jandremarais consdier some changes from #156 |
@jandremarais any updates here ? |
#136
[x] add norm functions
[ ] add cosine similarity function