Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

build Python wheel #521

Closed
wants to merge 15 commits into from
Closed

build Python wheel #521

wants to merge 15 commits into from

Conversation

pca006132
Copy link
Collaborator

Attempt at #362.

Now the problematic thing is windows...

@codecov
Copy link

codecov bot commented Aug 4, 2023

Codecov Report

Patch coverage: 100.00% and no project coverage change.

Comparison is base (1514c15) 91.50% compared to head (28e085c) 91.50%.

Additional details and impacted files
@@           Coverage Diff           @@
##           master     #521   +/-   ##
=======================================
  Coverage   91.50%   91.50%           
=======================================
  Files          35       35           
  Lines        4521     4521           
=======================================
  Hits         4137     4137           
  Misses        384      384           
Files Changed Coverage Δ
src/manifold/src/impl.cpp 96.44% <ø> (ø)
src/manifold/src/properties.cpp 83.66% <ø> (ø)
src/manifold/src/sort.cpp 90.58% <ø> (ø)
src/cross_section/src/cross_section.cpp 79.94% <100.00%> (ø)

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@pca006132 pca006132 force-pushed the python-wheel branch 2 times, most recently from 0e979d0 to 2f63a62 Compare August 12, 2023 11:23
@pca006132
Copy link
Collaborator Author

@elalish maybe you can have a look at this later? both windows and macos have trouble finding the installed package. You can use the setup.py to build and inspect the generated wheel file.

@elalish
Copy link
Owner

elalish commented Aug 12, 2023

Could it be this?

@pca006132
Copy link
Collaborator Author

Could it be this?

I don't think so, we only have a dynamic library file in there and we should have no python files.

@elalish
Copy link
Owner

elalish commented Aug 13, 2023

I don't know, seems like package installation didn't go well. Did it work on Linux? Did you ever look into scikit-build: #362 (comment)? Seems like it's designed specifically to streamline our type of building process. Wild guess says if it was easy, they wouldn't have bothered making a package for it...

@pca006132
Copy link
Collaborator Author

no, I haven't tried scikit-build, but I tried another cmake extension for setuptools https://github.com/diegoferigo/cmake-build-extension without success.

@elalish
Copy link
Owner

elalish commented Aug 14, 2023

Did you see this, regarding the mac issue?

@pca006132
Copy link
Collaborator Author

Did you see this, regarding the mac issue?

yeah I saw that, but we are not calling cibuildwheel from a python script so this should not be an issue, we are running the test from within cibuildwheel (so the virtualenv should be the same).

@pca006132
Copy link
Collaborator Author

pca006132 commented Aug 14, 2023

I guess I should try scikit-build later.

@pca006132
Copy link
Collaborator Author

ok it seems scikit build works
I think we should merge #533 first

@pca006132
Copy link
Collaborator Author

@elalish https://cibuildwheel.readthedocs.io/en/stable/deliver-to-pypi/

we probably want to run the build wheel actions on tag/release only as well, as they are too time consuming.

@pca006132
Copy link
Collaborator Author

the cibuildwheel errors here are weird... will have a look at them later

@Eric-Vin
Copy link
Contributor

Thanks so much for your work on this, it would be great to use this tool from Python! The distutils error on 3.12 is probably because of this: https://peps.python.org/pep-0632/ . I've managed to work around it by specifying the following numpy versions:

  'numpy; python_version<"3.12.0rc1"',
  'numpy>=1.26.0b1; python_version>="3.12.0.rc1"',

in the following PR for a different library: BerkeleyAutomation/python-fcl#70

@pca006132
Copy link
Collaborator Author

'numpy; python_version<"3.12.0rc1"',

Thanks a lot!

@pca006132
Copy link
Collaborator Author

I found that in order to make python source distribution work, we have to put everything in the top level directory.

@pca006132
Copy link
Collaborator Author

oh and we can switch to build backend as well, which verifies the sdist for us

@pca006132
Copy link
Collaborator Author

ah I have no idea why the macos build keeps running for 2 hours. I guess it is stuck somewhere

@pca006132
Copy link
Collaborator Author

I am not entirely sure if my current way of using MANIFEST.in file to specify source file is idiomatic or not. It seems that it is mainly for data files, but it works.

@@ -291,7 +291,7 @@ std::shared_ptr<const PathImpl> CrossSection::GetPaths() const {
if (transform_ == glm::mat3x2(1.0f)) {
return paths_;
}
paths_ = shared_paths(transform(paths_->paths_, transform_));
paths_ = shared_paths(::transform(paths_->paths_, transform_));
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does :: do?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It search for the transform function defined in the anonymous namespace. By default it should search for the one in the current namespace, i.e. the transform function defined in par.
I found this error when I was experimenting with precompiled headers (minor compile time improvement)

@@ -12,10 +12,6 @@
// See the License for the specific language governing permissions and
// limitations under the License.

#include <thrust/count.h>
#include <thrust/logical.h>
#include <thrust/transform_reduce.h>
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

auto m4 = Manifold::Cube({4, 1, 1}).Transform(
glm::translate(glm::mat4(1.0f), glm::vec3(3.0f, 0, 0)));
glm::mat4x3(glm::translate(glm::mat4(1.0f), glm::vec3(3.0f, 0, 0))));
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a reason we're using Transform instead of Translate directly?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, I found that we actually set GLM_FORCE_EXPLICIT_CTOR in public.h, and compile with precompiled header will cause errors. I think it is something related to include order which causes it to pass compilation in normal builds, but error when I tried using precompiled header.

uses: pypa/[email protected]
- uses: actions/upload-artifact@v3
with:
path: ./wheelhouse/*.whl
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, we upload the artifact to the wheelhouse, but what do with it? Should we add a release action? Should we try TestPyPI first?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@elalish
Copy link
Owner

elalish commented Aug 24, 2023

Excellent work! I can't wait to see this published as a proper Python package!

@pca006132
Copy link
Collaborator Author

I also looked at nanobind, which said they can compile ABI3 wheels (while pybind11 explicitly said they will not support ABI3), and claims that they are faster both in compilation time and runtime.

The port is quite simple, and it seems that their API is a bit more flexible than the one provided by pybind11, although a bit less ergonomic (does not perform list <-> array conversion automatically). The build is also very fast.
The problem is that their ABI3 support is only for python 3.12...

@elalish
Copy link
Owner

elalish commented Aug 24, 2023

I don't know Python well enough to have an opinion on ABI3 or pybind11 (what is ABI3?). I trust your judgement here. Not sure if @Eric-Vin has any thoughts to add?

@pca006132
Copy link
Collaborator Author

https://pyo3.rs/v0.14.5/building_and_distribution#py_limited_apiabi3 has a good summary:

By default, Python extension modules can only be used with the same Python version they were compiled against. For example, an extension module built for Python 3.5 can't be imported in Python 3.8. PEP 384 introduced the idea of the limited Python API, which would have a stable ABI enabling extension modules built with it to be used against multiple Python versions. This is also known as abi3.
The advantage of building extension modules using the limited Python API is that package vendors only need to build and distribute a single copy (for each OS / architecture), and users can install it on all Python versions from the minimum version and up.

So basically the idea is to have forward compatibility. Old versions can be used in higher python versions without recompiling. Potentially when python keeps evolving we still can build a limited set of python wheels. (because we need to build one for every supported python version, multiplied by the supported OS)

But the problem is that while nanobind can target abi3, it can only target abi3-py312, which is not even released yet...
So it is more of a future problem. For now, I think the main benefit is just slightly faster build time. The old pybind11 build will take quite some time to link, but nanobind can complete in 1s or something.

@pca006132
Copy link
Collaborator Author

another downside of using nanobind is it only supports python 3.8+ (although 3.7 already reached EOL)

@elalish
Copy link
Owner

elalish commented Aug 24, 2023

I have no way to judge how much work it would be to switch to nanobind, but if you think it's worthwhile, I'm fine with it. I'm not too concerned about the long builds, but short is nice. It's also funny that it says "By default Python extension modules can only be used with the same Python version they were compiled against." - Does that mean they can in fact work against multiple versions as is? Seems hard to believe a minor version bump would make every module break.

@pca006132
Copy link
Collaborator Author

It's also funny that it says "By default Python extension modules can only be used with the same Python version they were compiled against." - Does that mean they can in fact work against multiple versions as is? Seems hard to believe a minor version bump would make every module break.

It depends on the ABI. By default there is no guarantee, so yeah a version bump can make every module break. I think there are also performance concerns making libraries want to use unstable APIs. For Python 3.12, it seems that they introduced some very efficient stable APIs so minibind can target them. (just my guess, I don't know enough about Python internals for this)

@pca006132
Copy link
Collaborator Author

For pybind11 vs nanobind, I will just open another PR and see what others think. I think nanobind will be nicer in the long run but I am not sure if the subtle differences in the API will break existing code that depends on our binding.

@pca006132 pca006132 mentioned this pull request Aug 28, 2023
@pca006132 pca006132 closed this Aug 28, 2023
@pca006132 pca006132 deleted the python-wheel branch August 29, 2023 07:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants