Skip to content

[Draft/Do Not Merge] Consolidate pyprima into the python/ folder#277

Open
nbelakovski wants to merge 29 commits intolibprima:mainfrom
nbelakovski:consolidate2
Open

[Draft/Do Not Merge] Consolidate pyprima into the python/ folder#277
nbelakovski wants to merge 29 commits intolibprima:mainfrom
nbelakovski:consolidate2

Conversation

@nbelakovski
Copy link
Contributor

This is a very large PR but the basic idea is to merge the pure Python implementation of COBYLA into the main 'prima' package as a "backend" which can be selected alongside Fortran.

I'd like to keep it in draft status until I have some time to take a look at how this affects the scipy integration.

But generally speaking it is ready for review. The CI successfully builds wheels for windows/mac-arm/mac-intel/linux-glibc.

Sadly the -ftrampoline-impl=heap did not solve the problems with wheels on musl even though I managed to have some success when testing locally. We can look into this further in a future PR.

@nbelakovski
Copy link
Contributor Author

I'm not sure why the tests are failing, they were passing a minute ago until I rebased and pushed, and they're passing for me locally. Will investigate, hopefully a quick fix.

@nbelakovski
Copy link
Contributor Author

Tests are failing because pybind11 was downgraded again.

There are a number of TODOs outstanding, but basically pyprima has been
integrated into the prima package.

Additionally, the pyprima tests need to set 'fortran': False, and we may
want to consolidate some tests as well as parameterizing some others ot
use fortran True/False
Zaikun suggested we always assume Fortran is available in order to keep
things simple. I decided to switch from boolean to string because I
didn't like that Fortran = False implies Python, I think it should be
more explicit.
It is no longer accessible as an import from optiprofiler unless we use
an older commit. Optiprofiler looks like it's still a ways from release
so I would rather have a local copy of the code.

Changed some things in test_pycutest to reflect recent changes.

A number of the tests are failing. I don't know if this has to do with
my new installation of cutest or if it's due to the changes I've been
making.

I suppose I should test with a known working commit.
Instead of testing against hardcoded numbers, we can just test against
the Fortran implemenetation directly now that we are in the same
package.

TODO: Need to make sure CI is building Fortran in debug mode for this
test.
It is not enabled by default (anymore?) so no need to skip it. Skipping
it raises a warning about it not being enabled.
pyprima uses PEP 604, allowing the writing of union types as `int |
float`, which is only supported from 3.10 up.
CIBW uses a RedHat based container, which does this strange thing where
the libgcc_s.so for a newer version of gcc links against the system one
(?!) instead of just being provided by the package. As such we get
issues about `__gcc_nested_func_ptr_created` not being found, and the
solution is to explicitly link against gcc_eh (the flag -static-libgcc
works as well I believe).

A bug report was filed, but it seems you need to be a customer to file
the bug report in the right place :(

https://bugzilla.redhat.com/show_bug.cgi?id=2437216
Since pycutest only works with Fortran in debug mode, it makes more sense to
run full tests and coverage separately from building the wheels.

The files were also renamed to start with python_ for better
organization.
Several commits for testing musl were squashed since it ultimately
didn't work, and a note was added.
We are now managing test requirements with uv
Hopefully this hierarchy makes it somewhat clear that bindings and
pyprima are both backends and effectively on the same level, and that
the main files in python/prima are the interface.
TENBARS1 is removed because it takes a very long time and is not
mentioned in the note at the top of the file as being a noteworthy test.

ERRINBAR runs quickly on arm but on x86 it takes 30-45 minutes, and
again it's not mentioned in the docstring so removing it should be fine.

--durations 0 is added to pytest so that it always displays how long
each test took to run

Switched to pytest-cov in an attempt to get coverage with pytest-xdist,
but the nondeterminism introduced by xdist is not desireable, so I just
left the pytest-cov implementation, it gives the same results as
coverage
Added a couple tests to pycutest to increase coverage of
load_cutest_problem.

Renamed a test in test_basic_functionality to something more consice and
appropriate.
The old way seemed to be unreliable, so some new ways of modiffying a
global variable were explored and the least annoying one was selected.
This addresses an intermittently failing test on ubuntu and also a
failing test on macos intel.

We also address the TODO question about an extra newline.
This enforces the architectural choice of having the Fortran and Python
implementations as "backends" which are not integral to the main python
package. See the test's docstring for details.

Since this test requires building with a separate env variable and
combining coverage, it make more sense to switch from poe to make. poe
is nice, but it runs after the environment variables are evaluated and
the build is completed, and we need a higher level orchestration for
modifying those variables and rerunning.
Runs in under 10 seconds. Helps with development.
appropriate backends

Also adjust the tests to reflect the new requirements.

The test_miscellaneous and test_end_to_end were removed.
Test_miscellaneous was removed because it doesn't cleanly fit into the
requirements at this time, and while we need it to improve coverage,
this branch is big enough without concerning ourselves with coverage.
Getting good coverage will have to be done as a later effort.

test_end_to_end was removed because in terms of functionality it's
redundant with some of the tests in test_basic_functionality and
test_options, and while it does provide some coverage that those don't,
specifically on the _project function, coverage efforts will have to
come later for sake of appropriate scoping, as mentioned above.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant