Skip to content

feat: Unify piecewise API behind add_piecewise_formulation (sign + LP dispatch)#638

Merged
FabianHofmann merged 67 commits intomasterfrom
feat/piecewise-api-refactor
May 6, 2026
Merged

feat: Unify piecewise API behind add_piecewise_formulation (sign + LP dispatch)#638
FabianHofmann merged 67 commits intomasterfrom
feat/piecewise-api-refactor

Conversation

@FBumann
Copy link
Copy Markdown
Collaborator

@FBumann FBumann commented Apr 1, 2026

Summary

Follows up on #602. Replaces the descriptor pattern (PiecewiseExpression, PiecewiseConstraintDescriptor, piecewise()) with a single, stateless construction layer: add_piecewise_formulation.

add_piecewise_formulation becomes the one entry point for piecewise work — equality, one-sided inequality, N-variable linking, per-entity breakpoints, unit-commitment gating, and automatic method dispatch all live behind one call. tangent_lines() survives as a low-level helper for users who want to build chord expressions manually.

The API

Each (expression, breakpoints[, sign]) tuple links a variable to its breakpoints. All tuples share interpolation weights, coupling them on the same curve piece. The optional 3rd element marks one tuple as bounded by the curve instead of pinned to it.

Equality (default)

# 2-variable: fuel = f(power)
m.add_piecewise_formulation(
    (power, [0, 30, 60, 100]),
    (fuel,  [0, 36, 84, 170]),
)

# N-variable (CHP plant — same pattern, more tuples)
m.add_piecewise_formulation(
    (power, [0, 30, 60, 100]),
    (fuel,  [0, 40, 85, 160]),
    (heat,  [0, 25, 55, 95]),
)

# Disjunctive (disconnected operating regions)
m.add_piecewise_formulation(
    (power, linopy.segments([(0, 0), (50, 80)])),
    (cost,  linopy.segments([(0, 0), (125, 200)])),
)

# Per-entity breakpoints (different curves per generator)
m.add_piecewise_formulation(
    (power, linopy.breakpoints({"gas": [0, 30, 60, 100], "coal": [0, 50, 100, 150]}, dim="gen")),
    (fuel,  linopy.breakpoints({"gas": [0, 40, 90, 180], "coal": [0, 55, 130, 225]}, dim="gen")),
)

# Breakpoints from slopes
m.add_piecewise_formulation(
    (power, [0, 30, 60, 100]),
    (fuel,  linopy.breakpoints(slopes=[1.2, 1.4, 1.7], x_points=[0, 30, 60, 100], y0=0)),
)

Inequality bounds — per-tuple sign

Append "<=" or ">=" as a third tuple element to mark one expression as bounded by the curve. The other tuples stay pinned. The bounded tuple need not be first — its role is visible at the call site.

# fuel ≤ f(power).  "auto" picks the cheapest correct formulation — a pure
# LP chord formulation when the curvature matches the sign (concave + "<=",
# or convex + ">="), SOS2/incremental otherwise.
m.add_piecewise_formulation(
    (fuel,  [0, 20, 30, 35], "<="),  # bounded
    (power, [0, 10, 20, 30]),         # pinned
)

On a matching-curvature curve this dispatches to method="lp": one chord inequality per piece plus a domain bound x_min ≤ x ≤ x_max, and no auxiliary variables. Mismatched curvature (convex + "<=" / concave + ">=") describes the wrong region (not just a looser one) — method="auto" detects this and falls back to SOS2/incremental with an explanatory info log; method="lp" raises with a clear error.

Restrictions (current):

  • At most one tuple may carry a non-equality sign.
  • With 3+ tuples, all signs must be "==".

The N-input bounded case (e.g. z >= f(x, y)) is rejected with an error that invites a GitHub issue if anyone has a concrete use case — we don't promise a specific future API shape.

valid or rejectable

Unit-commitment gating — active

# active=1: power in [30, 100], fuel = f(power)
# active=0: power = 0, fuel = 0
m.add_piecewise_formulation(
    (power, [30, 60, 100]),
    (fuel,  [40, 90, 170]),
    active=commit,  # binary variable
)

Works with SOS2, incremental, and disjunctive. With a bounded tuple, deactivation only pushes the signed bound to 0 — set lower=0 on naturally non-negative outputs (fuel, cost, heat) and the combination forces y = 0 automatically.

Low-level: tangent_lines

# Returns a LinearExpression with one chord per piece — no variables.
# Most users should prefer add_piecewise_formulation with a bounded tuple,
# which builds on this helper and adds domain bounds + curvature checks.
t = linopy.tangent_lines(power, x_pts, y_pts)
m.add_constraints(fuel <= t)

Key changes

New public API

  • add_piecewise_formulation — one entry point for all piecewise constructions. Returns a PiecewiseFormulation carrying .method (resolved formulation), .convexity (when well-defined), .variables, .constraints.
  • Per-tuple sign — optional 3rd tuple element, default "==". At most one non-equality; 3+ tuples must all be equality.
  • method parameter — "auto" (default), "sos2", "incremental", "lp".
  • active parameter — binary gating for unit commitment.
  • linopy.breakpoints() / linopy.segments() factories.
  • linopy.slopes_to_points() utility.
  • linopy.tangent_lines() low-level helper.

Auto-dispatch (method="auto")

  1. 2-variable inequality on a matching-curvature curvelp (chord + domain bounds, no aux vars)
  2. Strictly monotonic breakpointsincremental (deltas + binaries)
  3. Otherwisesos2 (lambdas + SOS2 constraint)
  4. Disjunctive (segments)sos2 with binary segment selection

The resolved method is logged at INFO; when LP is skipped, the log explains why (curvature, NaN layout, tuple count, or active).

Correctness properties

  • _detect_convexity is direction-invariant — ascending and descending x give the same label for the same graph.
  • NaN-padded per-entity breakpoints are masked everywhere, including the LP chord constraints (padded pieces don't create spurious y ≤ 0 constraints).
  • method="lp" validates curvature + sign + tuple count + active absence upfront; mismatch raises with a clear pointer to method="auto".
  • PiecewiseFormulation.method and .convexity persist across netCDF round-trip.

Architecture (internal)

  • _PwlInputs dataclass categorizes the user's tuples into pinned_* and bounded_* slots — explicit role assignment, no positional convention. Carries through the whole dispatch chain.
  • _PwlLinks is the stacked link representation consumed by SOS2/incremental/disjunctive builders; _build_links() produces it from _PwlInputs.
  • _try_lp() / _resolve_sos2_vs_incremental() flatten the dispatch — _add_continuous is ~10 lines of real logic.
  • User tuple order is preserved end-to-end; the bounded one is found via _PwlInputs, not by assuming position 0.

Removed / deprecated

  • PiecewiseExpression, PiecewiseConstraintDescriptor, linopy.piecewise() — descriptor pattern gone.
  • Old add_piecewise_constraints shape that dispatched on Variable.__le__/__ge__/__eq__ return types.
  • Variable operator overloads (__le__, __ge__, __eq__) simplified — no more piecewise dispatch.

Design principles

  • API surface stays minimal: Model, Variables, Constraints, Expression, plus the single PiecewiseFormulation return type. No user-facing intermediate state.
  • Piecewise is a construction layer that produces regular linopy objects.
  • One entry point for equality and inequality — per-tuple sign reads as the relation it encodes ((y, y_pts, "<=")y ≤ f(...)).
  • Variable names as link coords: the internal _pwl_var dimension shows power, fuel, heat instead of 0, 1, 2.

Alternative designs considered

A) Descriptor pattern (PR #602, previous master)

pw = linopy.piecewise(x, x_pts, y_pts)
m.add_piecewise_constraints(pw == y)

Rejected: Introduces PiecewiseExpression and PiecewiseConstraintDescriptor as user-facing state. Breaks the "only Variables, Constraints, Expressions" principle. Structurally limited to 2-variable x → y.

B) Dict of expressions + shared breakpoints DataArray

m.add_piecewise_formulation(exprs={"power": power, "fuel": fuel}, breakpoints=bp)

Considered: Supports N-variable. But requires string keys to match breakpoint coordinates — an indirection layer.

C) Global sign= keyword (initial implementation, refactored in #664)

m.add_piecewise_formulation((fuel, y_pts), (power, x_pts), sign="<=")

The original implementation used a single formulation-level sign= kwarg with a positional convention ("the first tuple is the bounded one when sign != '=='"). #664 refactored this to per-tuple sign because the role belongs to a specific tuple — encoding it globally + positionally was a wart that required a paragraph of docs to teach. The internal dispatch was rewritten from positional access to explicit categorization (_PwlInputs with bounded_* and pinned_* slots).

D) Chosen: per-tuple sign on the tuple it applies to

m.add_piecewise_formulation(
    (fuel,  y_pts, "<="),  # bounded (role visible at the call site)
    (power, x_pts),         # pinned
)

Future work this enables

  • 2-D triangulated piecewise (z ≥ f(x, y)). The reserved N≥3-inequality slot reads naturally as the epigraph of a bivariate function: (z, z_pts, ">="), (x, x_pts), (y, y_pts), triangulation=tris. Vertex coordinates stay as plain 1-D DataArrays; the simplex adjacency rides along as one shared sibling kwarg. _PwlInputs would grow a single optional field; the formulation builders for the triangulated path are new but consume the same input carrier.
  • Multi-bounded relations (if a use case appears) — the "at most one bounded tuple" check is a single line; the dispatch chain already handles per-tuple roles.

Defered Features

  • More polished slopes api, potentially with a new class

Out of scope / follow-ups

Directions the design enables or considered but deliberately not shipping in this PR:

  • 2-D triangulated piecewise (z ≥ f(x, y)). The reserved N≥3-inequality slot reads naturally as the epigraph of a bivariate function: (z, z_pts, ">="), (x, x_pts), (y, y_pts), triangulation=tris. Vertex coordinates stay as plain 1-D DataArrays; the simplex adjacency rides along as one shared sibling kwarg. _PwlInputs would grow a single optional field; the formulation builders for the triangulated path are new but consume the same input carrier.
  • Multi-bounded relations (if a use case appears) — the "at most one bounded tuple" check is a single line; the dispatch chain already handles per-tuple roles.
  • More polished slopes API, potentially with a new class. breakpoints(slopes=..., x_points=..., y0=...) is shipped today. What's deferred is letting slopes inherit the x grid from a sibling tuple (e.g. (fuel, Slopes([1.2, 1.4, 1.7], y0=0)) paired with a reference tuple that carries the x grid). Two viable shapes were discussed — extending breakpoints(slopes=...) to allow omitted x_points, or introducing a small frozen-dataclass Slopes value type — each with tradeoffs around type-system visibility and dispatch clarity. EvolvingAPIWarning covers either direction as a non-breaking follow-up.

Notebook examples

examples/piecewise-linear-constraints.ipynb (main tutorial):

# Section Feature
1 Getting started Basic 2-variable equality + method inspection via pwf
2 Picking a method auto vs sos2 vs incremental give the same optimum
3 Disjunctive segments segments() for gaps in operation
4 Inequality bounds Per-tuple "<=" (auto-dispatches LP)
5 Unit commitment active parameter
6 N-variable linking CHP plant (3 variables, joint equality)
7 Per-entity breakpoints Fleet with different curves

examples/piecewise-inequality-bounds.ipynb (deep-dive on per-tuple sign / LP):

  • Setup with a concave reference curve
  • Three methods (LP, SOS2, incremental) verified to produce the same feasible region
  • 2-D hypograph feasibility plot
  • Curvature × sign rule table — the "wrong region" cases and why
  • Auto-dispatch fallback demonstration (non-convex curve, mismatched sign, explicit method="lp" raise)

Formulation math lives in the reference page, not in the notebook.

Test plan

  • pytest test/test_piecewise_constraints.py test/test_piecewise_feasibility.py — 518 passed
  • Full suite (excl. solver/remote) — 1578 passed, 7 skipped
  • Both notebooks execute end-to-end against Gurobi
  • Ruff lint + format clean, mypy clean on the full repo
  • Round-trip tested: PiecewiseFormulation.method and .convexity persist across to_netcdf / read_netcdf
  • Review

🤖 Generated with Claude Code

FBumann and others added 22 commits April 1, 2026 08:36
…on layer

Remove PiecewiseExpression, PiecewiseConstraintDescriptor, and the
piecewise() function. Replace with an overloaded add_piecewise_constraints()
that supports both a 2-variable positional API and an N-variable dict API
for linking 3+ expressions through shared lambda weights.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Change add_piecewise_constraints() to use keyword-only parameters
(x=, y=, x_points=, y_points=) instead of positional args.  Add
detailed docstring documenting the mathematical meaning of equality
vs inequality constraints.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The N-variable path was not broadcasting breakpoints to cover extra
dimensions from the expressions (e.g. time), resulting in shared
lambda variables across timesteps.  Also simplify CHP example to
use breakpoints() factory and add plot.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The plotting helper now accepts a single breakpoints DataArray with a
"var" dimension, supporting both 2-variable and N-variable examples.
Replaces the inline CHP plot with a single function call.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Document the N-variable core formulation with shared lambda weights,
explain how the 2-variable case maps to it, and detail the inequality
case (auxiliary variable + bound).  Remove all references to the
removed piecewise() function and descriptor classes.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add linopy.piecewise_envelope() as a standalone linearization utility
that returns tangent-line LinearExpressions — no auxiliary variables.
Users combine it with regular add_constraints for inequality bounds.

Remove sign parameter, LP method, convexity detection, and all
inequality logic from add_piecewise_constraints. The piecewise API
now only does equality linking (the core formulation).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
More accurate name — the function computes tangent lines per segment,
not necessarily a convex/concave envelope.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Single function doesn't justify a separate module. tangent_lines
lives next to breakpoints() and segments() — all stateless helpers
for the piecewise workflow.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add prominent section explaining the fundamental difference:
- add_piecewise_constraints: exact equality, needs aux variables
- tangent_lines: one-sided bounds, pure LP, no aux variables
- tangent_lines with == is infeasible (overconstrained)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Replace keyword-only (x=, y=, x_points=, y_points=) and dict-based
(exprs=, breakpoints=) forms with a single tuple-based API:

    m.add_piecewise_constraints(
        (power, [0, 30, 60, 100]),
        (fuel,  [0, 36, 84, 170]),
    )

2-var and N-var are the same pattern — no separate convenience API.
Internally stacks all breakpoints along a link dimension and uses
a unified formulation path.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The _pwl_var dimension now shows variable names (e.g. "power", "fuel")
instead of generic indices ("0", "1"), making generated constraints
easier to debug and inspect.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
… notebook

The piecewise() function was removed but api.rst still referenced it.
Also replace xr.concat with breakpoints() in plot cells to avoid
pandas StringDtype compatibility issue on newer xarray.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Silences xarray FutureWarning about default coords kwarg changing.
No behavior change — we concatenate along new dimensions where
coord handling is irrelevant.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add Example 8 (fleet of generators with per-entity breakpoints) to
the notebook.  Also drop scalar coordinates from breakpoints before
stacking to handle bp.sel(var="power") without MergeError.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The descriptor API was never released, so for users this is all new.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@FBumann FBumann marked this pull request as ready for review April 1, 2026 11:53
@FBumann FBumann changed the title Refactor piecewise API: stateless construction layer + tangent_lines utility refac: Refactor piecewise API to be a stateless construction layer + tangent_lines utility Apr 1, 2026
@FBumann FBumann changed the title refac: Refactor piecewise API to be a stateless construction layer + tangent_lines utility refac: Refactor piecewise API to a stateless construction layer + tangent_lines utility Apr 1, 2026
@FBumann FBumann requested a review from FabianHofmann April 1, 2026 11:54
FBumann and others added 3 commits April 1, 2026 14:03
Reorder: Quick Start -> API -> When to Use What -> Breakpoint
Construction -> Formulation Methods -> Advanced Features.
Add per-entity, slopes, and N-variable examples. Deduplicate
code samples. Fold generated-variables tables into compact lists.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@FBumann
Copy link
Copy Markdown
Collaborator Author

FBumann commented Apr 1, 2026

@FabianHofmann This is pretty urgent to me, as the current piecewise API is already on master and should NOT be included in the next release in my opinion.
But i understand that the CSR PR is more important atm.

The current code can be reorganized a bit, but id like to hear your thoughts about the API first.

@FabianHofmann
Copy link
Copy Markdown
Collaborator

this looks cool, well done @FBumann ! I will have a look in the afternoon!

Copy link
Copy Markdown
Collaborator

@FabianHofmann FabianHofmann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

first round of small requests mostly outside of the piecewise module.

Comment thread pyproject.toml
Comment thread linopy/piecewise.py Outdated
Comment thread linopy/constraints.py Outdated
Comment thread linopy/constraints.py
Comment thread linopy/model.py Outdated
Comment thread linopy/expressions.py Outdated
Comment thread linopy/io.py Outdated
Comment thread linopy/model.py Outdated
Comment on lines +541 to +542
names.update(pwl.variable_names)
names.update(pwl.constraint_names)
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

piecewise variables and constraints cannot have the same name right?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, they can. As regular variables and constraints can share a name too. They do not currently though.

@FabianHofmann
Copy link
Copy Markdown
Collaborator

@FBumann just went through the notebook, fantastic work. mind if I push some decorative changes to the notebook?

@FBumann
Copy link
Copy Markdown
Collaborator Author

FBumann commented May 4, 2026

Of course not. Go ahead. You dont have to ask that going forward. If you keep a proper commit history, i can check what you did and complain if i need to :)

FBumann and others added 9 commits May 4, 2026 17:15
The descriptor pattern is gone, so the bare ``# type: ignore`` over-suppressed
mypy.  ``__eq__`` still needs to declare ``Constraint`` instead of ``bool`` to
preserve linopy's expression-equals-builds-constraint semantics, so the
override-mismatch is intrinsic — narrow the directive to ``[override]``.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…int_names

Use the same key names that ``PiecewiseFormulation`` uses internally
(``variable_names``/``constraint_names``) so the netcdf attribute layout
matches the dataclass field names.  Read path updated to match.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Promote the method and convexity string sets to Literal type aliases and
derive the runtime sets from ``get_args``.  ``piecewise.py`` now uses the
aliases for type annotations on ``add_piecewise_formulation``,
``_detect_convexity``, and ``PiecewiseFormulation.convexity`` instead of
inlining the string list at every site.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Move the piecewise summary block out of ``Model.__repr__`` into
``piecewise._repr_summary``; the model now adds the section with a
single call.  Also drops the ``_repr_filtered`` wrappers on
``Constraints``/``Variables`` — the model calls ``_format_items(exclude=...)``
directly, since that's all the wrappers were doing.

The variable and constraint name sets are returned separately from
``_grouped_names`` (variables and constraints live in independent
namespaces in the model, so each filter applies to its own collection).

Restores the missing docstrings on ``Constraints.__repr__`` and
``Variables.__repr__`` and corrects the wording — they describe the
respective container, not the model.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
A single model build often calls ``add_piecewise_formulation`` /
``tangent_lines`` hundreds of times, and each emit produces a
multi-line warning that drowns out other output.  Dedup per call site
via a module-level set keyed by a typed ``_EvolvingApiKey`` literal so
each entry point warns at most once per process.

Warning text now mentions the once-per-session behaviour so users know
they aren't seeing every call.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
feat: add jupyter and ipykernel to dev extension in installation
8a57437 added ipykernel/jupyter/matplotlib to [dev], which transitively
pulled requests in via the jupyter metapackage.  test/remote/test_oetc.py
guards itself with `pytest.importorskip("requests")` — on master that
skipped the whole file, but with requests now resolved the file got
collected and every test failed at OetcHandler() instantiation because
google-cloud-storage (the other half of the [oetc] extra) is still
absent.  Result: 23 failures + 14 errors across the entire test matrix.

Notebook execution lives in test-notebooks.yml, which already installs
[docs] (ipykernel + matplotlib are there).  No CI job needs notebook
deps in [dev].

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@FabianHofmann
Copy link
Copy Markdown
Collaborator

Here is the review from my agents:

Critical

  • piecewise.py_lp_eligibility (1172-1200) and _try_lp (1278-1308) duplicate every check. Two sources of truth for LP eligibility rules. Collapse into one helper returning (ok, reason) and raise from _try_lp when method == "lp".
  • __init__.py:60-66__all__ no longer alphabetical. tangent_lines sits between breakpoints and segments; align/merge/options/read_netcdf come after slopes_to_points. The latter group was already broken on master, but the new tangent_lines insertion compounds it. Likely to trip ruff RUF022.
  • No tests for EvolvingAPIWarning at all. grep -rn "EvolvingAPIWarning" test/ returns zero matches. Per-key dedup, stacklevel=3, and the "no double-warn from _tangent_lines_impl inside _add_lp" claim are all unverified. Module-global dedup set will make tests order-dependent without a reset fixture.
  • NetCDF roundtrip is a single 2-tuple equality case (test_formulation_survives_netcdf, line 2048). Missing: multi-formulation models, disjunctive formulations, bounded <=/>= tuples, method="lp" (empty variable_names), and convexity != None.

High

Code quality / DRY

  • piecewise.py:648-659 _validate_breakpoint_shapes repeats the BREAKPOINT_DIM error twice for bp_a/bp_b. Loop the pair.
  • piecewise.py:1227-1267 _build_links has two near-identical equality-branch builders.
  • piecewise.py:1038-1059 _PwlInputs two-branch construction differs in 4 fields only.
  • piecewise.py:127-140 PiecewiseFormulation hand-rolls __slots__+__init__; use @dataclass(slots=True) to drop ~25 lines of boilerplate.
  • piecewise.py:154-188 _user_dims and __repr__ duplicate the dim-collection / _-prefix-skip loop.

Type / API

  • piecewise.py:131 method: str should be PWL_METHOD (or narrowed Literal); same for convexity.
  • piecewise.py:797-805 add_piecewise_formulation(**kwargs: object) defeats static type-checking on callers and conflicts with the no-getattr-style rule. Use explicit sign: str | None = None and raise.
  • io.py:1265-1269 Inconsistent fail-fast posture — convexity uses d.get("convexity") (silent None) while every other field uses d["..."] (KeyError). Pick one.

Tests

  • PiecewiseFormulation.variables / .constraints / __repr__ / _user_dims essentially unexercised. Only f.method/f.convexity are read in tests.
  • Auto-method success log never asserted (only the LP-skip caplog branch at line 1850 exists).
  • _lp_eligibility reason strings not individually asserted. Each branch returns a distinct user-facing diagnostic; parametrize directly.
  • _add_continuous resolution for method ∈ {"sos2","incremental","lp","auto"} not parametrized (only ["sos2","incremental"] at line 1732).

Medium

Code quality

  • piecewise.py:62-74 Custom _emitted_evolving_warnings set is over-engineered; warnings.simplefilter("once", EvolvingAPIWarning) would suffice.
  • piecewise.py:1364 _resolve_sos2_vs_incremental final raise ValueError("unknown method ...") is dead — add_piecewise_formulation validates against PWL_METHODS upstream.
  • piecewise.py:1156-1169 _PwlInputs.all_bps/all_coords/all_exprs three near-identical methods; collapse.
  • piecewise.py:1033 link-coord fallback to str(i) can collide with a user variable named "1". Use f"_pwl_{i}".
  • model.py:506 indirect _grouped_names import → consider exposing as model._piecewise_grouped_names().
  • io.py:1259-1269 move reconstruction into PiecewiseFormulation.from_dict for symmetry with to_netcdf.

Type / API

  • piecewise.py:1146 link_dim = "_pwl_var" is ad-hoc; promote to PWL_LINK_DIM constant alongside BREAKPOINT_DIM/SEGMENT_DIM.
  • piecewise.py:1448 piece_dim = f"{dim}_piece" recomputed in _add_continuous instead of reusing LP_PIECE_DIM (imported at line 28, used in _rename_to_pieces at 236-237). Two sources of truth.
  • piecewise.py:191 _grouped_names is opaque naming → _piecewise_member_names.
  • piecewise.py tangent_lines (~600-635) and add_piecewise_formulation (~900-925) docstrings are long; the EvolvingAPIWarning silencing recipe is duplicated verbatim across both sites.
  • variables.py:545 bare # type: ignore should be # type: ignore[override] (regression — was [override] before).
  • piecewise.py:1116-1125 _stack_along_link accepts Sequence[DataArray | xr.Dataset] but returns DataArray; either generic via TypeVar or split.

Tests

  • add_piecewise_formulation unexpected-kwargs TypeError path untested.
  • _breakpoints_from_slopes 3D slopes path, dict y0 missing key, per-entity length mismatch — direct tests missing.
  • _validate_breakpoint_shapes first-arg branch (mirror of the bp_b test at 1360) untested.
  • Public breakpoints / segments factories: empty list, single point, all-NaN, ±inf, and duplicate x (zero dx → div-by-zero in tangent_lines / _detect_convexity) all uncovered.
  • LP _add_lp piece_count == 1 shortcut (piece_mask all-True) not tested.
  • _broadcast_points "Cannot broadcast: missing dim" branch has no negative test.
  • SOS2/incremental masking branch of bounded path with NaN not directly verified.

Low

  • piecewise.py:229-231 _strip_nan returns list[float]; use arr[~np.isnan(arr)].tolist() (faster, type-consistent).
  • piecewise.py:1187, 1287 two assert ... is not None # narrowed by is_equality check comments — drop the trailing rationale comments.
  • piecewise.py:550-551 _tangent_lines_impl calls _coerce_breaks before the type check; reverse for fail-fast.
  • piecewise.py:216-217 _repr_summary uses len(pwl.variables) / len(pwl.constraints) (constructs views) — len(pwl.variable_names) is O(1).
  • piecewise.py:1239 _PwlLinks.eq_bp = stacked_bp is redundant in the equality case.
  • constants.py:58,60 PWL_METHODS / PWL_CONVEXITIES could be frozenset for immutability.
  • constants.py:84-107 EvolvingAPIWarning ~24-line docstring is verbose; two silencing examples could be one.
  • model.py:808 add_piecewise_formulation = add_piecewise_formulation mypy-invisible binding survives refactor (pre-existing).
  • test_piecewise_feasibility.py:274-356 TestHandComputedAnchors six near-identical methods — collapse via parametrization.
  • test_piecewise_constraints.py:1823 manual for sign in signs loop instead of pytest.parametrize.
  • test_piecewise_constraints.py:1933-2034 from linopy.piecewise import _detect_convexity repeated in every method — hoist to module scope.
  • test_formulation_survives_netcdf:2048 excludes _model from the slot comparison without affirming the back-reference is rebound — that back-reference is what makes .variables / .constraints work after read.
  • TestTangentLines:370-421 covers var/linexpr/dataarray/constraint paths but no BreaksLike size-mismatch or NaN tests.

Positive observations

  • *pairs variadic API at piecewise.py:797 is a clean upgrade over the old descriptor >=-overload trickery.
  • Per-tuple (expr, breakpoints, "<=") sense convention reads naturally and is well-checked.
  • EvolvingAPIWarning design (subclass FutureWarning at constants.py:84, per-key dedup at piecewise.py:66-74 with stacklevel=3, message-prefix targeting) is well thought out.
  • PiecewiseFormulation return value with .variables / .constraints views is a real improvement over returning a bare Constraint.
  • API asymmetry to consider: add_piecewise_formulation is a Model method (model.py:808) but tangent_lines is module-level (piecewise.py:573). Either expose model.tangent_lines(...) for discoverability or document the split.

Recommended action plan

  • Collapse _lp_eligibility / _try_lp duplication.
  • Restore alphabetical __all__ sort (consider also fixing the pre-existing ordering).
  • Add EvolvingAPIWarning tests with a fixture clearing _emitted_evolving_warnings.
  • Expand netCDF roundtrip tests (multi-formulation, bounded, LP, convexity).
  • Tighten types: method: PWL_METHOD; replace **kwargs: object with explicit sign: param.
  • Make io.py:1269 convexity loading consistent with other field reads (drop .get).
  • Cover PiecewiseFormulation.variables / .constraints / __repr__, parametrize _lp_eligibility reasons, assert the auto-resolution log.
  • DRY refactors in _validate_breakpoint_shapes (648-659), _build_links (1227-1267), _PwlInputs (1038-1059).

Defer to follow-up cleanup PR (Medium + Low): dataclass conversion, constants extraction (PWL_LINK_DIM, reuse LP_PIECE_DIM), docstring trimming, test parametrization cleanups, _strip_nan speedup, comment removal.

I would say nothing of this is really critical, but I think the recommended points should be easy to address. perhaps @FBumann you want to go through this and see what of it you find helpful. I would say nothing of this is a requirement.

FBumann and others added 5 commits May 5, 2026 15:06
…drop sign= migration helper

* ``__all__`` in ``linopy/__init__.py`` is now sorted (Constraint…tangent_lines).
* ``PiecewiseFormulation.method`` typed as ``PWL_METHOD`` instead of ``str``,
  matching ``convexity``.
* Drop ``**kwargs: object`` from ``add_piecewise_formulation``.  The
  pre-release ``sign=`` migration helper and the catch-all unknown-kwarg
  error were dev backwards-compat for an unreleased API; Python's default
  TypeError on unexpected kwargs covers the rest.
* Extract ``_user_dims_with_sizes`` to share the dim-collection loop
  between ``_user_dims`` and ``__repr__``.
* Loop the two identical ``BREAKPOINT_DIM`` checks in
  ``_validate_breakpoint_shapes``.

Removes ``test_old_sign_kwarg_raises_with_migration_help`` (covered the
removed migration helper).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Both ``_lp_eligibility`` (auto-dispatch) and ``_try_lp`` (explicit
``method='lp'``) re-implemented the same five checks (n_tuples,
is_equality, active, monotonicity, sign+curvature) — the latter just
raised instead of returning a reason.  ``_try_lp`` now always calls
``_lp_eligibility`` and translates the ``(False, reason)`` result into
either an INFO log (auto) or a ``ValueError`` (explicit lp), so
adding a new eligibility rule means editing one function instead of
two.

The raised-error wording is slightly more uniform — the eligibility
``reason`` is now embedded verbatim — but every existing assertion
matches.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
The ``to_netcdf`` writer always emits ``convexity`` (possibly ``None``)
for every persisted ``PiecewiseFormulation``, so the reader's
``d.get("convexity")`` was masking what is actually a required field.
Switching to ``d["convexity"]`` makes the schema mismatch explicit and
matches the fail-fast posture of every other key in the same loop
(``method``, ``variable_names``, ``constraint_names``).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…ility, more netCDF cases

Closes review gaps left by #638:

* ``TestEvolvingAPIWarning`` — first-call fires, dedup holds across
  repeats, ``tangent_lines`` and ``add_piecewise_formulation`` dedup
  independently, ``stacklevel=3`` reports the user's call site.
  Module-global dedup set is cleared by an autouse fixture so test
  order doesn't matter.
* ``TestPiecewiseFormulationAPI`` — the ``.variables`` / ``.constraints``
  properties and ``__repr__`` were essentially unexercised; smoke-tests
  for the equality and LP shapes (the latter has empty ``variable_names``).
* ``TestLPEligibilityReasons`` — direct unit test of ``_lp_eligibility``
  with handcrafted ``_PwlInputs``, parametrised over each branch (too
  many tuples / all equality / active / non-monotonic / wrong-curvature
  ``<=`` and ``>=``) plus the success path.  Uses ``_PwlInputs`` directly
  so branches that the public-API short-circuits hide are still covered.
* ``TestPiecewiseNetCDFRoundtrip`` — parametrised over three shapes:
  the existing 2-var equality, plus a bounded ``<=``/LP case (empty
  ``variable_names``) and a 3-var equality (``convexity is None``), so
  the roundtrip exercises every reachable combination of persisted fields.
  Asserts the reloaded ``.variables`` / ``.constraints`` properties work,
  catching any future regression where the model back-reference isn't
  rebound.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…=True)

Replaces the hand-rolled ``__slots__`` + ``__init__`` boilerplate with
``@dataclass(slots=True, repr=False)`` (the custom ``__repr__`` is
preserved).  Net 17 lines removed.

Also renames the back-reference field from ``_model`` to ``model``.  The
underscore was protective of an attribute that's actually a fine read:
``pwf.model`` is a sensible public access.  This also lets the dataclass
init signature stay ``model=...`` without aliasing tricks — the two
existing callers (``add_piecewise_formulation`` and the netCDF reload in
``io.py``) already used ``model=`` keyword and don't need any change.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@FBumann
Copy link
Copy Markdown
Collaborator Author

FBumann commented May 5, 2026

@FabianHofmann Seems reasonable.

A few comments on things i disagree on:

  1. kwargs: Good catch, but those should just go entirely. They are mere an artifact from iterations during dev...
  2. _build_links and _PwlInputs: I dont see how we could compress this meaningfully.

FBumann and others added 5 commits May 5, 2026 15:55
* Promote ``_pwl_var`` to ``PWL_LINK_DIM`` constant and reuse
  ``LP_PIECE_DIM`` in ``_add_lp`` instead of recomputing
  ``f"{dim}_piece"`` — one source of truth for each magic dim name.
* Tighten ``_resolve_sos2_vs_incremental`` and ``_add_continuous`` return
  types to ``Literal`` / ``PWL_METHOD`` so the chain producing
  ``PiecewiseFormulation.method`` is type-checked end-to-end.
* Drop the dead ``raise ValueError(f"unknown method ...")`` branch in
  ``_resolve_sos2_vs_incremental`` — ``add_piecewise_formulation``
  validates against ``PWL_METHODS`` upstream, and ``_try_lp`` already
  consumed ``"lp"`` before this is called.  Replaced with an ``assert``.
* ``PWL_METHODS`` / ``PWL_CONVEXITIES`` switched to ``frozenset`` to
  signal immutability of the value sets.
* ``_strip_nan`` switched to ``arr[~np.isnan(arr)].tolist()`` —
  vectorised, type-consistent regardless of input (sequence or
  ``ndarray``).
* ``_repr_summary`` uses ``len(pwl.variable_names)`` /
  ``len(pwl.constraint_names)`` instead of ``len(pwl.variables)`` — the
  latter constructs a Variables view just to ask its length.
* Link-coord fallback for unnamed expressions is now ``f"_pwl_{i}"``
  instead of ``str(i)``, so a user variable named e.g. ``"1"`` can't
  collide with the synthetic coord.
* ``variables.py:Variable.__eq__`` ``# type: ignore`` narrowed back to
  ``# type: ignore[override]`` — the bare form was a regression.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…IWarning

* ``_make_inputs`` return type narrowed from ``object`` to ``"_PwlInputs"``
  (with ``TYPE_CHECKING`` import) so callers don't need
  ``# type: ignore`` to dispatch on it.
* ``test_reason_string`` ``kwargs: dict[str, Any]`` instead of bare
  ``dict`` — the previous ``# type: ignore[type-arg]`` is unused under
  the project's mypy config.
* ``_reset_dedup`` autouse fixture annotated as
  ``Generator[None, None, None]`` since it ``yield``s.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
doc: adjust figsize in notebook
@FabianHofmann FabianHofmann merged commit 94a38cd into master May 6, 2026
22 checks passed
@FabianHofmann FabianHofmann deleted the feat/piecewise-api-refactor branch May 6, 2026 07:29
@FBumann
Copy link
Copy Markdown
Collaborator Author

FBumann commented May 6, 2026

Great quality improvements @FabianHofmann 🎉

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants