feat: Unify piecewise API behind add_piecewise_formulation (sign + LP dispatch)#638
feat: Unify piecewise API behind add_piecewise_formulation (sign + LP dispatch)#638FabianHofmann merged 67 commits intomasterfrom
Conversation
…on layer Remove PiecewiseExpression, PiecewiseConstraintDescriptor, and the piecewise() function. Replace with an overloaded add_piecewise_constraints() that supports both a 2-variable positional API and an N-variable dict API for linking 3+ expressions through shared lambda weights. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Change add_piecewise_constraints() to use keyword-only parameters (x=, y=, x_points=, y_points=) instead of positional args. Add detailed docstring documenting the mathematical meaning of equality vs inequality constraints. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The N-variable path was not broadcasting breakpoints to cover extra dimensions from the expressions (e.g. time), resulting in shared lambda variables across timesteps. Also simplify CHP example to use breakpoints() factory and add plot. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The plotting helper now accepts a single breakpoints DataArray with a "var" dimension, supporting both 2-variable and N-variable examples. Replaces the inline CHP plot with a single function call. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Document the N-variable core formulation with shared lambda weights, explain how the 2-variable case maps to it, and detail the inequality case (auxiliary variable + bound). Remove all references to the removed piecewise() function and descriptor classes. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add linopy.piecewise_envelope() as a standalone linearization utility that returns tangent-line LinearExpressions — no auxiliary variables. Users combine it with regular add_constraints for inequality bounds. Remove sign parameter, LP method, convexity detection, and all inequality logic from add_piecewise_constraints. The piecewise API now only does equality linking (the core formulation). Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
More accurate name — the function computes tangent lines per segment, not necessarily a convex/concave envelope. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Single function doesn't justify a separate module. tangent_lines lives next to breakpoints() and segments() — all stateless helpers for the piecewise workflow. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add prominent section explaining the fundamental difference: - add_piecewise_constraints: exact equality, needs aux variables - tangent_lines: one-sided bounds, pure LP, no aux variables - tangent_lines with == is infeasible (overconstrained) Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Replace keyword-only (x=, y=, x_points=, y_points=) and dict-based
(exprs=, breakpoints=) forms with a single tuple-based API:
m.add_piecewise_constraints(
(power, [0, 30, 60, 100]),
(fuel, [0, 36, 84, 170]),
)
2-var and N-var are the same pattern — no separate convenience API.
Internally stacks all breakpoints along a link dimension and uses
a unified formulation path.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The _pwl_var dimension now shows variable names (e.g. "power", "fuel")
instead of generic indices ("0", "1"), making generated constraints
easier to debug and inspect.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
… notebook The piecewise() function was removed but api.rst still referenced it. Also replace xr.concat with breakpoints() in plot cells to avoid pandas StringDtype compatibility issue on newer xarray. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Silences xarray FutureWarning about default coords kwarg changing. No behavior change — we concatenate along new dimensions where coord handling is irrelevant. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add Example 8 (fleet of generators with per-entity breakpoints) to the notebook. Also drop scalar coordinates from breakpoints before stacking to handle bp.sel(var="power") without MergeError. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
for more information, see https://pre-commit.ci
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The descriptor API was never released, so for users this is all new. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Reorder: Quick Start -> API -> When to Use What -> Breakpoint Construction -> Formulation Methods -> Advanced Features. Add per-entity, slopes, and N-variable examples. Deduplicate code samples. Fold generated-variables tables into compact lists. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
for more information, see https://pre-commit.ci
|
@FabianHofmann This is pretty urgent to me, as the current piecewise API is already on master and should NOT be included in the next release in my opinion. The current code can be reorganized a bit, but id like to hear your thoughts about the API first. |
|
this looks cool, well done @FBumann ! I will have a look in the afternoon! |
FabianHofmann
left a comment
There was a problem hiding this comment.
first round of small requests mostly outside of the piecewise module.
| names.update(pwl.variable_names) | ||
| names.update(pwl.constraint_names) |
There was a problem hiding this comment.
piecewise variables and constraints cannot have the same name right?
There was a problem hiding this comment.
Yes, they can. As regular variables and constraints can share a name too. They do not currently though.
|
@FBumann just went through the notebook, fantastic work. mind if I push some decorative changes to the notebook? |
|
Of course not. Go ahead. You dont have to ask that going forward. If you keep a proper commit history, i can check what you did and complain if i need to :) |
The descriptor pattern is gone, so the bare ``# type: ignore`` over-suppressed mypy. ``__eq__`` still needs to declare ``Constraint`` instead of ``bool`` to preserve linopy's expression-equals-builds-constraint semantics, so the override-mismatch is intrinsic — narrow the directive to ``[override]``. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…int_names Use the same key names that ``PiecewiseFormulation`` uses internally (``variable_names``/``constraint_names``) so the netcdf attribute layout matches the dataclass field names. Read path updated to match. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Promote the method and convexity string sets to Literal type aliases and derive the runtime sets from ``get_args``. ``piecewise.py`` now uses the aliases for type annotations on ``add_piecewise_formulation``, ``_detect_convexity``, and ``PiecewiseFormulation.convexity`` instead of inlining the string list at every site. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Move the piecewise summary block out of ``Model.__repr__`` into ``piecewise._repr_summary``; the model now adds the section with a single call. Also drops the ``_repr_filtered`` wrappers on ``Constraints``/``Variables`` — the model calls ``_format_items(exclude=...)`` directly, since that's all the wrappers were doing. The variable and constraint name sets are returned separately from ``_grouped_names`` (variables and constraints live in independent namespaces in the model, so each filter applies to its own collection). Restores the missing docstrings on ``Constraints.__repr__`` and ``Variables.__repr__`` and corrects the wording — they describe the respective container, not the model. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
A single model build often calls ``add_piecewise_formulation`` / ``tangent_lines`` hundreds of times, and each emit produces a multi-line warning that drowns out other output. Dedup per call site via a module-level set keyed by a typed ``_EvolvingApiKey`` literal so each entry point warns at most once per process. Warning text now mentions the once-per-session behaviour so users know they aren't seeing every call. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
feat: add jupyter and ipykernel to dev extension in installation
for more information, see https://pre-commit.ci
8a57437 added ipykernel/jupyter/matplotlib to [dev], which transitively pulled requests in via the jupyter metapackage. test/remote/test_oetc.py guards itself with `pytest.importorskip("requests")` — on master that skipped the whole file, but with requests now resolved the file got collected and every test failed at OetcHandler() instantiation because google-cloud-storage (the other half of the [oetc] extra) is still absent. Result: 23 failures + 14 errors across the entire test matrix. Notebook execution lives in test-notebooks.yml, which already installs [docs] (ipykernel + matplotlib are there). No CI job needs notebook deps in [dev]. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Here is the review from my agents:Critical
HighCode quality / DRY
Type / API
Tests
MediumCode quality
Type / API
Tests
Low
Positive observations
Recommended action plan
Defer to follow-up cleanup PR (Medium + Low): dataclass conversion, constants extraction ( I would say nothing of this is really critical, but I think the recommended points should be easy to address. perhaps @FBumann you want to go through this and see what of it you find helpful. I would say nothing of this is a requirement. |
…drop sign= migration helper * ``__all__`` in ``linopy/__init__.py`` is now sorted (Constraint…tangent_lines). * ``PiecewiseFormulation.method`` typed as ``PWL_METHOD`` instead of ``str``, matching ``convexity``. * Drop ``**kwargs: object`` from ``add_piecewise_formulation``. The pre-release ``sign=`` migration helper and the catch-all unknown-kwarg error were dev backwards-compat for an unreleased API; Python's default TypeError on unexpected kwargs covers the rest. * Extract ``_user_dims_with_sizes`` to share the dim-collection loop between ``_user_dims`` and ``__repr__``. * Loop the two identical ``BREAKPOINT_DIM`` checks in ``_validate_breakpoint_shapes``. Removes ``test_old_sign_kwarg_raises_with_migration_help`` (covered the removed migration helper). Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Both ``_lp_eligibility`` (auto-dispatch) and ``_try_lp`` (explicit ``method='lp'``) re-implemented the same five checks (n_tuples, is_equality, active, monotonicity, sign+curvature) — the latter just raised instead of returning a reason. ``_try_lp`` now always calls ``_lp_eligibility`` and translates the ``(False, reason)`` result into either an INFO log (auto) or a ``ValueError`` (explicit lp), so adding a new eligibility rule means editing one function instead of two. The raised-error wording is slightly more uniform — the eligibility ``reason`` is now embedded verbatim — but every existing assertion matches. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
The ``to_netcdf`` writer always emits ``convexity`` (possibly ``None``)
for every persisted ``PiecewiseFormulation``, so the reader's
``d.get("convexity")`` was masking what is actually a required field.
Switching to ``d["convexity"]`` makes the schema mismatch explicit and
matches the fail-fast posture of every other key in the same loop
(``method``, ``variable_names``, ``constraint_names``).
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…ility, more netCDF cases Closes review gaps left by #638: * ``TestEvolvingAPIWarning`` — first-call fires, dedup holds across repeats, ``tangent_lines`` and ``add_piecewise_formulation`` dedup independently, ``stacklevel=3`` reports the user's call site. Module-global dedup set is cleared by an autouse fixture so test order doesn't matter. * ``TestPiecewiseFormulationAPI`` — the ``.variables`` / ``.constraints`` properties and ``__repr__`` were essentially unexercised; smoke-tests for the equality and LP shapes (the latter has empty ``variable_names``). * ``TestLPEligibilityReasons`` — direct unit test of ``_lp_eligibility`` with handcrafted ``_PwlInputs``, parametrised over each branch (too many tuples / all equality / active / non-monotonic / wrong-curvature ``<=`` and ``>=``) plus the success path. Uses ``_PwlInputs`` directly so branches that the public-API short-circuits hide are still covered. * ``TestPiecewiseNetCDFRoundtrip`` — parametrised over three shapes: the existing 2-var equality, plus a bounded ``<=``/LP case (empty ``variable_names``) and a 3-var equality (``convexity is None``), so the roundtrip exercises every reachable combination of persisted fields. Asserts the reloaded ``.variables`` / ``.constraints`` properties work, catching any future regression where the model back-reference isn't rebound. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…=True) Replaces the hand-rolled ``__slots__`` + ``__init__`` boilerplate with ``@dataclass(slots=True, repr=False)`` (the custom ``__repr__`` is preserved). Net 17 lines removed. Also renames the back-reference field from ``_model`` to ``model``. The underscore was protective of an attribute that's actually a fine read: ``pwf.model`` is a sensible public access. This also lets the dataclass init signature stay ``model=...`` without aliasing tricks — the two existing callers (``add_piecewise_formulation`` and the netCDF reload in ``io.py``) already used ``model=`` keyword and don't need any change. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
|
@FabianHofmann Seems reasonable. A few comments on things i disagree on:
|
* Promote ``_pwl_var`` to ``PWL_LINK_DIM`` constant and reuse
``LP_PIECE_DIM`` in ``_add_lp`` instead of recomputing
``f"{dim}_piece"`` — one source of truth for each magic dim name.
* Tighten ``_resolve_sos2_vs_incremental`` and ``_add_continuous`` return
types to ``Literal`` / ``PWL_METHOD`` so the chain producing
``PiecewiseFormulation.method`` is type-checked end-to-end.
* Drop the dead ``raise ValueError(f"unknown method ...")`` branch in
``_resolve_sos2_vs_incremental`` — ``add_piecewise_formulation``
validates against ``PWL_METHODS`` upstream, and ``_try_lp`` already
consumed ``"lp"`` before this is called. Replaced with an ``assert``.
* ``PWL_METHODS`` / ``PWL_CONVEXITIES`` switched to ``frozenset`` to
signal immutability of the value sets.
* ``_strip_nan`` switched to ``arr[~np.isnan(arr)].tolist()`` —
vectorised, type-consistent regardless of input (sequence or
``ndarray``).
* ``_repr_summary`` uses ``len(pwl.variable_names)`` /
``len(pwl.constraint_names)`` instead of ``len(pwl.variables)`` — the
latter constructs a Variables view just to ask its length.
* Link-coord fallback for unnamed expressions is now ``f"_pwl_{i}"``
instead of ``str(i)``, so a user variable named e.g. ``"1"`` can't
collide with the synthetic coord.
* ``variables.py:Variable.__eq__`` ``# type: ignore`` narrowed back to
``# type: ignore[override]`` — the bare form was a regression.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…IWarning * ``_make_inputs`` return type narrowed from ``object`` to ``"_PwlInputs"`` (with ``TYPE_CHECKING`` import) so callers don't need ``# type: ignore`` to dispatch on it. * ``test_reason_string`` ``kwargs: dict[str, Any]`` instead of bare ``dict`` — the previous ``# type: ignore[type-arg]`` is unused under the project's mypy config. * ``_reset_dedup`` autouse fixture annotated as ``Generator[None, None, None]`` since it ``yield``s. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
doc: adjust figsize in notebook
for more information, see https://pre-commit.ci
|
Great quality improvements @FabianHofmann 🎉 |
Summary
Follows up on #602. Replaces the descriptor pattern (
PiecewiseExpression,PiecewiseConstraintDescriptor,piecewise()) with a single, stateless construction layer:add_piecewise_formulation.add_piecewise_formulationbecomes the one entry point for piecewise work — equality, one-sided inequality, N-variable linking, per-entity breakpoints, unit-commitment gating, and automatic method dispatch all live behind one call.tangent_lines()survives as a low-level helper for users who want to build chord expressions manually.The API
Each
(expression, breakpoints[, sign])tuple links a variable to its breakpoints. All tuples share interpolation weights, coupling them on the same curve piece. The optional 3rd element marks one tuple as bounded by the curve instead of pinned to it.Equality (default)
Inequality bounds — per-tuple sign
Append
"<="or">="as a third tuple element to mark one expression as bounded by the curve. The other tuples stay pinned. The bounded tuple need not be first — its role is visible at the call site.On a matching-curvature curve this dispatches to
method="lp": one chord inequality per piece plus a domain boundx_min ≤ x ≤ x_max, and no auxiliary variables. Mismatched curvature (convex +"<="/ concave +">=") describes the wrong region (not just a looser one) —method="auto"detects this and falls back to SOS2/incremental with an explanatory info log;method="lp"raises with a clear error.Restrictions (current):
"==".The N-input bounded case (e.g.
z >= f(x, y)) is rejected with an error that invites a GitHub issue if anyone has a concrete use case — we don't promise a specific future API shape.Unit-commitment gating —
activeWorks with SOS2, incremental, and disjunctive. With a bounded tuple, deactivation only pushes the signed bound to
0— setlower=0on naturally non-negative outputs (fuel, cost, heat) and the combination forcesy = 0automatically.Low-level:
tangent_linesKey changes
New public API
add_piecewise_formulation— one entry point for all piecewise constructions. Returns aPiecewiseFormulationcarrying.method(resolved formulation),.convexity(when well-defined),.variables,.constraints."==". At most one non-equality; 3+ tuples must all be equality.methodparameter —"auto"(default),"sos2","incremental","lp".activeparameter — binary gating for unit commitment.linopy.breakpoints()/linopy.segments()factories.linopy.slopes_to_points()utility.linopy.tangent_lines()low-level helper.Auto-dispatch (
method="auto")lp(chord + domain bounds, no aux vars)incremental(deltas + binaries)sos2(lambdas + SOS2 constraint)sos2with binary segment selectionThe resolved method is logged at INFO; when LP is skipped, the log explains why (curvature, NaN layout, tuple count, or
active).Correctness properties
_detect_convexityis direction-invariant — ascending and descending x give the same label for the same graph.y ≤ 0constraints).method="lp"validates curvature + sign + tuple count +activeabsence upfront; mismatch raises with a clear pointer tomethod="auto".PiecewiseFormulation.methodand.convexitypersist across netCDF round-trip.Architecture (internal)
_PwlInputsdataclass categorizes the user's tuples intopinned_*andbounded_*slots — explicit role assignment, no positional convention. Carries through the whole dispatch chain._PwlLinksis the stacked link representation consumed by SOS2/incremental/disjunctive builders;_build_links()produces it from_PwlInputs._try_lp()/_resolve_sos2_vs_incremental()flatten the dispatch —_add_continuousis ~10 lines of real logic._PwlInputs, not by assuming position 0.Removed / deprecated
PiecewiseExpression,PiecewiseConstraintDescriptor,linopy.piecewise()— descriptor pattern gone.add_piecewise_constraintsshape that dispatched onVariable.__le__/__ge__/__eq__return types.Variableoperator overloads (__le__,__ge__,__eq__) simplified — no more piecewise dispatch.Design principles
Model,Variables,Constraints,Expression, plus the singlePiecewiseFormulationreturn type. No user-facing intermediate state.(y, y_pts, "<=")⇒y ≤ f(...))._pwl_vardimension showspower, fuel, heatinstead of0, 1, 2.Alternative designs considered
A) Descriptor pattern (PR #602, previous master)
Rejected: Introduces
PiecewiseExpressionandPiecewiseConstraintDescriptoras user-facing state. Breaks the "only Variables, Constraints, Expressions" principle. Structurally limited to 2-variablex → y.B) Dict of expressions + shared breakpoints DataArray
Considered: Supports N-variable. But requires string keys to match breakpoint coordinates — an indirection layer.
C) Global
sign=keyword (initial implementation, refactored in #664)The original implementation used a single formulation-level
sign=kwarg with a positional convention ("the first tuple is the bounded one whensign != '=='"). #664 refactored this to per-tuple sign because the role belongs to a specific tuple — encoding it globally + positionally was a wart that required a paragraph of docs to teach. The internal dispatch was rewritten from positional access to explicit categorization (_PwlInputswithbounded_*andpinned_*slots).D) Chosen: per-tuple sign on the tuple it applies to
Future work this enables
z ≥ f(x, y)). The reserved N≥3-inequality slot reads naturally as the epigraph of a bivariate function:(z, z_pts, ">="), (x, x_pts), (y, y_pts), triangulation=tris. Vertex coordinates stay as plain 1-DDataArrays; the simplex adjacency rides along as one shared sibling kwarg._PwlInputswould grow a single optional field; the formulation builders for the triangulated path are new but consume the same input carrier.Defered Features
Out of scope / follow-ups
Directions the design enables or considered but deliberately not shipping in this PR:
z ≥ f(x, y)). The reserved N≥3-inequality slot reads naturally as the epigraph of a bivariate function:(z, z_pts, ">="), (x, x_pts), (y, y_pts), triangulation=tris. Vertex coordinates stay as plain 1-DDataArrays; the simplex adjacency rides along as one shared sibling kwarg._PwlInputswould grow a single optional field; the formulation builders for the triangulated path are new but consume the same input carrier.breakpoints(slopes=..., x_points=..., y0=...)is shipped today. What's deferred is letting slopes inherit the x grid from a sibling tuple (e.g.(fuel, Slopes([1.2, 1.4, 1.7], y0=0))paired with a reference tuple that carries the x grid). Two viable shapes were discussed — extendingbreakpoints(slopes=...)to allow omittedx_points, or introducing a small frozen-dataclassSlopesvalue type — each with tradeoffs around type-system visibility and dispatch clarity.EvolvingAPIWarningcovers either direction as a non-breaking follow-up.Notebook examples
examples/piecewise-linear-constraints.ipynb(main tutorial):pwfautovssos2vsincrementalgive the same optimumsegments()for gaps in operation"<="(auto-dispatches LP)activeparameterexamples/piecewise-inequality-bounds.ipynb(deep-dive on per-tuple sign / LP):method="lp"raise)Formulation math lives in the reference page, not in the notebook.
Test plan
pytest test/test_piecewise_constraints.py test/test_piecewise_feasibility.py— 518 passedPiecewiseFormulation.methodand.convexitypersist acrossto_netcdf/read_netcdf🤖 Generated with Claude Code