Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix numpy and scipy incompatibilities #149

Merged
merged 3 commits into from
Jun 27, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 6 additions & 4 deletions docs/source/mods/bipartite-matching.rst
Original file line number Diff line number Diff line change
Expand Up @@ -182,10 +182,12 @@ Solution
.. doctest:: bipartite_matching_sp
:options: +NORMALIZE_WHITESPACE

>>> print(sp.triu(matching))
(0, 7) 1.0
(1, 6) 1.0
(3, 5) 1.0
>>> upper = sp.triu(matching)
>>> for edge, value in zip(zip(*upper.coords), upper.data):
... print(f"{edge}: {value}")
(0, 7): 1.0
(1, 6): 1.0
(3, 5): 1.0

.. group-tab:: networkx

Expand Down
38 changes: 17 additions & 21 deletions docs/source/mods/max-flow-min-cut.rst
Original file line number Diff line number Diff line change
Expand Up @@ -126,17 +126,15 @@ An example of these inputs with their respective requirements is shown below.
>>> from gurobi_optimods import datasets
>>> G, capacities, _, _ = datasets.simple_graph_scipy()
>>> G.data = capacities.data # Copy capacity data
>>> G
<5x6 sparse array of type '<class 'numpy.int64'>'
with 7 stored elements in COOrdinate format>
>>> print(G)
(0, 1) 2
(0, 2) 2
(1, 3) 1
(2, 3) 1
(2, 4) 2
(3, 5) 2
(4, 5) 2
>>> for edge, value in zip(zip(*G.coords), G.data):
... print(f"{edge}: {value}")
(0, 1): 2
(0, 2): 2
(1, 3): 1
(2, 3): 1
(2, 4): 2
(3, 5): 2
(4, 5): 2

We only need the adjacency matrix for the graph (as a sparse array) where
each each entry contains the capacity of the edge.
Expand Down Expand Up @@ -204,16 +202,14 @@ Let us use the data to solve the maximum flow problem.
>>> obj, sol = max_flow(G, 0, 5, verbose=False)
>>> obj
3.0
>>> sol
<5x6 sparse array of type '<class 'numpy.float64'>'
with 6 stored elements in COOrdinate format>
>>> print(sol)
(0, 1) 1.0
(0, 2) 2.0
(1, 3) 1.0
(2, 4) 2.0
(3, 5) 1.0
(4, 5) 2.0
>>> for edge, value in zip(zip(*sol.coords), sol.data):
... print(f"{edge}: {value}")
(0, 1): 1.0
(0, 2): 2.0
(1, 3): 1.0
(2, 4): 2.0
(3, 5): 1.0
(4, 5): 2.0

The ``max_flow`` function returns the value of the maximum flow as well a
sparse array with the amount of non-zero flow in each edge in the
Expand Down
70 changes: 34 additions & 36 deletions docs/source/mods/min-cost-flow.rst
Original file line number Diff line number Diff line change
Expand Up @@ -88,33 +88,33 @@ An example of these inputs with their respective requirements is shown below.

>>> from gurobi_optimods import datasets
>>> G, capacities, cost, demands = datasets.simple_graph_scipy()
>>> G
<5x6 sparse array of type '<class 'numpy.int64'>'
with 7 stored elements in COOrdinate format>
>>> print(G)
(0, 1) 1
(0, 2) 1
(1, 3) 1
(2, 3) 1
(2, 4) 1
(3, 5) 1
(4, 5) 1
>>> print(capacities)
(0, 1) 2
(0, 2) 2
(1, 3) 1
(2, 3) 1
(2, 4) 2
(3, 5) 2
(4, 5) 2
>>> print(cost)
(0, 1) 9
(0, 2) 7
(1, 3) 1
(2, 3) 10
(2, 4) 6
(3, 5) 1
(4, 5) 1
>>> for edge, value in zip(zip(*G.coords), G.data):
... print(f"{edge}: {value}")
(0, 1): 1
(0, 2): 1
(1, 3): 1
(2, 3): 1
(2, 4): 1
(3, 5): 1
(4, 5): 1
>>> for edge, value in zip(zip(*capacities.coords), capacities.data):
... print(f"{edge}: {value}")
(0, 1): 2
(0, 2): 2
(1, 3): 1
(2, 3): 1
(2, 4): 2
(3, 5): 2
(4, 5): 2
>>> for edge, value in zip(zip(*cost.coords), cost.data):
... print(f"{edge}: {value}")
(0, 1): 9
(0, 2): 7
(1, 3): 1
(2, 3): 10
(2, 4): 6
(3, 5): 1
(4, 5): 1
>>> print(demands)
[-2 0 -1 1 0 2]

Expand Down Expand Up @@ -206,15 +206,13 @@ formats.
>>> obj, sol = min_cost_flow_scipy(G, capacities, cost, demands, verbose=False)
>>> obj
31.0
>>> sol
<5x6 sparse array of type '<class 'numpy.float64'>'
with 5 stored elements in COOrdinate format>
>>> print(sol)
(0, 1) 1.0
(0, 2) 1.0
(1, 3) 1.0
(2, 4) 2.0
(4, 5) 2.0
>>> for edge, value in zip(zip(*sol.coords), sol.data):
... print(f"{edge}: {value}")
(0, 1): 1.0
(0, 2): 1.0
(1, 3): 1.0
(2, 4): 2.0
(4, 5): 2.0

The ``min_cost_flow_scipy`` function returns the cost of the solution as
well as a ``sp.sparray`` that provides the amount of flow for each
Expand Down
3 changes: 2 additions & 1 deletion src/gurobi_optimods/min_cut.py
Original file line number Diff line number Diff line change
Expand Up @@ -205,10 +205,11 @@ def _min_cut_scipy(G, source, sink, create_env):
return MinCutResult(0.0, (set(), set()), set())

queue = [source]
G = G.tocsr()
while len(queue) > 0:
node = queue.pop()
p1.add(node)
row = G.getrow(node)
row = G[[node]]
# Add successors of `node` that are not in the cutset
queue.extend(
[
Expand Down
2 changes: 1 addition & 1 deletion tests/test_bipartite_matching.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ def assert_is_unweighted_matching(self, matching):
assert_allclose(matching.data, np.ones(matching.data.shape))
adj = matching.todense()
assert_allclose(adj, adj.T)
self.assertTrue(np.alltrue(adj.sum(axis=0) <= 1))
self.assertTrue(np.all(adj.sum(axis=0) <= 1))

def test_empty(self):
# Matching of an empty graph is empty
Expand Down