## Two supply optimization problems

### October 28, 2013

1. Computer program power consumption
A programming language that “minimizes” power consumption through minimal interconnect usage (e.g. memory calls).
2. Food sourcing power consumption
Farmland supply to cities: how to optimize land usage? What part of the produce can be made local e.g. made at the consumer or turned to hydroponic and its culture brought within the city itself?

Both these problems require a grammar of solutions, rather than a single instance, due to the diversity of the operating/boundary conditions that are encountered.
As such, I don’t think that a “proof of correctness” for either can be hoped for, but perhaps a number of heuristic checks might prove the point.
The former is addressed by a single technology, whereas the second requires a diverse array of strategies.

General considerations

• Area and land usage
Arbitrary rearrangement of the resources is not trivial: CPUs are designed with CAD tools that favor periodicity and reuse, and farmland restricts supply due to physiological productivity/rest cycles.
• Time and flow
Time plays a part as well: the edges in these supply nets do not handle a constant flow. In the first case, storage is regulated by registers, queues and stacks, whereas in the second, the flowing entities are subject to seasonal variation, degrade with time etc.

This framework is intentionally generic in order to highlight similarities, and it is of course a work in progress.
Both these problems in fact have broad political implications, which leaves plenty of space for many juicy discussions. Looking forward.

Literature

1. An article from the NYT: A Balance Between the Factory and the Local Farm (Feb. 2010) highlights both the high costs of local (i.e. small-scale) green production, citing The 64\$ Tomato, and the related climatic issues (e.g. cultivation on terrain located in the snow belt).
The article closes with “Localism is difficult to scale up enough to feed a whole country in any season. But on the other extreme are the mammoth food factories in the United States. Here, frequent E. coli and salmonella bacteria outbreaks […] may be a case of a manufacturing system that has grown too fast or too large to be managed well.
Somewhere, there is a happy medium.” — an optimum, if you will.

Side questions

• Why do large-scale economics “work better”? i.e. have a larger monetary efficiency, which drives down the prices for the end user? More effective supply chain, waste minimization, minimization of downtime …

## Coffee, graphs and musings on scientific education

### October 3, 2013

Good morning!

Today I kick off with some more Algorithms basic theory. Tim Roughgarden, the instructor, has a precise yet captivating delivery style (not too informal, not too nerdy), which makes this worthwhile.
In general, I noticed the same pattern during all of my studies: the instructor has a large responsibility in keeping the students’ heads up, in both senses.
The presentation of even the most formal of subjects has to be enriched by “refreshing” elements. For example, my Calculus 1, 2, 3 AND 4 teacher, the never-enough-celebrated G. C. Barozzi, had a way of embedding anecdotes and biographical facts about mathematicians. This both put the subject in historical context, making it feel less abstract, and relieved the audience from continued focus. Seems too nerdy still? You’ve found the wrong blog then! Ha.

I can agree that some subjects might feel a bit “dogmatic”, as formal proof of all their aspects would require a disproportionate effort from large part of the audience. In this case, it is up to the lecturer to select the more illuminating material beforehand : Longum iter est per praecepta, breve et efficax per exempla (Seneca)

So, contraction algorithms for the MINCUT problem for graphs: given a graph $G = \{V, E\}$, with nodes V and edges E, which is the smallest set of edges $\tilde{E}$ that divides G in two otherwise disjoint subsets of nodes $V_1$ and $V_2$ (i.e. the smallest cutset)?

Random contraction (Karger’s algorithm)
Sequentially remove edges uniformly at random, merging the resulting nodes $V(E_{k}) = V_i, V_j \rightarrow V_i$ (“edge contraction”) and rerouting the edges initially connected to $V_i$ and $V_j$ to $V_i$.
The cutset that remains when there are only two nodes left is a local optimum of MINCUT.

A single run of Karger’s algorithm (adapted from Wikipedia)

If $N_E = |E|$ is the edge set size, there are $N_E!$ possible contraction sequences. Only a subset of these yield the true MINCUT (i.e. the problem is path-dependent).