The topic of the interplay between the "expected" and "emergent" behavior of the system, reminds me a lot of Dijkstra's famous GOTO essay. The point he made in that essay was that GOTO was harmful because it prevented you from making inferences about the dynamic behavior of the program from its static syntax, whereas structured programming allowed you to relate the dynamic state of the system to a "coordinate system" derived from the control structures.
I think that appeals to human intuition are often misplaced when it comes to this aspect of code specifically, because code complexity experiences exponential blowup. No matter how well you represent a boolean expression in terms of naming, code formatting, or organization, for instance, there is the fundamental fact that boolean satisfiability is NP-complete; at some point your brain will be unable to scale that cliff. This remark doesn't apply to code that isn't of an NP-complete flavor, like say a flat list of drawing instructions.
I think "composability" is what is important for code, but "composability" is a somewhat vague word. More precisely, there should be a way to determine specific properties of the system by systematic logical deduction from the components of the system, and without simulating the whole dynamic trace of the system's execution. Composability can only be defined relative to the properties you are interested in: SQL is "composable" if you want to know if an element is in the relation defined by the SQL statement, but is not so "composable" if you want to know how long the statement will take to run.