Page 163 - MODES of EXPLANATION

Basic HTML Version

It is important not to overstress the role of the flow loop. Physical simulation was
necessary, but far from sufficient, for A1’s discovery. The model-building process is filled
with all kinds of choices, such as what kind of mathematical representation you are going to
use, or what kind of modeling framework, whether mechanistic or agent based, or whether
you are going to use ordinary or partial differential equations. All kinds of fitting assumptions
are also made when doing parameter estimation and optimization, and various model
diagnostics are conducted on a final fitted model until the researchers feel that they have
something stable enough to be able to say that it is an adequate representation of the
phenomena. Once they have a final fitted stable model, they can use it for exploration and
possibly control. Thus, building the model is primarily a task of managing complexity.
I call this a “bird’s nesting process,” because what a bird does is grab anything it
possibly can to make a stable platform for its eggs and its chicks. In this particular case, A1
had no steady-state data, no rate-constant data on the enzyme reactions, and she had to make
lots of assumptions, such as that the timescale was hours. So she grabbed anything she could,
making judgments about its reasonableness along the way, in order to create a stable model.
She made simplifying assumptions; she chose a modeling framework and a mathematical
framework; she had to decide on various theoretical elements to go into the model, such as
the kinetic orders and the effects of cell boundaries, and then determine parameter
estimations; and she had to conduct sensitivity analysis. In this case, she had to make three
possible additions to a piece of the model, doing Monte Carlo sampling to see which one
made the best case. In the end, she built a stable, robust model that allowed the lab to deal
with and control the phenomena that they wanted to investigate.
This was a case in which A1 had very limited data, but not every case is like that. The
best-case scenario is when the engineers have a lot of robust data and are able to build a
computational simulation that enacts the phenomena, enabling them to make predictions and
to have the kind of control that they want. Another case involved lignin, a polymer that keeps
plants rigid. The ability to break down lignin would be valuable in biofuels production. A
modeler, A2, was given this question: Could we understand the lignin model sufficiently such
that we can tweak it somewhere in order to break it down better, to produce a better biofuel?
However, unlike in the previous case where A1 had to build the model nearly from scratch,
A2 was given reams and reams of data by the benchtop scientists. Furthermore, there was
already an accepted pathway to work with, so A2 was only expected to make small changes.
A2 built the initial model and added a few feedback loops. Nevertheless, he was
having trouble getting the model to work. The only explanation he could find was that there
was an entire component left out of the accepted pathway. Because A2 was not a biologist, he
has no idea what this component could possibly be, so he called it X. A2 had to wait six
months before he was able to do his final modeling because the biologists did not cooperate
with him sufficiently. He finally got their attention when he convincingly showed them that
something new needed to be incorporated into the lignin pathway, which of course was a
major discovery and far from the minor tweaking they had initially expected. So the
biologists rushed to do their experiments and discovered that it was salicylic acid that needed
to be added.
6