LAWRENCE — If your car has an electronic parking brake, you’re asking for trouble.
“There are measures we should build into the design of computerized systems,” said John Symons, professor of philosophy at the University of Kansas. “A manual parking brake is a very simple example of the kind of fail-safe that needs to be included by creators of software-intensive consumer products.”
That concept is one of many ethical and practical matters explored by Symons and co-writer Jack Horner in “Why There is no General Solution to the Problem of Software Verification.” Their new article appears in the current issue of Foundations of Science, a leading journal that probes fundamental questions about the nature of science.
Having researched this topic since 2014, Symons has now written the definitive paper attesting verification is an unrealistic and unattainable goal. As such, treating any software as reliable proves problematic.
Continuing with the automobile analogy, Symons cites the accelerator pedal recall that plagued Toyota a decade ago.
“Toyota was criticized for not ensuring their software was error-free. And their response was, ‘We did everything that is customarily expected of a builder of a piece of software, etc.’ Our results point to the idea that those expectations are beyond anyone’s capacity to achieve. The corporation simply can’t ensure the error-free nature of large-scale software,” he said.
A related challenge involves the philosophical considerations inherent to technology. Symons explains that when characterizing computers or programs, we often treat them as abstract objects. Actual computers in our daily lives are physical objects that do the job of computing some function.
“But consider the notion of function. What is a function? Is it a mathematical notion? And what is it to compute a function? Those aren’t questions an engineer would normally need to pin down. From my view, physical computers are basically doing a good-enough job for their target audience or target market,” he said. “Knowing what counts as ‘good enough’ or what it means to ‘meet expectations’ drags engineering into the domain of normative questions.”
Like the plot of “The Terminator” or many “Star Trek” episodes, the reliance on technology without factoring in a moral principle often results in unexpected consequences.
Symons said, “If we think about the ethical responsibilities that corporations have with respect to the safety of consumers, or the safety of innocent bystanders, we tend to assume they have some responsibility ensuring that their systems are free of error. But maybe it’s better to recognize that we have to live with error and to design fail-safe approaches to ensuring it will be possible for human agents to take control of ultimately unreliable technological systems.”
Other revelations Symons discovered during his research is that fixing mistakes creates new vulnerabilities. (“There may be tradeoffs where repairing error is actually more costly in terms of new error than leaving a broken system,” he said.) And even if model-based verification experts have developed provably correct pieces of software, those successes are for modestly sized pieces that don’t scale.
“We can’t get that agenda scaled up to the point where they’ll play a meaningful role in ordinary commercial, consumer electronics. We’ve had a tradeoff. And the tradeoff is the presence of error in these technologies,” he said.
A native of Cork, Ireland, Symons is in his sixth year at KU. An expert in metaphysics and epistemology of science and philosophy of psychology, he has written or edited 12 books, including “Formal Reasoning: A Guide to Critical Thinking” (2017), “The Architecture of Cognition: Rethinking Fodor and Pylyshyn’s Systematicity Challenge” (2014) and “Daniel Dennett: Critical Assessments of Leading Philosophers” (2014).
His co-writer, Jack Horner, is a philosopher and longtime veteran of the software engineering industry.
“Jack’s practical experience serves as a sort of sanity check over our theoretical speculations,” said Symons, who is currently writing a book with Horner that expands on this article.
“Why There is no General Solution to the Problem of Software Verification” confirms what a lot of those who work in the tech industry have long suspected, Symons said. But to demonstrate it formally provides significant progress on the issue.
“We think that’s important, primarily for the ethics of technology,” Symons said. “Ultimately, we’re saying certain kinds of demands are unrealizable. The desire for complete elimination of error can never be satisfied.”