Years ago I was part of a forum discussing the use of technology in navigation. A question came up: what technology did we panelists think contributed most to modern navigation?

This was in the early 1990s and many of my colleagues weighed in about the wondrous new technology of the Global Positioning System (GPS) with its sub-yardstick accuracy. When it came my turn to speak I said simply “The Marine Chronograph.”

In other words, a clock.

Navigation on our globe involves determining both one’s latitude as well as one’s longitude. Figuring your latitude is easy – wait for high noon and measure the Sun’s angle in the sky. That angle is your latitude.

Determining longitude is also easy. Since the Earth rotates at fifteen degrees an hour, if you know what time it is when the Sun is overhead at a fixed location then your longitude is simply the time when the Sun is overhead at your location.

Knowing what time it is, well that’s the hard part. You need to have a really accurate clock and we didn’t have accurate clocks until the eighteenth century –- the British invented them. Since the British also ruled the seas at the time, guess what? They got to pick that “fixed location” that serves as a reference for high noon: Greenwich, England.

Greenwich Mean Time. GMT. Zulu. Patient Zero in the longitudinal study of the Earth’s circumference.

The accurate marine chronometer meant the difference between sailing from London for Bermuda and landing at Bermuda vs. landing in Brazil, instead. With the clock, maritime navigation instantly turned from a game of chance to being able to reliably land within a mile of one’s intended destination irrespective of distance traveled or weather.

GPS improved on that a bit, but not much. A clock that gets you within a mile of your destination is really something. A GPS receiver that gets you within a couple of inches of the destination doesn’t really help much over that.

But it is much more expensive. 99% accuracy is relatively cheap. 99.9% accuracy is really expensive. That’s the law of diminishing returns. All those rockets, all those satellites, all that infrastructure and cost gains you some but really, not much.

When my wife fell ill the constant refrain was “Something’s going on but we don’t know what. Don’t worry, it’s not serious but just in case let’s do another MRI/ECG/CT-Scan” (take your pick). From California to Ohio, clinic after clinic and hospital after hospital scratched their collective heads and said “Not sure, but let’s do some more imaging.”

She had image after image taken. No doctor, no radiologist, no clinician could ever make heads or tails of what they saw. And these were the experts.

It turns out that all this machinery, making modern medicine, is really good at showing us the obvious. We can palpate a tumor and we know what that feels like. We can X-ray a tumor and we know what that looks like. We can CT-Scan a tumor, same thing.

When physicians first laid hands on their patients, it was in an attempt to see into the human body without having to cut it open. Ever since then doctors have been seeking newer and better ways of seeing into the body, of looking without invasion, in the hopes of doing a better job of diagnosis.

It’s not clear that our technology has risen to the challenge. An MRI machine is a lot more expensive than a pair of fingers, but the jury is out as to how much more effective it really is.

How much of our industry’s application of medical technology is for the purposes of marketing and competitive differentiation vs. delivering better patient care? How much of it just drives up costs and delivers little or no value?

Do we have the metrics that help us understand the efficacy of the machinery? Can we measure patient outcomes?

A pair of fingers gets us really close. Do we need machinery to get us closer and, if we do, can we understand what the machines are telling us?