Another, not dissimilar, attack is raging this very moment. It started with the brilliant result of Peter Shor [SIComp, 26(5) 1997]. He factors integers in polynomial time using an imaginary analog device, Quantum Computer (QC), inspired by the laws of quantum physics taken to their formal extreme.

And what is the physical meaning of 500 digit long numbers? What could one possibly mean by saying "This box has a remarkable property: its q-bits contain the Ten Commandments with the amplitude whose first 500 decimal places end with 666"? What physical interpretation could this statement have even for just this one amplitude?

[Remark (**not in the article**):*
Subtleties and paradoxes of probability lead to widespread confusions.
I heard questions about why similar issues do not plague tiny probabilities.
Exponentially small probabilities themselves do not have physical meaning.
Their logarithms, i.e. entropies, do. Entropies are falsifiable
e.g., by finding, for an experimental result, a code with significantly
fewer bits than its entropy under the predicted distribution.
But this (analyzed by Kolmogorov) link with the (observable) complexity,
only works because probabilities are always positive (or 0). Quantum
amplitudes have no such property, so cannot carry this meaning.*]

Close to the tensor product basis, one might have opportunities to restate the assertions using several short measurable numbers instead of one long. Such opportunities may also exist for large systems, such as lasers or condensates, where individual states matter little. But QC factoring uses amplitudes of an exponential number of highly individualized basis states. I doubt anything short of the most generic and direct use of these huge precisions would be easy to substitute. One can make the amplitudes more "physical" by choosing a less physical basis. Let us look into this.

Consider, however, this scenario. With few q-bits, QC is eventually made to work. The progress stops, though, long before QC factoring starts competing with pencils. The QC people then demand some noble prize for the correction to the Quantum Mechanics. But the committee wants more specifics than simply a nonworking machine, so something like observing the state of the QC is needed. Then they find the Universe too small for observing individual states of the needed dimensions and accuracy. (Raising sufficient funds to compete with pencil factoring may justify a Nobel Prize in Economics.)

Let us make some calculations. In cryptography, the length
`n` of the integers to factor may be a thousand bits (and could
easily be millions.) By `*n`, I will mean a reasonable power of
`n`. A `2 ^{*n}`-dimensional space

There is a big difference between untested and untestable regimes.
Claims about individual megastates are untestable. I can imagine a
feasible way to separate any **two** QC states **from each
other**. Yet, as this calculation shows, no machine can separate a
generic QC state from the set of all states more distant from it than QC
tolerates. So, what thought experiments can probe the QC to be in the
state described with the accuracy needed? I would allow to use the
resources of the entire Universe, but **not more**!

Archimedes made a great discovery that digital representation of numbers is exponentially more efficient than analog ones (sand pile sizes). Many subsequent analog devices yielded unimpressive results. It is not clear why QCs should be an exception.

In infinite dimensions, the distinction between 0 and positive
separation between a point and a set varies with topologies. In finite
dimensions, 0-vs.-positive distinction is too coarse: all topologies
agree. Since 2^{500} is finite only in a very philosophical
sense, one needs a quantitative refinement, some sort of a
weak-topological (not metric) **depth** of a neighborhood
polynomially related to resources required for precision to a given
depth. Then, precision to reasonable depths would be physical, e.g.,
allow one to generate points inside the neighborhood, distinguish its
center from the outside, etc.

Metric defines `e`-neighborhoods and is richer in that than
topology where the specific value of `e` is lost (only `e
> 0` is assured). However, metric is restricted by the axiom
that the intersection of any set of `e`-neighborhoods is always
another `e`-neighborhood. Quantum proximity may require both
benefits: defined depth `e` and freedom to express it by
formulas violating the "intersection axiom." Here is an example of
such a violation, without pretense of relevance to our needs. Suppose a
neighborhood of 0 is given by a set of linear inequalities
`f _{i}(x)<1`; then its depth may be taken as
1/

The rest of the article ignores any extravagant models and stands
fully committed to the Polynomial Overhead Church--Turing Thesis: Any
computation that takes `t` steps on an `s`-bit device
can be simulated by a Turing Machine in `s ^{O(1)}t`
steps within