Is a Theory Needed Before We Exploit a New Phenomenon? (Doug Marker)

The following article was submitted by Doug Marker

Is a theory needed before we exploit a new phenomenon?:

The issue of needing a sound theory before LENR can be called real or deserves investment really does not mean LENR doesn’t exist. It also doesn’t mean that LENR can’t deliver a scale-able non-chemical abundant anomalous heat, through a new not understood nuclear process.

What LENR suffers from is that it is a significant threat to some industries and existing fusion projects where investment and effort run into the billions of dollars. It is fair to argue that it is for this reason that LENR polarizes opinion in the extreme way that it has.

Quantum Information Processing (QIP):

If we compare LENR to Quantum Computing we have the remarkable situation where quantum computing devices are being built and sold (DWave Quantum Computers) that work, and in ways that no scientist can adequately explain, to the satisfaction of science in general. A big difference is that QIP (Quantum Information Processing) is not a threat to anyone, on the contrary it is embraced and encouraged and regarded by many in the IT field as the next big leap forward in computing power. You won’t destroy your career playing with qubits held in place in Bose Einstein condensate and super conducting material.

QM – The Copenhagen Interpretation:

The principles that QIP work on are based on exploiting ‘Quantum Entanglement’, the effect first outlined by Einstein, Podolsky & Rosen in their famous EPR Paradox paper. Einstein used EPR to try to prove that entanglement as it could be explained by Nils Bohr and Werner Heisenburg (famous for his ‘Heisenburg uncertainty principle’) was wrong and that the outlined entanglement effect could only be explained by locality or that there were ‘hidden variables’.

Non-locality violated Einstein’s utter conviction in the need for locality, which in turn fitted in with his existing theories. Point here is that Einstein and friends first described quantum entanglement but only to try to use it to prove that Bohr, Heisenburg and their followers were wrong. The subsequent history of experiments (Bell’s inequality tests) have shown Einstein was the one who had it wrong. The need for ‘locality’ (with ‘hidden variables’) has been proven to not be involved in the entanglement effect. This discovery baffles science to this day. The famous scientist Richard Feynman once said with deep wisdom, “I think I can safely say that nobody understands quantum mechanics” (Richard Feynman, in The Character of Physical Law (1965)). Entanglement embodies this view to perfection.

So now we have computers being built, and used, based on that allegedly ‘wrong’ entanglement and yet still, today, no scientist can explain it with any widely accepted or cohesive theory. It just works. The smart ones just move on and exploit the principle. The pedants argue it shouldn’t be invested in or time wasted on it because there is no accepted theory or known science. In the LENR reality debate the pedants tend to be both vociferous and shrill.

So the situation we have is that QIP is accepted and in use despite science lacking a good or clear understanding of how and why it works, but LENR is widely attacked and blasted when there is plenty of research evidence for those who look, to show it is real. The difference with QIP is that LENR poses multiple threats, QIP poses none.

Explore and encourage rather than supress or destroy: Tom Darden really summed up his involvement very well and in a strikingly mature way. What he has had to say and is doing (supporting Andrea Rossi when few others would) is what is needed.

The hope is he now triggers an avalanche of interest and support and the shrill nature of the anti-LENR folk gets seen for what it is, that is, to suppress and destroy rather than explore and encourage. Hot fusionists do not have a monopoly on understanding the forces at play at the sub-atomic level. This field is opening up to new notions and realities.

Quantum Entanglement, what it is and isn’t, and how it can be exploited, is the living proof.

Doug Marker

  • quax

    My beating on poor ole Niels was tongue firmly planted in cheek. So if it made you chuckle then my mission was accomplished 🙂

    At the time the Copenhagen interpretation was the pragmatic thing to do, he couldn’t possibly anticipate what repercussions it would have.

    • doug marker

      Quax, If you want to write another tongue-in-cheek article, give Erwin Schrödinger’s a good dose. His farcical cat in a box story deserves a lot of tongue – heaps in fact ! 🙂

      So much to be said about bright minds misinterpreting or deliberately distorting a concept. Even Einstein (who I truly admire greatly and at a logical only level, believe was right more than he was wrong) was not innocent in regard to the emergence of the cat-in-the-box fallacy.

      Einstein 1st postulated the story but with explosives being triggered. But when Schrödinger expounded the Einstein story he substituted a blast for a poisoned cat as he realized that using explosives removed any mystery and also a massive explosion was near impossible to describe in a superposition state. He needed the mystery of the cat’s fate to give credence to his otherwise implausible story.

      Even then he ignored the role of the trigger (a detector) that in effect was the ‘observer’ that detected the state thus at that instant destroying the theoretical superposition.

      So I am proposing that a really good tongue-in-cheek story about Schrödinger is long over due 🙂

      • quax

        Well, I think the joke is already on the jokester. After all the cat example was supposed to illustrate what Einstein and Schrödinger considered patently absurd QM predictions. And now it is taken as gospel.

  • quax

    Thanks Doug 🙂

    At the beginning D-Wave was sometimes called fraudulent on some blog’s comment sections. But overall you are entirely right, the vitriol never reached LENR levels.

  • GreenWin

    quax! Like the ghost of Jacob Marley you return to the living. Welcome!

    • quax

      Thanks Greenwin. Like to think of it more like a phoenix from the LENR ashes.

      So my time will again be limited. Too many other things to do 🙂

  • Mark Underwood

    Thanks for the comment. A couple of things.

    I was surprised that the article from AMS had to do with measurement. Although uncertainty in measurement gives an intuition into Heinsenberg’s Uncertain Principle, it isn’t really what HUP is about. HUP is really about *intrinsic* uncertainty of certain quantum attributes, independent of measurement. So for instance HUP would tell us that an electron really does not have a certain momentum or a certain position and the same time ; rather the electron has a certain conjoined momentum – position property. I have a problem with that. I won’t be dogmatic that it isn’t true, but I am wary of it. What we are doing is projecting the abstractness of our mathematics – in this case the mathematical ‘conjugate variable’ aspect of momentum and position, and saying that the particle *actually* has no real simultaneous momentum and position properties, only a mathematically abstract mixture of the two. So we are essentially making a physical system unphysical, to reflect our abstract mathematics. Today we hardly blink when we call an electron within an atom a ‘probability cloud’. Again, I’m wary of projecting our mathematics onto nature and thinking we’ve successfully described the wild beast. I don’t think so.

    Optics serves as a segue into an interesting story. From the Wiki entry on Charles Townes, the inventory of the maser (laser):

    Theorists like Niels Bohr and John von Neumann doubted whether it was possible to create such a thing as a maser.[19] Nobel laureates Isidor Isaac Rabi and Polykarp Kusch received the budget for their research from the same source as Townes. Three months before the first successful experiment, they tried to stop him: “Look, you should stop the work you are doing. It isn’t going to work. You know it’s not going to work, we know it’s not going to work. You’re wasting money, Just stop!”[20]

    If I recall correctly, Bohr and von Neumann actually visited Townes in his lab. Why did they think a laser was not possible? Because of the Heisenberg Uncertain Principle! They didn’t think light could be made to act in lockstep like that because it involved precisions of position and momentum, simultaneously.

    Some food for thought anyway.

    • quax

      Originally Heisenberg also interpreted his principle in terms of measurement, but this is indeed a point of view that fell out of favor (and rightly so).

      To me the complimentary aspects is clearest in the context of a femtosecond laser light pulse. You can build up such a short light impulse by superimposing a wide spectrum range which runs counter to the “normal” nature of lasers. They usually have a highly defined frequency.

      This immediately makes clear how space/time resolution is complimentary to the frequency one. You can only make the pulse shorter by widening the frequency window.

      https://en.wikipedia.org/wiki/Ultrashort_pulse

      • Mark Underwood

        Yes, yes, that would be an example of the energy – time conjugate. Narrower time window, more uncertain energy (frequency), and visa versa. This does seem indeed seem valid when applied to the ability of our measurement apparatus, or (in this case) the generating apparatus to detect or create phenomena where one attribute is measured or generated more precisely, but at the expense of the preciseness of the other attribute.

        (However a few years ago some researchers seemed to have been able to increase the information measured by taking several “weak measurements” which as a whole disturbed the system less than a single strong measurement. Not sure how that has progressed though. )

        Anyway I still think it’s a leap to project the fuzziness of what we are able to measure or produce to the actual state of a individual quantum. But to each his own. Thanks for the dialogue!

  • Axil Axil

    Multiparticle quantum entanglement enables Hawking radiation.

    See

    Multiboundary Wormholes and Holographic Entanglement

    http://arxiv.org/abs/1406.2663#

  • Mark Underwood

    No, a theory is not needed to exploit a new phenomenon, but a (more or less correct) theory helps, immensely. Still, without theory we can nonetheless notice trends, and do logical extrapolations. Informed trial and error. That seems to be the stage we are at with cold fusion.

    About entanglement. That quantum concept may be fundamentally flawed. Randell Mills has classical, more common sense explanations for phenomena like entanglement and tunnelling. They result from his fundamental idea that particles like electrons have real physical extent. To illustrate, imagine a high jumper going over a bar using the Fosbury flop. It is well known that one’s centre of gravity can actually go *under* the bar, while the jumper drapes himself over the bar successfully. Now, quantum mechanics would see this as ‘tunnelling’ ; it sees only the centre of gravity, a point, which mysteriously traverses under the bar. Mills sees the electron as a 2D electromagnetic current surface draping over the bar, so to speak.

    So (not to offend), the typical quantum view of the electron somehow being both a point of zero extent and also a probability cloud spread out over space, may simply be nonsense – and a sign we don’t know what in the hell we are talking about. Oh yes we have mathematics, but it correlates to phenomena and is not necessarily descriptive. For instance the infinite series

    (4/1) – (4/3) + (4/5) – (4/7) + (4/9) – (4/11) + (4/13) – (4/15) + ….

    is a formula that can successfully compute a certain well known number. But is it more physically descriptive to say it is the magnitude of a circle’s circumference divided by its diameter. (Pi, that is, 3.141592…)

    Mills would say that the DWave computer will not show any mystical quantum powers, because he does not believe that in quantum states being simultaneously both on and off.

    My main point is that, just because we have a mathematical theory that seems to correlate to what we observe in nature, doesn’t mean that we have a true understanding of nature. But theory can be so comforting to some that that they will use it to deny new and real phenomena that are in conflict with their theory. We know that all too well don’t we.

    • Axil Axil

      The mention of R. Mills technology and associated theory is a red herring. Mills disavows LENR in any form and states that his technology is not nuclear. On the contrary, the energy produced if any is produced by chemical means. He also says that there is no transmutation produced by his reactions.

      You can’t have your cake and eat it to. Please correct your argument about quantum entanglement by removing reference to Mills theory in this regard.

      • Mark Underwood

        A red herring Axil? Mills has his own ideas about entanglement, so it is quite on point actually. Separated particles can be correlated; the question is how.

        You’re right about Mills disavowing LENR. He’s got his reasons and he seems quite settled in his opinion. I happen to disagree with his opinion, but I still value his unique physical insights on other matters.

        Tonight was my wife’s birthday. We splurged and got a strawberry shortcake. I had my cake for a while, then I ate it. Then I no longer had it. (Except in my stomach.) But gosh if the piece of cake is small enough (sub atomic), we apparently have no problem believing it is both there and not there at the same time. Go figure!

        • GreenWin

          Mark, your cake conclusion is off. What you mean is, “I ate a piece of shortcake. But the GI scan shows it to be a piece of blueberry pie!”

    • Axil Axil

      Multiparticle quantum entanglement has been experimentally verified through the detection of Hawking radiation coming from a acoustic EMF black hole.

      • Mark Underwood

        Axil, a few points. One, that experiment can be hardly said to ‘verify’ entanglement. Its idea was to detect Hawking radiation, not entanglement. And its claim to detect Hawking radiation is debatable. Other experiments deal directly with entanglement.

        Secondly, my understanding is that Bose Einstein condensates have naught to do with entanglement per se. Recall that Einstein had a problem with ‘spooky action at a distance’ entanglements ; but with Bose Einstein condensates he was just fine, of course.

        Thirdly, black holes themselves are hardly something we should be confident we know all about, let alone Hawking radiation. Hawking radiation involves presuming the existence of Heisenberg’s Uncertainty Principle, virtual particles, negative energy (negative mass) particles, and so on. All very speculative in my opinion.

        But that’s all I want to say about that, too much off topic.

        • Axil Axil

          Research into LENR will soon allow the experimental exploration of the existence of Heisenberg’s Uncertainty Principle, virtual particles, negative energy (negative mass) particles faster than light travel and warp drive, and so on. LENR is key to changing these speculative subjects into rock bed science fundamentals.

          LENR research will also show that R. Mills is misinterpreting his experimental results(10 mm black light). What he is producing is LENR, but his theory is stopping him from developing that technology and has been doing so for more than 20 years.

          Here is another researcher who has seen Hawking radiation emitted by acoustic black holes

          Prof. Daniele Faccio: “Black Holes, With A Twist” – Inaugural Lecture

          https://www.youtube.com/watch?v=YyMYcqxuZ_I

    • MasterBlaster7

      I think Peter Hagelstein said it best. “Once again, scientific theory triumphs over experimental result.”

      • Axil Axil

        He wasn’t talking about the theory of R. Mills.

        • MasterBlaster7

          I was just talking about his main point, not R. Mills.

  • http://www.lenrnews.eu/lenr-summary-for-policy-makers/ AlainCo

    In fact the know theory BCS was incompatible, and it is no worse tha for LENR.

    theory don’t say it is impossible, just cannot explain it.

    note that there was observation reported since 1973

    http://www.mosaicsciencemagazine.org/pdf/m18_03_87_04.pdf

    but hidden as footnote…

    10 years of denial, and according to the legend the academic community took 1 year to accept evidence…
    Let me laugh. As Kughn says, history is rewritten.

    LENr will be invented by Caltech in 2015.

    • Omega Z

      With the Collaboration of MIT I assume…

  • Bernie Koppenhofer

    A financial Controller of a company spending 5 million dollars for heat energy could care less about theory/engineering if he can get that heat energy for 2.5 million, he is going to buy it,

    • Axil Axil

      Will the insurance industry insure his building? Will the nuclear regulators allow a reactor to be installed so near people without containment? Will the financial Controller take a risk to the future of his company and its employees by installing a new and unproven technology, or does he wait for years while other early adopter guinea pig costumers take the risks for him.

      • Mats002

        The first guinea pig is Mr Rossi. He seams to be fine, a little thinner but still going strong. He has a reason to live in that LENR container.

        • Bernie Koppenhofer

          Right. This quote from Darden, “I don’t want to say thatcold fusion is real until we can absolutely prove it in ten different ways and then persuade our worst critics to join our camp.” Coupled with the fact he gave the interview, is very positive and answers a lot questions including the yearlong test.

          • Omega Z

            “and then persuade our worst critics to join our camp”

            Darden is naive in this respect.
            The critics will not sway until working products are in the market. Only then because it will threaten their bottom line. Doing nothing is easy. Change is hard.

            • Bernie Koppenhofer

              I could care less about “the critics” this decision will be made by corporate financial types and Darden is right to be prepared to answer their questions.

      • Bernie Koppenhofer

        Why do you think they are running a year long test? Money talks, very loudly!!

      • LarryJ

        If the technology works as described then no one can afford not to adopt it. Their competition will quickly eat their lunch otherwise. Once in the market I think adoption will be very quick.

        • Axil Axil

          When LENR reduces the demand for gas over time, the price for gas will drop to very low levels as gas producers will struggle to extract the last bit of profit from existing gas wells. Because of this drop in gas prices, gas will remain competitive with LENR for a very long time.

          As a home owner that uses gas for heating, why spend thousands to replace a perfectly running gas heater while gas is dirt cheap. The bottom line, temper your expectations for rapid LENR adoption in the face of economic reality.

          • LarryJ

            If the device is priced to allow a 1 year payback it really doesn’t matter how cheap gas is, this will be cheaper. There is also the the high probability that the government will apply serious taxes to fossil fuels, especially if there is an economical alternative. Here in British Columbia Canada I already pay a carbon tax on home heating fuel. That will surely rise. Adoption will be very quick.

            • Omega Z

              I agree with Axil on this. The price of gas if cheap enough will act as a counter balance. People will not change out a good high efficiency gas furnace until it quits.

              The same would apply to IC engines. Electric cars will drive the price of gasoline down & the need to switch becomes less urgent. They will use the IC engine vehicles & be replaced only as the need arises.

              “the government will apply serious taxes to fossil fuels”

              This proves Axil’s point is valid. Otherwise, such taxes wouldn’t be necessary. Of course, there is a downside to Government getting involved this way. It eliminates the need for LENR to compete. They can charge as they feel fit. We’ll keep it 10% cheaper then fossil fuel+taxes added.

              As to the deployment of LENR. It is going to take decades. Regardless what many think. It takes time & money to transition & build facilities. There is presently 1000 power plants on a 10 year waiting list. Some claim if it weren’t for the long waiting period, that number would double. There’s 2 primary reasons for this waiting. The lesser is financial. The greater reason is a lack of skilled labor.

              There’s a couple million college grads in the U.S. unemployed or under employed. Guy walks in to human resources of a large contractor. Says, I have 4 years of college. I hear your hiring. H.R. says, Can you weld? NO. Can you form or work with concrete & bricks? NO. Electrical knowledge or heavy equipment experience? NO

              H.R.- Tell you what, I hear Starbucks down the street needs a Barista
              Our Government in action…

  • Manuel Cruz

    Well, superconductivity allows from electrons to move freely indefinitely on an electric circuit, meaning that the circuit is a perpetuum mobile of type A, something that also has been called theoretically impossible.

  • Zephir

    High-temperature superconductivity is researched from 1986 without any underlying theory accepted. So that there is absolutely no problem with LENR from this perspective.

  • pg

    NO

    • http://bobmapp.com.uk twobob

      “?”

      • doug marker

        No to the question posed in the title 😉
        D

        • http://bobmapp.com.uk twobob

          Oh!

    • Mats002

      Efficient, clear answer to a long tedious question, I like it!

  • http://bobmapp.com.uk twobob

    What’s the problem.
    Tesla had it right. Have an idea/revelation experiment with it.
    fiddle with it fine tune it try to understand what you have done.
    All this cant about you can not put it on the market till you know,
    every little wrinkle of what it does and how is just BS.
    If I had invented Flubber I would be rich. and over the moon.
    Mr Rossi has done just that. But from his past experiences, he knows he
    wants to have something that is irrefutable and give the English Archers Gesture.

  • Manuel Cruz

    Genetic algorithms are also something that we use and that scientists cannot explain. Not just why the method works, but sometimes we cannot even understand the solutions reached by those algorithms. Some papers go as far as to call it “magic”, and many circles consider it either cheating or an unorthodox way to solve problems, because you cannot write a document explaining the solution or the steps to reach it to ensure correctness. One funny case was a circuit design that had a segment not connected to anything else, and yet if you removed it, the machine would not work. Here is a link to the story: http://darwin200.christs.cam.ac.uk/pages/index.php?page_id=h7

  • Alan DeAngelis

    I don’t think so.
    “Today’s scientists have substituted mathematics for experiments, and they wander off through equation after equation, and eventually build a structure which has no relation to reality.”
    – Nikola Tesla

    • Axil Axil

      That quote is a gem. It should be posted on Rossi’s blog. Rossi thinks that mathematics must be used to formulate theory.

      Supersymmetry came out of math based predictions and will cost science billions and decades of wasted time. And then there is string theory.

      • Alan DeAngelis

        And we know that Tesla was mathematically proficient. So, it wasn’t a sour grapes statement. I think it was more of a reminder to not drift too far from the experimental evidence.

        • Omega Z

          Theories in themselves are actually worthless until someone actually does experiments to prove or disprove them.

  • Roland

    Thank you for an interesting analysis; you are being somewhat modest in one regard though. The total global investment in the energy production business in all its facets is US$48 trillion; the stakes are very, very high…

  • Axil Axil

    Yale physicist Rob Schoelkopf, Sterling Professor of Applied Physics and Physics, stated, “Ninety-nine percent of quantum computing will be correcting errors. Demonstrating error correction that actually works is the biggest remaining challenge for building a quantum computer.”

  • Ophelia Rump

    There are two “how’s” involved, and there is an innate pecking order for which is sufficient cause to engage.

    The first “how” is how can you make it work. If you know the answer to this, then you can get to business immediately.

    The second is how it works, the theory. If you have the answer to this then you are justified to begin seeking the answer to the first how.

    Keep in mind that theory is not fact. It is an aid and a tool. Theory is often discarded, or modified along the way to answering the first how. Theory which does not lead to the first how is clutter.

    • Axil Axil

      You are taking about engineering. Science is not interested in engineering. Engineering without a scientific theory is a trial and error process.

      • Zack Iszard

        Succinct and correct, except that the distinction between science and engineering disappears more each day. Material science and computer science are shining examples.

        To clarify the semantics of your most astute observation: Ophelia’s “first how” is product design know-how, and the “second how” is a mechanistic understanding, “scientific theory” in the strict sense or not.

        *Mechanistic understanding is not interested in product development. Product development without a mechanistic understanding is a trial and error process. People invest in product development, not trial and error.*

        In short, to answer the question in the post’s title with Ophelia’s context, the “second how” is needed only as far as it is part of the “first how”.

      • Frechette

        The Romans built structures based on trial and error. They new nothing about statics nor strength of materials yet some of their structures still stand today.

        • http://www.lenrnews.eu/lenr-summary-for-policy-makers/ AlainCo

          they have theories, but just simple rules… sometime overkill…
          medieval engineers build the cathedral with improved phenomenological theory… not with differential calculus.

      • http://www.lenrnews.eu/lenr-summary-for-policy-makers/ AlainCo

        this is the position of JF geneste.
        Engineers need a theory, even tiny and phenomenological, to improve the solution.without improvement there cannot be industry, thus no funding.

        the error for me is that people ask for a full theory, while a medieval theory could be enough, phenomenological but validated by experience.

        We pay 50years of success of the full theory…
        we abuse of models to the point we deny observations
        we abuse of upfront theory to the point of denying measurements

        theory is required but not the gold-standard we get lucky to have to build A-Bomb and i8088.

        • Axil Axil

          Many have generated theories for LENR that are complete nonsense but they are well funded through investments. Two examples that come to mind are Brillouin Energy and Black Light Power. Even Rossi is working on his own malarkey theory. Rossi can’t very well sell product while saying he has no idea about how it works.

      • Ophelia Rump

        “The following article was submitted by Doug Marker
        Is a theory needed before we exploit a new phenomenon?:”
        Exploit a phenomenon, sounds like engineering.
        Science is not interested in anything. Without testing there is no science.
        More interesting is your disdain for trial and error. Without testing you can only find answers which are already implicitly known.
        New theory cannot be crafted from air, only religion can do that.

        • Axil Axil

          Dirac’s equation predicts antiparticles 2 January 1928

          In 1928, British physicist Paul Dirac wrote down an equation that combined quantum theory and special relativity to describe the behaviour of an electron moving at a relativistic speed. The equation would allow whole atoms to be treated in a manner consistent with Einstein’s relativity theory. Dirac’s equation appeared in his paper The quantum theory of the electron, received by the journal Proceedings of the Royal Society A on 2 January 1928. It won Dirac the Nobel prize in physics in 1933.

          But the equation posed a problem: just as the equation x2=4 can have two possible solutions (x=2 or x=-2), so Dirac’s equation could have two solutions, one for an electron with positive energy, and one for an electron with negative energy. But classical physics (and common sense) dictated that the energy of a particle must always be a positive number.

          Dirac interpreted the equation to mean that for every particle there exists a corresponding antiparticle, exactly matching the particle but with opposite charge. For the electron there should be an “antielectron” identical in every way but with a positive electric charge. In his 1933 Nobel lecture, Dirac explained how he arrived at this conclusion and speculated on the existence of a completely new universe made out of antimatter:

          If we accept the view of complete symmetry between positive and negative electric charge so far as concerns the fundamental laws of Nature, we must regard it rather as an accident that the Earth (and presumably the whole solar system), contains a preponderance of negative electrons and positive protons. It is quite possible that for some of the stars it is the other way about, these stars being built up mainly of positrons and negative protons. In fact, there may be half the stars of each kind. The two kinds of stars would both show exactly the same spectra, and there would be no way of distinguishing them by present astronomical methods.

          Discovering the positron

          First sighting of positron disregarded 6 May 1929

          While studying cosmic rays in a Wilson cloud chamber, the Soviet academic Dimitri Skobeltsyn noticed something unexpected among the tracks left by high-energy charged particles. Some particles would act like electrons but curve the opposite way in a magnetic field. In an independent experiment that same year, Caltech graduate student Chung-Yao Chao observed the same phenomenon. The results were inconclusive, and both scientists disregarded the anomaly.

          Paul Dirac predicts the positron 1 September 1931

          Paul Dirac published a paper mathematically predicting the existence of an antielectron that would have the same mass as an electron but the opposite charge. The two particles would mutually annihilate upon interaction.

          “This new development requires no change whatever in the formalism when expressed in terms of abstract symbols denoting states and observables, but is merely a generalization of the possibilities of representation of these abstract symbols by wave functions and matrices. Under these circumstances one would be surprised if Nature had made no use of it,” he wrote.

          Theoretical predictions confirmed 3 March 1933

          The discovery was confirmed soon after by Occhialini and Blacket, who published Some photographs of the tracks of penetrating radiation in the journal Proceedings of the Royal Society A. Anderson’s observations proved the existence of the antiparticles predicted by Dirac. For discovering the positron, Anderson shared the 1936 Nobel prize in physics with Victor Hess.

          For years to come, cosmic rays remained the only source of high-energy particles. The next antiparticle physicists were looking for was the antiproton. Much heavier than the positron, the antiproton is the antiparticle of the proton. It would not be confirmed experimentally for another 22 years.

        • Axil Axil

          This is how valid theory development and scientific prediction is supposed to work.

          Just like the scientific process exemplified by the discovery of the positron and anti matter by Dirac, Axil predicts the existence of mesons through muon production in LENR in mid August 2014 in an explanation of LENR reaction inspired by Ophelia Rump request for a theory describing the LENR reaction.

          Muon production confirmed in the LENR reaction.

          Lief Holmlid discovers mesons through muons produced by the LENR reaction one year after its prediction by Axil in August 2015.

  • Axil Axil

    Regarding:

    “Quantum Entanglement, what it is and isn’t, and how it can be exploited, is the living proof.”

    How prophetic that the author brings up this subject in a discussion of LENR.

    When all the emergent properties of LENR are pealed away in thick layers one after the other to reveal LENR’s purest and incandescent heart of causation, We arrive at Quantum Entanglement.

    Quantum Entanglement keeps the universe together and it permits Black Holes to evaporate: and it powers the fusion process that allows one nucleus to tunnel into another thus forming as fusion of the two.

    Many devotees of LENR use the term tunneling in connection with LENR causation but what that terms means and how it works in never addressed.

    A theoretical understanding of a special kind of entanglement that allows matter to be removed from inside a black hole is what LENR uses to defeat the coulomb barrier. It is called multiparticle entanglement and it only active as a mechanism found in Black Holes. Quantum Entanglement is usually monogamous. This is the first commandment of Quantum Entanglement: a one to one connection between two items. But this method of connection does not hold for Black Holes.

    LENR is caused by nanoscopic light based acoustic black holes called dark mode surface plasmon polaritons. These black holes transfer energy(energy quantum teleportation) to and from the site of the fusion reaction caused through the action of Black Holes through Quantum Entanglement using something that Einstein discovered back in 1935 with Rosen: the worm hole.

    If we want to understand LENR we must understand Black Holes very well indeed.