Charting the Nickel-Lithium-Hydrogen Workspace; 3,600 Variables Involved (Lookingforheat.com)

Alan Smith of Lookingforheat.com has posted a short slideshow on the LFH website which shows the many variables involved in any possible nickel-lithium-hydrogen LENR reaction. The full document can be read here: http://www.lookingforheat.com/lenr-the-experimental-space/

Alan provides a list of some of the important variables in the categories of Reactor Parameters, Fuel and Pre-treatment, and Heat and Power, and calculates that there are around 3,600 variables involved in the build and experimentation of a Rossi-type reactor, making it a hard task to hit the sweet spot. He writes:

Thus it becomes obvious that without the luxury of a big team, a dedicated workspace and large
budget -the nuclear physics equivalent an infinite number of monkeys with typewriters – the
successful creation of a working LENR system – involves attempting to minimise the unknowns
by careful study of the available patents and papers to create a practical and logical series of
experiments where only one parameter is varied each time. Careful record keeping and data
Logging thus becomes a ‘must do’.

We have seen now many E-Cat replication attempts turn up with either marginal or no excess heat at all — illustrating the point that Alan is making. On the other hand there have been some experimenters who have reported clear success in obtaining excess heat in some experiments (e.g. Parkhomov, Songsheng Jiang, Stepanov), which give an indication that it is not an impossible task. But Alan’s point regarding consistent, systematic experimentation with careful record keeping is critical if we are to narrow down the parameters for success. It’s a big task and not easy to do with a dispersed community of replicators speaking different languages and with different motivations. Is it possible to have the coordination and cooperation needed for open-source researchers to crack the code behind the Rossi effect?

http://www.e-catworld.com/wp-content/uploads/2016/05/LENR-–THE-EXPERIMENTAL-SPACE.pdf

  • http://www.animpossibleinvention.com/ Mats Lewan

    Jimmy, the lack of a theory AND the general lack of acceptance of the field, is exactly why it probably had to be a person like Rossi, doing it the old style with systematic and tiresome step-by-step experimental work, changing one parameter at a time, who might have succeeded. Funny.

    What could be done with computers though (which has been suggested), would be using AI to gather all information and data in published LENR papers and working it through to come up with the essentials and even suggest new potentially viable approaches.

  • Warthog

    Sorry, but BS. Science is still an experimental endeavor. NO computer model is worth the electrons to run the computers UNLESS verified by experiment. Garbage in = garbage out is still very true.

  • Roland

    It would be of tremendous service, granting that your self description is reasonable accurate, to us E-catworlders if you would take the time to examine a particular software suite for modelling molecules; it’s available, in late beta, as a free download in the hopes engaging relevantly skilled individuals in the extension of the boundaries of understanding.

    What makes this particular software of interest to those gathered here is the wider implications of the underlying GUT-CP. This underlying theory has been substantiated in the following regard; the theory led to two cosmological predictions that, at that time, were considered to be absurd but were subsequently validated by astronomical observations.

    Too wit; that most of the mass in the universe would turn out to not give out any signature of it’s existence other than it’s gravitational effects on the visible universe (dark matter; as included in a complete explanation of what and why) and that the matter in the universe would be found to still be under acceleration away from the putative ‘centre’ (the argument had previously circled around whether the inertia imparted by the big bang would lead to continuous expansion into infinity or eventual collapse back to some order of singularity).

    On the surface of things there appear to be a number of interesting validations performed in support of the broader GUT-CP; a confirmation of the efficacy of the aforementioned software by one or more persons with the background, granting an open mind as a few oxen may be gored along the way, to properly evaluate the claims for the software would be of significance to many of us.

    http://www.millsian.com

    My, and hopefully our, thanks to those who take up this burden as there just may be a very, very large dog attached to this tip of this particular tail.

  • Roland

    This is the tip of the iceberg; the potential variables in the single category of EM stimulation of the reaction, in the absence of some overarching theoretical guidance to narrow the search, dwarfs the possible permutations of the physical parameters.

    I would take one step further and posit that the correct EM stimulation will produce a result across a broad range within the physical parameters, and that a fairly complete understanding of the mechanisms of EM stimulation are foundational to the advanced E-cat designs; especially as this relates to running in controlled self sustain mode for lengthly periods.

  • Curbina

    I am aware of the importance of models and simulation but I often see that models become a surrogate for reality and when researchers lose the perspective to the point that when the observations don’t fit the model is the observation, and not the model, what is put in doubt. I have witnessed people believing their models to the point of that, while oblivious to the weaknesses of the model. LENR is a phenomena that has been observed and is still observed and is not predicted by any model, thus is a case of confusion of model by reality. In this case we need to be doubtful of the model, not of the experimental observation.

    • Alan DeAngelis

      Yeah, like the simple minded approach of the deuterium-palladium system theorists who assume that d-d fusion is taking place because deuterium is being consumed and helium is being created (even though no gamma rays are seen), completely ignoring the fact that deuterium can transmute heavier elements (the Mitsubishi experiments). Therefore, it may be going through an intermediate.

  • Alan Smith

    Bless you. You did notice I said ‘at least 10’ meaning 10X the whole group. If I had mentioned a need for 360k experiments I suspect everyone would pack up and go home. 🙂

    • Omega Z

      Yes, this is not for the faint of heart.. It could easily be a lifetime commitment.

      • Alan Smith

        Well, if we knew what we are doing, it wouldn’t be research.

  • Alan DeAngelis

    Some of these optimization techniques might be useful. http://www.slideshare.net/biniyapatel/optimization-techniques-37632457
    But the standard operating procedure is for the “brilliant” to “borrow” the optimum conditions that the “idiot” came up with and create an

    ipso facto, supercomputer model of how they “predicted” this optimum.

  • Ryan

    Maybe they could leverage some of the newer technology being used to figure out whether a specific combination of materials will work better than others for desired specifics. It looks like Los Alamos has come up with a system like that to help filter through possible variables and find likely good candidates for specific goals. http://www.kurzweilai.net/machine-learning-accelerates-the-discovery-of-new-materials

    With something like that they could even broaden the scope and see if other material compositions might garner better results. Might also be a good way to narrow down some of the ideas on LENR. If they factor in that they think materials need to be x and y to have the effects needed and the system pushes out a bunch of materials that should be optimal for that and they don’t work then they can determine that perhaps those factors are not important and move on to other ideas more quickly.

  • SG

    You make some good points. Bear in mind that many if not most LENR supporters and experimentalists hail from the computer and electrical engineering fields, much more so than the field of physics. The physicists as a whole have largely turned their back on this phenomena. I think initial Edisonian type experiments are necessary until at least a rudimentary theory can be developed. Then, I believe you are right, computer modeling will be way the forward to optimize the effect.

  • Alan Smith

    You aren’t wrong at all. But when you slice a sausage so finely as that, it is hard to tell one slice from the others.

  • theBuckWheat

    “…The major problem with LENR is there is no theory. There are no equations to model on the computer. …”

    At this moment, we have an entry-less circle with only what seems to be ephemeral hints in the form of cranky desktop experiments that work sometimes but not others. Until a solid theory can be developed and rigorously tested one problem for anyone who comes up with a recipe that works is that their device could stop working for no known reason, or worse, it could slide into a previously unseen region of its reaction and start emitting dangerous levels of ionizing radiation.

    But, with every device that operates we will get more and more data and more and more hints about the fundamentals. The benefits of LENR are so vast that we must not give up.

    • GordonDocherty

      The same, of course, was (and is) true of fire. Yet we didn’t stop using it, as the benefits clearly outweighed the risks (like, cooking things was seen as beneficial, though no-one could have told you about the bacterica, fungii and viruses that the cooking helped control or kill), and the more we used it, the better we got at controlling it and benefitting from it … most of the time. If we had waited around for a theory, however, then we would never have “progressed” beyond the level of apes – but, then, perhaps we still haven’t. 🙂

  • Omega Z

    While Computer modeling can be a big help, it overlooks one very valuable aspect.

    “Serendipity”.

    Discovering things we may never have imagined such as Penicillin. Science without experimenting is like eliminating half of science and all the those unexpected discoveries..

    • Roland

      I suspect if we combine serendipity with an acute awareness of the behaviour of the apparatus built up over years of rapt observation we’ll recognize the turning point in Rossi’s endeavours, he was paying attention when a subtle shift occurred and he locked on to it like a pit bull.

      The success of the Edisonian approach hinges on deeper aspects of focused consciousness and it is a misnomer to characterize it as a purely ‘brute force’ methodology.

  • AdrianAshfield

    What you suggest only works if you have a reasonable understanding of the process. While there are dozens of theories none has yet been accepted. The process may be based on a new phenomenon.

    The danger of believing in models when you don’t have the facts is well illustrated by the IPCC computer models of the climate. Many are adamant the “scientific” models MUST be right even when they are not supported by observations.

  • lkelemen

    what about Stoyan Sarg’s theory and MFMP testing it?
    http://www.e-catworld.com/2014/02/18/mfmp-and-stoyan-sarg-team-up-to-experiment-with-nickel-powder/

    Jimmi: what’s your opinion of a simulation of S.Sarg’s theory?

  • sam

    There are some interesting comments from
    Jed Rothwell,Axil and others regarding the
    Penon EVR on Peter Gluck Ego Out blog
    May 10,11,12,13,14,

  • Alan Smith

    Good plan! You write the sim and I will test it.

  • Ophelia Rump

    That seems irrefutable the way you put it, until you consider that the field would not exist today were it not for the good old fashioned way. Using predictive methods to find something which your methods say does not exist, is quite possibly an insurmountable variable all in itself.

  • georgehants

    I wonder if Mr. Rossi could give any help with this topic?

  • http://www.health-answers.co.uk Agaricus

    I’ll temporarily assume a new tag of ‘armchair_experimenter’ for this.

    ‘Proper’ science requires that only one variable at a time is altered, i.e., the experimental environment should remain constant except for one factor. IMHO, in the case of LENR, it seems important to develop a ‘test bed’ reactor which provides for heating of an inert capsule (quartz?) of reactant (‘fuel’) which can be easily swapped in and out of the apparatus, a separate field coil which can produce an EM field that pervades the reactant, and standardised, fixed instrumentation and logging facilities. Possibly a pair of electrodes fitted to the reactant capsules could be used to inject current into the fuel, modulated as required by the PC as for EM fields.

    The apparatus would allow systematic testing of reactants of differing composition, sealed into capsules at different pressures of H2 as required. A PC could be programmed to work its way through a preset series of frequencies and waveforms and log the results, which in turn would be used to evolve the controlling algorithm. All very tedious, but easy enough for me to suggest from my office swivel chair…

    Louis DeChiaro’s research notes, part of which were published on this blog a long time ago, suggest that NAVSEA have already done something like this, understood the mechanics, and established a method of predicting the stimuli necessary for driving LENR in any given metal matrix.

    • Alan Smith

      Hi Agaricus. You are quite right about the need for restraint when developing experimental variants, and equally correct about the need for a standardised test-bed. That is what led LFH to develop the ‘Model T’ low-cost reactor, and to continue with our own R&D and replication efforts. It is our hope that this very inexpensive, simple, fast, and flexible system will encourage more collective efforts. We are hoping to make LENR less of a spectator sport and more of a team game.
      In fact this document about variables was created as a result of a request from a well-known physicist (and client) to define the experimental space available to independent researchers. And it is of course far from comprehensive -LENR is a huge beach and we are only small boys playing with a few pebbles.
      I think (hope) Zephir was kidding when he said ‘we just sell experimental kit’ – actually sales takes up only a small percentage of our working week – most of our time is spent in reading, engineering work and devising and running experiments. All of our interesting findings will be freely available when we come up with something novel that works reliably – in the meantime we see little point in whipping up interest in experiments where the results are submerged beneath error bars. When the going gets good, we will not be slow to let you know.

      • clovis ray

        Hi, Alan. i think you are absolutely correct, and i have been holding off for the very reasons you indicated, i too will be building, a reactor, just for the knowledge, of knowing how. as you know i have built many devices, some with partial success. others were failures, but this reactor seems reasonably easy to construct, but then, comes the hard part, how to configure it,and how to mix the fuel, and how to operate the computer with the right data, anyways thanks Alan, you are always johnny on the spot , when it comes to helping our community discover new things.
        I will be buying my construction materials from you, and we all know that you will get filthy rich off of us garage mechanics, right, your a good scientist i don’t care who says differently, i have worked with you on many projects, and have known and admired your work, thanks for thinking of this great idea to helping others find their dreams,
        i probably well wait until Dr. Rossi, has his E-CAT well defined before i try it myself.

    • wpj

      As a chemist, there is a need for process optimisation in a plant and many variables can exist in the same manner as above. A complete run thorough is rarely necessary.

      The best example that I have seen was a lecture by what I thought was going to be an extremely boring prof. They took six of what they believed to be the key factors and gave 5 values for each of these. They then randomly drew a value for each of these from a hat and did the experiment with those values. The results of this indicated the key factors in the process and then this could be further refined such that the whole process could be optimised in about 10 experiments.

      I’m sure that such an approach could also be applied to the LENR problem.

  • Zephir

    Work smarter not harder – and listen those, who understand the subject.