<div dir="ltr"><br><div class="gmail_extra"><br><br><div class="gmail_quote">On Fri, Nov 29, 2013 at 10:49 AM, Joe Landman <span dir="ltr"><<a href="mailto:landman@scalableinformatics.com" target="_blank">landman@scalableinformatics.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br>
Actually that in and of itself bears a strong need for discussion.<br>
Non-repeatable results, or results that cannot be repeated due to an<br>
inability to run or rerun the simulation, should be treated as transient<br>
results, and handled with the appropriate skepticism that infrequent<br>
results are. Or if the investigator wants to leverage some of the<br>
statistical techniques used in HEP, there are techniques for doing so.<br>
<br></blockquote><div><br></div><div style>This reminds me of an idea I had years ago to start two new scientific journals. One called, "The Journal of Reproducible Research" and it's evil brother, "The Journal of Irreproducible Research". Early in my graduate studies I wasted quite some time trying to reproduce some other researcher's work, only later to find out through the grape vine that their work was riddled with errors, yet there was nothing in the literature to indicate this. These journals would also provide some additional motivation to keep one's results and conclusions realistic. In my experience, the peer-review process can be ineffective or easily manipulated at times. Perhaps the Computational Sciences would be a good test-market for this, coupled with an associated online repository of code/data/binary-containers. </div>
<div style><br></div></div>-- <br>Kevin Van Workum, PhD<br>Sabalcore Computing Inc.<br>Run your code on 500 processors.<br>Sign up for a free trial account.<br><a href="http://www.sabalcore.com" target="_blank">www.sabalcore.com</a><br>
877-492-8027 ext. 11
</div></div>
<br>
<img src="http://www.sabalcore.com/images/Sabalcore_Computing.png">