<div dir="ltr"><div dir="ltr">On Tue, Oct 13, 2020 at 8:25 AM Benson Muite <<a href="mailto:benson_muite@emailplus.org">benson_muite@emailplus.org</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">A typical university cluster can run a data science workload with Spark, <br>
Hadoop etc., just requires Admins to make this possible. Systems like <br>
Comet are made for this kind of work:<br>
<a href="https://portal.xsede.org/sdsc-comet" rel="noreferrer" target="_blank">https://portal.xsede.org/sdsc-comet</a></blockquote><div><br></div><div>Thank you, I was unaware of this.</div><br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
Once jobs start using tens to hundreds of thousands of cores hours, <br>
taxpayer money (probably also the environment) is saved by writing in a <br>
low level language.<br></blockquote><div><br></div><div>That's an interesting thought, thanks.</div><div> <br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
A small number of countries design entirely new systems and train their <br>
students write/port software for them - much as happened with bleeding <br>
edge systems 20 years ago:)<br></blockquote><div><br></div><div>Can you provide some examples? I am curious about this. Thanks!</div></div></div>