supercomputer progress...

S

server

Guest
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast \"extreme
storms\" from 1982 to 2014.

The conclusion from a senior scientist is that \"it rains a lot more
during the worst storms.\"



--

Anybody can count to one.

- Robert Widlar
 
jlarkin@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast \"extreme
storms\" from 1982 to 2014.

The conclusion from a senior scientist is that \"it rains a lot more
during the worst storms.\"

I\'m surprised they even noticed that detail. Too bad they never talked to
anybody over at the NOAA about how things work.
 
On 2022/04/26 8:44 a.m., jlarkin@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast \"extreme
storms\" from 1982 to 2014.

https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer

---------<quote>-----------------
Lawrence Berkeley National Laboratory scientists are looking to make
highly detailed, 1 kilometer scale cloud models to improve climate
predictions. Using current supercomputer designs of combining
microprocessors used in personal computers, a system capable of making
such models would cost about $1 billion and use up 200 megawatts of
energy. A supercomputer using 20 million embedded processors, on the
other hand, would cost about $75 million and use less than 4 megawatts
of energy, according to Lawrence Berkeley National Laboratory researchers.
-------------<end quote>--------------

4 megawatts/200 megawatts - do the computers factor in their heat
generation in the climate models?

John ;-#)#
 
On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson <spam@flippers.com>
wrote:

On 2022/04/26 8:44 a.m., jlarkin@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast \"extreme
storms\" from 1982 to 2014.

https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer

---------<quote>-----------------
Lawrence Berkeley National Laboratory scientists are looking to make
highly detailed, 1 kilometer scale cloud models to improve climate
predictions. Using current supercomputer designs of combining
microprocessors used in personal computers, a system capable of making
such models would cost about $1 billion and use up 200 megawatts of
energy. A supercomputer using 20 million embedded processors, on the
other hand, would cost about $75 million and use less than 4 megawatts
of energy, according to Lawrence Berkeley National Laboratory researchers.
-------------<end quote>--------------

4 megawatts/200 megawatts - do the computers factor in their heat
generation in the climate models?

John ;-#)#

Does LBL measure energy in megawatts?

Do bigger computers predict climate better?

Oh dear.

--

If a man will begin with certainties, he shall end with doubts,
but if he will be content to begin with doubts he shall end in certainties.
Francis Bacon
 
On Wednesday, April 27, 2022 at 6:53:20 AM UTC+10, John Larkin wrote:
On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson <sp...@flippers.com
wrote:

On 2022/04/26 8:44 a.m., jla...@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast \"extreme
storms\" from 1982 to 2014.

https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer

---------<quote>-----------------
Lawrence Berkeley National Laboratory scientists are looking to make
highly detailed, 1 kilometer scale cloud models to improve climate
predictions. Using current supercomputer designs of combining
microprocessors used in personal computers, a system capable of making
such models would cost about $1 billion and use up 200 megawatts of
energy. A supercomputer using 20 million embedded processors, on the
other hand, would cost about $75 million and use less than 4 megawatts
of energy, according to Lawrence Berkeley National Laboratory researchers.
-------------<end quote>--------------

4 megawatts/200 megawatts - do the computers factor in their heat
generation in the climate models?

Probably don\'t have to bother. It\'s lost in the rounding errors.

> Does LBL measure energy in megawatts?

No, but the media department won\'t be staffed with people with degrees in physics (or any hard science).

> Do bigger computers predict climate better?

That remains to be seen, but modelling individual cloud masses at the 1km scale should work better than plugging in average cloud cover for regions broken up into 100km by 100km squares
The IEEE Spectum published an article on \"Cloud computing\" a few years ago that addressed this issue.

> Oh dear.

John Larkin doesn\'t know much, and what he thinks he know mostly comes from Anthony Watts\' climate change denial web site.

--
Bill Sloman, Sydney
 
On a sunny day (Tue, 26 Apr 2022 16:56:33 -0000 (UTC)) it happened Cydrome
Leader <presence@MUNGEpanix.com> wrote in <t49881$clq$2@reader1.panix.com>:

jlarkin@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast \"extreme
storms\" from 1982 to 2014.

The conclusion from a senior scientist is that \"it rains a lot more
during the worst storms.\"

I\'m surprised they even noticed that detail. Too bad they never talked to
anybody over at the NOAA about how things work.

There is a lot about need to publish
Somebody I knew did a PhD in psychology or something
He promoted on a paper about the sex-life of some group living in the wild.
I asked him if he went there and experienced it...

No :)

if you read sciencedaily.com every day there are papers and things
discovered that are either too obvious to read
or too vague to be useful.
Do plants have feeling?
Do monkeys feel emotions?
sort of things
Of course they do.
Today:
Prehistoric People Created Art by Firelight
of course they did, no flashlights back then in a dark cave.
 
On a sunny day (Tue, 26 Apr 2022 13:53:08 -0700) it happened John Larkin
<jlarkin@highland_atwork_technology.com> wrote in
<fpmg6hhot88ajjqkcb6nv9mkbjm7s9q85k@4ax.com>:

On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson <spam@flippers.com
wrote:


On 2022/04/26 8:44 a.m., jlarkin@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast \"extreme
storms\" from 1982 to 2014.

https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer

---------<quote>-----------------
Lawrence Berkeley National Laboratory scientists are looking to make
highly detailed, 1 kilometer scale cloud models to improve climate
predictions. Using current supercomputer designs of combining
microprocessors used in personal computers, a system capable of making
such models would cost about $1 billion and use up 200 megawatts of
energy. A supercomputer using 20 million embedded processors, on the
other hand, would cost about $75 million and use less than 4 megawatts
of energy, according to Lawrence Berkeley National Laboratory researchers.
-------------<end quote>--------------

4 megawatts/200 megawatts - do the computers factor in their heat
generation in the climate models?

John ;-#)#

Does LBL measure energy in megawatts?

Do bigger computers predict climate better?

Oh dear.

I have read CERN uses more power than all windmills together deliver in Switzerland.
 
On 2022-04-27 12:19, Jan Panteltje wrote:
On a sunny day (Tue, 26 Apr 2022 13:53:08 -0700) it happened John Larkin
jlarkin@highland_atwork_technology.com> wrote in
fpmg6hhot88ajjqkcb6nv9mkbjm7s9q85k@4ax.com>:

On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson <spam@flippers.com
wrote:


On 2022/04/26 8:44 a.m., jlarkin@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast \"extreme
storms\" from 1982 to 2014.

https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer

---------<quote>-----------------
Lawrence Berkeley National Laboratory scientists are looking to make
highly detailed, 1 kilometer scale cloud models to improve climate
predictions. Using current supercomputer designs of combining
microprocessors used in personal computers, a system capable of making
such models would cost about $1 billion and use up 200 megawatts of
energy. A supercomputer using 20 million embedded processors, on the
other hand, would cost about $75 million and use less than 4 megawatts
of energy, according to Lawrence Berkeley National Laboratory researchers.
-------------<end quote>--------------

4 megawatts/200 megawatts - do the computers factor in their heat
generation in the climate models?

John ;-#)#

Does LBL measure energy in megawatts?

Do bigger computers predict climate better?

Oh dear.

I have read CERN uses more power than all windmills together deliver in Switzerland.

Yes, that sounds correct. CERN uses about 200MW when everything is
running. Switzerland has a little over 70MW of windmills installed.
Of course, those never actually deliver 70MW. More like 25% of that,
on average.

Most of CERN\'s electricity comes from the Genissiat dam in nearby
France.

Jeroen Belleman
 
On Wednesday, April 27, 2022 at 8:17:20 PM UTC+10, Jan Panteltje wrote:
On a sunny day (Tue, 26 Apr 2022 16:56:33 -0000 (UTC)) it happened Cydrome
Leader <pres...@MUNGEpanix.com> wrote in <t49881$clq$2...@reader1.panix.com>:
jla...@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast \"extreme
storms\" from 1982 to 2014.

The conclusion from a senior scientist is that \"it rains a lot more
during the worst storms.\"

I\'m surprised they even noticed that detail. Too bad they never talked to
anybody over at the NOAA about how things work.
There is a lot about need to publish
Somebody I knew did a PhD in psychology or something
He promoted on a paper about the sex-life of some group living in the wild.
I asked him if he went there and experienced it...

No :)

if you read sciencedaily.com every day there are papers and things discovered that are either too obvious to read or too vague to be useful?

In Jan\'s ever-so-expert opinion.

Anything published in the peer reviewed literature, get reviewed by people who do know something about the subject - the author\'s peers - who have to accept it as a useful and meaningful contribution.

Max Planck didn\'t bother sending out any of Einsteins 1905 papers for review. He had enough confidence in his own judgement not to bother, and he was right.

> Do plants have feeling?

Depends what you mean by feelings.

> Do monkeys feel emotions?

Obviously they do.

sort of things
Of course they do.
Today:
Prehistoric People Created Art by Firelight
of course they did, no flashlights back then in a dark cave.

That\'s all popular science. Peer reviewed science is rather more technical.

--
Bill Sloman, Sydney
 
Anthony William Sloman <bill.sloman@ieee.org> wrote in
news:cc47cea9-5c27-411a-9793-f83274cfb007n@googlegroups.com:

On Wednesday, April 27, 2022 at 6:53:20 AM UTC+10, John Larkin
wrote:
On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson
sp...@flippers.com> wrote:

On 2022/04/26 8:44 a.m., jla...@highlandsniptechnology.com
wrote:
Lawrence Berkeley Lab announced the results from a new
supercomputer analysis of climate change. They analyzed five
west coast \"extreme storms\" from 1982 to 2014.

https://www.greenbiz.com/article/berkeley-lab-tensilica-collabora
te-energy-efficient-climate-modeling-supercomputer

---------<quote>-----------------
Lawrence Berkeley National Laboratory scientists are looking to
make highly detailed, 1 kilometer scale cloud models to improve
climate predictions. Using current supercomputer designs of
combining microprocessors used in personal computers, a system
capable of making such models would cost about $1 billion and
use up 200 megawatts of energy. A supercomputer using 20 million
embedded processors, on the other hand, would cost about $75
million and use less than 4 megawatts of energy, according to
Lawrence Berkeley National Laboratory researchers.
-------------<end quote>--------------

4 megawatts/200 megawatts - do the computers factor in their
heat generation in the climate models?

Probably don\'t have to bother. It\'s lost in the rounding errors.

Does LBL measure energy in megawatts?

No, but the media department won\'t be staffed with people with
degrees in physics (or any hard science).

Do bigger computers predict climate better?

That remains to be seen, but modelling individual cloud masses at
the 1km scale should work better than plugging in average cloud
cover for regions broken up into 100km by 100km squares The IEEE
Spectum published an article on \"Cloud computing\" a few years ago
that addressed this issue.

Oh dear.

John Larkin doesn\'t know much, and what he thinks he know mostly
comes from Anthony Watts\' climate change denial web site.

1nm scale not kilometer.

I want to marry this woman...

<https://www.youtube.com/watch?v=0sUQkIyoF8M>
 
On Tue, 26 Apr 2022 13:53:08 -0700, John Larkin
<jlarkin@highland_atwork_technology.com> wrote:

On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson <spam@flippers.com
wrote:


On 2022/04/26 8:44 a.m., jlarkin@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast \"extreme
storms\" from 1982 to 2014.

https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer

---------<quote>-----------------
Lawrence Berkeley National Laboratory scientists are looking to make
highly detailed, 1 kilometer scale cloud models to improve climate
predictions. Using current supercomputer designs of combining
microprocessors used in personal computers, a system capable of making
such models would cost about $1 billion and use up 200 megawatts of
energy. A supercomputer using 20 million embedded processors, on the
other hand, would cost about $75 million and use less than 4 megawatts
of energy, according to Lawrence Berkeley National Laboratory researchers.
-------------<end quote>--------------

4 megawatts/200 megawatts - do the computers factor in their heat
generation in the climate models?

John ;-#)#

Does LBL measure energy in megawatts?

Do bigger computers predict climate better?

Oh dear.

I think the jury has already returned that there is climate
change/global warming and it is probably already too late to do much
about it with the short time needed for countries and people to react.

Especially with all the global warming denialists that don\'t care
agout it and state of the art and science of generating
non-greenhouse gas energy.

I suppose that I won\'t be around to see how bad it will get which
could be a good thing.

I would love to have a super computer to run LTspice.

boB
 
On Thu, 28 Apr 2022 09:26:40 -0700, boB <boB@K7IQ.com> wrote:

On Tue, 26 Apr 2022 13:53:08 -0700, John Larkin
jlarkin@highland_atwork_technology.com> wrote:

On Tue, 26 Apr 2022 12:04:44 -0700, John Robertson <spam@flippers.com
wrote:


On 2022/04/26 8:44 a.m., jlarkin@highlandsniptechnology.com wrote:
Lawrence Berkeley Lab announced the results from a new supercomputer
analysis of climate change. They analyzed five west coast \"extreme
storms\" from 1982 to 2014.

https://www.greenbiz.com/article/berkeley-lab-tensilica-collaborate-energy-efficient-climate-modeling-supercomputer

---------<quote>-----------------
Lawrence Berkeley National Laboratory scientists are looking to make
highly detailed, 1 kilometer scale cloud models to improve climate
predictions. Using current supercomputer designs of combining
microprocessors used in personal computers, a system capable of making
such models would cost about $1 billion and use up 200 megawatts of
energy. A supercomputer using 20 million embedded processors, on the
other hand, would cost about $75 million and use less than 4 megawatts
of energy, according to Lawrence Berkeley National Laboratory researchers.
-------------<end quote>--------------

4 megawatts/200 megawatts - do the computers factor in their heat
generation in the climate models?

John ;-#)#

Does LBL measure energy in megawatts?

Do bigger computers predict climate better?

Oh dear.


I think the jury has already returned that there is climate
change/global warming and it is probably already too late to do much
about it with the short time needed for countries and people to react.

At last! We\'ll all be dead in 8 years. I\'d rather be drowned or blown
away than bored to death.



--

Anybody can count to one.

- Robert Widlar
 
On 4/28/22 11:26, boB wrote:

I would love to have a super computer to run LTspice.
I thought one of the problems with LTspice (and spice in general)
performance is that the algorithms don\'t parallelize very well.
 
On 2022-04-28 18:26, boB wrote:
[...]
I would love to have a super computer to run LTspice.

boB

In fact, what you have on your desk *is* a super computer,
in the 1970\'s meaning of the words. It\'s just that it\'s
bogged down running bloatware.

Jeroen Belleman
 
On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:

On 4/28/22 11:26, boB wrote:

I would love to have a super computer to run LTspice.

I thought one of the problems with LTspice (and spice in general)
performance is that the algorithms don\'t parallelize very well.

LT runs on multiple cores now. I\'d love the next gen LT Spice to run
on an Nvidia card. 100x at least.

--

If a man will begin with certainties, he shall end with doubts,
but if he will be content to begin with doubts he shall end in certainties.
Francis Bacon
 
On Thursday, April 28, 2022 at 1:47:11 PM UTC-4, Jeroen Belleman wrote:
On 2022-04-28 18:26, boB wrote:
[...]
I would love to have a super computer to run LTspice.

boB

In fact, what you have on your desk *is* a super computer,
in the 1970\'s meaning of the words. It\'s just that it\'s
bogged down running bloatware.

Even supercomputers from the 80s were not as fast as many of today\'s computers and the memory was often 16,000 times smaller than a typical laptop today.

--

Rick C.

- Get 1,000 miles of free Supercharging
- Tesla referral code - https://ts.la/richard11209
 
John Larkin wrote:
On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:

On 4/28/22 11:26, boB wrote:

I would love to have a super computer to run LTspice.

I thought one of the problems with LTspice (and spice in general)
performance is that the algorithms don\'t parallelize very well.

LT runs on multiple cores now. I\'d love the next gen LT Spice to run
on an Nvidia card. 100x at least.

The \"number of threads\" setting doesn\'t do anything very dramatic,
though, at least last time I tried. Splitting up the calculation
between cores would require all of them to communicate a couple of times
per time step, but lots of other simulation codes do that.

The main trouble is that the matrix defining the connectivity between
nodes is highly irregular in general.

Parallellizing that efficiently might well need a special-purpose
compiler, sort of similar to the profile-guided optimizer in the guts of
the FFTW code for computing DFTs. Probably not at all impossible, but
not that straightforward to implement.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC / Hobbs ElectroOptics
Optics, Electro-optics, Photonics, Analog Electronics
Briarcliff Manor NY 10510

http://electrooptical.net
http://hobbs-eo.com
 
Phil Hobbs <pcdhSpamMeSenseless@electrooptical.net> wrote:

John Larkin wrote:
On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:

On 4/28/22 11:26, boB wrote:

I would love to have a super computer to run LTspice.

I thought one of the problems with LTspice (and spice in general)
performance is that the algorithms don\'t parallelize very well.

LT runs on multiple cores now. I\'d love the next gen LT Spice to run
on an Nvidia card. 100x at least.


The \"number of threads\" setting doesn\'t do anything very dramatic,
though, at least last time I tried. Splitting up the calculation
between cores would require all of them to communicate a couple of times
per time step, but lots of other simulation codes do that.

The main trouble is that the matrix defining the connectivity between
nodes is highly irregular in general.

Parallellizing that efficiently might well need a special-purpose
compiler, sort of similar to the profile-guided optimizer in the guts of
the FFTW code for computing DFTs. Probably not at all impossible, but
not that straightforward to implement.

Cheers

Phil Hobbs

Supercomputers have thousands or hundreds of thousands of cores.

Quote:

\"Cerebras Systems has unveiled its new Wafer Scale Engine 2 processor with
a record-setting 2.6 trillion transistors and 850,000 AI-optimized cores.
It’s built for supercomputing tasks, and it’s the second time since 2019
that Los Altos, California-based Cerebras has unveiled a chip that is
basically an entire wafer.\"

https://venturebeat.com/2021/04/20/cerebras-systems-launches-new-ai-
supercomputing-processor-with-2-6-trillion-transistors/


Man, I wish I were back living in Los Altos again.




--
MRM
 
On 28/04/2022 18:47, Jeroen Belleman wrote:
On 2022-04-28 18:26, boB wrote:
[...]
I would love to have a super computer to run LTspice.

boB

In fact, what you have on your desk *is* a super computer,
in the 1970\'s meaning of the words. It\'s just that it\'s
bogged down running bloatware.

Indeed. The Cray X-MP in its 4 CPU configuration with a 105MHz clock and
a whopping for the time 128MB of fast core memory with 40GB of disk. The
one I used had an amazing for the time 1TB tape cassette backing store.
It did 600 MFLOPs with the right sort of parallel vector code.

That was back in the day when you needed special permission to use more
than 4MB of core on the timesharing IBM 3081 (approx 7 MIPS).

Current Intel 12 gen CPU desktops are ~4GHz, 16GB ram and >1TB of disk.
(and the upper limits are even higher) That combo does ~66,000 MFLOPS.

Spice simulation doesn\'t scale particularly well to large scale
multiprocessor environments to many long range interractions.

--
Regards,
Martin Brown
 
On 29/04/2022 07:09, Phil Hobbs wrote:
John Larkin wrote:
On Thu, 28 Apr 2022 12:01:59 -0500, Dennis <dennis@none.none> wrote:

On 4/28/22 11:26, boB wrote:

I would love to have a super computer to run LTspice.

I thought one of the problems with LTspice (and spice in general)
performance is that the algorithms don\'t parallelize very well.

LT runs on multiple cores now. I\'d love the next gen LT Spice to run
on an Nvidia card. 100x at least.


The \"number of threads\" setting doesn\'t do anything very dramatic,
though, at least last time I tried.  Splitting up the calculation
between cores would require all of them to communicate a couple of times
per time step, but lots of other simulation codes do that.

If it is anything like chess problems then the memory bandwidth will
saturate long before all cores+threads are used to optimum effect. After
that point the additional threads merely cause it to run hotter.

I found setting max threads to about 70% of those notionally available
produced the most computing power with the least heat. After that the
performance gain per thread was negligible but the extra heat was not.

Having everything running full bore was actually slower and much hotter!
The main trouble is that the matrix defining the connectivity between
nodes is highly irregular in general.

Parallellizing that efficiently might well need a special-purpose
compiler, sort of similar to the profile-guided optimizer in the guts of
the FFTW code for computing DFTs.  Probably not at all impossible, but
not that straightforward to implement.

I\'m less than impressed with profile guided optimisers in compilers. The
only time I tried it in anger the instrumentation code interfered with
the execution of the algorithms to such an extent as to be meaningless.

One gotcha I have identified in the latest MSC is that when it uses
higher order SSE2, AVX, and AVX-512 implicitly in its code generation it
does not align them on the stack properly so that sometimes they are
split across two cache lines. I see two distinct speeds for each
benchmark code segment depending on how the cache allignment falls.

Basically the compiler forces stack alignment to 8 bytes and cache lines
are 64 bytes but the compiler generated objects in play are 16 bytes, 32
bytes or 64 bytes. Alignment failure fractions 1:4, 2:4 and 3:4.

If you manually allocate such objects you can use pragmas to force
optimal alignment but when the code generator chooses to use them
internally you have no such control. Even so the MS compiler does
generate blisteringly fast code compared to either Intel or GCC.

--
Regards,
Martin Brown
 

Welcome to EDABoard.com

Sponsor

Back
Top