Is this Intel i7 machine good for LTSpice?

On Tue, 04 Nov 2014 09:08:33 -0800, DecadentLinuxUserNumeroUno
<DLU1@DecadentLinuxUser.org> Gave us:


Way too much waste in this nation and the world, for that matter.

This one looks real nice.

http://www.ebay.com/itm/HP-PROLIANT-DL360-G6-484184-B21-SERVER-2x-QC-E5504-2-0GHZ-4GB-2x-146GB-10K-RPM-/361075538793?pt=COMP_EN_Servers&hash=item5411c77b69

Not bad for a $5k computer.

I wish I had the extra cash right now.
 
On 11/04/2014 12:16 PM, DecadentLinuxUserNumeroUno wrote:
On Tue, 04 Nov 2014 09:08:33 -0800, DecadentLinuxUserNumeroUno
DLU1@DecadentLinuxUser.org> Gave us:



Way too much waste in this nation and the world, for that matter.

This one looks real nice.

http://www.ebay.com/itm/HP-PROLIANT-DL360-G6-484184-B21-SERVER-2x-QC-E5504-2-0GHZ-4GB-2x-146GB-10K-RPM-/361075538793?pt=COMP_EN_Servers&hash=item5411c77b69

Not bad for a $5k computer.

I wish I had the extra cash right now.

Yowza. I paid $3800 for my 16-core box in 2011, which looks like it'll
run rings round the HP thing.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510

hobbs at electrooptical dot net
http://electrooptical.net
 
On Tue, 04 Nov 2014 12:33:08 -0500, Phil Hobbs
<pcdhSpamMeSenseless@electrooptical.net> Gave us:

Yowza. I paid $3800 for my 16-core box in 2011, which looks like it'll
run rings round the HP thing.

Cheers

Operating speed? Actual CPUs used?

"Run rings" is pretty ambiguous, and the machine is older. They are
like G8 now. So the CPU speed and RAM are better now than then.

And all those great accessories, like multiple 1GbE ports, and an SAS
SCSI RAID interface and controller.

And SFP and XFP optical fabric can be added.

It is a pretty good deal for $600.
 
On 11/4/2014 9:55 AM, Joerg wrote:
rickman wrote:
On 11/3/2014 8:02 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 6:56 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 5:09 PM, Joerg wrote:
rickman wrote:
On 11/2/2014 5:28 PM, Jeff Liebermann wrote:
On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com
wrote:

One catch. LTspice saves its preferences to:
C:\windows\scad3.ini
which has to be writeable. The fix is to use the
-ini <path
command line switch, which will:
Specify an .ini file to use other than %WINDIR%\scad3.ini
http://ltwiki.org/LTspiceHelp/LTspiceHelp/Command_Line_Switches.htm




I need to note this somewhere. Writing to the Windows directory
is a
*very* bad idea.

It was standard procedure in Windoze 3.1, where almost all
applications dropped pick_a_name.ini files in the C:\Windows\
directory.

Yes, and Windows 3.1 crashed on a regular basis for about any reason
whatsoever just like 95, 98 and ME.

MS has been telling developers since Win2000 and maybe since NT
to not
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
put data files in the Windows or Program Files directories. Many
chose
to ignore this which wasn't enforced until Vista and became one
of the
things everyone loves to hate about Vista.


Maybe. But for us users only one thing counts: That stuff works.

Do you build your stuff so that if the user connects a different
computer it craps out? No, you design your interfaces *correctly* so
that it works now and it keeps working when some peripheral piece that
should have no impact is changed out.


I design it so that is also works correctly with legacy gear. In
aerospace that can mean equipment from before you and I were born.

And if that gear was not designed to spec you are screwed. You will
have to sit down and reverse engineer the unit so you can design the
interface. Do you really expect MS to do that with all the crappy
software that was designed poorly?


I expect them to provide a way that programs can write into their
install directories. What is so difficult about that?

You can do that. Don't install it in Program Files. Many programs do
just that if they don't want to play by the rules. The ones that can't
figure out that there *are* rules and just ignore the requirements of
the OS don't work so well.


I always install in another directory but some SW won't give you a choice.

[...]


I do have to admit it was handy as the files were easy to
find and save. The new and improved versions of Windoze hide these
config files in either the registry, or bury them 5 directory
layers
deep, where few can find them without specialized tools or inside
information.

Windows doesn't put anything from an app in the registry. That is
up to
the app to decide. Getting to these directories is easy if they
used
the right location, C:\ProgramData. Instead they continue to use
C:\Program Files and now with Win8 MS puts the files in the long
path
name you list, but I believe they can be reached transparently
through
the path C:\Program Files So the best of both worlds.

If the app puts them somewhere else, don't blame windows.


If it was allowed in old Windows, isn't in new Windows, and there
isn't
a user selector about this then I blame Windows.

"Allowed" meaning it didn't crap out, yes. "Allowed" meaning the
developers were not designing according to best practices, no.


If it was not disallowed it was ok.

See, that is the BS that got you into the problem. Now you are trying
to justify the bad development practices. I surely hope you don't use
that philosophy in the stuff you design. If it works, it is ok, ship
it! Then someone changes a process a bit and the design stops working.


Nope. Software that writes into its program folder may not be ideal but
can be perfectly sound.

Oh, I'm sorry, I didn't realize you were an OS expert. I guess you can
consult with MS and help them fix their problems.


Sometimes it would behove them to listen to customers some more. It
would most certainly have prevented the failure of their RTOS efforts.
After an Embedsyscon I told them they'll fail with RT and why. And then
they did.

Same with the housing bubble. I still remember a top notch realtor
laughing at me. Then they lost their own house ...


Even today it's still this way.
Personally I also think it was wrong but it is what it is. Many CAD
programs still store their libraries in the program folder and,
naturally, libraries are meant to be modified and added to.

Then put that on the CAD designers, not MS. They told them not to do it
with W2k and XP, they make it hard to do with Vista and 7. Now with
Win8 they have found a way to fake it out and put the files somewhere
else. They are just trying to make the computer harder to hack, but no
one want to work with them.


If this causes older CAD and other stuff not to run I do not want that
OS. My computer is a tool which I expect to be able to do the jobs that
I've been doing since decades. If it can't do that is isn't very useful
to me. Then I will strive to buy another one which can do that. It's
that simple.

Ever wondered why industrial users hung on to XP for so long and are now
(grudgingly) upgrading to Win-7 while shunning Win-8?

Windows 7 has many of the same issues you don't like about Win 8.


I know :-(

But what can you do other than VMs?



I can't tell you how many developers do all sorts of
things they aren't supposed to under windows. That is the actual
cause
of many problems people have running older software under Windows.
They
don't listen to the people providing them with the OS!

LTspice (aka SwitcherCAD) is a rather old program, with many of the
traditions of Windoze 3.1 still present. If you don't like
that, try
running some of the various NEC antenna modeling programs, that
still
use the terms "card" and "deck" from the Hollerith punch card era.
The
common mantra is the same everywhere... if it works, don't touch
it.

These programs have been updated many, many times since Windows 3.1.
Windows NT, 2k, XP, Vista, 7, 8 and 8.1 aren't even the same OS
as the
3.1 tree which was ended when XP was released. Stick with the old
habits and blame yourself or your program maintainer.

I use some open source Windows software that does the same crap and
I am
very vocal about the cause and the fix for the problem. Few of the
developers are interested though. Now that 8 makes this (using
Program
Files for data) work adequately they no longer have a need to
change it.

If you are relying on programming habits from over 20 years ago,
then
you will have to stew in your own soup.


Easy to say for someone who probably never has to deal with beamfield
sims and such. Bottomline there are programs some of us have to use
where there is no alternative. Where the design teams have dissolved
decades ago and some of the folks are not with us on earth
anymore. My
record so far is a chunk of software that was stored on an 8" floppy.

Yeah, exactly. If you are that far back in time you man need to
rethink
your approach.


Any suggestions there? Should I tell my clients that this, that and the
other project can't be done because only older design tools are
available?

Yes, tell your clients that there has been no software written since
1975.


ROFL! You clearly could not do my job.

No, I would just get more current software.


So how do you convince a research group that has disbanded in the 80's
or 90's and where the professor is retired to pull that off?

It must be some pretty special software that there is only one source in
the world. But then I guess there is only one person who seems to need
it.

Really? There isn't anything else produced in the last 20 years that
will do the job without requiring you to burden computer purchases for
the rest of your career?


I suppose you are still driving the car you had in the 70's too?


I would be still driving my 1987 Audi station if it had been possible to
register it in California. My former neighbor has it now and it still
works fine. I drive a 1997 SUV and the way it goes I might still drive
that 10 years from now.

A former coworker tools around in one of two cars. A 1950's Chevy truck
or a 1950's Bel Air, sometimes depending on whether he has to pick up
heavier stuff at the hardware store. Both cars in impeccable shape. Why
should he "upgrade"?

Yes, he "tools around". Not exactly the same as doing work.


He commutes to and from the work place, gets all his materials, what
else can one ask? A farmer around here still uses one of those Chevys in
the field, every day. Of course, that one does not look showroom.

Great. But when something breaks he is down for days or weeks until it
can be fixed *if* he can find a part. Not so different from you having
to select your computer based on your need for older operating systems.


Win 7 will be around for a long time. MS has learned from the Vista
debacle, giving XP a long lifetime. They know that Win-8 is in a lot of
aspects a dud.

No, MS knows how to make money and Win 7 won't be around for a long time.

I just find it funny how every new Windows that comes out is spawn of
the devil.


Because some of the were.

Like XP?


... I remember when XP came out it was shunned as being far too
restrictive. lol I'm looking for bumper stickers, "you can have my Win
XP when you tear it from my cold, dead hands".


I like it right from the start. And it's been good to me.


Win 7 was considered to be a poor choice, but became successful because
it was the only choice.


Right now it is but wasn't until recently.

Yes, and you are seeking high and low for Win 7 computers that were
shunned earlier. There really isn't much point in avoiding the current
OS as long as it has been out long enough to have wide peripheral
support and software support. I got Win 8 after it had been out a year
and I have seen no problems. Win 8.1 still doesn't show up in enough
systems requirements lists for me to upgrade, but I will likely do that
at some later point... assuming I can find a reason to do so. If it
ain't broke, don't fix it.

That goes both ways. I can't see where Win 8 is broke, so I ain't gonna
try to fix it with Win 7.

--

Rick
 
On 11/4/2014 10:07 AM, Joerg wrote:
Martin Brown wrote:
On 04/11/2014 01:02, Joerg wrote:
rickman wrote:
On 11/3/2014 6:56 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 5:09 PM, Joerg wrote:
rickman wrote:
On 11/2/2014 5:28 PM, Jeff Liebermann wrote:
On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com
wrote:

One catch. LTspice saves its preferences to:
C:\windows\scad3.ini
which has to be writeable. The fix is to use the
-ini <path
command line switch, which will:
Specify an .ini file to use other than %WINDIR%\scad3.ini
http://ltwiki.org/LTspiceHelp/LTspiceHelp/Command_Line_Switches.htm




I need to note this somewhere. Writing to the Windows directory
is a
*very* bad idea.

It was standard procedure in Windoze 3.1, where almost all
applications dropped pick_a_name.ini files in the C:\Windows\
directory.

Yes, and Windows 3.1 crashed on a regular basis for about any reason
whatsoever just like 95, 98 and ME.

MS has been telling developers since Win2000 and maybe since NT
to not
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
put data files in the Windows or Program Files directories. Many
chose
to ignore this which wasn't enforced until Vista and became one
of the
things everyone loves to hate about Vista.


Maybe. But for us users only one thing counts: That stuff works.

Do you build your stuff so that if the user connects a different
computer it craps out? No, you design your interfaces *correctly* so
that it works now and it keeps working when some peripheral piece that
should have no impact is changed out.


I design it so that is also works correctly with legacy gear. In
aerospace that can mean equipment from before you and I were born.

And if that gear was not designed to spec you are screwed. You will
have to sit down and reverse engineer the unit so you can design the
interface. Do you really expect MS to do that with all the crappy
software that was designed poorly?


I expect them to provide a way that programs can write into their
install directories. What is so difficult about that?

It is no longer considered good practice to permit this without asking
permission. ...


Still, a good OS must support legacy SW. This decision whether to allow
or not should be left to the customer.

You are aware that this is not a function of the OS innately, right?
There is nothing different about XP, Vista, Win 7 and Win 8 in this
regard. It is just an issue of permissions set by default. Even NT
could have the same restrictions if the permissions were set accordingly.

So just change the permissions on Program Files and you are done!


... It could be writing things that modify executable code.


That;s easily preventable if you give customers choices. They could, for
example, allow writing but exclude changes to executable files.

The do that with permissions. Just adjust the permissions to suit your
needs and damage control be damned.


Legacy code that needs to do this should live in some 8.3 filename
compatible hovel from the root directory. You will otherwise find
programs that don't work because fully qualified filenames overflow
buffers in ancient DOS programs. Peeky pokey ancient print IO port stuff
is at the mercy of the OS as to whether or not it will work.


Again, a user could easily make directories that comply with old styles.

Yes, that is what you are being told.


... OS
development is continuing. There are significant problems with older OS
and they are trying to fix those problems. If you want to run DOS
software, why not run DOS? I have read here that it is still available
and the hardware should still run it.

Supposedly that's a problem on PCs with Win-8. With XP, no problem, I've
done it. And I do not need to reboot for that. This is what I call
performance.

Comparatively few DOS programs are really tetchy about what version they
are on and some of them merely baulk at running on an OS that has
version numbers much higher than it expects to see.


Sure, but blanket-banning any 16-bit apps is a really bad idea. It
results in lost biz opportunity for an OS maker because people will be
leery of upgrades.

Who bans 16 bit apps? I did a search and every windows through 8 runs
16 bit apps.

But really... I expect the number of customers that MS loses from 20
year old programs that won't run is in the single digits.

Maybe you should switch to Linux? I understand you can runs nearly any
old version of Linux on any old hardware. Oh, but you won't have the
applications, will you?


I do have to admit it was handy as the files were easy to
find and save. The new and improved versions of Windoze hide these
config files in either the registry, or bury them 5 directory
layers
deep, where few can find them without specialized tools or inside
information.

Windows doesn't put anything from an app in the registry. That is
up to
the app to decide. Getting to these directories is easy if they
used
the right location, C:\ProgramData. Instead they continue to use
C:\Program Files and now with Win8 MS puts the files in the long
path
name you list, but I believe they can be reached transparently
through
the path C:\Program Files So the best of both worlds.

If the app puts them somewhere else, don't blame windows.


If it was allowed in old Windows, isn't in new Windows, and there
isn't
a user selector about this then I blame Windows.

"Allowed" meaning it didn't crap out, yes. "Allowed" meaning the
developers were not designing according to best practices, no.

If it was not disallowed it was ok.

See, that is the BS that got you into the problem. Now you are trying
to justify the bad development practices. I surely hope you don't use
that philosophy in the stuff you design. If it works, it is ok, ship
it! Then someone changes a process a bit and the design stops working.


Nope. Software that writes into its program folder may not be ideal but
can be perfectly sound.

Yes. But if you want to do that for legacy keep a specific directory
where the permissions are right for this (ab)usage.

I wouldn't recommend installing it in user documents or the directory
name length starts getting a bit long for comfort.


That's all ok as long as the OS does not blanket-ban the old stuff.

Are you going to explain what you are talking about?


Even today it's still this way.
Personally I also think it was wrong but it is what it is. Many CAD
programs still store their libraries in the program folder and,
naturally, libraries are meant to be modified and added to.

Then put that on the CAD designers, not MS. They told them not to do it
with W2k and XP, they make it hard to do with Vista and 7. Now with
Win8 they have found a way to fake it out and put the files somewhere
else. They are just trying to make the computer harder to hack, but no
one want to work with them.


If this causes older CAD and other stuff not to run I do not want that
OS. My computer is a tool which I expect to be able to do the jobs that
I've been doing since decades. If it can't do that is isn't very useful
to me. Then I will strive to buy another one which can do that. It's
that simple.

Most windows programs now do allow you to put the libraries in user
documents or somewhere else that they can be modified safely.


Some older CAD doesn't and there's the problem. There is a lot of custom
software that is de facto irreplaceable.

I have a hard time believing that. I think this is a Joerg's world
issue. I guess no one does the sort of work you do because they can't
find the software.


My favourites are

C:\Program.dos

and

C:\DATA

For legacy DOS code that is only 8.3 filename aware. You are probably
out of luck if you have any of the software that insists on having a
dongle plugged into the non-existent centronix printer port these days.


Whoever buys dongled software brought the wrath upon themselves. That is
one thing I never did and never will do.


Ever wondered why industrial users hung on to XP for so long and are now
(grudgingly) upgrading to Win-7 while shunning Win-8?

Main reason is inertia and Vista was such a dog.

Other problem is that scientific instrument makers (and engineering
toolmakers) can't be bothered to provide drivers for 10 year old kit on
newer OSs and the gear will typically last for 15-20 years.


Make that 30+ years :)


Several of my wife's older lab instruments are firewalled off from the
corporate network because they run now unsupported legacy XP with no
prospect of ever upgrading to any new OS. Device drivers simply do no
exist - of course the maker would love to sell them a brand new one.


I came across one piece of production equipment that would only run
under Windows 3.2.


Example from a few years ago: A nasty alarm system problem had to be
diagnosed. The software from that system was from the 80's. If I hadn't
been able to run it really old software here at my lab I would have had
to turn down that whole job. That would not be what I call smart.

I guess you will have to close down shop in a few more years then. Even
Win7 is going bye-bye before too long. You can learn to use computers
or be a victim, your choice.

Win 7 will be around for a long time. MS has learned from the Vista
debacle, giving XP a long lifetime. They know that Win-8 is in a lot of
aspects a dud.

Mainly because it forces desktop users to leave greasy fingerprints on
their screens and the new GUI looks like Picasso on a bad acid trip.


I understand one can get it to behave somewhat normal but a regular
non-techie user can't get that done. Others just don't have the time to
fix it. So they avoid it.

I don't know where you get your info. Nearly all of my problems have to
do with the computer rather than the OS. This Lenovo laptop has
function keys that aren't function keys. They are laptop control keys
and I have to press the function button to make them into function keys.
Then some of the combos don't work, like cntl-F3. There are many
other Lenovo issues, but Win 8 is doing well. In fact there are a
number of new features that let me get on with my work better than
before. In fact they are so transparent that I can't even recall what
they are.

I had to get used to some of the visual differences in Win 8, but that
is not a big deal. Every new version updates the look, same as a car.

--

Rick
 
On 11/4/2014 3:54 AM, Martin Brown wrote:
On 04/11/2014 07:20, upsidedown@downunder.com wrote:
On Mon, 03 Nov 2014 19:31:10 -0500, rickman <gnuarm@gmail.com> wrote:

On 11/3/2014 6:59 PM, Joerg wrote:

It _is_ separate from the main memory on all modern graphics cards. The
core circuitry of a PC has nothing to do with screen refresh. That was
even the case with an old Tseng Labs card I had in the early 90's.

Ok, now I understand why you couldn't get what I am saying. Computers
have come a long way since the 90's. Most computers, desktop as well as
laptop now integrate the video controller into the main chipset and use
main memory as video RAM, *NOT* as a separate function with its own
memory. If you don't believe me look at the specs on a few systems.
Anything that talks about Intel XYZ graphics has an integrated
controller and shares main memory for video. In fact, you said
something about this yourself in this thread where you mentioned video
on the motherboard I believe.

Video on the motherboard is usually integrated. If you get a graphics
card it will be separate. A very few motherboards have separate video
controller on board with separate video memory.

1920x1080x60x24bits is just 120 Mpixels/s or 360 MB/s.
DDR3 memories have peak transfer rates over 10 GB/s, so the video
refresh is less than 3 % of the memory bandwidth.


And provided that the high performance code that you are running is
sensible and cache aware the hit from the video refresh overhead is
barely detectable. The box runs a *lot* cooler without a 3D GPU in.

There is a highly inaccurate assumption. Multicore CPUs are very much
memory bandwidth limited. Read about the memory wall. Once you reach 3
or 4, adding CPUs gives diminishing returns for performance. Boost the
memory speed and performance picks up again. Take away memory bandwidth
and the CPU speed falls off as well. The point is why pay hundreds of
dollars for extra CPU speed only to piss it away with an on chip
graphics controller sharing the memory bus?

--

Rick
 
On 11/4/2014 10:11 AM, Joerg wrote:
rickman wrote:
On 11/3/2014 6:59 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 4:57 PM, Lasse Langwadt Christensen wrote:
Den mandag den 3. november 2014 22.42.14 UTC+1 skrev rickman:
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:

[...]

Since I cannot afford to put $1000 into a Titan video card,
I miss on
a few benchmarks with my $250 GTX650.


I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no
movies.

If you are going for power, you need to have separate video
memory or
the video eats memory bandwidth which is often the limiting factor
on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with
multi-banked
RAM. Does this machine have two or more memory interfaces or just
one?


No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.

You aren't grasping the concept. Video memory needs a sizable
bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.


with anything but a graphics-card integrated in the chipset that memory
will be on the card itself, 1920*1080*24bit is less that 7MB

It is not the amount of memory, it is the video bandwidth to keep the
monitor refreshed. Yes, it should be separate from the main memory or
you take a hit from the video accesses.


It _is_ separate from the main memory on all modern graphics cards. The
core circuitry of a PC has nothing to do with screen refresh. That was
even the case with an old Tseng Labs card I had in the early 90's.

Ok, now I understand why you couldn't get what I am saying. Computers
have come a long way since the 90's. Most computers, desktop as well as
laptop now integrate the video controller into the main chipset and use
main memory as video RAM, *NOT* as a separate function with its own
memory.


Serious machine don't.


If you don't believe me look at the specs on a few systems.


Look at the Dell XPS series. It has the video completely separated. As
it should be.

I don't know what a "serious" machine is. I think that would be hard to
spec. It is easy to spec a separate video controller and memory.


Anything that talks about Intel XYZ graphics has an integrated
controller and shares main memory for video. In fact, you said
something about this yourself in this thread where you mentioned video
on the motherboard I believe.


Simple computers have that but the bigger machines do not. Or sometimes
have it but it's not being used because there is a big fat Nvidia or
other graphics card in there.

That's what I'm saying. Make sure you have a video card, then you can
turn off the internal video.


Video on the motherboard is usually integrated. If you get a graphics
card it will be separate. A very few motherboards have separate video
controller on board with separate video memory.


A "few"? When have you last looked at business-clas computers?

"Business class" is a marketing term and means nothing for the specs.

--

Rick
 
On 11/4/2014 2:40 AM, upsidedown@downunder.com wrote:
On Mon, 03 Nov 2014 19:37:02 -0500, rickman <gnuarm@gmail.com> wrote:

On 11/3/2014 6:58 PM, Lasse Langwadt Christensen wrote:
Den tirsdag den 4. november 2014 00.42.36 UTC+1 skrev rickman:
On 11/3/2014 5:47 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:

[...]

Since I cannot afford to put $1000 into a Titan video card, I
miss on
a few benchmarks with my $250 GTX650.


I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no movies.

If you are going for power, you need to have separate video memory or
the video eats memory bandwidth which is often the limiting factor on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with multi-banked
RAM. Does this machine have two or more memory interfaces or just one?


No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.

You aren't grasping the concept. Video memory needs a sizable bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.

This has *nothing* to do with drawing the images into graphic memory.

The memory bank question will likely be more important than the number
of cores in the CPU. The guy who can run 16 threads has at least two
memory interfaces or it would be bogging down between 4 and 8 cores.


Well ... we did enter the 21st century. In this day and age graphics
cards come with their own memory. AFAIK the Nvidia GT720 has 1GB of on
board RAM. Others have more but that sounds sufficient. Also, there is
no need to store 60 frames if the content is more or less static.

I don't know if you are playing with me or what. Yes, that is what I am
telling you, get a system with separate graphic memory which means a
separate graphics chip. Many mobos have built in video with *no* video
ram.


even so, unless you play 3D games that need massive amount of texture memory
I doubt it matters much

an I7 have something like +30GByte/sec memory BW depending on memory config

refreshing two full HD monitors at 60Hz is only a few percent of that

Yes, exactly! Just refreshing the monitors is some significant
percentage of the available memory bandwidth. Why spend a bunch of
money on an i7 with fast memory only to share that with the video
controller? Running multicore is typically memory bandwidth limited so
a 5 or 10% hit to the memory bandwidth will be a 5 to 10% hit to CPU
performance in the critical sections of code... where it matters.

After startup, I would expect that in simulation both code and
intermediate results are read from the cache, which these days seem to
be several megabytes. The main memory is mainly needed to store
results.

If a huge amount of data is to be generate, a sensible simulation
program on a 64 bit machine would allocate hundreds of gigabytes of
virtual memory and write the result into that memory. Associate a disk
file with that virtual memory range (memory mapped file) and let the
page fault mechanism write those virtual memory pages to disk in the
background.

So if I understand correctly, all simulations will either be small
enough to fit in cache or be so large as to require paging to disk? How
do you know this?

--

Rick
 
On 11/4/2014 10:13 AM, Joerg wrote:
rickman wrote:
On 11/3/2014 5:47 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:

[...]

Since I cannot afford to put $1000 into a Titan video card, I
miss on
a few benchmarks with my $250 GTX650.


I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no movies.

If you are going for power, you need to have separate video memory or
the video eats memory bandwidth which is often the limiting factor
on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with multi-banked
RAM. Does this machine have two or more memory interfaces or just
one?


No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.

You aren't grasping the concept. Video memory needs a sizable bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.

This has *nothing* to do with drawing the images into graphic memory.

The memory bank question will likely be more important than the number
of cores in the CPU. The guy who can run 16 threads has at least two
memory interfaces or it would be bogging down between 4 and 8 cores.


Well ... we did enter the 21st century. In this day and age graphics
cards come with their own memory. AFAIK the Nvidia GT720 has 1GB of on
board RAM. Others have more but that sounds sufficient. Also, there is
no need to store 60 frames if the content is more or less static.

I don't know if you are playing with me or what. Yes, that is what I am
telling you, get a system with separate graphic memory which means a
separate graphics chip. Many mobos have built in video with *no* video
ram.


Yes, and that would suffice for my purposes. But in this thread we were
talking about a different class of computers, the Dell XPS series.

Sometimes I think you just make stuff up.

--

Rick
 
Joerg wrote:

miso wrote:
Joerg wrote:

There are so many variants of graphics cards that it would require tons
of work for Mike's team.


It isn't the graphics card as much as the standard of acceleration. ATI
and Nvidia use different standards.

NGspice has Cuda support, which means you need Nvidia.
http://ngspice.sourceforge.net/

You also need an OS that supports CUDA.


That's where it becomes esoteric to me. I just want to install LTSpice
and ... simulate. Not get into the business of IT and computer science
which is pretty foreign to us analog guys anyhow.

Uh, I am the analog guy.

NGspice is easy to compile, at least under linux. I don't have a proper
video card to test the cuda, but will make it a point to get one when I
upgrade.
 
DecadentLinuxUserNumeroUno wrote:

On Mon, 03 Nov 2014 16:01:39 -0500, rickman <gnuarm@gmail.com> Gave us:

So DOS programs? What makes you think they won't work under Win8? The
usual FUD?


So, you really know nothing about actual attachment to real hardware
hooks then, eh, dingledorf?

You ain't real bright, boy.

Under DOS (not to be confused with the windows "cmd"), you "own" the
hardware. It is not a multi-user OS so there is no need to "abstract" the
peripherals.

The classic DOS hacker program was to bit bang the parallel port. Since you
could write directly to the port, it wasn't that hard. The second most
popular DOS hack was to read the the level sensitive pins on the serial port
(i.e. pins that don't go through the UART.)

If you are just running code to do calculations, DOSBOX is the way to go.
 
Phil Hobbs wrote:


Which processors is in there?


It has a pair of AMD Opteron 6128s. I haven't been keeping up, but 3
years ago the Magny Cours Opterons ran rings around the Intel offerings
for floating point.

Cheers

Phil Hobbs

I was a big AMD fan, but Intel has trumped them. It isn't even a contest
today.

I will say that the AMD CPUs have better memory management, so they do
multitask a little better, but that can't save them on today's market.

I delayed building this Xeon PC hoping AMD would get their act together, but
I gave up.
 
Have I used Spice to analyze a resistor divider? Actually yes, but in
finite
element analysis to simulate a laser trim procedure. The basic networks
are
designed by hand.

John Larkin wrote:

> Do that if you enjoy it. I

You have no fucking clue what I am talking about. I might as well be talking
to the wall. Have you ever designed a chip where you laser trim thin film
resistors?
 
rickman wrote:
On 11/4/2014 10:13 AM, Joerg wrote:
rickman wrote:
On 11/3/2014 5:47 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:

[...]

Since I cannot afford to put $1000 into a Titan video
card, I
miss on
a few benchmarks with my $250 GTX650.


I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no
movies.

If you are going for power, you need to have separate video
memory or
the video eats memory bandwidth which is often the limiting factor
on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with
multi-banked
RAM. Does this machine have two or more memory interfaces or just
one?


No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.

You aren't grasping the concept. Video memory needs a sizable
bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.

This has *nothing* to do with drawing the images into graphic memory.

The memory bank question will likely be more important than the number
of cores in the CPU. The guy who can run 16 threads has at least two
memory interfaces or it would be bogging down between 4 and 8 cores.


Well ... we did enter the 21st century. In this day and age graphics
cards come with their own memory. AFAIK the Nvidia GT720 has 1GB of on
board RAM. Others have more but that sounds sufficient. Also, there is
no need to store 60 frames if the content is more or less static.

I don't know if you are playing with me or what. Yes, that is what I am
telling you, get a system with separate graphic memory which means a
separate graphics chip. Many mobos have built in video with *no* video
ram.


Yes, and that would suffice for my purposes. But in this thread we were
talking about a different class of computers, the Dell XPS series.

Sometimes I think you just make stuff up.

Did you read the subject line of this thread?

--
Regards, Joerg

http://www.analogconsultants.com/
 
On Tue, 04 Nov 2014 07:16:07 -0800, Joerg <news@analogconsultants.com>
wrote:

Martin Riddle wrote:
On Sun, 02 Nov 2014 07:25:49 -0800, Joerg <news@analogconsultants.com
wrote:

Folks,

Need to spiff up my simulation speeds here. IIRC Mike Engelhardt stated
that the Intel i7 is a really good processor for LTSPice. According to
this it looks like the 4790 is the fastest of the bunch:

http://www.intel.com/content/www/us/en/processors/core/core-i7-processor.html

So, what do thee say, is the computer in the Costco link below a good
deal for LTSpice purposes?

http://www.costco.com/Dell-XPS-8700-Desktop-%7c-Intel-Core-i7-%7c-1GB-Graphics-%7c-Windows-7-Professional.product.100131208.html

It's also available without MS-Office Home & Student 2013 for $100 less
but I found that OpenOffice isn't 100% compatible in the Excel area so
that sounds like an ok deal. My hope is that it can drive two 27"
monitors but I guess I can always add in another graphics card if not.

Reason I am looking at these is that I absolutely positively do not want
any computer with Windows 8 in here and unfortunately that's what many
others come with.


Should be fine for ltspice. The 1600 dram makes a hugh difference.
If you want a real screamer, then it's another story. Xeon class, a
real number cruncher.


Do you have any suggestion from a mainstream manufacturer? I need
something with a screaming processor, lots of memory but do not need
much disk space and certainly no fancy 3D graphics (but the big machines
seem to always have that these days and it'll just waste electricity).

I would go with a Xeon and quad channel memory if price is not an
issue. There is a Nvidia card that has screaming 2D performance, it is
not a top 3D performer.
We are happy with the DELL workstations at work, their Xeons and I
believe Quad channel. THey have Ati graphic cards for Soild works.
I think the Extream i7 series support Quad channel, but they are not
as fast as a xeon. The x99 intel chip set supports Quad channel.

A good tool to pick a processor is the www.passmark.com site.
The Xeons get expensive.

Cheers
 
On Tue, 04 Nov 2014 07:16:07 -0800, Joerg <news@analogconsultants.com>
wrote:

>Do you have any suggestion from a mainstream manufacturer?

No. You're requirement sound more like a server motherboard than a
"workstation". However, don't go shopping for a rack mount server
package, which tends to have expensive features and options that you
don't need.

I need
something with a screaming processor,

Think about a dual processor Xeon server type motherboard:
<http://www.newegg.com/Product/ProductList.aspx?Description=dual%20processor%20motherboard&Submit=ENE>
16 DDR3 slots, with 32GB or more max RAM is common. The expense is
really in the processor. You can possibly populate the motherboard
with one CPU and add another (identical stepping) CPU later. I'm not
sure, but I think that for dual processor support, you'll need Win 7
Ultimate. I don't have any specific vendor or hardware
recommendations because the last dual processor server I built was
about 4 years ago.

>lots of memory

How's this MB with 16 memory slots and 512GB maximum RAM?
<http://www.newegg.com/Product/Product.aspx?Item=N82E16813131814>

but do not need
much disk space

Get a 250GB or 500GB SSD.

and certainly no fancy 3D graphics (but the big machines
seem to always have that these days and it'll just waste electricity).

Picking a video card is tricky. I decided that I didn't want to
listen to a fan, and ended dealing with a compromise between power
consumption (as limited by the power supply), and performance. As
Martin Riddle suggested, dig through the Passmark web pile for clues.
For video, start here:
<http://www.videocardbenchmark.net>
Nvidia has a handy selection guide:
<http://www.nvidia.com/content/HelpMeChoose/fx2/HelpMeChoose.asp?lang=en-us>
Newegg also has a huge selection, that can be narrowed down with the
"choices" on the left:
<http://www.newegg.com/Desktop-Graphics-Cards/SubCategory/ID-48>

There's also quite a bit on the various FutureMark web piles:
<http://www.3dmark.com>
<http://www.futuremark.com/benchmarks/pcmark>
You download one of their (free) benchmarking programs, run it on your
prospective hardware, and see where you stand compared to other
systems. It's really designed for gamers, but if raw CPU/IO
performance is what you want, methinks it would be useful if you just
ignore the video benchmark results. Benchmark results:
<http://www.futuremark.com/hardware/>




--
Jeff Liebermann jeffl@cruzio.com
150 Felker St #D http://www.LearnByDestroying.com
Santa Cruz CA 95060 http://802.11junk.com
Skype: JeffLiebermann AE6KS 831-336-2558
 
On 11/4/2014 8:47 PM, Joerg wrote:
rickman wrote:
On 11/4/2014 10:07 AM, Joerg wrote:
Martin Brown wrote:
On 04/11/2014 01:02, Joerg wrote:
rickman wrote:
On 11/3/2014 6:56 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 5:09 PM, Joerg wrote:
rickman wrote:
On 11/2/2014 5:28 PM, Jeff Liebermann wrote:
On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com
wrote:

[...]

... OS
development is continuing. There are significant problems with
older OS
and they are trying to fix those problems. If you want to run DOS
software, why not run DOS? I have read here that it is still
available
and the hardware should still run it.

Supposedly that's a problem on PCs with Win-8. With XP, no problem,
I've
done it. And I do not need to reboot for that. This is what I call
performance.

Comparatively few DOS programs are really tetchy about what version they
are on and some of them merely baulk at running on an OS that has
version numbers much higher than it expects to see.


Sure, but blanket-banning any 16-bit apps is a really bad idea. It
results in lost biz opportunity for an OS maker because people will be
leery of upgrades.

Who bans 16 bit apps? I did a search and every windows through 8 runs
16 bit apps.


New 64-bit Windows OS'es do.

When I searched I found specific info on running 16 bit apps under Win
8-64 bits. It's not a problem.


But really... I expect the number of customers that MS loses from 20
year old programs that won't run is in the single digits.


It's huge. MS is painfully aware that the industry is extremely sluggish
in upgrading and it doesn't take much to figure out why. Because on
production lines there are tons of stations that run 20+ year old
software. Do you honestly think a company will throw a well-working
half-million Dollar active laser trim system into the dumpster just
because some OS "requires" this?

You always bring up the tiny corner cases. How many "half-million
Dollar" systems rely on 20 year old software. Anyone managing such a
machine would have replaced the piece long ago. It is no different than
any other part of the machine which wears out. I replace wiper blades
on my car every 6 months or year, I get new tires every three or four
years. I'm not going to expect to repair 20 year old computer hardware
so I will plan to replace it with new stuff and that includes the
software if necessary. But fortunately it will still run under Windows
8. :)


They don't lose customers, they lose business volume. So do their OEM
partners.

That is absurd. Like I said, I expect they have lost single digit
customers due to problems running DOS apps.


Maybe you should switch to Linux? I understand you can runs nearly any
old version of Linux on any old hardware. Oh, but you won't have the
applications, will you?


I can't use Linux here. Have tried it though.


I do have to admit it was handy as the files were easy to
find and save. The new and improved versions of Windoze hide
these
config files in either the registry, or bury them 5 directory
layers
deep, where few can find them without specialized tools or inside
information.

Windows doesn't put anything from an app in the registry. That is
up to
the app to decide. Getting to these directories is easy if they
used
the right location, C:\ProgramData. Instead they continue to use
C:\Program Files and now with Win8 MS puts the files in the long
path
name you list, but I believe they can be reached transparently
through
the path C:\Program Files So the best of both worlds.

If the app puts them somewhere else, don't blame windows.


If it was allowed in old Windows, isn't in new Windows, and there
isn't
a user selector about this then I blame Windows.

"Allowed" meaning it didn't crap out, yes. "Allowed" meaning the
developers were not designing according to best practices, no.

If it was not disallowed it was ok.

See, that is the BS that got you into the problem. Now you are trying
to justify the bad development practices. I surely hope you don't use
that philosophy in the stuff you design. If it works, it is ok, ship
it! Then someone changes a process a bit and the design stops
working.


Nope. Software that writes into its program folder may not be ideal but
can be perfectly sound.

Yes. But if you want to do that for legacy keep a specific directory
where the permissions are right for this (ab)usage.

I wouldn't recommend installing it in user documents or the directory
name length starts getting a bit long for comfort.


That's all ok as long as the OS does not blanket-ban the old stuff.

Are you going to explain what you are talking about?


This:

http://answers.microsoft.com/en-us/windows/forum/windows_8-winapps/how-do-i-run-a-16-bit-application-with-a-64-bit/ce0b3186-c39d-4027-88ef-f802a3f74f8e

Do you realize this info is not from MS, but from someone posting in a
forum? In other words, you are planning your business around hearsay
you found on a web forum. Since you like social media...

https://www.youtube.com/watch?v=pCFkxzVs5cc


Even today it's still this way.
Personally I also think it was wrong but it is what it is. Many CAD
programs still store their libraries in the program folder and,
naturally, libraries are meant to be modified and added to.

Then put that on the CAD designers, not MS. They told them not to
do it
with W2k and XP, they make it hard to do with Vista and 7. Now with
Win8 they have found a way to fake it out and put the files somewhere
else. They are just trying to make the computer harder to hack,
but no
one want to work with them.


If this causes older CAD and other stuff not to run I do not want that
OS. My computer is a tool which I expect to be able to do the jobs that
I've been doing since decades. If it can't do that is isn't very useful
to me. Then I will strive to buy another one which can do that. It's
that simple.

Most windows programs now do allow you to put the libraries in user
documents or somewhere else that they can be modified safely.


Some older CAD doesn't and there's the problem. There is a lot of custom
software that is de facto irreplaceable.

I have a hard time believing that. I think this is a Joerg's world
issue. I guess no one does the sort of work you do because they can't
find the software.


No, the reason I am a rare species is probably that most folks in my
field of work are retired or no longer on earth.

If you've never dealt with custom beam field simulators or similar
specialty software you can't really understand this. They don't come out
with a new release every year or so. They come out with one, and that's it.

Well, I guess the industry will have to shut down then. Maybe there
will be another one out in the next 20 years. In the mean time, try
running it under Windows 8, it should work.


I had to get used to some of the visual differences in Win 8, but that
is not a big deal. Every new version updates the look, same as a car.


I am driving a 1997 SUV, stick-shift, with window cranks and manual door
lock. No new look here and probably not for many years to come. The
automotive industry's nightmare is an elderly woman out here driving an
Austin. She bought it used in 1961 and it runs just fine. She said it'll
probably survive her.

Sorry you don't like power windows and the other nice features of life.
If you want to figure out some other computer problems, let me know.
I'm a bit of a Luddite myself, but I am practical about it and learn
when I have to.

I'm actually looking forward to getting a new truck. The only thing I
don't like about this one is it has no doors for the back seat. My next
one will have suicide doors so I can get things in and out of the back
seat. I'm not ready to give up the bed... but maybe I'll compromise and
get some sort of an SUV and keep the truck too.

--

Rick
 
rickman wrote:
On 11/4/2014 10:07 AM, Joerg wrote:
Martin Brown wrote:
On 04/11/2014 01:02, Joerg wrote:
rickman wrote:
On 11/3/2014 6:56 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 5:09 PM, Joerg wrote:
rickman wrote:
On 11/2/2014 5:28 PM, Jeff Liebermann wrote:
On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com
wrote:

[...]

... OS
development is continuing. There are significant problems with
older OS
and they are trying to fix those problems. If you want to run DOS
software, why not run DOS? I have read here that it is still
available
and the hardware should still run it.

Supposedly that's a problem on PCs with Win-8. With XP, no problem,
I've
done it. And I do not need to reboot for that. This is what I call
performance.

Comparatively few DOS programs are really tetchy about what version they
are on and some of them merely baulk at running on an OS that has
version numbers much higher than it expects to see.


Sure, but blanket-banning any 16-bit apps is a really bad idea. It
results in lost biz opportunity for an OS maker because people will be
leery of upgrades.

Who bans 16 bit apps? I did a search and every windows through 8 runs
16 bit apps.

New 64-bit Windows OS'es do.


But really... I expect the number of customers that MS loses from 20
year old programs that won't run is in the single digits.

It's huge. MS is painfully aware that the industry is extremely sluggish
in upgrading and it doesn't take much to figure out why. Because on
production lines there are tons of stations that run 20+ year old
software. Do you honestly think a company will throw a well-working
half-million Dollar active laser trim system into the dumpster just
because some OS "requires" this?

They don't lose customers, they lose business volume. So do their OEM
partners.


Maybe you should switch to Linux? I understand you can runs nearly any
old version of Linux on any old hardware. Oh, but you won't have the
applications, will you?

I can't use Linux here. Have tried it though.

I do have to admit it was handy as the files were easy to
find and save. The new and improved versions of Windoze hide
these
config files in either the registry, or bury them 5 directory
layers
deep, where few can find them without specialized tools or inside
information.

Windows doesn't put anything from an app in the registry. That is
up to
the app to decide. Getting to these directories is easy if they
used
the right location, C:\ProgramData. Instead they continue to use
C:\Program Files and now with Win8 MS puts the files in the long
path
name you list, but I believe they can be reached transparently
through
the path C:\Program Files So the best of both worlds.

If the app puts them somewhere else, don't blame windows.


If it was allowed in old Windows, isn't in new Windows, and there
isn't
a user selector about this then I blame Windows.

"Allowed" meaning it didn't crap out, yes. "Allowed" meaning the
developers were not designing according to best practices, no.

If it was not disallowed it was ok.

See, that is the BS that got you into the problem. Now you are trying
to justify the bad development practices. I surely hope you don't use
that philosophy in the stuff you design. If it works, it is ok, ship
it! Then someone changes a process a bit and the design stops
working.


Nope. Software that writes into its program folder may not be ideal but
can be perfectly sound.

Yes. But if you want to do that for legacy keep a specific directory
where the permissions are right for this (ab)usage.

I wouldn't recommend installing it in user documents or the directory
name length starts getting a bit long for comfort.


That's all ok as long as the OS does not blanket-ban the old stuff.

Are you going to explain what you are talking about?

This:

http://answers.microsoft.com/en-us/windows/forum/windows_8-winapps/how-do-i-run-a-16-bit-application-with-a-64-bit/ce0b3186-c39d-4027-88ef-f802a3f74f8e

Even today it's still this way.
Personally I also think it was wrong but it is what it is. Many CAD
programs still store their libraries in the program folder and,
naturally, libraries are meant to be modified and added to.

Then put that on the CAD designers, not MS. They told them not to
do it
with W2k and XP, they make it hard to do with Vista and 7. Now with
Win8 they have found a way to fake it out and put the files somewhere
else. They are just trying to make the computer harder to hack,
but no
one want to work with them.


If this causes older CAD and other stuff not to run I do not want that
OS. My computer is a tool which I expect to be able to do the jobs that
I've been doing since decades. If it can't do that is isn't very useful
to me. Then I will strive to buy another one which can do that. It's
that simple.

Most windows programs now do allow you to put the libraries in user
documents or somewhere else that they can be modified safely.


Some older CAD doesn't and there's the problem. There is a lot of custom
software that is de facto irreplaceable.

I have a hard time believing that. I think this is a Joerg's world
issue. I guess no one does the sort of work you do because they can't
find the software.

No, the reason I am a rare species is probably that most folks in my
field of work are retired or no longer on earth.

If you've never dealt with custom beam field simulators or similar
specialty software you can't really understand this. They don't come out
with a new release every year or so. They come out with one, and that's it.

[...]


I had to get used to some of the visual differences in Win 8, but that
is not a big deal. Every new version updates the look, same as a car.

I am driving a 1997 SUV, stick-shift, with window cranks and manual door
lock. No new look here and probably not for many years to come. The
automotive industry's nightmare is an elderly woman out here driving an
Austin. She bought it used in 1961 and it runs just fine. She said it'll
probably survive her.

--
Regards, Joerg

http://www.analogconsultants.com/
 
Joerg wrote:


Normally on the HD. But not in the DOS days, there I used (part of) an
extra 4MB that I installed for this. RAM-disk should also be possible
under Windows. Like here:

http://blog.laptopmag.com/faster-than-an-ssd-how-to-turn-extra-memory-into-a-ram-disk

I can't speak for stuff they sell at box stores, but all my drives have
cache. [64Mbytes in my case, newer drives use 128Mbytes.] There is zero
reason to do a RAM disk. I'm running a software RAID, so besides the cache,
much of the data is in RAM anyway prior to being written to disk. [The
software RAID is one reason to use error detecting RAM.]

The advantage to building it yourself is you know the capabilities of each
component. The disadvantage is is tends to cost more.
 
On Tue, 04 Nov 2014 14:19:03 -0500, rickman <gnuarm@gmail.com> wrote:

On 11/4/2014 3:54 AM, Martin Brown wrote:
On 04/11/2014 07:20, upsidedown@downunder.com wrote:
On Mon, 03 Nov 2014 19:31:10 -0500, rickman <gnuarm@gmail.com> wrote:

On 11/3/2014 6:59 PM, Joerg wrote:

It _is_ separate from the main memory on all modern graphics cards. The
core circuitry of a PC has nothing to do with screen refresh. That was
even the case with an old Tseng Labs card I had in the early 90's.

Ok, now I understand why you couldn't get what I am saying. Computers
have come a long way since the 90's. Most computers, desktop as well as
laptop now integrate the video controller into the main chipset and use
main memory as video RAM, *NOT* as a separate function with its own
memory. If you don't believe me look at the specs on a few systems.
Anything that talks about Intel XYZ graphics has an integrated
controller and shares main memory for video. In fact, you said
something about this yourself in this thread where you mentioned video
on the motherboard I believe.

Video on the motherboard is usually integrated. If you get a graphics
card it will be separate. A very few motherboards have separate video
controller on board with separate video memory.

1920x1080x60x24bits is just 120 Mpixels/s or 360 MB/s.
DDR3 memories have peak transfer rates over 10 GB/s, so the video
refresh is less than 3 % of the memory bandwidth.


And provided that the high performance code that you are running is
sensible and cache aware the hit from the video refresh overhead is
barely detectable. The box runs a *lot* cooler without a 3D GPU in.

There is a highly inaccurate assumption. Multicore CPUs are very much
memory bandwidth limited. Read about the memory wall. Once you reach 3
or 4, adding CPUs gives diminishing returns for performance. Boost the
memory speed and performance picks up again. Take away memory bandwidth
and the CPU speed falls off as well. The point is why pay hundreds of
dollars for extra CPU speed only to piss it away with an on chip
graphics controller sharing the memory bus?

Looked at the list of i7 processor variants and all seemed to have a
processor specific 256 KiB L2 cache and 4-20 MiB L3 cache. Depending
on the program code and data access locality, most of the tight loop
code and small data sets would be in the L2 and sometimes taken from
the L3 cache with only few accesses to the DDR4 memory (which would
actually be "L4 cache", while the disk storage would be the "main
memory" in any virtual memory system).

Programs written in traditional languages had a quite good access
locality, but unfortunately many programs written in C++ constantly
make references all over the place both for code as well as data,
reducing the cache hit ratio.

I haven't looked at the actual implement ion of those on-chip graphic
controllers, but I would use a few megabyte cache for the actual
refresh memory. That display cache controller needs the same kind of
functionality as each CPU cache controller on a multiprosessor system
that must detect write access to a specific area in the main memory by
some foreign actor (other CPU, DMA etc.) and invalidate the local
cache for that area.

So, when the application wants to change the display, it writes the
new information into main memory, the display cache controller detects
the write and invalidates that area in the display cache, which forces
the display cache controller to reload the modified areas. Of course,
if the application is so stupid that it writes to the main memory even
if the contents has not changed, this will cause the display cache
reload each time.

However, the display refresh loss is so small that it doesn't justify
the use of megabytes of dedicated display cache, while that chip area
would be better spent increasing the L3 cache size.
 

Welcome to EDABoard.com

Sponsor

Back
Top