Is this Intel i7 machine good for LTSpice?

On Sun, 02 Nov 2014 07:25:49 -0800, Joerg <news@analogconsultants.com>
wrote:

Folks,

Need to spiff up my simulation speeds here. IIRC Mike Engelhardt stated
that the Intel i7 is a really good processor for LTSPice. According to
this it looks like the 4790 is the fastest of the bunch:

http://www.intel.com/content/www/us/en/processors/core/core-i7-processor.html

So, what do thee say, is the computer in the Costco link below a good
deal for LTSpice purposes?

http://www.costco.com/Dell-XPS-8700-Desktop-%7c-Intel-Core-i7-%7c-1GB-Graphics-%7c-Windows-7-Professional.product.100131208.html

It's also available without MS-Office Home & Student 2013 for $100 less
but I found that OpenOffice isn't 100% compatible in the Excel area so
that sounds like an ok deal. My hope is that it can drive two 27"
monitors but I guess I can always add in another graphics card if not.

Reason I am looking at these is that I absolutely positively do not want
any computer with Windows 8 in here and unfortunately that's what many
others come with.

Should be fine for ltspice. The 1600 dram makes a hugh difference.
If you want a real screamer, then it's another story. Xeon class, a
real number cruncher.

Cheers
 
On 11/3/2014 6:56 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 5:09 PM, Joerg wrote:
rickman wrote:
On 11/2/2014 5:28 PM, Jeff Liebermann wrote:
On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com> wrote:

One catch. LTspice saves its preferences to:
C:\windows\scad3.ini
which has to be writeable. The fix is to use the
-ini <path
command line switch, which will:
Specify an .ini file to use other than %WINDIR%\scad3.ini
http://ltwiki.org/LTspiceHelp/LTspiceHelp/Command_Line_Switches.htm


I need to note this somewhere. Writing to the Windows directory is a
*very* bad idea.

It was standard procedure in Windoze 3.1, where almost all
applications dropped pick_a_name.ini files in the C:\Windows\
directory.

Yes, and Windows 3.1 crashed on a regular basis for about any reason
whatsoever just like 95, 98 and ME.

MS has been telling developers since Win2000 and maybe since NT to not
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
put data files in the Windows or Program Files directories. Many chose
to ignore this which wasn't enforced until Vista and became one of the
things everyone loves to hate about Vista.


Maybe. But for us users only one thing counts: That stuff works.

Do you build your stuff so that if the user connects a different
computer it craps out? No, you design your interfaces *correctly* so
that it works now and it keeps working when some peripheral piece that
should have no impact is changed out.


I design it so that is also works correctly with legacy gear. In
aerospace that can mean equipment from before you and I were born.

And if that gear was not designed to spec you are screwed. You will
have to sit down and reverse engineer the unit so you can design the
interface. Do you really expect MS to do that with all the crappy
software that was designed poorly?


These developers are designing crappy software and blaming it on MS.


I've underlined the important part above. You might remember that there
were operating systems before Win2k and that there was software written
for those.

And the software written for the OS like Win95 are not assured of
working with the newer and better made OS. Heck, software written for
Win95 didn't work with Win95 half the time. That was what was wrong
with Win95, it didn't do a good enough job of protecting your computer
from the crappy software.

You seem to think everything is as simple as your bicycle. OS
development is continuing. There are significant problems with older OS
and they are trying to fix those problems. If you want to run DOS
software, why not run DOS? I have read here that it is still available
and the hardware should still run it.


I do have to admit it was handy as the files were easy to
find and save. The new and improved versions of Windoze hide these
config files in either the registry, or bury them 5 directory layers
deep, where few can find them without specialized tools or inside
information.

Windows doesn't put anything from an app in the registry. That is up to
the app to decide. Getting to these directories is easy if they used
the right location, C:\ProgramData. Instead they continue to use
C:\Program Files and now with Win8 MS puts the files in the long path
name you list, but I believe they can be reached transparently through
the path C:\Program Files So the best of both worlds.

If the app puts them somewhere else, don't blame windows.


If it was allowed in old Windows, isn't in new Windows, and there isn't
a user selector about this then I blame Windows.

"Allowed" meaning it didn't crap out, yes. "Allowed" meaning the
developers were not designing according to best practices, no.


If it was not disallowed it was ok.

See, that is the BS that got you into the problem. Now you are trying
to justify the bad development practices. I surely hope you don't use
that philosophy in the stuff you design. If it works, it is ok, ship
it! Then someone changes a process a bit and the design stops working.


Even today it's still this way.
Personally I also think it was wrong but it is what it is. Many CAD
programs still store their libraries in the program folder and,
naturally, libraries are meant to be modified and added to.

Then put that on the CAD designers, not MS. They told them not to do it
with W2k and XP, they make it hard to do with Vista and 7. Now with
Win8 they have found a way to fake it out and put the files somewhere
else. They are just trying to make the computer harder to hack, but no
one want to work with them.


I can't tell you how many developers do all sorts of
things they aren't supposed to under windows. That is the actual
cause
of many problems people have running older software under Windows.
They
don't listen to the people providing them with the OS!

LTspice (aka SwitcherCAD) is a rather old program, with many of the
traditions of Windoze 3.1 still present. If you don't like that, try
running some of the various NEC antenna modeling programs, that still
use the terms "card" and "deck" from the Hollerith punch card era. The
common mantra is the same everywhere... if it works, don't touch it.

These programs have been updated many, many times since Windows 3.1.
Windows NT, 2k, XP, Vista, 7, 8 and 8.1 aren't even the same OS as the
3.1 tree which was ended when XP was released. Stick with the old
habits and blame yourself or your program maintainer.

I use some open source Windows software that does the same crap and I am
very vocal about the cause and the fix for the problem. Few of the
developers are interested though. Now that 8 makes this (using Program
Files for data) work adequately they no longer have a need to change it.

If you are relying on programming habits from over 20 years ago, then
you will have to stew in your own soup.


Easy to say for someone who probably never has to deal with beamfield
sims and such. Bottomline there are programs some of us have to use
where there is no alternative. Where the design teams have dissolved
decades ago and some of the folks are not with us on earth anymore. My
record so far is a chunk of software that was stored on an 8" floppy.

Yeah, exactly. If you are that far back in time you man need to rethink
your approach.


Any suggestions there? Should I tell my clients that this, that and the
other project can't be done because only older design tools are available?

Yes, tell your clients that there has been no software written since 1975.

Imagine you having a leak in the house. The plumber comes, takes a look
and says "That wouldn't be up to code these days although it was back
then. I suggest you build a new house and have this one torn down".

If you think that is at all analogous then you deserve the problems you
are having.

I suppose you are still driving the car you had in the 70's too?


Software does not automatically lose its value because it is over 20
years old. Or would you pour a bottle of 1995 Domaine Leflaive
Montrachet Grand Cru [*] into the sink because it is old?

Actually software does degrade with time as you are finding out. If you
can't find a platform to run it on, it has worn out.


My software does not run out. My hardware does. I get more and more into
pulse-echo stuff and esoteric switch mode designs where the amount of
data to be crunched overwhelms the machine.

BTW, when the in-circuit tester at one client conked out the culprit was
the PC. Had an ISA bus. It was no problem to buy a brand-new machine at
a very reasonable price that has an ISA bus. Except now they also have a
CD drive in it.


Talking about using legacy stuff, the aircraft guys are a bit more
extreme there. This aircraft is going to celebrate its 80th soon and is
used commercially:

https://www.youtube.com/watch?v=jx11k1r1Pm8

[*] It runs north of $5k. Per bottle.

Good, maybe your beamfield sim will run on it. :)


:)

Example from a few years ago: A nasty alarm system problem had to be
diagnosed. The software from that system was from the 80's. If I hadn't
been able to run it really old software here at my lab I would have had
to turn down that whole job. That would not be what I call smart.

I guess you will have to close down shop in a few more years then. Even
Win7 is going bye-bye before too long. You can learn to use computers
or be a victim, your choice.

--

Rick
 
On 11/3/2014 6:59 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 4:57 PM, Lasse Langwadt Christensen wrote:
Den mandag den 3. november 2014 22.42.14 UTC+1 skrev rickman:
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:

[...]

Since I cannot afford to put $1000 into a Titan video card,
I miss on
a few benchmarks with my $250 GTX650.


I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no movies.

If you are going for power, you need to have separate video memory or
the video eats memory bandwidth which is often the limiting factor
on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with multi-banked
RAM. Does this machine have two or more memory interfaces or just
one?


No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.

You aren't grasping the concept. Video memory needs a sizable bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.


with anything but a graphics-card integrated in the chipset that memory
will be on the card itself, 1920*1080*24bit is less that 7MB

It is not the amount of memory, it is the video bandwidth to keep the
monitor refreshed. Yes, it should be separate from the main memory or
you take a hit from the video accesses.


It _is_ separate from the main memory on all modern graphics cards. The
core circuitry of a PC has nothing to do with screen refresh. That was
even the case with an old Tseng Labs card I had in the early 90's.

Ok, now I understand why you couldn't get what I am saying. Computers
have come a long way since the 90's. Most computers, desktop as well as
laptop now integrate the video controller into the main chipset and use
main memory as video RAM, *NOT* as a separate function with its own
memory. If you don't believe me look at the specs on a few systems.
Anything that talks about Intel XYZ graphics has an integrated
controller and shares main memory for video. In fact, you said
something about this yourself in this thread where you mentioned video
on the motherboard I believe.

Video on the motherboard is usually integrated. If you get a graphics
card it will be separate. A very few motherboards have separate video
controller on board with separate video memory.

--

Rick
 
On 11/3/2014 6:58 PM, Lasse Langwadt Christensen wrote:
Den tirsdag den 4. november 2014 00.42.36 UTC+1 skrev rickman:
On 11/3/2014 5:47 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:

[...]

Since I cannot afford to put $1000 into a Titan video card, I
miss on
a few benchmarks with my $250 GTX650.


I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no movies.

If you are going for power, you need to have separate video memory or
the video eats memory bandwidth which is often the limiting factor on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with multi-banked
RAM. Does this machine have two or more memory interfaces or just one?


No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.

You aren't grasping the concept. Video memory needs a sizable bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.

This has *nothing* to do with drawing the images into graphic memory.

The memory bank question will likely be more important than the number
of cores in the CPU. The guy who can run 16 threads has at least two
memory interfaces or it would be bogging down between 4 and 8 cores.


Well ... we did enter the 21st century. In this day and age graphics
cards come with their own memory. AFAIK the Nvidia GT720 has 1GB of on
board RAM. Others have more but that sounds sufficient. Also, there is
no need to store 60 frames if the content is more or less static.

I don't know if you are playing with me or what. Yes, that is what I am
telling you, get a system with separate graphic memory which means a
separate graphics chip. Many mobos have built in video with *no* video
ram.


even so, unless you play 3D games that need massive amount of texture memory
I doubt it matters much

an I7 have something like +30GByte/sec memory BW depending on memory config

refreshing two full HD monitors at 60Hz is only a few percent of that

Yes, exactly! Just refreshing the monitors is some significant
percentage of the available memory bandwidth. Why spend a bunch of
money on an i7 with fast memory only to share that with the video
controller? Running multicore is typically memory bandwidth limited so
a 5 or 10% hit to the memory bandwidth will be a 5 to 10% hit to CPU
performance in the critical sections of code... where it matters.

--

Rick
 
On 11/3/2014 7:37 PM, rickman wrote:
On 11/3/2014 6:58 PM, Lasse Langwadt Christensen wrote:
Den tirsdag den 4. november 2014 00.42.36 UTC+1 skrev rickman:
On 11/3/2014 5:47 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:

[...]

Since I cannot afford to put $1000 into a Titan video
card, I
miss on
a few benchmarks with my $250 GTX650.


I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no
movies.

If you are going for power, you need to have separate video
memory or
the video eats memory bandwidth which is often the limiting
factor on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with
multi-banked
RAM. Does this machine have two or more memory interfaces or
just one?


No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.

You aren't grasping the concept. Video memory needs a sizable
bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.

This has *nothing* to do with drawing the images into graphic memory.

The memory bank question will likely be more important than the number
of cores in the CPU. The guy who can run 16 threads has at least two
memory interfaces or it would be bogging down between 4 and 8 cores.


Well ... we did enter the 21st century. In this day and age graphics
cards come with their own memory. AFAIK the Nvidia GT720 has 1GB of on
board RAM. Others have more but that sounds sufficient. Also, there is
no need to store 60 frames if the content is more or less static.

I don't know if you are playing with me or what. Yes, that is what I am
telling you, get a system with separate graphic memory which means a
separate graphics chip. Many mobos have built in video with *no* video
ram.


even so, unless you play 3D games that need massive amount of texture
memory
I doubt it matters much

an I7 have something like +30GByte/sec memory BW depending on memory
config

refreshing two full HD monitors at 60Hz is only a few percent of that

Yes, exactly! Just refreshing the monitors is some significant
percentage of the available memory bandwidth. Why spend a bunch of
money on an i7 with fast memory only to share that with the video
controller? Running multicore is typically memory bandwidth limited so
a 5 or 10% hit to the memory bandwidth will be a 5 to 10% hit to CPU
performance in the critical sections of code... where it matters.

That's also why I asked Joerg if this machine had dual memory channels.
That makes a big difference running multicore. I would expect that to
show up in the promotional material somewhere.

--

Rick
 
On 11/3/2014 6:49 PM, Phil Hobbs wrote:
On 11/3/2014 4:31 PM, Joerg wrote:
Phil Hobbs wrote:
On 11/02/2014 01:17 PM, Phil Hobbs wrote:
On 11/2/2014 12:45 PM, John Larkin wrote:
On Sun, 02 Nov 2014 11:06:30 -0500, Phil Hobbs
hobbs@electrooptical.net> wrote:

On 11/2/2014 11:00 AM, John Larkin wrote:
On Sun, 02 Nov 2014 07:25:49 -0800, Joerg
news@analogconsultants.com
wrote:

Folks,

Need to spiff up my simulation speeds here. IIRC Mike Engelhardt
stated
that the Intel i7 is a really good processor for LTSPice.
According to
this it looks like the 4790 is the fastest of the bunch:

http://www.intel.com/content/www/us/en/processors/core/core-i7-processor.html




So, what do thee say, is the computer in the Costco link below a
good
deal for LTSpice purposes?

http://www.costco.com/Dell-XPS-8700-Desktop-%7c-Intel-Core-i7-%7c-1GB-Graphics-%7c-Windows-7-Professional.product.100131208.html




It's also available without MS-Office Home & Student 2013 for $100
less
but I found that OpenOffice isn't 100% compatible in the Excel
area so
that sounds like an ok deal. My hope is that it can drive two 27"
monitors but I guess I can always add in another graphics card if
not.

Reason I am looking at these is that I absolutely positively do not
want
any computer with Windows 8 in here and unfortunately that's what
many
others come with.

I have spent too many hours this weekend tweaking the transient
response of a semi-hysteretic (we call it "hysterical") switchmode
constant-current source. There are about 8 interacting knobs to
turn.
At 30 seconds per run, understanding the interactions is impossible.

I want sliders on each of the part values, and I want to see the
waveforms change as I move the sliders, like they were trimpots on a
breadboard and I was looking at a scope. I need maybe 500 times the
compute power that I have now.

Mike should code LT Spice to execute on a high-end video card.



You can go quite a bit faster with a nice multicore machine--LTspice
lets you choose how many threads to run. My desktop machine (about 3
years old now) runs about 150 Gflops peak. Supermicro is an
excellent
vendor.

Cheers

Phil Hobbs

There's a setting for one or two threads. Is that all?


That's because you only have two cores. Mine goes up to 15.

16 actually. Here's a picture:
http://electrooptical.net/pictures/LTspice16threads.png


That sounds like a high-testosterone machine of a computer :)

Which processors is in there?


It has a pair of AMD Opteron 6128s. I haven't been keeping up, but 3
years ago the Magny Cours Opterons ran rings around the Intel offerings
for floating point.

Cheers

Phil Hobbs

The exact specs are:
1 pc ACC-3C0-3C13685 Supermicro Chassis 733TQ
1 pc ACC-3C0-3C13685 SUPERMICRO H8DGI-F
2 pcs CFN-OTH-AC170 2 OPTERON COOLING FAN
2 pcs AMD OPTERON 6128 8-CORE 16 TOTAL CORES
8 pcs MM3-KIN-4G133ER KINGSTON 4GB DDR3 ECC REGISTERED CL9
1.35-1.5V (32gb of ram installed)
4 pcs HDA-WDC-WD1002F WDC RE4 1TB CDW-LGE-22XSATA
1 pc GOLDSTAR DVDRW 22X GH22NS30

It runs CentOS 6 Linux, with four Windows VMs under Qemu/KVM: two XP and
two Win7.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510

hobbs at electrooptical dot net
http://electrooptical.net
 
On 11/3/2014 7:44 PM, Phil Hobbs wrote:
On 11/3/2014 6:49 PM, Phil Hobbs wrote:
On 11/3/2014 4:31 PM, Joerg wrote:
Phil Hobbs wrote:
On 11/02/2014 01:17 PM, Phil Hobbs wrote:
On 11/2/2014 12:45 PM, John Larkin wrote:
On Sun, 02 Nov 2014 11:06:30 -0500, Phil Hobbs
hobbs@electrooptical.net> wrote:

On 11/2/2014 11:00 AM, John Larkin wrote:
On Sun, 02 Nov 2014 07:25:49 -0800, Joerg
news@analogconsultants.com
wrote:

Folks,

Need to spiff up my simulation speeds here. IIRC Mike Engelhardt
stated
that the Intel i7 is a really good processor for LTSPice.
According to
this it looks like the 4790 is the fastest of the bunch:

http://www.intel.com/content/www/us/en/processors/core/core-i7-processor.html





So, what do thee say, is the computer in the Costco link below a
good
deal for LTSpice purposes?

http://www.costco.com/Dell-XPS-8700-Desktop-%7c-Intel-Core-i7-%7c-1GB-Graphics-%7c-Windows-7-Professional.product.100131208.html





It's also available without MS-Office Home & Student 2013 for $100
less
but I found that OpenOffice isn't 100% compatible in the Excel
area so
that sounds like an ok deal. My hope is that it can drive two 27"
monitors but I guess I can always add in another graphics card if
not.

Reason I am looking at these is that I absolutely positively do
not
want
any computer with Windows 8 in here and unfortunately that's what
many
others come with.

I have spent too many hours this weekend tweaking the transient
response of a semi-hysteretic (we call it "hysterical") switchmode
constant-current source. There are about 8 interacting knobs to
turn.
At 30 seconds per run, understanding the interactions is
impossible.

I want sliders on each of the part values, and I want to see the
waveforms change as I move the sliders, like they were trimpots
on a
breadboard and I was looking at a scope. I need maybe 500 times the
compute power that I have now.

Mike should code LT Spice to execute on a high-end video card.



You can go quite a bit faster with a nice multicore machine--LTspice
lets you choose how many threads to run. My desktop machine
(about 3
years old now) runs about 150 Gflops peak. Supermicro is an
excellent
vendor.

Cheers

Phil Hobbs

There's a setting for one or two threads. Is that all?


That's because you only have two cores. Mine goes up to 15.

16 actually. Here's a picture:
http://electrooptical.net/pictures/LTspice16threads.png


That sounds like a high-testosterone machine of a computer :)

Which processors is in there?


It has a pair of AMD Opteron 6128s. I haven't been keeping up, but 3
years ago the Magny Cours Opterons ran rings around the Intel offerings
for floating point.

Cheers

Phil Hobbs


The exact specs are:
1 pc ACC-3C0-3C13685 Supermicro Chassis 733TQ
1 pc ACC-3C0-3C13685 SUPERMICRO H8DGI-F
2 pcs CFN-OTH-AC170 2 OPTERON COOLING FAN
2 pcs AMD OPTERON 6128 8-CORE 16 TOTAL CORES
8 pcs MM3-KIN-4G133ER KINGSTON 4GB DDR3 ECC REGISTERED CL9
1.35-1.5V (32gb of ram installed)
4 pcs HDA-WDC-WD1002F WDC RE4 1TB CDW-LGE-22XSATA
1 pc GOLDSTAR DVDRW 22X GH22NS30

It runs CentOS 6 Linux, with four Windows VMs under Qemu/KVM: two XP and
two Win7.

What's your video card?

--

Rick
 
On Mon, 03 Nov 2014 19:37:02 -0500, rickman <gnuarm@gmail.com> wrote:

On 11/3/2014 6:58 PM, Lasse Langwadt Christensen wrote:
Den tirsdag den 4. november 2014 00.42.36 UTC+1 skrev rickman:
On 11/3/2014 5:47 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:

[...]

Since I cannot afford to put $1000 into a Titan video card, I
miss on
a few benchmarks with my $250 GTX650.


I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no movies.

If you are going for power, you need to have separate video memory or
the video eats memory bandwidth which is often the limiting factor on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with multi-banked
RAM. Does this machine have two or more memory interfaces or just one?


No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.

You aren't grasping the concept. Video memory needs a sizable bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.

This has *nothing* to do with drawing the images into graphic memory.

The memory bank question will likely be more important than the number
of cores in the CPU. The guy who can run 16 threads has at least two
memory interfaces or it would be bogging down between 4 and 8 cores.


Well ... we did enter the 21st century. In this day and age graphics
cards come with their own memory. AFAIK the Nvidia GT720 has 1GB of on
board RAM. Others have more but that sounds sufficient. Also, there is
no need to store 60 frames if the content is more or less static.

I don't know if you are playing with me or what. Yes, that is what I am
telling you, get a system with separate graphic memory which means a
separate graphics chip. Many mobos have built in video with *no* video
ram.


even so, unless you play 3D games that need massive amount of texture memory
I doubt it matters much

an I7 have something like +30GByte/sec memory BW depending on memory config

refreshing two full HD monitors at 60Hz is only a few percent of that

Yes, exactly! Just refreshing the monitors is some significant
percentage of the available memory bandwidth. Why spend a bunch of
money on an i7 with fast memory only to share that with the video
controller? Running multicore is typically memory bandwidth limited so
a 5 or 10% hit to the memory bandwidth will be a 5 to 10% hit to CPU
performance in the critical sections of code... where it matters.

After startup, I would expect that in simulation both code and
intermediate results are read from the cache, which these days seem to
be several megabytes. The main memory is mainly needed to store
results.

If a huge amount of data is to be generate, a sensible simulation
program on a 64 bit machine would allocate hundreds of gigabytes of
virtual memory and write the result into that memory. Associate a disk
file with that virtual memory range (memory mapped file) and let the
page fault mechanism write those virtual memory pages to disk in the
background.
 
On 03/11/2014 23:20, Joerg wrote:
Martin Brown wrote:
On 03/11/2014 20:37, Joerg wrote:

Ok, but for example a gamer machine like the XPS series is a pretty good
bet that it'll perform well with SPICE.

If you can persuade them to do it a gamer machine with the graphics card
entirely deleted will be the cheapest low power combo to do about what
you want. The 2D capability of the Intel graphics engine internal on the
i5 & i7 CPUs are as fast as anything on fancy 3D gaming cards.
(obviously they get totally thrashed in 3D realtime rendering tests)

Basically you can shave 100-200W of the power consumption. My i7 PC
idles at about 60W when it isn't doing anything beyond web browsing.


Problem is, unless you piece together a custom machine the ones that are
equipped with good processors and RAM up to the gills seem to always
come with these powerful graphics cards. The other issue is that
on-board graphics often will not drive two monitors. I found that out
the hard way after I bought the PC I am using now.

They will raise their eyebrows and try to persuade you to have a 3D
video card but you can usually persuade them to downgrade the card and
give you more memory if you explain that it isn't for any 3D gaming.

BTW I wouldn't waste your money on exotic faster ram unless you intend
to overclock it. Stock ram and more of it is a better price performance.

So you think 1600MHz RAM is fine?

The gains with faster RAM are really very marginal. If had done my
homework properly I wouldn't have got the faster parts myself. More RAM
is a better bet min 8GB preferably 16GB and matched chips - no matter
what the makers say about being able to fit any size they generally
perform better in matched pairs and very probably a matched set of four.

I'd be interested to see how a moderate sized LTSPice simulation scales
with the number of threads on an i5 and i7 architecture. My guess is
that hyperthreading will not be all that useful to it.


I am hoping the four cores will speed things up significantly. Also the
huge amount of RAM. Right now I have 2GB and I regularly hit the limit.

I was hoping someone with an i5 and i7 might post some benchmarks with
different thread counts from 1-4 and 1-8 respectively. My instinct is
still that the i5 will give better price performance on up to 4 cores.

I have never been a great fan of hyperthreading. It tends to saturate
memory bandwidth and generate heat without additional performance.

BTW given what you have said elsewhere in the thread you probably want
it with Win7 Pro installed so that you get the XP license thrown in and
don't have to faff about setting up your own VMs for legacy code.

--
Regards,
Martin Brown
 
On 04/11/2014 07:20, upsidedown@downunder.com wrote:
On Mon, 03 Nov 2014 19:31:10 -0500, rickman <gnuarm@gmail.com> wrote:

On 11/3/2014 6:59 PM, Joerg wrote:

It _is_ separate from the main memory on all modern graphics cards. The
core circuitry of a PC has nothing to do with screen refresh. That was
even the case with an old Tseng Labs card I had in the early 90's.

Ok, now I understand why you couldn't get what I am saying. Computers
have come a long way since the 90's. Most computers, desktop as well as
laptop now integrate the video controller into the main chipset and use
main memory as video RAM, *NOT* as a separate function with its own
memory. If you don't believe me look at the specs on a few systems.
Anything that talks about Intel XYZ graphics has an integrated
controller and shares main memory for video. In fact, you said
something about this yourself in this thread where you mentioned video
on the motherboard I believe.

Video on the motherboard is usually integrated. If you get a graphics
card it will be separate. A very few motherboards have separate video
controller on board with separate video memory.

1920x1080x60x24bits is just 120 Mpixels/s or 360 MB/s.
DDR3 memories have peak transfer rates over 10 GB/s, so the video
refresh is less than 3 % of the memory bandwidth.

And provided that the high performance code that you are running is
sensible and cache aware the hit from the video refresh overhead is
barely detectable. The box runs a *lot* cooler without a 3D GPU in.

--
Regards,
Martin Brown
 
On 04/11/2014 01:02, Joerg wrote:
rickman wrote:
On 11/3/2014 6:56 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 5:09 PM, Joerg wrote:
rickman wrote:
On 11/2/2014 5:28 PM, Jeff Liebermann wrote:
On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com> wrote:

One catch. LTspice saves its preferences to:
C:\windows\scad3.ini
which has to be writeable. The fix is to use the
-ini <path
command line switch, which will:
Specify an .ini file to use other than %WINDIR%\scad3.ini
http://ltwiki.org/LTspiceHelp/LTspiceHelp/Command_Line_Switches.htm



I need to note this somewhere. Writing to the Windows directory
is a
*very* bad idea.

It was standard procedure in Windoze 3.1, where almost all
applications dropped pick_a_name.ini files in the C:\Windows\
directory.

Yes, and Windows 3.1 crashed on a regular basis for about any reason
whatsoever just like 95, 98 and ME.

MS has been telling developers since Win2000 and maybe since NT to not
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
put data files in the Windows or Program Files directories. Many
chose
to ignore this which wasn't enforced until Vista and became one of the
things everyone loves to hate about Vista.


Maybe. But for us users only one thing counts: That stuff works.

Do you build your stuff so that if the user connects a different
computer it craps out? No, you design your interfaces *correctly* so
that it works now and it keeps working when some peripheral piece that
should have no impact is changed out.


I design it so that is also works correctly with legacy gear. In
aerospace that can mean equipment from before you and I were born.

And if that gear was not designed to spec you are screwed. You will
have to sit down and reverse engineer the unit so you can design the
interface. Do you really expect MS to do that with all the crappy
software that was designed poorly?


I expect them to provide a way that programs can write into their
install directories. What is so difficult about that?

It is no longer considered good practice to permit this without asking
permission. It could be writing things that modify executable code.

Legacy code that needs to do this should live in some 8.3 filename
compatible hovel from the root directory. You will otherwise find
programs that don't work because fully qualified filenames overflow
buffers in ancient DOS programs. Peeky pokey ancient print IO port stuff
is at the mercy of the OS as to whether or not it will work.

... OS
development is continuing. There are significant problems with older OS
and they are trying to fix those problems. If you want to run DOS
software, why not run DOS? I have read here that it is still available
and the hardware should still run it.

Supposedly that's a problem on PCs with Win-8. With XP, no problem, I've
done it. And I do not need to reboot for that. This is what I call
performance.

Comparatively few DOS programs are really tetchy about what version they
are on and some of them merely baulk at running on an OS that has
version numbers much higher than it expects to see.

I do have to admit it was handy as the files were easy to
find and save. The new and improved versions of Windoze hide these
config files in either the registry, or bury them 5 directory layers
deep, where few can find them without specialized tools or inside
information.

Windows doesn't put anything from an app in the registry. That is
up to
the app to decide. Getting to these directories is easy if they used
the right location, C:\ProgramData. Instead they continue to use
C:\Program Files and now with Win8 MS puts the files in the long path
name you list, but I believe they can be reached transparently through
the path C:\Program Files So the best of both worlds.

If the app puts them somewhere else, don't blame windows.


If it was allowed in old Windows, isn't in new Windows, and there isn't
a user selector about this then I blame Windows.

"Allowed" meaning it didn't crap out, yes. "Allowed" meaning the
developers were not designing according to best practices, no.

If it was not disallowed it was ok.

See, that is the BS that got you into the problem. Now you are trying
to justify the bad development practices. I surely hope you don't use
that philosophy in the stuff you design. If it works, it is ok, ship
it! Then someone changes a process a bit and the design stops working.


Nope. Software that writes into its program folder may not be ideal but
can be perfectly sound.

Yes. But if you want to do that for legacy keep a specific directory
where the permissions are right for this (ab)usage.

I wouldn't recommend installing it in user documents or the directory
name length starts getting a bit long for comfort.

Even today it's still this way.
Personally I also think it was wrong but it is what it is. Many CAD
programs still store their libraries in the program folder and,
naturally, libraries are meant to be modified and added to.

Then put that on the CAD designers, not MS. They told them not to do it
with W2k and XP, they make it hard to do with Vista and 7. Now with
Win8 they have found a way to fake it out and put the files somewhere
else. They are just trying to make the computer harder to hack, but no
one want to work with them.


If this causes older CAD and other stuff not to run I do not want that
OS. My computer is a tool which I expect to be able to do the jobs that
I've been doing since decades. If it can't do that is isn't very useful
to me. Then I will strive to buy another one which can do that. It's
that simple.

Most windows programs now do allow you to put the libraries in user
documents or somewhere else that they can be modified safely.

My favourites are

C:\Program.dos

and

C:\DATA

For legacy DOS code that is only 8.3 filename aware. You are probably
out of luck if you have any of the software that insists on having a
dongle plugged into the non-existent centronix printer port these days.

Ever wondered why industrial users hung on to XP for so long and are now
(grudgingly) upgrading to Win-7 while shunning Win-8?

Main reason is inertia and Vista was such a dog.

Other problem is that scientific instrument makers (and engineering
toolmakers) can't be bothered to provide drivers for 10 year old kit on
newer OSs and the gear will typically last for 15-20 years.

Several of my wife's older lab instruments are firewalled off from the
corporate network because they run now unsupported legacy XP with no
prospect of ever upgrading to any new OS. Device drivers simply do no
exist - of course the maker would love to sell them a brand new one.

Example from a few years ago: A nasty alarm system problem had to be
diagnosed. The software from that system was from the 80's. If I hadn't
been able to run it really old software here at my lab I would have had
to turn down that whole job. That would not be what I call smart.

I guess you will have to close down shop in a few more years then. Even
Win7 is going bye-bye before too long. You can learn to use computers
or be a victim, your choice.

Win 7 will be around for a long time. MS has learned from the Vista
debacle, giving XP a long lifetime. They know that Win-8 is in a lot of
aspects a dud.

Mainly because it forces desktop users to leave greasy fingerprints on
their screens and the new GUI looks like Picasso on a bad acid trip.

--
Regards,
Martin Brown
 
rickman wrote:
On 11/3/2014 8:02 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 6:56 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 5:09 PM, Joerg wrote:
rickman wrote:
On 11/2/2014 5:28 PM, Jeff Liebermann wrote:
On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com
wrote:

One catch. LTspice saves its preferences to:
C:\windows\scad3.ini
which has to be writeable. The fix is to use the
-ini <path
command line switch, which will:
Specify an .ini file to use other than %WINDIR%\scad3.ini
http://ltwiki.org/LTspiceHelp/LTspiceHelp/Command_Line_Switches.htm




I need to note this somewhere. Writing to the Windows directory
is a
*very* bad idea.

It was standard procedure in Windoze 3.1, where almost all
applications dropped pick_a_name.ini files in the C:\Windows\
directory.

Yes, and Windows 3.1 crashed on a regular basis for about any reason
whatsoever just like 95, 98 and ME.

MS has been telling developers since Win2000 and maybe since NT
to not
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
put data files in the Windows or Program Files directories. Many
chose
to ignore this which wasn't enforced until Vista and became one
of the
things everyone loves to hate about Vista.


Maybe. But for us users only one thing counts: That stuff works.

Do you build your stuff so that if the user connects a different
computer it craps out? No, you design your interfaces *correctly* so
that it works now and it keeps working when some peripheral piece that
should have no impact is changed out.


I design it so that is also works correctly with legacy gear. In
aerospace that can mean equipment from before you and I were born.

And if that gear was not designed to spec you are screwed. You will
have to sit down and reverse engineer the unit so you can design the
interface. Do you really expect MS to do that with all the crappy
software that was designed poorly?


I expect them to provide a way that programs can write into their
install directories. What is so difficult about that?

You can do that. Don't install it in Program Files. Many programs do
just that if they don't want to play by the rules. The ones that can't
figure out that there *are* rules and just ignore the requirements of
the OS don't work so well.

I always install in another directory but some SW won't give you a choice.

[...]


I do have to admit it was handy as the files were easy to
find and save. The new and improved versions of Windoze hide these
config files in either the registry, or bury them 5 directory
layers
deep, where few can find them without specialized tools or inside
information.

Windows doesn't put anything from an app in the registry. That is
up to
the app to decide. Getting to these directories is easy if they
used
the right location, C:\ProgramData. Instead they continue to use
C:\Program Files and now with Win8 MS puts the files in the long
path
name you list, but I believe they can be reached transparently
through
the path C:\Program Files So the best of both worlds.

If the app puts them somewhere else, don't blame windows.


If it was allowed in old Windows, isn't in new Windows, and there
isn't
a user selector about this then I blame Windows.

"Allowed" meaning it didn't crap out, yes. "Allowed" meaning the
developers were not designing according to best practices, no.


If it was not disallowed it was ok.

See, that is the BS that got you into the problem. Now you are trying
to justify the bad development practices. I surely hope you don't use
that philosophy in the stuff you design. If it works, it is ok, ship
it! Then someone changes a process a bit and the design stops working.


Nope. Software that writes into its program folder may not be ideal but
can be perfectly sound.

Oh, I'm sorry, I didn't realize you were an OS expert. I guess you can
consult with MS and help them fix their problems.

Sometimes it would behove them to listen to customers some more. It
would most certainly have prevented the failure of their RTOS efforts.
After an Embedsyscon I told them they'll fail with RT and why. And then
they did.

Same with the housing bubble. I still remember a top notch realtor
laughing at me. Then they lost their own house ...

Even today it's still this way.
Personally I also think it was wrong but it is what it is. Many CAD
programs still store their libraries in the program folder and,
naturally, libraries are meant to be modified and added to.

Then put that on the CAD designers, not MS. They told them not to do it
with W2k and XP, they make it hard to do with Vista and 7. Now with
Win8 they have found a way to fake it out and put the files somewhere
else. They are just trying to make the computer harder to hack, but no
one want to work with them.


If this causes older CAD and other stuff not to run I do not want that
OS. My computer is a tool which I expect to be able to do the jobs that
I've been doing since decades. If it can't do that is isn't very useful
to me. Then I will strive to buy another one which can do that. It's
that simple.

Ever wondered why industrial users hung on to XP for so long and are now
(grudgingly) upgrading to Win-7 while shunning Win-8?

Windows 7 has many of the same issues you don't like about Win 8.

I know :-(

But what can you do other than VMs?


I can't tell you how many developers do all sorts of
things they aren't supposed to under windows. That is the actual
cause
of many problems people have running older software under Windows.
They
don't listen to the people providing them with the OS!

LTspice (aka SwitcherCAD) is a rather old program, with many of the
traditions of Windoze 3.1 still present. If you don't like
that, try
running some of the various NEC antenna modeling programs, that
still
use the terms "card" and "deck" from the Hollerith punch card era.
The
common mantra is the same everywhere... if it works, don't touch
it.

These programs have been updated many, many times since Windows 3.1.
Windows NT, 2k, XP, Vista, 7, 8 and 8.1 aren't even the same OS
as the
3.1 tree which was ended when XP was released. Stick with the old
habits and blame yourself or your program maintainer.

I use some open source Windows software that does the same crap and
I am
very vocal about the cause and the fix for the problem. Few of the
developers are interested though. Now that 8 makes this (using
Program
Files for data) work adequately they no longer have a need to
change it.

If you are relying on programming habits from over 20 years ago,
then
you will have to stew in your own soup.


Easy to say for someone who probably never has to deal with beamfield
sims and such. Bottomline there are programs some of us have to use
where there is no alternative. Where the design teams have dissolved
decades ago and some of the folks are not with us on earth
anymore. My
record so far is a chunk of software that was stored on an 8" floppy.

Yeah, exactly. If you are that far back in time you man need to
rethink
your approach.


Any suggestions there? Should I tell my clients that this, that and the
other project can't be done because only older design tools are
available?

Yes, tell your clients that there has been no software written since
1975.


ROFL! You clearly could not do my job.

No, I would just get more current software.

So how do you convince a research group that has disbanded in the 80's
or 90's and where the professor is retired to pull that off?

[...]


I suppose you are still driving the car you had in the 70's too?


I would be still driving my 1987 Audi station if it had been possible to
register it in California. My former neighbor has it now and it still
works fine. I drive a 1997 SUV and the way it goes I might still drive
that 10 years from now.

A former coworker tools around in one of two cars. A 1950's Chevy truck
or a 1950's Bel Air, sometimes depending on whether he has to pick up
heavier stuff at the hardware store. Both cars in impeccable shape. Why
should he "upgrade"?

Yes, he "tools around". Not exactly the same as doing work.

He commutes to and from the work place, gets all his materials, what
else can one ask? A farmer around here still uses one of those Chevys in
the field, every day. Of course, that one does not look showroom.

[...]

Win 7 will be around for a long time. MS has learned from the Vista
debacle, giving XP a long lifetime. They know that Win-8 is in a lot of
aspects a dud.

No, MS knows how to make money and Win 7 won't be around for a long time.

I just find it funny how every new Windows that comes out is spawn of
the devil.

Because some of the were.


... I remember when XP came out it was shunned as being far too
restrictive. lol I'm looking for bumper stickers, "you can have my Win
XP when you tear it from my cold, dead hands".

I like it right from the start. And it's been good to me.


Win 7 was considered to be a poor choice, but became successful because
it was the only choice.

Right now it is but wasn't until recently.

--
Regards, Joerg

http://www.analogconsultants.com/
 
Martin Brown wrote:
On 04/11/2014 01:02, Joerg wrote:
rickman wrote:
On 11/3/2014 6:56 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 5:09 PM, Joerg wrote:
rickman wrote:
On 11/2/2014 5:28 PM, Jeff Liebermann wrote:
On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com
wrote:

One catch. LTspice saves its preferences to:
C:\windows\scad3.ini
which has to be writeable. The fix is to use the
-ini <path
command line switch, which will:
Specify an .ini file to use other than %WINDIR%\scad3.ini
http://ltwiki.org/LTspiceHelp/LTspiceHelp/Command_Line_Switches.htm




I need to note this somewhere. Writing to the Windows directory
is a
*very* bad idea.

It was standard procedure in Windoze 3.1, where almost all
applications dropped pick_a_name.ini files in the C:\Windows\
directory.

Yes, and Windows 3.1 crashed on a regular basis for about any reason
whatsoever just like 95, 98 and ME.

MS has been telling developers since Win2000 and maybe since NT
to not
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
put data files in the Windows or Program Files directories. Many
chose
to ignore this which wasn't enforced until Vista and became one
of the
things everyone loves to hate about Vista.


Maybe. But for us users only one thing counts: That stuff works.

Do you build your stuff so that if the user connects a different
computer it craps out? No, you design your interfaces *correctly* so
that it works now and it keeps working when some peripheral piece that
should have no impact is changed out.


I design it so that is also works correctly with legacy gear. In
aerospace that can mean equipment from before you and I were born.

And if that gear was not designed to spec you are screwed. You will
have to sit down and reverse engineer the unit so you can design the
interface. Do you really expect MS to do that with all the crappy
software that was designed poorly?


I expect them to provide a way that programs can write into their
install directories. What is so difficult about that?

It is no longer considered good practice to permit this without asking
permission. ...

Still, a good OS must support legacy SW. This decision whether to allow
or not should be left to the customer.


... It could be writing things that modify executable code.

That;s easily preventable if you give customers choices. They could, for
example, allow writing but exclude changes to executable files.


Legacy code that needs to do this should live in some 8.3 filename
compatible hovel from the root directory. You will otherwise find
programs that don't work because fully qualified filenames overflow
buffers in ancient DOS programs. Peeky pokey ancient print IO port stuff
is at the mercy of the OS as to whether or not it will work.

Again, a user could easily make directories that comply with old styles.


... OS
development is continuing. There are significant problems with older OS
and they are trying to fix those problems. If you want to run DOS
software, why not run DOS? I have read here that it is still available
and the hardware should still run it.

Supposedly that's a problem on PCs with Win-8. With XP, no problem, I've
done it. And I do not need to reboot for that. This is what I call
performance.

Comparatively few DOS programs are really tetchy about what version they
are on and some of them merely baulk at running on an OS that has
version numbers much higher than it expects to see.

Sure, but blanket-banning any 16-bit apps is a really bad idea. It
results in lost biz opportunity for an OS maker because people will be
leery of upgrades.


I do have to admit it was handy as the files were easy to
find and save. The new and improved versions of Windoze hide these
config files in either the registry, or bury them 5 directory
layers
deep, where few can find them without specialized tools or inside
information.

Windows doesn't put anything from an app in the registry. That is
up to
the app to decide. Getting to these directories is easy if they
used
the right location, C:\ProgramData. Instead they continue to use
C:\Program Files and now with Win8 MS puts the files in the long
path
name you list, but I believe they can be reached transparently
through
the path C:\Program Files So the best of both worlds.

If the app puts them somewhere else, don't blame windows.


If it was allowed in old Windows, isn't in new Windows, and there
isn't
a user selector about this then I blame Windows.

"Allowed" meaning it didn't crap out, yes. "Allowed" meaning the
developers were not designing according to best practices, no.

If it was not disallowed it was ok.

See, that is the BS that got you into the problem. Now you are trying
to justify the bad development practices. I surely hope you don't use
that philosophy in the stuff you design. If it works, it is ok, ship
it! Then someone changes a process a bit and the design stops working.


Nope. Software that writes into its program folder may not be ideal but
can be perfectly sound.

Yes. But if you want to do that for legacy keep a specific directory
where the permissions are right for this (ab)usage.

I wouldn't recommend installing it in user documents or the directory
name length starts getting a bit long for comfort.

That's all ok as long as the OS does not blanket-ban the old stuff.


Even today it's still this way.
Personally I also think it was wrong but it is what it is. Many CAD
programs still store their libraries in the program folder and,
naturally, libraries are meant to be modified and added to.

Then put that on the CAD designers, not MS. They told them not to do it
with W2k and XP, they make it hard to do with Vista and 7. Now with
Win8 they have found a way to fake it out and put the files somewhere
else. They are just trying to make the computer harder to hack, but no
one want to work with them.


If this causes older CAD and other stuff not to run I do not want that
OS. My computer is a tool which I expect to be able to do the jobs that
I've been doing since decades. If it can't do that is isn't very useful
to me. Then I will strive to buy another one which can do that. It's
that simple.

Most windows programs now do allow you to put the libraries in user
documents or somewhere else that they can be modified safely.

Some older CAD doesn't and there's the problem. There is a lot of custom
software that is de facto irreplaceable.


My favourites are

C:\Program.dos

and

C:\DATA

For legacy DOS code that is only 8.3 filename aware. You are probably
out of luck if you have any of the software that insists on having a
dongle plugged into the non-existent centronix printer port these days.

Whoever buys dongled software brought the wrath upon themselves. That is
one thing I never did and never will do.

Ever wondered why industrial users hung on to XP for so long and are now
(grudgingly) upgrading to Win-7 while shunning Win-8?

Main reason is inertia and Vista was such a dog.

Other problem is that scientific instrument makers (and engineering
toolmakers) can't be bothered to provide drivers for 10 year old kit on
newer OSs and the gear will typically last for 15-20 years.

Make that 30+ years :)


Several of my wife's older lab instruments are firewalled off from the
corporate network because they run now unsupported legacy XP with no
prospect of ever upgrading to any new OS. Device drivers simply do no
exist - of course the maker would love to sell them a brand new one.

I came across one piece of production equipment that would only run
under Windows 3.2.


Example from a few years ago: A nasty alarm system problem had to be
diagnosed. The software from that system was from the 80's. If I hadn't
been able to run it really old software here at my lab I would have had
to turn down that whole job. That would not be what I call smart.

I guess you will have to close down shop in a few more years then. Even
Win7 is going bye-bye before too long. You can learn to use computers
or be a victim, your choice.

Win 7 will be around for a long time. MS has learned from the Vista
debacle, giving XP a long lifetime. They know that Win-8 is in a lot of
aspects a dud.

Mainly because it forces desktop users to leave greasy fingerprints on
their screens and the new GUI looks like Picasso on a bad acid trip.

I understand one can get it to behave somewhat normal but a regular
non-techie user can't get that done. Others just don't have the time to
fix it. So they avoid it.

--
Regards, Joerg

http://www.analogconsultants.com/
 
rickman wrote:
On 11/3/2014 6:59 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 4:57 PM, Lasse Langwadt Christensen wrote:
Den mandag den 3. november 2014 22.42.14 UTC+1 skrev rickman:
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:

[...]

Since I cannot afford to put $1000 into a Titan video card,
I miss on
a few benchmarks with my $250 GTX650.


I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no
movies.

If you are going for power, you need to have separate video
memory or
the video eats memory bandwidth which is often the limiting factor
on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with
multi-banked
RAM. Does this machine have two or more memory interfaces or just
one?


No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.

You aren't grasping the concept. Video memory needs a sizable
bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.


with anything but a graphics-card integrated in the chipset that memory
will be on the card itself, 1920*1080*24bit is less that 7MB

It is not the amount of memory, it is the video bandwidth to keep the
monitor refreshed. Yes, it should be separate from the main memory or
you take a hit from the video accesses.


It _is_ separate from the main memory on all modern graphics cards. The
core circuitry of a PC has nothing to do with screen refresh. That was
even the case with an old Tseng Labs card I had in the early 90's.

Ok, now I understand why you couldn't get what I am saying. Computers
have come a long way since the 90's. Most computers, desktop as well as
laptop now integrate the video controller into the main chipset and use
main memory as video RAM, *NOT* as a separate function with its own
memory.

Serious machine don't.


If you don't believe me look at the specs on a few systems.


Look at the Dell XPS series. It has the video completely separated. As
it should be.


Anything that talks about Intel XYZ graphics has an integrated
controller and shares main memory for video. In fact, you said
something about this yourself in this thread where you mentioned video
on the motherboard I believe.

Simple computers have that but the bigger machines do not. Or sometimes
have it but it's not being used because there is a big fat Nvidia or
other graphics card in there.


Video on the motherboard is usually integrated. If you get a graphics
card it will be separate. A very few motherboards have separate video
controller on board with separate video memory.

A "few"? When have you last looked at business-clas computers?

--
Regards, Joerg

http://www.analogconsultants.com/
 
rickman wrote:
On 11/3/2014 5:47 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:

[...]

Since I cannot afford to put $1000 into a Titan video card, I
miss on
a few benchmarks with my $250 GTX650.


I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no movies.

If you are going for power, you need to have separate video memory or
the video eats memory bandwidth which is often the limiting factor
on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with multi-banked
RAM. Does this machine have two or more memory interfaces or just
one?


No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.

You aren't grasping the concept. Video memory needs a sizable bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.

This has *nothing* to do with drawing the images into graphic memory.

The memory bank question will likely be more important than the number
of cores in the CPU. The guy who can run 16 threads has at least two
memory interfaces or it would be bogging down between 4 and 8 cores.


Well ... we did enter the 21st century. In this day and age graphics
cards come with their own memory. AFAIK the Nvidia GT720 has 1GB of on
board RAM. Others have more but that sounds sufficient. Also, there is
no need to store 60 frames if the content is more or less static.

I don't know if you are playing with me or what. Yes, that is what I am
telling you, get a system with separate graphic memory which means a
separate graphics chip. Many mobos have built in video with *no* video
ram.

Yes, and that would suffice for my purposes. But in this thread we were
talking about a different class of computers, the Dell XPS series.

--
Regards, Joerg

http://www.analogconsultants.com/
 
Martin Riddle wrote:
On Sun, 02 Nov 2014 07:25:49 -0800, Joerg <news@analogconsultants.com
wrote:

Folks,

Need to spiff up my simulation speeds here. IIRC Mike Engelhardt stated
that the Intel i7 is a really good processor for LTSPice. According to
this it looks like the 4790 is the fastest of the bunch:

http://www.intel.com/content/www/us/en/processors/core/core-i7-processor.html

So, what do thee say, is the computer in the Costco link below a good
deal for LTSpice purposes?

http://www.costco.com/Dell-XPS-8700-Desktop-%7c-Intel-Core-i7-%7c-1GB-Graphics-%7c-Windows-7-Professional.product.100131208.html

It's also available without MS-Office Home & Student 2013 for $100 less
but I found that OpenOffice isn't 100% compatible in the Excel area so
that sounds like an ok deal. My hope is that it can drive two 27"
monitors but I guess I can always add in another graphics card if not.

Reason I am looking at these is that I absolutely positively do not want
any computer with Windows 8 in here and unfortunately that's what many
others come with.


Should be fine for ltspice. The 1600 dram makes a hugh difference.
If you want a real screamer, then it's another story. Xeon class, a
real number cruncher.

Do you have any suggestion from a mainstream manufacturer? I need
something with a screaming processor, lots of memory but do not need
much disk space and certainly no fancy 3D graphics (but the big machines
seem to always have that these days and it'll just waste electricity).

--
Regards, Joerg

http://www.analogconsultants.com/
 
On Tue, 04 Nov 2014 07:16:07 -0800, Joerg <news@analogconsultants.com>
Gave us:

Do you have any suggestion from a mainstream manufacturer? I need
something with a screaming processor, lots of memory but do not need
much disk space and certainly no fancy 3D graphics (but the big machines
seem to always have that these days and it'll just waste electricity).

Memory depends on a few things. This guy did a test, but fails to
note that his motherboard may also be a factor in his findings, and the
same tests may yield entirely different results with the same cpu and
RAM on a different mobo.

But it looks like 1888 is max gain advantage on his system.
Mine will do 2400, but I put 32GB of 2133 in it, which is still faster
than it was made for.

My benchmarks beat ALL of his though.

http://www.youtube.com/watch?v=dWgzA2C61z4
 
On Tue, 04 Nov 2014 07:16:07 -0800 Joerg <news@analogconsultants.com>
wrote in Message id: <cbs8tnFddeU1@mid.individual.net>:

Do you have any suggestion from a mainstream manufacturer? I need
something with a screaming processor, lots of memory but do not need
much disk space and certainly no fancy 3D graphics (but the big machines
seem to always have that these days and it'll just waste electricity).

I don't, but this might help you pick a processor.
http://www.cpubenchmark.net/high_end_cpus.html

Stick with a 4th gen (Haswell) processor.
http://ark.intel.com/products/family/75023/4th-Generation-Intel-Core-i7-Processors#@Desktop
http://ark.intel.com/products/family/75024/4th-Generation-Intel-Core-i5-Processors#@Desktop
 
Joerg wrote:

Folks,

Need to spiff up my simulation speeds here. IIRC Mike Engelhardt stated
that the Intel i7 is a really good processor for LTSPice. According to
this it looks like the 4790 is the fastest of the bunch:
I think you want to run a several Xeon-processor system with a version
of SPICE that uses multi-processors. You might even be able to find
a used unit on eBay really cheap.

Jon
 
On Tue, 04 Nov 2014 11:00:36 -0600, Jon Elson <elson@pico-systems.com>
Gave us:

Joerg wrote:

Folks,

Need to spiff up my simulation speeds here. IIRC Mike Engelhardt stated
that the Intel i7 is a really good processor for LTSPice. According to
this it looks like the 4790 is the fastest of the bunch:
I think you want to run a several Xeon-processor system with a version
of SPICE that uses multi-processors. You might even be able to find
a used unit on eBay really cheap.

Jon

Buy an old HP DL360. It is only 1U but it is jam packed with all
manner of things computing...

The G5s are cheap as the industry is like at G7 or 8 now.

http://www.ebay.com/sch/i.html?_from=R40&_trksid=p2050601.m570.l1313&_nkw=DL360&_sacat=0

They have two Xeon sockets in them, and gobs of RAM slots.

Companies change them out for the new stuff all the time.

Some even have to "scrap" the old.

Way too much waste in this nation and the world, for that matter.
 

Welcome to EDABoard.com

Sponsor

Back
Top