Is this Intel i7 machine good for LTSpice?

Lasse Langwadt Christensen wrote:
Den mandag den 3. november 2014 22.11.48 UTC+1 skrev Joerg:
Lasse Langwadt Christensen wrote:
Den mandag den 3. november 2014 21.58.14 UTC+1 skrev Joerg:
Lasse Langwadt Christensen wrote:
Den mandag den 3. november 2014 21.20.35 UTC+1 skrev Joerg:
Lasse Langwadt Christensen wrote:
Den sřndag den 2. november 2014 18.24.23 UTC+1 skrev Joerg:
Joerg wrote:
Carl Ijames wrote:
Don't know about computation speed, but this link says the
video card will drive 3 monitors:
http://www.geforce.com/hardware/desktop-gpus/geforce-gt-720/specifications.
Looking at Dell's site I don't see any mention of
expansion slots, and looking at the one picture with the
cover off I really can't see any sockets beyond the video
card, so if any further expansion is important you need to
ask Dell for clarification.

Looks like you are right:

http://www.dell.com/ed/business/p/xps-8700/pd
http://core0.staticworld.net/images/article/2013/07/1253541_sr-1160-100047019-orig.jpg

http://www.pcworld.com/article/2047487/dell-xps-8700-special-editions-review-a-little-less-performance-for-a-lot-less-cash.html


Quote "There's only one PCIe x16 slot, which means you won't
be able to add a second video card to take advantage of
Nvidia's SLI technology".

No slots. There's one more card in the bottom, not sure what
that is. But if the video can drive three monitors it should
be fine, I never added any cards to my current PC either.

Only question is, how can one connect two regular OPC monitors
to this?

http://www.geforce.com/hardware/desktop-gpus/geforce-gt-720/product-images


I'd expect that you can connect a monitor to each of the three
outputs, VGA,DVI,HDMI. I have an old geforce and that's how that
works

VGA is not much use, but unless you want to watch something from
Hollywood DVI and HDMI is the same thing

I do a lot of video conferencing via web where content moves. Other
than that just CAD, no movie streaming and such.

Then DVI will works just fine, HDMI is just DVI with optional audio
and the encryption Hollywood insists on if you bought a blueray movie


So just plug a monitor into both the HDMI and DVI output

Ok, but can one be sure that an ordinary cheap 27" 1920*1080 monitor
will plug into either of them? For example, the ViewSonic VA2702w I have
here only has the large DVI connector, not the narrow HDMI. It does have
VGA though which I am using right now (good enough for my purposes).

yes, for regular computer monitor HDMI and DVI is the same thing, you just
need the right cable or a adapter to get the wires in the right holes ;)

So then here in the photo the center one is HDMI and the right one is
DVI and that's where the two monitors should go to? I could also hook
one up to VGA like I have now.

yes, you can hook up three monitors

http://www.geforce.com/hardware/desktop-gpus/geforce-gt-720/product-images

Says dual-link or DVI-D for the DVI connector in the specs, whatever
that means.


single-link is three differential pairs, dual-link has three extra pairs that are used for the higher bandwidth need for very high resolutions

afair single link DVI is limited to 1920x1200 at 60 Hz

Thanks. 1920*1080 at 60Hz would be all I need.

--
Regards, Joerg

http://www.analogconsultants.com/
 
Phil Hobbs wrote:
On 11/02/2014 01:17 PM, Phil Hobbs wrote:
On 11/2/2014 12:45 PM, John Larkin wrote:
On Sun, 02 Nov 2014 11:06:30 -0500, Phil Hobbs
hobbs@electrooptical.net> wrote:

On 11/2/2014 11:00 AM, John Larkin wrote:
On Sun, 02 Nov 2014 07:25:49 -0800, Joerg <news@analogconsultants.com
wrote:

Folks,

Need to spiff up my simulation speeds here. IIRC Mike Engelhardt
stated
that the Intel i7 is a really good processor for LTSPice.
According to
this it looks like the 4790 is the fastest of the bunch:

http://www.intel.com/content/www/us/en/processors/core/core-i7-processor.html



So, what do thee say, is the computer in the Costco link below a good
deal for LTSpice purposes?

http://www.costco.com/Dell-XPS-8700-Desktop-%7c-Intel-Core-i7-%7c-1GB-Graphics-%7c-Windows-7-Professional.product.100131208.html



It's also available without MS-Office Home & Student 2013 for $100
less
but I found that OpenOffice isn't 100% compatible in the Excel
area so
that sounds like an ok deal. My hope is that it can drive two 27"
monitors but I guess I can always add in another graphics card if
not.

Reason I am looking at these is that I absolutely positively do not
want
any computer with Windows 8 in here and unfortunately that's what
many
others come with.

I have spent too many hours this weekend tweaking the transient
response of a semi-hysteretic (we call it "hysterical") switchmode
constant-current source. There are about 8 interacting knobs to turn.
At 30 seconds per run, understanding the interactions is impossible.

I want sliders on each of the part values, and I want to see the
waveforms change as I move the sliders, like they were trimpots on a
breadboard and I was looking at a scope. I need maybe 500 times the
compute power that I have now.

Mike should code LT Spice to execute on a high-end video card.



You can go quite a bit faster with a nice multicore machine--LTspice
lets you choose how many threads to run. My desktop machine (about 3
years old now) runs about 150 Gflops peak. Supermicro is an excellent
vendor.

Cheers

Phil Hobbs

There's a setting for one or two threads. Is that all?


That's because you only have two cores. Mine goes up to 15.

16 actually. Here's a picture:
http://electrooptical.net/pictures/LTspice16threads.png

That sounds like a high-testosterone machine of a computer :)

Which processors is in there?

--
Regards, Joerg

http://www.analogconsultants.com/
 
rickman wrote:
On 11/3/2014 3:41 PM, Joerg wrote:
rickman wrote:
On 11/2/2014 12:53 PM, Joerg wrote:
Jeff Liebermann wrote:
On Sun, 02 Nov 2014 07:25:49 -0800, Joerg <news@analogconsultants.com
wrote:

LTspice benchmark on various machines:
http://fetting.se/images/PC%20Speed%20Benchmark%20running%20LTspice%20circuits.pdf



Reason I am looking at these is that I absolutely positively do not
want
any computer with Windows 8 in here and unfortunately that's what
many
others come with.

Windoze 8.1 can be made semi-tolerable by putting the start menu back
in and making it look like Windoze 7.
http://www.classicshell.net
I've been installing it on all my customers Windoze 8.1 machines and
have had no complaints or problems. If you like wiggly icons on the
Windoze 8.1 start screen, you can do <Shift><Start>.


Too much risk. I've heard that running legacy software is tough in
Win-8
but Win-7 can mostly do it. Not as good as XP.

What legacy software? I have Windows 8 and I'm not having problems
running anything I ran on my old Vista laptop.


Ahm, my SW goes back to the mid-80's. Unorthodox filter design,
beamfield simulators and such.

So DOS programs? What makes you think they won't work under Win8? The
usual FUD?

I've read that many DOS and also Windows 16-bit programs no longer run.


When it comes to PCs I am lazy :)

I just want to plug it in and go. Re-installing all my stuff takes
enough time already.

I hear you. The big problem I had with setting up my Win 8 laptop was
that a lot of the freeware has become burdened with ads, toolbars and
other malware to the point I'm not willing to use it.


Yes, nagware is a major problem. It already was 10 years ago where it
took a lot of effort to rid the computer of that.


Before buying anything, I suggest you try LTspice on the new machine.
This is VERY easy with LTspice which doesn't use the registry or
require admin rights. Just copy the files to a flash drive and it
should work.

I am quite sure Costco will not let me do this :)

You can try finding the computer salesperson in the store. They are
limited by store policy of course, but I have met a few who were very
willing to help as best they could.


They only have them online.

I've never been a fan of Costco for computers, with one exception. They
let you return a computer, no questions asked for 90 days I believe. So
you can try it out at home.

Yep. And all I have to try is LTSpice, the rest will work.

One catch. LTspice saves its preferences to:
C:\windows\scad3.ini
which has to be writeable. The fix is to use the
-ini <path
command line switch, which will:
Specify an .ini file to use other than %WINDIR%\scad3.ini
http://ltwiki.org/LTspiceHelp/LTspiceHelp/Command_Line_Switches.htm

I need to note this somewhere. Writing to the Windows directory is a
*very* bad idea. I can't tell you how many developers do all sorts of
things they aren't supposed to under windows. That is the actual cause
of many problems people have running older software under Windows. They
don't listen to the people providing them with the OS!


I install everything in my own directory called "Programs". That avoids
a lot of such issues. Makes it tough in a multi-user environment but I
work alone here.

That helps one aspect, the nags from the OS about writing data. But you
have lost the benefit of the Program Files directory being protected. It
makes your executables that much easier to infect, although that is not
typically a problem since good AVS stops malware long before it gets a
chance to infect the hard drive.

I never had that happen. Plus the SW I run is mostly not mainstream and
should be very low on the hit list of hackers.

--
Regards, Joerg

http://www.analogconsultants.com/
 
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:

[...]

Since I cannot afford to put $1000 into a Titan video card, I miss on
a few benchmarks with my $250 GTX650.


I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no movies.

If you are going for power, you need to have separate video memory or
the video eats memory bandwidth which is often the limiting factor on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with multi-banked
RAM. Does this machine have two or more memory interfaces or just one?


No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.

You aren't grasping the concept. Video memory needs a sizable bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.

This has *nothing* to do with drawing the images into graphic memory.

The memory bank question will likely be more important than the number
of cores in the CPU. The guy who can run 16 threads has at least two
memory interfaces or it would be bogging down between 4 and 8 cores.

--

Rick
 
Jeff Liebermann wrote:
On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com> wrote:

On 11/2/2014 12:53 PM, Joerg wrote:
Too much risk. I've heard that running legacy software is tough in Win-8
but Win-7 can mostly do it. Not as good as XP.

What legacy software? I have Windows 8 and I'm not having problems
running anything I ran on my old Vista laptop.

I recently awarded myself a short vacation in honor my burning a huge
amount of time getting old software to run nicely on Windoze 8.1.
Specifically, the DOS versions of various fiduciary programs dating
1996 through 2002, which the customer insisted had to run even though
later versions worked just fine. The problem was that the tax rules
and tables all changed over the years and they wanted the original
versions. I ended up running them under DOSbox, which was originally
designed to run ancient games, but works equally well with ancient
business applications:
http://www.dosbox.com/status.php?show_status=1
I also tried them under VMware and VirtualBox, both of which worked
nicely, but DOSbox is easier and faster.

Another horror was Office 2003 on Windoze 8.1. It installs, updates,
loads, and looks like it might work, but eventually crashes. All I
really needed was Outlook 2003, but that would hang after polling for
mail a few times. I probably could have figured out the problem, but
convinced the customer that Mozilla Thunderbird would be a suitable
option.

Then, there's WordPerfect 12 which I think was introduced in 2002.
Amazingly, it worked 99%. However, the 1% was fatal. Windoze file
association would not start WP12 if I double clicked on a WPD file (or
any of the other WP files). It took a while to figure out that WP12
was trying to use an ancient ODBC version, which required that WP12
beg permission of the Windoze security abomination before it would
condescend to even supply an error message. Fixed by running WP12 as
administrator, which by passes most of the security mess.

I guess the moral here is to not try to run 12+ year old software on
Windoze 8.1. My mistake was assuming that since all the
aforementioned software ran just fine in Windoze 7, the new and
improved Windoze 8.1 couldn't possibly break something that already
worked so well.

That kind of stuff cinches it for me: It has to be Win-7.

[...]

--
Regards, Joerg

http://www.analogconsultants.com/
 
On Mon, 03 Nov 2014 15:53:13 -0500, rickman <gnuarm@gmail.com> Gave us:

If you would care to do a little digging for info on this I'm sure you
can find something that will help you learn.

What part of "can be turned off in the BIOS" did you not understand?

I know full well what it is.

I also know that YOU do not know what is going on, regardless of what
you read, and are happy to paste iterate back into here, appearing as if
to be a professor on the subject. You make me laugh.

We are talking about basic grasp here.

Nothing will help you learn, old man. You are hard wired stupid.
 
On Mon, 03 Nov 2014 16:01:39 -0500, rickman <gnuarm@gmail.com> Gave us:

So DOS programs? What makes you think they won't work under Win8? The
usual FUD?

So, you really know nothing about actual attachment to real hardware
hooks then, eh, dingledorf?

You ain't real bright, boy.
 
rickman wrote:
On 11/2/2014 5:28 PM, Jeff Liebermann wrote:
On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com> wrote:

One catch. LTspice saves its preferences to:
C:\windows\scad3.ini
which has to be writeable. The fix is to use the
-ini <path
command line switch, which will:
Specify an .ini file to use other than %WINDIR%\scad3.ini
http://ltwiki.org/LTspiceHelp/LTspiceHelp/Command_Line_Switches.htm

I need to note this somewhere. Writing to the Windows directory is a
*very* bad idea.

It was standard procedure in Windoze 3.1, where almost all
applications dropped pick_a_name.ini files in the C:\Windows\
directory.

Yes, and Windows 3.1 crashed on a regular basis for about any reason
whatsoever just like 95, 98 and ME.

MS has been telling developers since Win2000 and maybe since NT to not
put data files in the Windows or Program Files directories. Many chose
to ignore this which wasn't enforced until Vista and became one of the
things everyone loves to hate about Vista.

Maybe. But for us users only one thing counts: That stuff works.

I do have to admit it was handy as the files were easy to
find and save. The new and improved versions of Windoze hide these
config files in either the registry, or bury them 5 directory layers
deep, where few can find them without specialized tools or inside
information.

Windows doesn't put anything from an app in the registry. That is up to
the app to decide. Getting to these directories is easy if they used
the right location, C:\ProgramData. Instead they continue to use
C:\Program Files and now with Win8 MS puts the files in the long path
name you list, but I believe they can be reached transparently through
the path C:\Program Files So the best of both worlds.

If the app puts them somewhere else, don't blame windows.

If it was allowed in old Windows, isn't in new Windows, and there isn't
a user selector about this then I blame Windows.

I can't tell you how many developers do all sorts of
things they aren't supposed to under windows. That is the actual cause
of many problems people have running older software under Windows. They
don't listen to the people providing them with the OS!

LTspice (aka SwitcherCAD) is a rather old program, with many of the
traditions of Windoze 3.1 still present. If you don't like that, try
running some of the various NEC antenna modeling programs, that still
use the terms "card" and "deck" from the Hollerith punch card era. The
common mantra is the same everywhere... if it works, don't touch it.

These programs have been updated many, many times since Windows 3.1.
Windows NT, 2k, XP, Vista, 7, 8 and 8.1 aren't even the same OS as the
3.1 tree which was ended when XP was released. Stick with the old
habits and blame yourself or your program maintainer.

I use some open source Windows software that does the same crap and I am
very vocal about the cause and the fix for the problem. Few of the
developers are interested though. Now that 8 makes this (using Program
Files for data) work adequately they no longer have a need to change it.

If you are relying on programming habits from over 20 years ago, then
you will have to stew in your own soup.

Easy to say for someone who probably never has to deal with beamfield
sims and such. Bottomline there are programs some of us have to use
where there is no alternative. Where the design teams have dissolved
decades ago and some of the folks are not with us on earth anymore. My
record so far is a chunk of software that was stored on an 8" floppy.

Software does not automatically lose its value because it is over 20
years old. Or would you pour a bottle of 1995 Domaine Leflaive
Montrachet Grand Cru [*] into the sink because it is old?

Talking about using legacy stuff, the aircraft guys are a bit more
extreme there. This aircraft is going to celebrate its 80th soon and is
used commercially:

https://www.youtube.com/watch?v=jx11k1r1Pm8

[*] It runs north of $5k. Per bottle.

--
Regards, Joerg

http://www.analogconsultants.com/
 
rickman wrote:
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:

[...]

Since I cannot afford to put $1000 into a Titan video card, I
miss on
a few benchmarks with my $250 GTX650.


I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no movies.

If you are going for power, you need to have separate video memory or
the video eats memory bandwidth which is often the limiting factor on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with multi-banked
RAM. Does this machine have two or more memory interfaces or just one?


No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.

You aren't grasping the concept. Video memory needs a sizable bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.

This has *nothing* to do with drawing the images into graphic memory.

The memory bank question will likely be more important than the number
of cores in the CPU. The guy who can run 16 threads has at least two
memory interfaces or it would be bogging down between 4 and 8 cores.

Well ... we did enter the 21st century. In this day and age graphics
cards come with their own memory. AFAIK the Nvidia GT720 has 1GB of on
board RAM. Others have more but that sounds sufficient. Also, there is
no need to store 60 frames if the content is more or less static.

--
Regards, Joerg

http://www.analogconsultants.com/
 
On 03/11/2014 20:37, Joerg wrote:

Ok, but for example a gamer machine like the XPS series is a pretty good
bet that it'll perform well with SPICE.

If you can persuade them to do it a gamer machine with the graphics card
entirely deleted will be the cheapest low power combo to do about what
you want. The 2D capability of the Intel graphics engine internal on the
i5 & i7 CPUs are as fast as anything on fancy 3D gaming cards.
(obviously they get totally thrashed in 3D realtime rendering tests)

Basically you can shave 100-200W of the power consumption. My i7 PC
idles at about 60W when it isn't doing anything beyond web browsing.

BTW I wouldn't waste your money on exotic faster ram unless you intend
to overclock it. Stock ram and more of it is a better price performance.

I'd be interested to see how a moderate sized LTSPice simulation scales
with the number of threads on an i5 and i7 architecture. My guess is
that hyperthreading will not be all that useful to it.

--
Regards,
Martin Brown
 
Martin Brown wrote:
On 03/11/2014 20:37, Joerg wrote:

Ok, but for example a gamer machine like the XPS series is a pretty good
bet that it'll perform well with SPICE.

If you can persuade them to do it a gamer machine with the graphics card
entirely deleted will be the cheapest low power combo to do about what
you want. The 2D capability of the Intel graphics engine internal on the
i5 & i7 CPUs are as fast as anything on fancy 3D gaming cards.
(obviously they get totally thrashed in 3D realtime rendering tests)

Basically you can shave 100-200W of the power consumption. My i7 PC
idles at about 60W when it isn't doing anything beyond web browsing.

Problem is, unless you piece together a custom machine the ones that are
equipped with good processors and RAM up to the gills seem to always
come with these powerful graphics cards. The other issue is that
on-board graphics often will not drive two monitors. I found that out
the hard way after I bought the PC I am using now.


BTW I wouldn't waste your money on exotic faster ram unless you intend
to overclock it. Stock ram and more of it is a better price performance.

So you think 1600MHz RAM is fine?


I'd be interested to see how a moderate sized LTSPice simulation scales
with the number of threads on an i5 and i7 architecture. My guess is
that hyperthreading will not be all that useful to it.

I am hoping the four cores will speed things up significantly. Also the
huge amount of RAM. Right now I have 2GB and I regularly hit the limit.

--
Regards, Joerg

http://www.analogconsultants.com/
 
In article <01a8a0d7-a080-4ff4-ae7d-961154a235b8@googlegroups.com>,
langwadt@fonz.dk says...
Den mandag den 3. november 2014 22.42.14 UTC+1 skrev rickman:
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:

[...]

Since I cannot afford to put $1000 into a Titan video card, I miss on
a few benchmarks with my $250 GTX650.


I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no movies.

If you are going for power, you need to have separate video memory or
the video eats memory bandwidth which is often the limiting factor on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with multi-banked
RAM. Does this machine have two or more memory interfaces or just one?


No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.

You aren't grasping the concept. Video memory needs a sizable bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.


with anything but a graphics-card integrated in the chipset that memory
will be on the card itself, 1920*1080*24bit is less that 7MB

-Lasse

how do you figure that ?

As far as I know, most video cards map a whole 32 bit per pixel for
24bit (true color) these days.

That would equate to somewhere around 67 megs I do think.

I don't know, maybe there is some compression magic I don't know
about..

Jamie
 
On 11/3/2014 5:09 PM, Joerg wrote:
rickman wrote:
On 11/2/2014 5:28 PM, Jeff Liebermann wrote:
On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com> wrote:

One catch. LTspice saves its preferences to:
C:\windows\scad3.ini
which has to be writeable. The fix is to use the
-ini <path
command line switch, which will:
Specify an .ini file to use other than %WINDIR%\scad3.ini
http://ltwiki.org/LTspiceHelp/LTspiceHelp/Command_Line_Switches.htm

I need to note this somewhere. Writing to the Windows directory is a
*very* bad idea.

It was standard procedure in Windoze 3.1, where almost all
applications dropped pick_a_name.ini files in the C:\Windows\
directory.

Yes, and Windows 3.1 crashed on a regular basis for about any reason
whatsoever just like 95, 98 and ME.

MS has been telling developers since Win2000 and maybe since NT to not
put data files in the Windows or Program Files directories. Many chose
to ignore this which wasn't enforced until Vista and became one of the
things everyone loves to hate about Vista.


Maybe. But for us users only one thing counts: That stuff works.

Do you build your stuff so that if the user connects a different
computer it craps out? No, you design your interfaces *correctly* so
that it works now and it keeps working when some peripheral piece that
should have no impact is changed out.

These developers are designing crappy software and blaming it on MS.


I do have to admit it was handy as the files were easy to
find and save. The new and improved versions of Windoze hide these
config files in either the registry, or bury them 5 directory layers
deep, where few can find them without specialized tools or inside
information.

Windows doesn't put anything from an app in the registry. That is up to
the app to decide. Getting to these directories is easy if they used
the right location, C:\ProgramData. Instead they continue to use
C:\Program Files and now with Win8 MS puts the files in the long path
name you list, but I believe they can be reached transparently through
the path C:\Program Files So the best of both worlds.

If the app puts them somewhere else, don't blame windows.


If it was allowed in old Windows, isn't in new Windows, and there isn't
a user selector about this then I blame Windows.

"Allowed" meaning it didn't crap out, yes. "Allowed" meaning the
developers were not designing according to best practices, no.


I can't tell you how many developers do all sorts of
things they aren't supposed to under windows. That is the actual cause
of many problems people have running older software under Windows. They
don't listen to the people providing them with the OS!

LTspice (aka SwitcherCAD) is a rather old program, with many of the
traditions of Windoze 3.1 still present. If you don't like that, try
running some of the various NEC antenna modeling programs, that still
use the terms "card" and "deck" from the Hollerith punch card era. The
common mantra is the same everywhere... if it works, don't touch it.

These programs have been updated many, many times since Windows 3.1.
Windows NT, 2k, XP, Vista, 7, 8 and 8.1 aren't even the same OS as the
3.1 tree which was ended when XP was released. Stick with the old
habits and blame yourself or your program maintainer.

I use some open source Windows software that does the same crap and I am
very vocal about the cause and the fix for the problem. Few of the
developers are interested though. Now that 8 makes this (using Program
Files for data) work adequately they no longer have a need to change it.

If you are relying on programming habits from over 20 years ago, then
you will have to stew in your own soup.


Easy to say for someone who probably never has to deal with beamfield
sims and such. Bottomline there are programs some of us have to use
where there is no alternative. Where the design teams have dissolved
decades ago and some of the folks are not with us on earth anymore. My
record so far is a chunk of software that was stored on an 8" floppy.

Yeah, exactly. If you are that far back in time you man need to rethink
your approach.


Software does not automatically lose its value because it is over 20
years old. Or would you pour a bottle of 1995 Domaine Leflaive
Montrachet Grand Cru [*] into the sink because it is old?

Actually software does degrade with time as you are finding out. If you
can't find a platform to run it on, it has worn out.


Talking about using legacy stuff, the aircraft guys are a bit more
extreme there. This aircraft is going to celebrate its 80th soon and is
used commercially:

https://www.youtube.com/watch?v=jx11k1r1Pm8

[*] It runs north of $5k. Per bottle.

Good, maybe your beamfield sim will run on it. :)

--

Rick
 
On 11/3/2014 4:57 PM, Lasse Langwadt Christensen wrote:
Den mandag den 3. november 2014 22.42.14 UTC+1 skrev rickman:
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:

[...]

Since I cannot afford to put $1000 into a Titan video card, I miss on
a few benchmarks with my $250 GTX650.


I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no movies.

If you are going for power, you need to have separate video memory or
the video eats memory bandwidth which is often the limiting factor on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with multi-banked
RAM. Does this machine have two or more memory interfaces or just one?


No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.

You aren't grasping the concept. Video memory needs a sizable bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.


with anything but a graphics-card integrated in the chipset that memory
will be on the card itself, 1920*1080*24bit is less that 7MB

It is not the amount of memory, it is the video bandwidth to keep the
monitor refreshed. Yes, it should be separate from the main memory or
you take a hit from the video accesses.

--

Rick
 
Maynard A. Philbrook Jr. wrote:
In article <01a8a0d7-a080-4ff4-ae7d-961154a235b8@googlegroups.com>,
langwadt@fonz.dk says...
Den mandag den 3. november 2014 22.42.14 UTC+1 skrev rickman:
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:
[...]

Since I cannot afford to put $1000 into a Titan video card, I miss on
a few benchmarks with my $250 GTX650.

I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no movies.
If you are going for power, you need to have separate video memory or
the video eats memory bandwidth which is often the limiting factor on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with multi-banked
RAM. Does this machine have two or more memory interfaces or just one?

No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.
You aren't grasping the concept. Video memory needs a sizable bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.

with anything but a graphics-card integrated in the chipset that memory
will be on the card itself, 1920*1080*24bit is less that 7MB

-Lasse

how do you figure that ?

As far as I know, most video cards map a whole 32 bit per pixel for
24bit (true color) these days.

That would equate to somewhere around 67 megs I do think.

That's megabits. In bytes this would be 8.3MB. Times two if there's two
displays. With the massive quantity of onboard memory on modern graphics
cards that is a mere drop in the bucket.


I don't know, maybe there is some compression magic I don't know
about..

No need for compression.

--
Regards, Joerg

http://www.analogconsultants.com/
 
On 11/3/2014 5:47 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:

[...]

Since I cannot afford to put $1000 into a Titan video card, I
miss on
a few benchmarks with my $250 GTX650.


I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no movies.

If you are going for power, you need to have separate video memory or
the video eats memory bandwidth which is often the limiting factor on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with multi-banked
RAM. Does this machine have two or more memory interfaces or just one?


No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.

You aren't grasping the concept. Video memory needs a sizable bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.

This has *nothing* to do with drawing the images into graphic memory.

The memory bank question will likely be more important than the number
of cores in the CPU. The guy who can run 16 threads has at least two
memory interfaces or it would be bogging down between 4 and 8 cores.


Well ... we did enter the 21st century. In this day and age graphics
cards come with their own memory. AFAIK the Nvidia GT720 has 1GB of on
board RAM. Others have more but that sounds sufficient. Also, there is
no need to store 60 frames if the content is more or less static.

I don't know if you are playing with me or what. Yes, that is what I am
telling you, get a system with separate graphic memory which means a
separate graphics chip. Many mobos have built in video with *no* video
ram.

--

Rick
 
On 11/3/2014 4:31 PM, Joerg wrote:
Phil Hobbs wrote:
On 11/02/2014 01:17 PM, Phil Hobbs wrote:
On 11/2/2014 12:45 PM, John Larkin wrote:
On Sun, 02 Nov 2014 11:06:30 -0500, Phil Hobbs
hobbs@electrooptical.net> wrote:

On 11/2/2014 11:00 AM, John Larkin wrote:
On Sun, 02 Nov 2014 07:25:49 -0800, Joerg <news@analogconsultants.com
wrote:

Folks,

Need to spiff up my simulation speeds here. IIRC Mike Engelhardt
stated
that the Intel i7 is a really good processor for LTSPice.
According to
this it looks like the 4790 is the fastest of the bunch:

http://www.intel.com/content/www/us/en/processors/core/core-i7-processor.html



So, what do thee say, is the computer in the Costco link below a good
deal for LTSpice purposes?

http://www.costco.com/Dell-XPS-8700-Desktop-%7c-Intel-Core-i7-%7c-1GB-Graphics-%7c-Windows-7-Professional.product.100131208.html



It's also available without MS-Office Home & Student 2013 for $100
less
but I found that OpenOffice isn't 100% compatible in the Excel
area so
that sounds like an ok deal. My hope is that it can drive two 27"
monitors but I guess I can always add in another graphics card if
not.

Reason I am looking at these is that I absolutely positively do not
want
any computer with Windows 8 in here and unfortunately that's what
many
others come with.

I have spent too many hours this weekend tweaking the transient
response of a semi-hysteretic (we call it "hysterical") switchmode
constant-current source. There are about 8 interacting knobs to turn.
At 30 seconds per run, understanding the interactions is impossible.

I want sliders on each of the part values, and I want to see the
waveforms change as I move the sliders, like they were trimpots on a
breadboard and I was looking at a scope. I need maybe 500 times the
compute power that I have now.

Mike should code LT Spice to execute on a high-end video card.



You can go quite a bit faster with a nice multicore machine--LTspice
lets you choose how many threads to run. My desktop machine (about 3
years old now) runs about 150 Gflops peak. Supermicro is an excellent
vendor.

Cheers

Phil Hobbs

There's a setting for one or two threads. Is that all?


That's because you only have two cores. Mine goes up to 15.

16 actually. Here's a picture:
http://electrooptical.net/pictures/LTspice16threads.png


That sounds like a high-testosterone machine of a computer :)

Which processors is in there?

It has a pair of AMD Opteron 6128s. I haven't been keeping up, but 3
years ago the Magny Cours Opterons ran rings around the Intel offerings
for floating point.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510

hobbs at electrooptical dot net
http://electrooptical.net
 
On 11/3/2014 4:31 PM, Joerg wrote:
Phil Hobbs wrote:
On 11/02/2014 01:17 PM, Phil Hobbs wrote:
On 11/2/2014 12:45 PM, John Larkin wrote:
On Sun, 02 Nov 2014 11:06:30 -0500, Phil Hobbs
hobbs@electrooptical.net> wrote:

On 11/2/2014 11:00 AM, John Larkin wrote:
On Sun, 02 Nov 2014 07:25:49 -0800, Joerg <news@analogconsultants.com
wrote:

Folks,

Need to spiff up my simulation speeds here. IIRC Mike Engelhardt
stated
that the Intel i7 is a really good processor for LTSPice.
According to
this it looks like the 4790 is the fastest of the bunch:

http://www.intel.com/content/www/us/en/processors/core/core-i7-processor.html



So, what do thee say, is the computer in the Costco link below a good
deal for LTSpice purposes?

http://www.costco.com/Dell-XPS-8700-Desktop-%7c-Intel-Core-i7-%7c-1GB-Graphics-%7c-Windows-7-Professional.product.100131208.html



It's also available without MS-Office Home & Student 2013 for $100
less
but I found that OpenOffice isn't 100% compatible in the Excel
area so
that sounds like an ok deal. My hope is that it can drive two 27"
monitors but I guess I can always add in another graphics card if
not.

Reason I am looking at these is that I absolutely positively do not
want
any computer with Windows 8 in here and unfortunately that's what
many
others come with.

I have spent too many hours this weekend tweaking the transient
response of a semi-hysteretic (we call it "hysterical") switchmode
constant-current source. There are about 8 interacting knobs to turn.
At 30 seconds per run, understanding the interactions is impossible.

I want sliders on each of the part values, and I want to see the
waveforms change as I move the sliders, like they were trimpots on a
breadboard and I was looking at a scope. I need maybe 500 times the
compute power that I have now.

Mike should code LT Spice to execute on a high-end video card.



You can go quite a bit faster with a nice multicore machine--LTspice
lets you choose how many threads to run. My desktop machine (about 3
years old now) runs about 150 Gflops peak. Supermicro is an excellent
vendor.

Cheers

Phil Hobbs

There's a setting for one or two threads. Is that all?


That's because you only have two cores. Mine goes up to 15.

16 actually. Here's a picture:
http://electrooptical.net/pictures/LTspice16threads.png


That sounds like a high-testosterone machine of a computer :)

Which processors is in there?

It has a pair of AMD Opteron 6128s. I haven't been keeping up, but 3
years ago the Magny Cours Opterons ran rings around the Intel offerings
for floating point.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510

hobbs at electrooptical dot net
http://electrooptical.net
 
rickman wrote:
On 11/3/2014 5:09 PM, Joerg wrote:
rickman wrote:
On 11/2/2014 5:28 PM, Jeff Liebermann wrote:
On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com> wrote:

One catch. LTspice saves its preferences to:
C:\windows\scad3.ini
which has to be writeable. The fix is to use the
-ini <path
command line switch, which will:
Specify an .ini file to use other than %WINDIR%\scad3.ini
http://ltwiki.org/LTspiceHelp/LTspiceHelp/Command_Line_Switches.htm


I need to note this somewhere. Writing to the Windows directory is a
*very* bad idea.

It was standard procedure in Windoze 3.1, where almost all
applications dropped pick_a_name.ini files in the C:\Windows\
directory.

Yes, and Windows 3.1 crashed on a regular basis for about any reason
whatsoever just like 95, 98 and ME.

MS has been telling developers since Win2000 and maybe since NT to not
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
put data files in the Windows or Program Files directories. Many chose
to ignore this which wasn't enforced until Vista and became one of the
things everyone loves to hate about Vista.


Maybe. But for us users only one thing counts: That stuff works.

Do you build your stuff so that if the user connects a different
computer it craps out? No, you design your interfaces *correctly* so
that it works now and it keeps working when some peripheral piece that
should have no impact is changed out.

I design it so that is also works correctly with legacy gear. In
aerospace that can mean equipment from before you and I were born.


These developers are designing crappy software and blaming it on MS.

I've underlined the important part above. You might remember that there
were operating systems before Win2k and that there was software written
for those.

I do have to admit it was handy as the files were easy to
find and save. The new and improved versions of Windoze hide these
config files in either the registry, or bury them 5 directory layers
deep, where few can find them without specialized tools or inside
information.

Windows doesn't put anything from an app in the registry. That is up to
the app to decide. Getting to these directories is easy if they used
the right location, C:\ProgramData. Instead they continue to use
C:\Program Files and now with Win8 MS puts the files in the long path
name you list, but I believe they can be reached transparently through
the path C:\Program Files So the best of both worlds.

If the app puts them somewhere else, don't blame windows.


If it was allowed in old Windows, isn't in new Windows, and there isn't
a user selector about this then I blame Windows.

"Allowed" meaning it didn't crap out, yes. "Allowed" meaning the
developers were not designing according to best practices, no.

If it was not disallowed it was ok. Even today it's still this way.
Personally I also think it was wrong but it is what it is. Many CAD
programs still store their libraries in the program folder and,
naturally, libraries are meant to be modified and added to.

I can't tell you how many developers do all sorts of
things they aren't supposed to under windows. That is the actual
cause
of many problems people have running older software under Windows.
They
don't listen to the people providing them with the OS!

LTspice (aka SwitcherCAD) is a rather old program, with many of the
traditions of Windoze 3.1 still present. If you don't like that, try
running some of the various NEC antenna modeling programs, that still
use the terms "card" and "deck" from the Hollerith punch card era. The
common mantra is the same everywhere... if it works, don't touch it.

These programs have been updated many, many times since Windows 3.1.
Windows NT, 2k, XP, Vista, 7, 8 and 8.1 aren't even the same OS as the
3.1 tree which was ended when XP was released. Stick with the old
habits and blame yourself or your program maintainer.

I use some open source Windows software that does the same crap and I am
very vocal about the cause and the fix for the problem. Few of the
developers are interested though. Now that 8 makes this (using Program
Files for data) work adequately they no longer have a need to change it.

If you are relying on programming habits from over 20 years ago, then
you will have to stew in your own soup.


Easy to say for someone who probably never has to deal with beamfield
sims and such. Bottomline there are programs some of us have to use
where there is no alternative. Where the design teams have dissolved
decades ago and some of the folks are not with us on earth anymore. My
record so far is a chunk of software that was stored on an 8" floppy.

Yeah, exactly. If you are that far back in time you man need to rethink
your approach.

Any suggestions there? Should I tell my clients that this, that and the
other project can't be done because only older design tools are available?

Imagine you having a leak in the house. The plumber comes, takes a look
and says "That wouldn't be up to code these days although it was back
then. I suggest you build a new house and have this one torn down".

Software does not automatically lose its value because it is over 20
years old. Or would you pour a bottle of 1995 Domaine Leflaive
Montrachet Grand Cru [*] into the sink because it is old?

Actually software does degrade with time as you are finding out. If you
can't find a platform to run it on, it has worn out.

My software does not run out. My hardware does. I get more and more into
pulse-echo stuff and esoteric switch mode designs where the amount of
data to be crunched overwhelms the machine.

BTW, when the in-circuit tester at one client conked out the culprit was
the PC. Had an ISA bus. It was no problem to buy a brand-new machine at
a very reasonable price that has an ISA bus. Except now they also have a
CD drive in it.

Talking about using legacy stuff, the aircraft guys are a bit more
extreme there. This aircraft is going to celebrate its 80th soon and is
used commercially:

https://www.youtube.com/watch?v=jx11k1r1Pm8

[*] It runs north of $5k. Per bottle.

Good, maybe your beamfield sim will run on it. :)

:)

Example from a few years ago: A nasty alarm system problem had to be
diagnosed. The software from that system was from the 80's. If I hadn't
been able to run it really old software here at my lab I would have had
to turn down that whole job. That would not be what I call smart.

--
Regards, Joerg

http://www.analogconsultants.com/
 
rickman wrote:
On 11/3/2014 4:57 PM, Lasse Langwadt Christensen wrote:
Den mandag den 3. november 2014 22.42.14 UTC+1 skrev rickman:
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:

[...]

Since I cannot afford to put $1000 into a Titan video card,
I miss on
a few benchmarks with my $250 GTX650.


I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no movies.

If you are going for power, you need to have separate video memory or
the video eats memory bandwidth which is often the limiting factor
on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with multi-banked
RAM. Does this machine have two or more memory interfaces or just
one?


No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.

You aren't grasping the concept. Video memory needs a sizable bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.


with anything but a graphics-card integrated in the chipset that memory
will be on the card itself, 1920*1080*24bit is less that 7MB

It is not the amount of memory, it is the video bandwidth to keep the
monitor refreshed. Yes, it should be separate from the main memory or
you take a hit from the video accesses.

It _is_ separate from the main memory on all modern graphics cards. The
core circuitry of a PC has nothing to do with screen refresh. That was
even the case with an old Tseng Labs card I had in the early 90's.

--
Regards, Joerg

http://www.analogconsultants.com/
 

Welcome to EDABoard.com

Sponsor

Back
Top