Is this Intel i7 machine good for LTSpice?

In article <k4pc5ad6p89qps1blou358m1s2lof9q7ij@4ax.com>,
jeffl@cruzio.com says...
On Sun, 02 Nov 2014 07:25:49 -0800, Joerg <news@analogconsultants.com
wrote:

LTspice benchmark on various machines:
http://fetting.se/images/PC%20Speed%20Benchmark%20running%20LTspice%20circuits.pdf

Reason I am looking at these is that I absolutely positively do not want
any computer with Windows 8 in here and unfortunately that's what many
others come with.

Windoze 8.1 can be made semi-tolerable by putting the start menu back
in and making it look like Windoze 7.
http://www.classicshell.net
I've been installing it on all my customers Windoze 8.1 machines and
have had no complaints or problems. If you like wiggly icons on the
Windoze 8.1 start screen, you can do <Shift><Start>.

The damage control version of Windoze 10, that is possibly due some
time in the distant future, restores the start menu:
http://windows.microsoft.com/en-us/windows/preview
but otherwise currently looks like Windoze 8.1.

Incidentally, Halloween was the last day that Microsoft will ship
Windoze 7 licenses to OEM's.

The Dell XPS 8700 seems like a nice machine. However, if you want
performance, I suggest you look at an SSD drive for the OS.
http://www.newegg.com/Internal-SSDs/SubCategory/ID-636
I've had good luck with Samsung 840 EVO series drives (mostly 250GB).
The ritual is simple. I use Acronis True Image 2014 (not 2015) to
clone the hard disk to the SSD. I then replace the hard disk with the
SSD and test everything. When done, I wipe the hard disk, and install
it as a 2nd hard disk. If I need to return everything to stock, I
have the Acronis True Image 2014 backup image with which to recover
the initial installation. Elapsed time on a typical fast system is
about 1 hr.

Before buying anything, I suggest you try LTspice on the new machine.
This is VERY easy with LTspice which doesn't use the registry or
require admin rights. Just copy the files to a flash drive and it
should work.

One catch. LTspice saves its preferences to:
C:\windows\scad3.ini
which has to be writeable. The fix is to use the
-ini <path
command line switch, which will:
Specify an .ini file to use other than %WINDIR%\scad3.ini
http://ltwiki.org/LTspiceHelp/LTspiceHelp/Command_Line_Switches.htm

I have a 17-xxx XPS 8700, LTSPICE works just dandy on it.

Jamie
 
On 11/2/2014 5:28 PM, Jeff Liebermann wrote:
On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com> wrote:

One catch. LTspice saves its preferences to:
C:\windows\scad3.ini
which has to be writeable. The fix is to use the
-ini <path
command line switch, which will:
Specify an .ini file to use other than %WINDIR%\scad3.ini
http://ltwiki.org/LTspiceHelp/LTspiceHelp/Command_Line_Switches.htm

I need to note this somewhere. Writing to the Windows directory is a
*very* bad idea.

It was standard procedure in Windoze 3.1, where almost all
applications dropped pick_a_name.ini files in the C:\Windows\
directory.

Yes, and Windows 3.1 crashed on a regular basis for about any reason
whatsoever just like 95, 98 and ME.

MS has been telling developers since Win2000 and maybe since NT to not
put data files in the Windows or Program Files directories. Many chose
to ignore this which wasn't enforced until Vista and became one of the
things everyone loves to hate about Vista.


I do have to admit it was handy as the files were easy to
find and save. The new and improved versions of Windoze hide these
config files in either the registry, or bury them 5 directory layers
deep, where few can find them without specialized tools or inside
information.

Windows doesn't put anything from an app in the registry. That is up to
the app to decide. Getting to these directories is easy if they used
the right location, C:\ProgramData. Instead they continue to use
C:\Program Files and now with Win8 MS puts the files in the long path
name you list, but I believe they can be reached transparently through
the path C:\Program Files So the best of both worlds.

If the app puts them somewhere else, don't blame windows.


I can't tell you how many developers do all sorts of
things they aren't supposed to under windows. That is the actual cause
of many problems people have running older software under Windows. They
don't listen to the people providing them with the OS!

LTspice (aka SwitcherCAD) is a rather old program, with many of the
traditions of Windoze 3.1 still present. If you don't like that, try
running some of the various NEC antenna modeling programs, that still
use the terms "card" and "deck" from the Hollerith punch card era. The
common mantra is the same everywhere... if it works, don't touch it.

These programs have been updated many, many times since Windows 3.1.
Windows NT, 2k, XP, Vista, 7, 8 and 8.1 aren't even the same OS as the
3.1 tree which was ended when XP was released. Stick with the old
habits and blame yourself or your program maintainer.

I use some open source Windows software that does the same crap and I am
very vocal about the cause and the fix for the problem. Few of the
developers are interested though. Now that 8 makes this (using Program
Files for data) work adequately they no longer have a need to change it.

If you are relying on programming habits from over 20 years ago, then
you will have to stew in your own soup.


Looking at the benchmarks at:
http://fetting.se/images/PC%20Speed%20Benchmark%20running%20LTspice%20circuits.pdf
my Dell Optiplex 755 clunker runs the 3 benchmarks at:
14.5 7.6 3.6
If I upgrade to the fastest machine on the list:
4.0 2.9 1.0
or roughly 3 times faster. Might be worth $1200+.

The database is at:
https://groups.yahoo.com/neo/groups/LTspice/database/2/edit
and shows no Windoze 8.1 benchmarks and no SSD, so those will remain
an unknown. The benchmark files and instructions are at:
https://groups.yahoo.com/neo/groups/LTspice/files/%20Examples/Benchmark/
If you run the benchmark, be sure to add it to the database.

SSD is great for anything that uses virtual memory (god forbid) or runs
for a short time after taking time to load. I would not expect spice to
have issues with disk speed except I guess the graph data is stored on
disk maybe? I seem to recall some of my simulations generating a lot of
data which would have easily overflowed the 3 GB of RAM in my machine
after the OS got done with it.

--

Rick
 
Jeff Liebermann wrote:

On Sun, 02 Nov 2014 12:27:52 -0800, DecadentLinuxUserNumeroUno
DLU1@DecadentLinuxUser.org> wrote:

Even better than those are the mSATA drives and now, the best... the
M.2 drives.
Not much bigger than a couple of air mail stamps (I date myself).
Way faster than the 2.5" form factor SSD "laptop drive" replacement
family.

The SSD drive I recommended comes in both SATA3 and mSATA
configurations:

http://www.samsung.com/global/business/semiconductor/minisite/SSD/global/html/ssd840evo/overview_mSATA.html

http://www.samsung.com/global/business/semiconductor/minisite/SSD/global/html/ssd840evo/overview.html
The specs look fairly close:

http://www.samsung.com/global/business/semiconductor/minisite/SSD/global/html/ssd840evo/specifications.html

I have an older mSATA drive in my Acer C720 running Linux. Very very
very very fast, but I haven't compared it with a SATA3 drive.

The Samsung 840 series has a speed bug. The fix is in flux, so it is best to
do an internet search for this issue. Samsung released a windows program to
"fix" the problem.

I unfortunately have the 840 in this PC but I run linux, so their fix isn't
so handy.
 
On Sun, 02 Nov 2014 20:04:44 -0800, miso <miso@sushi.com> wrote:

I'm on a quad core Xeon, which means I can have 8 threads. Also a Supermicro
motherboard with 32G of error correcting RAM. Basically a low end
workstation.

LT Spice has been multithreaded for about 5 years now, but the nature of
spice simulation won't lead to a linear speed up with the number of cores.
In fact, your setting of three makes sense based on system monitor analysis.

If you go with error detection and correction, you need a server grade mobo
like a Supermicro. Not all boards that can use such RAM have the ability to
report back on the amount of correction that occurs.

Regarding:
"I have spent too many hours this weekend tweaking the transient
response of a semi-hysteretic (we call it "hysterical") switchmode
constant-current source. There are about 8 interacting knobs to turn.
At 30 seconds per run, understanding the interactions is impossible."

Spice is not a design tool. It is a verification tool.

Of course it's a design tool. Why spend a half hour cranking out a
voltage divider network with a calculator, when you can Spice and
fiddle a solution in a few minutes?

Is a calculator a design tool? Is an equation a design tool?


--

John Larkin Highland Technology, Inc

jlarkin att highlandtechnology dott com
http://www.highlandtechnology.com
 
Joerg wrote:

There are so many variants of graphics cards that it would require tons
of work for Mike's team.

It isn't the graphics card as much as the standard of acceleration. ATI and
Nvidia use different standards.

NGspice has Cuda support, which means you need Nvidia.
> http://ngspice.sourceforge.net/

You also need an OS that supports CUDA.
 
On 2014-11-03, rickman <gnuarm@gmail.com> wrote:
On 11/2/2014 5:28 PM, Jeff Liebermann wrote:
On Sun, 02 Nov 2014 14:56:04 -0500, rickman <gnuarm@gmail.com> wrote:

One catch. LTspice saves its preferences to:
C:\windows\scad3.ini
which has to be writeable. The fix is to use the
-ini <path
command line switch, which will:
Specify an .ini file to use other than %WINDIR%\scad3.ini
http://ltwiki.org/LTspiceHelp/LTspiceHelp/Command_Line_Switches.htm

I need to note this somewhere. Writing to the Windows directory is a
*very* bad idea.

It was standard procedure in Windoze 3.1, where almost all
applications dropped pick_a_name.ini files in the C:\Windows\
directory.

Yes, and Windows 3.1 crashed on a regular basis for about any reason
whatsoever just like 95, 98 and ME.

MS has been telling developers since Win2000 and maybe since NT to not
put data files in the Windows or Program Files directories. Many chose
to ignore this which wasn't enforced until Vista and became one of the
things everyone loves to hate about Vista.

? AFAIK Microsoft software is still putting data files there.


--
umop apisdn
 
On 3 Nov 2014 06:20:45 GMT, Jasen Betts <jasen@xnet.co.nz> Gave us:

On 2014-11-02, Joerg <news@analogconsultants.com> wrote:

Only question is, how can one connect two regular OPC monitors to this?

http://www.geforce.com/hardware/desktop-gpus/geforce-gt-720/product-images

Assuming you consider VGA to be irregular use a HDMI to DVI cable for
the other,

Most video cards even come with an adapter these days.
 
Jeff Liebermann wrote:

On Sun, 02 Nov 2014 21:08:42 -0800, miso <miso@sushi.com> wrote:

The Samsung 840 series has a speed bug. The fix is in flux, so it is best
to do an internet search for this issue. Samsung released a windows
program to "fix" the problem.

I unfortunately have the 840 in this PC but I run linux, so their fix
isn't so handy.

The Linux fix is on the Samsung web pile in the form of a bootable CD:

http://www.samsung.com/global/business/semiconductor/minisite/SSD/global/html/support/downloads.html
Near bottom of page. I've only done the Windoze version once. When
it works, it's quite simple. However, if the "advanced mode" is
required, it's an ordeal. As always, make an image backup before
doing anything this radical.

Thanks. I downloaded the ISO and the instructions.

In the dark ages, you would get a file to put on a floppy to upgrade
firmware. Then the manufactures decided that they would just distribute a
windows program. Nice, but if you don't run windows, well not so nice.

My blueray has new firmware.
> http://www.lg.com/us/support-product/lg-WH14NS40
Only windows solutions. Now it isn't like I don't have a windows box handy,
but this shouldn't be necessary. If windows wasn't so damn expensive, I
could dual boot everything, but I don't like paying the Microsoft tax. I
have three copies of win7 pro. Enough is enough.\

I'm going to research the firmware upgrade path prior to buying any more
peripherals.
 
John Larkin wrote:

Spice is not a design tool. It is a verification tool.



Of course it's a design tool. Why spend a half hour cranking out a
voltage divider network with a calculator, when you can Spice and
fiddle a solution in a few minutes?

Is a calculator a design tool? Is an equation a design tool?

A voltage divider. We're talking Ohm's law here. You would really resort to
Spice to design something that can be done simply with middle school
algebra?

Have I used Spice to analyze a resistor divider? Actually yes, but in finite
element analysis to simulate a laser trim procedure. The basic networks are
designed by hand.
 
On 11/02/2014 01:17 PM, Phil Hobbs wrote:
On 11/2/2014 12:45 PM, John Larkin wrote:
On Sun, 02 Nov 2014 11:06:30 -0500, Phil Hobbs
hobbs@electrooptical.net> wrote:

On 11/2/2014 11:00 AM, John Larkin wrote:
On Sun, 02 Nov 2014 07:25:49 -0800, Joerg <news@analogconsultants.com
wrote:

Folks,

Need to spiff up my simulation speeds here. IIRC Mike Engelhardt
stated
that the Intel i7 is a really good processor for LTSPice. According to
this it looks like the 4790 is the fastest of the bunch:

http://www.intel.com/content/www/us/en/processors/core/core-i7-processor.html


So, what do thee say, is the computer in the Costco link below a good
deal for LTSpice purposes?

http://www.costco.com/Dell-XPS-8700-Desktop-%7c-Intel-Core-i7-%7c-1GB-Graphics-%7c-Windows-7-Professional.product.100131208.html


It's also available without MS-Office Home & Student 2013 for $100
less
but I found that OpenOffice isn't 100% compatible in the Excel area so
that sounds like an ok deal. My hope is that it can drive two 27"
monitors but I guess I can always add in another graphics card if not.

Reason I am looking at these is that I absolutely positively do not
want
any computer with Windows 8 in here and unfortunately that's what many
others come with.

I have spent too many hours this weekend tweaking the transient
response of a semi-hysteretic (we call it "hysterical") switchmode
constant-current source. There are about 8 interacting knobs to turn.
At 30 seconds per run, understanding the interactions is impossible.

I want sliders on each of the part values, and I want to see the
waveforms change as I move the sliders, like they were trimpots on a
breadboard and I was looking at a scope. I need maybe 500 times the
compute power that I have now.

Mike should code LT Spice to execute on a high-end video card.



You can go quite a bit faster with a nice multicore machine--LTspice
lets you choose how many threads to run. My desktop machine (about 3
years old now) runs about 150 Gflops peak. Supermicro is an excellent
vendor.

Cheers

Phil Hobbs

There's a setting for one or two threads. Is that all?


That's because you only have two cores. Mine goes up to 15.

16 actually. Here's a picture:
http://electrooptical.net/pictures/LTspice16threads.png

Cheers

Phil Hobbs
Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510

hobbs at electrooptical dot net
http://electrooptical.net
 
On 03/11/2014 04:04, miso wrote:

I'm on a quad core Xeon, which means I can have 8 threads. Also a Supermicro
motherboard with 32G of error correcting RAM. Basically a low end
workstation.

LT Spice has been multithreaded for about 5 years now, but the nature of
spice simulation won't lead to a linear speed up with the number of cores.
In fact, your setting of three makes sense based on system monitor analysis.

The more interesting question is whether the i7 with 4 cores and
hyperthreading to run 8 threads actually provides any better performance
with LTSpice than the corresponding i5 with 4 real cores and no
hyperthreading. Sometimes 5 or 6 threads is optimum but quite often in
search problems it consumes power without speeding it up!

In some chess problems the i5 can be faster and certainly cheaper!

Be interested to know if it holds with LTSpice too.

--
Regards,
Martin Brown
 
On Mon, 03 Nov 2014 01:43:29 -0800, miso <miso@sushi.com> wrote:

John Larkin wrote:

Spice is not a design tool. It is a verification tool.



Of course it's a design tool. Why spend a half hour cranking out a
voltage divider network with a calculator, when you can Spice and
fiddle a solution in a few minutes?

Is a calculator a design tool? Is an equation a design tool?



A voltage divider. We're talking Ohm's law here. You would really resort to
Spice to design something that can be done simply with middle school
algebra?

If it saves me time, absolutely.

Have I used Spice to analyze a resistor divider? Actually yes, but in finite
element analysis to simulate a laser trim procedure. The basic networks are
designed by hand.

Do that if you enjoy it. I'd rather get scut work like that over as
soon as possible, which is often Spice + twiddle.

Here's a divide-and-offset network designed by fiddling. The object
was to map +-10.5 volts or so into the unipolar ADC range with
available parts, already on the BOM. The final close-enough in:eek:ut
transfer function was determined by Spice, then plugged into the ARM
code... the opposite of the classic design direction. A bonus is that
the .asc file is part of the permanent design record, and easily
revisited if ever necessary.

This took minutes.

Version 4
SHEET 1 880 680
WIRE 224 -16 112 -16
WIRE 112 16 112 -16
WIRE 224 64 224 -16
WIRE 112 128 112 96
WIRE -192 192 -256 192
WIRE -48 192 -112 192
WIRE 0 192 -48 192
WIRE 48 192 0 192
WIRE 224 192 224 144
WIRE 224 192 128 192
WIRE 400 192 224 192
WIRE 448 192 400 192
WIRE -256 240 -256 192
WIRE 224 240 224 192
WIRE -48 256 -48 192
WIRE -256 368 -256 320
WIRE -48 368 -48 336
WIRE 224 368 224 320
FLAG -256 368 0
FLAG 224 368 0
FLAG 112 128 0
FLAG -48 368 0
FLAG 400 192 ADC
FLAG 0 192 IN
SYMBOL res 208 48 R0
WINDOW 0 60 44 Left 2
WINDOW 3 60 79 Left 2
SYMATTR InstName R1
SYMATTR Value 1K
SYMBOL res 208 224 R0
WINDOW 0 50 37 Left 2
WINDOW 3 51 71 Left 2
SYMATTR InstName R2
SYMATTR Value 1K
SYMBOL res 144 176 R90
WINDOW 0 67 56 VBottom 2
WINDOW 3 73 56 VTop 2
SYMATTR InstName R4
SYMATTR Value 2K
SYMBOL voltage -256 224 R0
WINDOW 0 55 68 Left 2
WINDOW 3 39 109 Left 2
WINDOW 123 0 0 Left 2
WINDOW 39 0 0 Left 2
SYMATTR InstName V1
SYMATTR Value 10.15
SYMBOL voltage 112 0 R0
WINDOW 0 -80 43 Left 2
WINDOW 3 -73 81 Left 2
SYMATTR InstName V2
SYMATTR Value 3
SYMBOL res -64 240 R0
WINDOW 0 51 44 Left 2
WINDOW 3 52 75 Left 2
SYMATTR InstName R5
SYMATTR Value 50
SYMBOL res -96 176 R90
WINDOW 0 -51 49 VBottom 2
WINDOW 3 -37 49 VTop 2
SYMATTR InstName R6
SYMATTR Value 50
TEXT 280 -8 Left 2 ;PIEZO DIVIDER FOR Z354 TEST BOARD
TEXT 376 40 Left 2 ;JL JULY 30 2014
TEXT 352 240 Left 2 ;ARM ADC RANGE IS 0 TO +3
TEXT 488 136 Left 2 !;tran 3
TEXT 344 280 Left 2 ;PIEZO DAC RANGE +-10.15V
TEXT 504 104 Left 2 !.op

--

John Larkin Highland Technology, Inc

jlarkin att highlandtechnology dott com
http://www.highlandtechnology.com
 
Den mandag den 3. november 2014 21.20.35 UTC+1 skrev Joerg:
Lasse Langwadt Christensen wrote:
Den sřndag den 2. november 2014 18.24.23 UTC+1 skrev Joerg:
Joerg wrote:
Carl Ijames wrote:
Don't know about computation speed, but this link says the video card will
drive 3 monitors:
http://www.geforce.com/hardware/desktop-gpus/geforce-gt-720/specifications.
Looking at Dell's site I don't see any mention of expansion slots, and
looking at the one picture with the cover off I really can't see any sockets
beyond the video card, so if any further expansion is important you need to
ask Dell for clarification.

Looks like you are right:

http://www.dell.com/ed/business/p/xps-8700/pd
http://core0.staticworld.net/images/article/2013/07/1253541_sr-1160-100047019-orig.jpg
http://www.pcworld.com/article/2047487/dell-xps-8700-special-editions-review-a-little-less-performance-for-a-lot-less-cash.html

Quote "There's only one PCIe x16 slot, which means you won't be able to
add a second video card to take advantage of Nvidia's SLI technology"..

No slots. There's one more card in the bottom, not sure what that is.
But if the video can drive three monitors it should be fine, I never
added any cards to my current PC either.

Only question is, how can one connect two regular OPC monitors to this?

http://www.geforce.com/hardware/desktop-gpus/geforce-gt-720/product-images


I'd expect that you can connect a monitor to each of the three outputs,
VGA,DVI,HDMI. I have an old geforce and that's how that works

VGA is not much use, but unless you want to watch something from Hollywood
DVI and HDMI is the same thing


I do a lot of video conferencing via web where content moves. Other than
that just CAD, no movie streaming and such.

Then DVI will works just fine, HDMI is just DVI with optional audio and the encryption Hollywood insists on if you bought a blueray movie

So just plug a monitor into both the HDMI and DVI output

-Lasse
 
Den mandag den 3. november 2014 21.58.14 UTC+1 skrev Joerg:
Lasse Langwadt Christensen wrote:
Den mandag den 3. november 2014 21.20.35 UTC+1 skrev Joerg:
Lasse Langwadt Christensen wrote:
Den sřndag den 2. november 2014 18.24.23 UTC+1 skrev Joerg:
Joerg wrote:
Carl Ijames wrote:
Don't know about computation speed, but this link says the
video card will drive 3 monitors:
http://www.geforce.com/hardware/desktop-gpus/geforce-gt-720/specifications.
Looking at Dell's site I don't see any mention of
expansion slots, and looking at the one picture with the
cover off I really can't see any sockets beyond the video
card, so if any further expansion is important you need to
ask Dell for clarification.

Looks like you are right:

http://www.dell.com/ed/business/p/xps-8700/pd
http://core0.staticworld.net/images/article/2013/07/1253541_sr-1160-100047019-orig.jpg

http://www.pcworld.com/article/2047487/dell-xps-8700-special-editions-review-a-little-less-performance-for-a-lot-less-cash.html


Quote "There's only one PCIe x16 slot, which means you won't
be able to add a second video card to take advantage of
Nvidia's SLI technology".

No slots. There's one more card in the bottom, not sure what
that is. But if the video can drive three monitors it should
be fine, I never added any cards to my current PC either.

Only question is, how can one connect two regular OPC monitors
to this?

http://www.geforce.com/hardware/desktop-gpus/geforce-gt-720/product-images


I'd expect that you can connect a monitor to each of the three
outputs, VGA,DVI,HDMI. I have an old geforce and that's how that
works

VGA is not much use, but unless you want to watch something from
Hollywood DVI and HDMI is the same thing

I do a lot of video conferencing via web where content moves. Other
than that just CAD, no movie streaming and such.


Then DVI will works just fine, HDMI is just DVI with optional audio
and the encryption Hollywood insists on if you bought a blueray movie


So just plug a monitor into both the HDMI and DVI output


Ok, but can one be sure that an ordinary cheap 27" 1920*1080 monitor
will plug into either of them? For example, the ViewSonic VA2702w I have
here only has the large DVI connector, not the narrow HDMI. It does have
VGA though which I am using right now (good enough for my purposes).

yes, for regular computer monitor HDMI and DVI is the same thing, you just
need the right cable or a adapter to get the wires in the right holes ;)

-Lasse
 
On Mon, 03 Nov 2014 13:14:47 +0000, Martin Brown
<|||newspam|||@nezumi.demon.co.uk> Gave us:

>In some chess problems the i5 can be faster and certainly cheaper!

Bullshit.
 
Den mandag den 3. november 2014 22.11.48 UTC+1 skrev Joerg:
Lasse Langwadt Christensen wrote:
Den mandag den 3. november 2014 21.58.14 UTC+1 skrev Joerg:
Lasse Langwadt Christensen wrote:
Den mandag den 3. november 2014 21.20.35 UTC+1 skrev Joerg:
Lasse Langwadt Christensen wrote:
Den sřndag den 2. november 2014 18.24.23 UTC+1 skrev Joerg:
Joerg wrote:
Carl Ijames wrote:
Don't know about computation speed, but this link says the
video card will drive 3 monitors:
http://www.geforce.com/hardware/desktop-gpus/geforce-gt-720/specifications.
Looking at Dell's site I don't see any mention of
expansion slots, and looking at the one picture with the
cover off I really can't see any sockets beyond the video
card, so if any further expansion is important you need to
ask Dell for clarification.

Looks like you are right:

http://www.dell.com/ed/business/p/xps-8700/pd
http://core0.staticworld.net/images/article/2013/07/1253541_sr-1160-100047019-orig.jpg

http://www.pcworld.com/article/2047487/dell-xps-8700-special-editions-review-a-little-less-performance-for-a-lot-less-cash.html


Quote "There's only one PCIe x16 slot, which means you won't
be able to add a second video card to take advantage of
Nvidia's SLI technology".

No slots. There's one more card in the bottom, not sure what
that is. But if the video can drive three monitors it should
be fine, I never added any cards to my current PC either.

Only question is, how can one connect two regular OPC monitors
to this?

http://www.geforce.com/hardware/desktop-gpus/geforce-gt-720/product-images


I'd expect that you can connect a monitor to each of the three
outputs, VGA,DVI,HDMI. I have an old geforce and that's how that
works

VGA is not much use, but unless you want to watch something from
Hollywood DVI and HDMI is the same thing

I do a lot of video conferencing via web where content moves. Other
than that just CAD, no movie streaming and such.

Then DVI will works just fine, HDMI is just DVI with optional audio
and the encryption Hollywood insists on if you bought a blueray movie


So just plug a monitor into both the HDMI and DVI output

Ok, but can one be sure that an ordinary cheap 27" 1920*1080 monitor
will plug into either of them? For example, the ViewSonic VA2702w I have
here only has the large DVI connector, not the narrow HDMI. It does have
VGA though which I am using right now (good enough for my purposes).


yes, for regular computer monitor HDMI and DVI is the same thing, you just
need the right cable or a adapter to get the wires in the right holes ;)


So then here in the photo the center one is HDMI and the right one is
DVI and that's where the two monitors should go to? I could also hook
one up to VGA like I have now.

yes, you can hook up three monitors

http://www.geforce.com/hardware/desktop-gpus/geforce-gt-720/product-images

Says dual-link or DVI-D for the DVI connector in the specs, whatever
that means.

single-link is three differential pairs, dual-link has three extra pairs that are used for the higher bandwidth need for very high resolutions

afair single link DVI is limited to 1920x1200 at 60 Hz


-Lasse
 
On 03/11/2014 16:26, DecadentLinuxUserNumeroUno wrote:
On Mon, 03 Nov 2014 13:14:47 +0000, Martin Brown
|||newspam|||@nezumi.demon.co.uk> Gave us:

In some chess problems the i5 can be faster and certainly cheaper!

Bullshit.

You demonstrate clearly that you are an inarticulate moron.
Why am I not surprised?

It is easy enough to do the tests and see for yourself. Hyperthreading
can get in the way of fast multithreading in some larger problems.

--
Regards,
Martin Brown
 
On Mon, 03 Nov 2014 16:48:29 +0000, Martin Brown
<|||newspam|||@nezumi.demon.co.uk> Gave us:

On 03/11/2014 16:26, DecadentLinuxUserNumeroUno wrote:
On Mon, 03 Nov 2014 13:14:47 +0000, Martin Brown
|||newspam|||@nezumi.demon.co.uk> Gave us:

In some chess problems the i5 can be faster and certainly cheaper!

Bullshit.

You demonstrate clearly that you are an inarticulate moron.
Why am I not surprised?

It is easy enough to do the tests and see for yourself. Hyperthreading
can get in the way of fast multithreading in some larger problems.

Again.. you spout crap, but have no clue about actual operation or
function.

I can turn OFF HT in my Motherboard BIOS setup.

I will bet you that your chess app runs the same in both settings, if
the code never uses it.

So, no... it typically NEVER "gets in the way", and YOU need to cite
a condition where it does, not merely spout off horseshit as if you are
a fucking professor.

Your brain benchmark takes a big hit when you pull this stupid
guesswork bullshit.
 
Den mandag den 3. november 2014 22.42.14 UTC+1 skrev rickman:
On 11/3/2014 4:28 PM, Joerg wrote:
rickman wrote:
On 11/3/2014 3:51 PM, Joerg wrote:
DecadentLinuxUserNumeroUno wrote:

[...]

Since I cannot afford to put $1000 into a Titan video card, I miss on
a few benchmarks with my $250 GTX650.


I am not at all concerned about video because that's just used for
static display and sometimes video conferencing. No games, no movies.

If you are going for power, you need to have separate video memory or
the video eats memory bandwidth which is often the limiting factor on a
multicore machine.

I haven't kept up with the hotrod machines these days, but I'd be
willing to bet you will get a lot better performance with multi-banked
RAM. Does this machine have two or more memory interfaces or just one?


No clue. But with SPICE the graphics action is very slow, just a wee
progress of a few traces on an otherwise static screen. And you could
even turn that off.

You aren't grasping the concept. Video memory needs a sizable bandwidth
to *display* the image to the screen. All the data that goes out over
your HDMI cable is being read from memory *all the time*. You're a
bright boy. Do the math... 1920*1080*60 times 3 or 4 bytes per pixel.

with anything but a graphics-card integrated in the chipset that memory
will be on the card itself, 1920*1080*24bit is less that 7MB

-Lasse
 
"John Larkin" wrote in message
news:0nkc5aljhec5r36ptkoaqbt0a48ud2j5vo@4ax.com...


I have spent too many hours this weekend tweaking the transient
response of a semi-hysteretic (we call it "hysterical") switchmode
constant-current source. There are about 8 interacting knobs to turn.
At 30 seconds per run, understanding the interactions is impossible.

Only 30 seconds, Luxury. A large bulk of my sims at work are in the 15 mins
to 1 hour range, some take overnight.

I want sliders on each of the part values, and I want to see the
waveforms change as I move the sliders, like they were trimpots on a
breadboard and I was looking at a scope. I need maybe 500 times the
compute power that I have now.

It doesn't look like you need core compute power to me, you want a spice
with real time control of components for this particular method of, er..
ahmmm.. "pissing" about design approach.

You can actually do this in SS. If real time mode is enabled, with marching
waveforms also enabled, changing component like resistors and caps will
update the simulation matrix immediately and the marching waveform will
reflect this. There are other spice programs out there that are better
designed for real time technician twiddling though.

The er...hmmm.. more professional way is to set up multi way sweeps of all
the pots you want to twiddle, and examine the resulting set of graphs. You
can gain intuition by doing this, even after the fact. That's what those
that generate millions of asic chips do anyway.

:)

Kevin Aylward
www.kevinaylward.co.uk
www.anasoft.co.uk - SuperSpice
 

Welcome to EDABoard.com

Sponsor

Back
Top