AREF bypass capacitance on ATMega2560?

krw@attt.bizz wrote:
On Sat, 07 Sep 2013 15:23:39 -0700, Joerg <invalid@invalid.invalid
wrote:

rickman wrote:
On 9/7/2013 4:46 PM, Joerg wrote:
Paul Rubin wrote:
Joerg<invalid@invalid.invalid> writes:
I don't see how the equivalent of a TMS320 or a big MSP430 could fit
into one of these small Lattice devices.
I had thought the parts of those processors that would bloat up badly
(instruction decode etc.) are pretty simple so the overall effect of the
bloat is ok in the scheme of things. The parts doing the most work
(memory, arithmetic) are done in the FPGA hardware (RAM and DSP blocks,
adders connected to the LUT's somehow) as efficiently as on the MCU's.

I do think softcores seem like a silly idea a lot of the time, and am
looking forward to more low end FPGA's with MCU blocks.

Much of it has to do with legacy code. Yes, some things could even be
done more efficiently in the FPGA because you can actually streamline
the HW to the task, something neither uC nor DSP allow. For example, why
have a 32-bit HW multiplier when you know you'll never exceed 23 bits?
But legacy code won't run anymore and you need FPGA specialists to make
it all work.
No, you would need a DSP specialist. The FPGA designer only needs to
know how to code the FPGA.

So for this kind of solution in an FPGA you need a DSP specialist and an
FPGA specialist? That would be a problem.

Pick ones that are in the automotive market. Support for fifteen
years is required.

That's a good point. How does one find out which ones those are?

Mil stuff is even better, that needs to remain available for decades.
That is the reason why the LM331 is still around and why I used it in a
long-life design many years ago.


But that is exactly the point of the FPGA in DSP apps. You code to the
app, not to a processor.

How long do the usual FPGA stay in the market? Meaning plop-in
replaceable, same footprint, same code, no changes.

The usual? About 30 minutes. ;-)

:)

--
Regards, Joerg

http://www.analogconsultants.com/
 
On 9/8/2013 4:04 AM, John Devereux wrote:
rickman<gnuarm@gmail.com> writes:

On 9/7/2013 4:24 AM, John Devereux wrote:
rickman<gnuarm@gmail.com> writes:

If your FPGA designs are expensive or power hungry, then you are doing
things you can't do in an MCU or you are not using FPGAs properly.
They don't need to use any more power than an MCU and in many cases
less. They certainly don't need to be significantly more expensive
unless you consider every dollar in your designs. At the very low end
MCUs can be under $1 and still have reasonable performance. For $1
you can't get much in the way of programmable logic. For $3 however,
you can get a chip large enough for a CPU (with math) and room for
your special logic.

I've never used an FPGA, microcontrollers have increased in speed faster
than my needs so far. So I can usually bitbang everything or use a
peripheral. I used PLDs for glue logic back in the day but that's it. Oh
and I bought a small xilinx dev kit which I got to make a led flash then
put in a drawer for 15 years.

So your use of MCUs is based on inertia?

Partly I suppose.

Or I could say that my projects so far all require a microcontroller
anyway, and it seemed likely that a separate FPGA was always going to be
more expensive than, say, choosing a faster CPU.

A STM32F4 can bitbang a PIO at 84MHz. (It can't do anything else then,
but still...)

I think what you are saying is that the MCU is a key part of your design
and you use a lot of code in it. Ok, if your emphasis in on using a
commercial MCU that will do the job. But unless your MCU needs are just
too large for something that fits in an FPGA, you have it backwards in
my opinion. Why have both when you can just use an FPGA?


But could you give an example of your $3 one? Or a favorite?

A startup company called Silicon Blue came out with a line of FPGAs
targeted to the high volume, low power market that exists for portable
devices. They were preparing their second device family and were
bought by Lattice Semi. The first family was dropped and the second
family is the iCE40 (for 40 nm). They are very low power although
smallish. The largest one has 8 kLUTs, the smallest 384 LUTs.

Last winter I was looking at designing a very low power radio
controlled clock to run in one of these. They were still playing a
shell game with the devices in the lineup and the 640 LUT part I
wanted to use was dropped... :( The only real problem I have with
these devices is the packaging. Because of the target market the
packages are mostly fine pitch BGAs. Great if you are making a cell
phone, not so great if you are designing other equipment.

You can get the 1 kLUT parts for under $3 and possibly the 4 kLUT
parts. It has been a while since I got a quote. The 1 kLUT part is
big enough for a soft core MCU plus some custom logic.

OK, thanks, will check them out.

I haven't gotten a quote on these parts since they were bought by
Lattice. I'd appreciate a pricing update if you get one. They should
be able to do a lot better than the Digikey price, I know Xilinx and
Altera always do. Heck, the Digikey pricing for most FPGAs doesn't go
above qty 1... if nothing else there should be some quantity price breaks.

--

Rick
 
On 9/8/2013 11:19 AM, krw@attt.bizz wrote:
On Sat, 07 Sep 2013 13:23:48 -0400, rickman<gnuarm@gmail.com> wrote:

On 9/7/2013 11:17 AM, krw@attt.bizz wrote:
On Fri, 06 Sep 2013 23:59:59 -0400, rickman<gnuarm@gmail.com> wrote:

On 9/6/2013 7:10 PM, Joerg wrote:
That is often the problem. Sometimes a buck fifty is the pain threshold.
Not in this ATMega case, the 2560 is very expensive but comes with lots
of ADC and analog muxes and all that. Things that will cost extra with a
FPGA solution and eat real estate.


For $3 however, you can get a
chip large enough for a CPU (with math) and room for your special logic.


For $3 I can get a big DSP.

What "big" DSP can you get for $3? It has been a while since I looked
hard at DSP chips, but I don't recall any I would call remotely "big"
for $3. The TI chips that would be "big" are the TMS6xxx line which
start somewhere around $20 the last time I looked and that requires all
memory and I/O to be separate. The smaller DSP chips that you can get
for the $3 range are not "big" in any sense and only a very few of them
include Flash memory. So you still need another chip.

We pay less than that for the largest of the ADI sigma DSPs. I just
received a quote for the smallest CPLD for around $.75. I have use
for CPLDs and FPGAs but they're simply too expensive for most of my
applications.

It seems the prices have come down in recent years, but still, the parts
I have seen have no Flash. So you need to add in that cost. But the
Sigma parts aren't really general purpose. They are good if you can
make you app fit the DSP design, otherwise they aren't much use. I
pursued them hard a few years ago until an FAE just threw in the towel
and said I couldn't do my app on their part.

Good grief. The issue wasn't to show YOU that YOUR application was
better in a DSP. Like many FGPA weenies, you're trying to sell a part
that has a niche market as the universal hammer.

Good grief is right. You don't need to be rude. It isn't just my
application, the Sigma parts are designed for a very limited set of DSP
apps and even the development software limits how you design with them.
They won't do the job of *most* DSP apps.


Even a "small" FPGA can run rings around a DSP when it comes to
performance. Usually "big" in DSPs means fast and when you want really
fast DSP you use an FPGA with all the parallelism you can handle. DSPs
can't touch FPGAs for speed, even with low power.

Comparing the two is silly. Each has its place.

That makes no sense.

Hammer, meet nail.

If you don't want to discuss engineering, then please spare me.


There will always be some designs that a given
part is a perfect fit for, but that doesn't mean different devices can't
be compared. The question is what is the best fit for a given job.

That is *NOT* what you're arguing. You're making the general case
that FPGA>> DSP>> uC, which is just silly.

I am hearing some say that FPGAs aren't the best fit and I find they often
are a better fit than an MCU.

Hammer, meet nail.

You are repeating yourself.


Much of it has to do with mis-information
about what FPGAs can and can't do and what is required to make them run.

Nonsense.

Just read Joerge's post.

I have.

Much of the stuff he objects to is specific
to the individual devices he has worked with.

Like DSPs. I agree with him. FPGAs aren't in his future. You keep
sugar-coating FPGAs and (erroneously) tear down DSPs. Note that I'm
more of an FPGA kind of guy than a DSP sort but in this case Joerg is
absolutely right. FPGAs only compete in small niche markets and those
where money is no object.

No one is tearing down DSPs. Can you just stick to the engineering and
skip the drama?

Your statement is exactly the sort of "mis-information" I am talking
about. At $3 I think you can use an FPGA in a low cost app. So your
"money is no object" claim is just BS.


What I often find is people only doing Altera or only Xilinx. With uC
it's a bit easier, a PIC guy can be cajoled into programming an AVR,
usually.

I'm totally device agnostic. I have worked with all brands other than
MicroSemi (formerly Actel). I even worked with Lucent which was bought
by Lattice and I believe is still sold and supported (but not the GD XP
line which I had designed into a cash cow product and will have to
redesign now). Ever hear of Concurrent? They were bought by Atmel.
Their devices were followed by the AT40K. I worked with the Concurrent
devices. lol So you can see I go way back.

The difference anymore is very small. The only reason I prefer one
over the other is software and that takes a back seat to most other
variables (in rough order of importance, 1. cost, 2. cost, 3. cost).

I have not found a big difference in software. The software is
different, but those differences are not important. It all compiles my
HDL fine (mostly because they often use the same third party tool
vendors) and simulation just works anymore.

The software is different in how it works, not what it does. That
difference makes *NO* difference to the end result or the cost of the
product. IOW, it's completely irrelevant. At one time it may have
been important but only in so much as that much of it didn't work
(making the hardware useless).

That is what I am saying. I find little difference in how the tools
work. You write your HDL in your editor or their built in editor, you
simulate it using the free tool they provide and you compile to a bit
stream that gets downloaded into the FPGA, Flash or RAM. No, the tools
aren't going to look exactly the same, but they do the same job and work
the same way. Most of the tool is actually third party anyway, except
for the Xilinx HDL in house compiler. But them most FPGA professionals
(read as working for a company that has a few bucks) pay for the third
party tools anyway.


The one feature that isn't universal is programming modes. This can
make a big difference in indirect costs (field upgrade, SKU
personalization, etc.) that may not show up directly on the raw BOM.

I don't know what devices you work with, but the ones I use are easy to
program.

Pile on more sugar. You clearly don't work where time is money.

Ok, very convincing argument. I have no idea what you are talking
about. Downloading an FPGA is no different than an MCU. You either
attach a cable for JTAG or your use the resources you designed into your
target.


I've used schematic based tools and both VHDL and Verilog. I've worked
with the vendor's tools and third party tools including the NeoCAD tools
which became Xilinx tools when Xilinx bought them.

If anyone tells you they only know one brand of FPGA you are talking to
an FPGA weenie. I find MCUs to vary a *great* deal more than FPGAs in
terms of usage. MCUs need all sorts of start up code and peripheral
drivers, clock control, etc, etc, etc. FPGAs not so much. They mostly
have the same features and most of that can be inferred from the HDL so
you never need to look too hard under the hood.

Sure, the feature set and peripherals of micros varies widely. We use
a variety of SoCs from just about everyone. Since most are settling
on ARM, switching from one to the other is pretty simple. Our last
port from one manufacturer to the other took a couple of weeks.

The CPU is the easy part to port, the compiler handles that for you. It
is the drivers for the I/O that is harder.

That's all included in the port. I'm talking from working hardware to
working hardware (the target system not qualified, of course). There
is only about 10% of the code that even has to be looked at.

With FPGAs *none* of the code has to be looked at.


Their libraries have to have
compatible interfaces and every port is a port.

Wrong. That's all included.

You have identical peripheral interface libraries for different brands
of MCUs? Every timer function works the same, every SPI port sets up
the same, every power controller is operated the same?


With FPGAs, all you
need to do to switch between brands is normally a new pin list and
timing constraints.

Bullshit! More sugar!

You are the consummate debater...


The HDL just compiles to suit the new device.

Oh, you never use libraries? Yet you (erroneously) add that cost into
the DSP/uC bucket.

The only libraries I use are the HDL libraries which are standardized,
like using stdio. I don't add the libraries into the MCU column, I'm
not the one using the MCU.


It has been a while since I ported between brands but it would make sense
if they provide tools to port the timing constraints. That is the only
part that might be any work at all.

In short, there is a lot of FUD about FPGAs. Talk to someone who
doesn't buy into the FUD.

The FUD is on both sides. The support costs aren't as low as you
pretend.

Care to elaborate?

You've TOTALLY forgotten about simulation, for instance. That's a
huge effort that you simply sweep under the rug.

What about it? I paid for a set of tools from Lattice. I had used the
Modelsim simulator at work and was used to it. The Lattice tools said
they came with the Modelsim simulator. But by the time I got the
package it had the Active HDL simulator. I complained about this
thinking I would have to learn a new simulator... but it was a no-op to
switch. Even my Modelsim scripts ran under the AHDL simulator.

So what is your concern?


Things quickly unravel when you start relying on real hardware that is
on uC but not on FPGA. Comparators, ADCs, analog muxes, for example.

If you really need it all on a single chip, then yes, you won't find
that on so many FPGAs although Microsemi has their Fusion line with
analog. My cash cow uses a single FPGA and a stereo CODEC. That was
smaller than any MCU design because the MCU would still require the
CODEC (CD quality) and some of the control logic and interface could not
be done with any conventional MCU. I had to vary the speed of the CODEC
clock via an ADPLL to synchronize it with an incoming data stream. I
don't know how to do that with an MCU and no logic. But I can do it all
with FPGA logic and no MCU.

Nonsense. DSPs are also available with CODECs, as are UCs.

You can find a small number of DSPs with CD qualitity CODECs and the
same for MCUs. I know, I did this search recently. I didn't find much
and none that suited my other critera. So the redo of my board will
likely have another FPGA on it.

Goal post shift added to the hammer.

I would appreciate a list of the MCUs/DSPs which have stereo CD quality
CODECs on chip. The Sigma parts from ADI don't count because their DSPs
can *only* be used for certain coding like filters, not general purpose
use.

Sigmas have them. I haven't looked for others.

But Sigmas aren't general purpose DSPs and can't do most DSP jobs. They
are designed to be filters, like in hearing aids.


Last week I reviewed a design with some larger FPGA on there. What I
found fairly disgusting was how much they had to be babied with the
power sequencing. uCs don't have that problem.

If you want to work with the wrong device, then you will find it hard to
work with. There are still single voltage devices on the market. If
this was an old design, most likely it was a Spartan 3 or similar era
device when they (for still unknown reasons) used three, yes, count
them, *three* voltages on the FPGA. The 2.5 volt aux supply was there
solely for the configuration interface which was normally to a 3.3 volt
device! Only from Xilinx...

If this was a new device, then I guess they picked one based on
something other than ease of use, eh? Don't assume all FPGAs are the same.

I thought you just said that there weren't many differences between
FPGA manufacturers?

You are mixing apples and oranges. One manufacturer has many different
families of FPGAs, no? Some are huge power hungry devices that burn a
hole in your board. Others are much lower power and don't burn a hole
in your pocketbook either.

The families all look the same and vary only in density and mix of
memory, speed, MCU, DSP(hmm), and other features.

Good grief, you're arguing both sides.

We've been down the road before. You are not enjoyable to discuss
things with. You get obnoxious and don't explain what you are talking
about. Do you really want to have this discussion? I don't think I do.

--

Rick
 
rickman <gnuarm@gmail.com> writes:

On 9/8/2013 4:04 AM, John Devereux wrote:
rickman<gnuarm@gmail.com> writes:

On 9/7/2013 4:24 AM, John Devereux wrote:
rickman<gnuarm@gmail.com> writes:

If your FPGA designs are expensive or power hungry, then you are doing
things you can't do in an MCU or you are not using FPGAs properly.
They don't need to use any more power than an MCU and in many cases
less. They certainly don't need to be significantly more expensive
unless you consider every dollar in your designs. At the very low end
MCUs can be under $1 and still have reasonable performance. For $1
you can't get much in the way of programmable logic. For $3 however,
you can get a chip large enough for a CPU (with math) and room for
your special logic.

I've never used an FPGA, microcontrollers have increased in speed faster
than my needs so far. So I can usually bitbang everything or use a
peripheral. I used PLDs for glue logic back in the day but that's it. Oh
and I bought a small xilinx dev kit which I got to make a led flash then
put in a drawer for 15 years.

So your use of MCUs is based on inertia?

Partly I suppose.

Or I could say that my projects so far all require a microcontroller
anyway, and it seemed likely that a separate FPGA was always going to be
more expensive than, say, choosing a faster CPU.

A STM32F4 can bitbang a PIO at 84MHz. (It can't do anything else then,
but still...)

I think what you are saying is that the MCU is a key part of your
design and you use a lot of code in it.

Yes, basically. "a lot" being only e.g. about 64k probably, not much for
a MCU but would push the price up for an FPGA I think.

Ok, if your emphasis in on using a commercial MCU that will do the
job. But unless your MCU needs are just too large for something that
fits in an FPGA, you have it backwards in my opinion. Why have both
when you can just use an FPGA?

I'm pretty sure that a FPGA with enough RAM would be far too expensive
(compared to the $3 200 MIPS CPU).

A M3 or M4 with attached FPGA + memories would be interesting, if it was
at a reasonable price.

NXP have a M4 with attached M0 which sort of goes in that direction; the
M0 does the more deterministic simple stuff, the M4 does the number
crunching and runs the more complicated software.

But could you give an example of your $3 one? Or a favorite?

[...]

You can get the 1 kLUT parts for under $3 and possibly the 4 kLUT
parts. It has been a while since I got a quote. The 1 kLUT part is
big enough for a soft core MCU plus some custom logic.

OK, thanks, will check them out.

I haven't gotten a quote on these parts since they were bought by
Lattice. I'd appreciate a pricing update if you get one. They should
be able to do a lot better than the Digikey price, I know Xilinx and
Altera always do. Heck, the Digikey pricing for most FPGAs doesn't go
above qty 1... if nothing else there should be some quantity price
breaks.

Unfortunately I don't really have a live application, so would only be
able to buy them as "education" at this stage.

--

John Devereux
 
On 9/8/2013 11:05 AM, Joerg wrote:
rickman wrote:
On 9/7/2013 7:45 PM, Joerg wrote:
rickman wrote:
On 9/7/2013 6:23 PM, Joerg wrote:

How long do the usual FPGA stay in the market? Meaning plop-in
replaceable, same footprint, same code, no changes.

Life span is typically *much* better than MCUs.

First, there are *no* second sources so whatever chip family you select
is the only one that will fit your layout. ...


That is one of my concerns. With 8051 uCs you have multiple sources as
long as you stick to customary packages such as a 44-pin flat-pack.

Yes, but 8051s aren't DSPs either are they? You seem to be switching
gears again. I can't keep up. I know you do different designs, but can
the FPGA be wrong for *all* of them? You seem to have all requirements
for all designs.


I do various designs, sometimes simultaneously. For DSP we often just
plop down a TMS320 bare-bones edition and be done with it. It's like
buying a Ford F-150 for the ranch, it may be too big but it is not
expensive and you almost can't go wrong with it. I had designs where the
DSP workload ended up at 5% but at $3 pop nobody was concerned.

And no, mostly I don't even have the requirements until after the
project already started. Sometimes weeks down the road the sensor guys
call in, "Houston, we have a problem". This is the kind of project
companies like to use consultants for, since it can be utterly
frustrating for engineers. Us guys are use to this stuff.


... There *may* be more than one
member of that family that will fit the same socket, that is common, but
not guaranteed. So you often will get a choice of two, three or four
sizes and you often get an upgrade path from your first selection. Just
in case you are familiar with the compilation process, in an FPGA you
*always* have to recompile for the target. Even if they are pin
compatible you can't load a design for an whatever-02 chip into a
whatever-03 part. Those are the limitations.


Yeah, that I was aware of. And changing to a whatever-03 would be a
major headache in many of my cases. Because it's medical, aerospace
similar where that can trigger a complete re-cert.

Why would you need to change to the whatever-03. Once it is qualified
you can stick with it. My point is that you have flexibility in the
device, no one is making you switch.


I was thinking about the case where whatever-02 becomes unobtanium.

In the FPGA world, they don't obsolete individual devices although they
do *very seldom* obsolete a package. They obsolete a family. So if
whatever-02 become obsolete, so does whatever-03. Fortunately this is
usually after a *very* long life and even then they usually make the
parts available through one of the extended life makers.


As to the market life, that it typically well over 10 years. Spartan 3
was introduced some 10 years ago and it is not yet the oldest chip
Xilinx has in current full production. I'm still considering using it
for new designs. Similar situation for Altera. ...


Well over 10 years is good. But only if that means no change to any new
versions that require a re-compile. Early on in my career that happened
and one guy promptly got busy with three months of regression testing.
Oh what fun.

Why not talk to the vendors?


We did that and all we got was a "Sorry about that". The designed-in
device was discontinued.


... I was just burned by
Lattice announcing EOL of their XP line. This was because they got a
new guy in at the top with a new broom I suppose.


Not so cool :-(

Yeah, I'm unhappy about it. I thought I could get more development
funds for a redo but the division reselling this board in their product
doesn't want to spend any cash on it. I've been asked to spend my dime
and I likely will. I make good money on this product.


Looks like a good business opportunity for you :)

It's a better opportunity if I don't have to redesign it. I am ready to
retire and this was bringing cash in with minimal effort. Very sporadic
cash, but cash nonetheless.


I'm sure you can find various MCUs which have been in production for 10
years, but I know Atmel likes to replace products from time to time with
similar "pin compatible" devices which are 99.9% compatible. I expect
for the 8 bit parts life span is not such an issue. For the larger
parts I expect life span is a bit more limited and for the top end
chips, I'm pretty sure their life span is measured in double digit
months. Can you still buy any of the Pentium 4s that were all over the
place seven or eight years ago?


Yup:

http://components.arrow.com/part/detail/41596500S6440784N2936?region=na


... I can't even find a Core 2 Duo.


No problem either:

http://components.arrow.com/part/detail/42952225S9497728N2936?region=na

How do you know these are the parts that were designed in the system of
interest? They made a huge number of variants and I know I have seen
EOL notices for Pentium 4s.


Well, you do have to look in the schematics. You only asked whether one
can still buy Pentium 4 and I said yes, and gave evidence. Are you
changing the game now? :)

Legacy stuff in the PC world does not go away fast. To this day you can
still easily buy brand-new ISA-bus PCs. Because scores of them are used
in production facilities. I helped replace one a few years ago and it
also had a processor from the days of Methusaleh.

I wasn't aware that conventional PCs were used in industry that much.


What lifespan have you seen for MCUs?


The 89C51 I designed in in the mid-90's is still living. Not sure how
long it was in production when I designed it in. The nice thing is that
these are made by several companies, even Asian ones such as Winbond. So
it was no surprise when I took apart our pellet stove for maintenance
and found one of those in there as well.

2nd source is important to me, and my clients.

If you really need that level of consistency, then you will be using
nothing but 8051s all your career. I don't know of any digital
component that has lived as long as the 8051 other than perhaps LS-TTL.
I also don't know of any other MCU that is second sourceds. If the
8051 does what you need, then go for it. But again you are mixing
conversations. That's why it is so frustrating to have a conversation
with you. ...


I merely said it matter in some cases. Not in all cases.


... You talk about not being able to use a part unless it has a
product life as long as the 8051 and then you talk about using various
DSP chips in the same context. I *know* you won't be able to buy those
DSP chips 10 years from now. TI just doesn't provide that level of
support unless they have a special program for long lived parts I'm not
aware of. I've seen lightly selling DSPs drop from the marketplace
after less than 5 years.


Well, let me show you a blast from the past ...

http://www.rocelec.com/search/finished/TMS320C10NL/0/1/contains/?utm_source=supplyFrame&utm_medium=buyNow

20,286 in stock, ready to ship.


The DSP market was just a tiny exploration by TI initially. Then they
saw cell phones as a way to utilize that capability. They actually
reorganized the entire company to take full advantage of it. As a
result they ended up with four segments for DSPs.


Analog Devices had the market first, they really ruled in the eearly
90's. We had boards with about a dozen 16-bit FP DSPs on there.

90's??? TI came out with their DSP in the early 80's by my memory.
They then captured the cell market. The FP DSPs were not the bread and
butter of DSP. Without the cell market DSPs would likely still be
rather a niche.


1) Cell phone devices - small, low power and cheap in large quantities.
Not much need for longevity at all... basically the C5xxx line.

2) Cell base stations - powerful devices that can handle multiple
channels, power consumption not important and cost is secondary. This
is the C6xxx line. Again, they focus on new, not longevity.

3) Scientific DSP - floating point. C67xx lines. Relatively low
volumes compared to the other two, but they seem to think it is an
important market. New designs are not as frequent. Longevity might be
better than the other two, but no promises.

4) Motor control, white goods, etc - fixed point with price the major
factor. These have appeared in a range of variations, some with flash,
some with ADCs, etc. These are almost MCUs with simlar performance,
slow compared to segment 1 and 2. Intended for high volume apps, but
again, longevity is not important.

So if you are going to consider DSPs for your apps, I expect you would
be looking at the last category. I'm pretty sure I wouldn't be
designing from this group if I wanted to be building this board 10 years
from now though. Have you talked to TI about longevity?


Not yet. That comes if I decide to have a DSP in a project. But mostly
my clients have those discussions because that's their turf, I am more
the analog guys. On large projects stuff gets put in writing about
guaranteed years of supply. On some chips it goes as far as putting the
mask data in escrow, especially with smaller companies where there is a
chance of them going belly-up down the road.

I only wish I could do that, but making chips from masks can be an
expensive proposition if the fab is being shut down. It would seem that
is what is behind the end of the XP series from Lattice.

--

Rick
 
On 9/8/2013 12:34 PM, Joerg wrote:
krw@attt.bizz wrote:

Pick ones that are in the automotive market. Support for fifteen
years is required.


That's a good point. How does one find out which ones those are?

All FPGA makers offer automotive lines. They advertise them as such.


Mil stuff is even better, that needs to remain available for decades.
That is the reason why the LM331 is still around and why I used it in a
long-life design many years ago.

You won't find many $3 parts in a MIL line...

--

Rick
 
rickman wrote:
On 9/8/2013 11:05 AM, Joerg wrote:
rickman wrote:
On 9/7/2013 7:45 PM, Joerg wrote:
rickman wrote:

[...]

... There *may* be more
than one
member of that family that will fit the same socket, that is
common, but
not guaranteed. So you often will get a choice of two, three or four
sizes and you often get an upgrade path from your first selection.
Just
in case you are familiar with the compilation process, in an FPGA you
*always* have to recompile for the target. Even if they are pin
compatible you can't load a design for an whatever-02 chip into a
whatever-03 part. Those are the limitations.


Yeah, that I was aware of. And changing to a whatever-03 would be a
major headache in many of my cases. Because it's medical, aerospace
similar where that can trigger a complete re-cert.

Why would you need to change to the whatever-03. Once it is qualified
you can stick with it. My point is that you have flexibility in the
device, no one is making you switch.


I was thinking about the case where whatever-02 becomes unobtanium.

In the FPGA world, they don't obsolete individual devices although they
do *very seldom* obsolete a package. They obsolete a family. So if
whatever-02 become obsolete, so does whatever-03. Fortunately this is
usually after a *very* long life and even then they usually make the
parts available through one of the extended life makers.

It would be good to know which manufacturers have the best reputation
regarding longevity. I've seen a few cases where programmable logic was
discontinued and that has caused a lot of grief. But I do not remember
the brands because it wasn't really my turf.

Often longevity is way more important than performance.

[...]

We did that and all we got was a "Sorry about that". The designed-in
device was discontinued.


... I was just burned by
Lattice announcing EOL of their XP line. This was because they got a
new guy in at the top with a new broom I suppose.


Not so cool :-(

Yeah, I'm unhappy about it. I thought I could get more development
funds for a redo but the division reselling this board in their product
doesn't want to spend any cash on it. I've been asked to spend my dime
and I likely will. I make good money on this product.


Looks like a good business opportunity for you :)

It's a better opportunity if I don't have to redesign it. I am ready to
retire ...

Lucky you. It's still some time away for me and I'll probably never
fully retire, electronics design is fun. But I already talked with a
neighbor about making beer together once I slow down the EE design work
a bit. He is retired and I used to brew when I was at the university.


... and this was bringing cash in with minimal effort. Very sporadic
cash, but cash nonetheless.

If the grand total per year is worth it, why not?

I'm sure you can find various MCUs which have been in production
for 10
years, but I know Atmel likes to replace products from time to time
with
similar "pin compatible" devices which are 99.9% compatible. I expect
for the 8 bit parts life span is not such an issue. For the larger
parts I expect life span is a bit more limited and for the top end
chips, I'm pretty sure their life span is measured in double digit
months. Can you still buy any of the Pentium 4s that were all over
the
place seven or eight years ago?


Yup:

http://components.arrow.com/part/detail/41596500S6440784N2936?region=na


... I can't even find a Core 2 Duo.


No problem either:

http://components.arrow.com/part/detail/42952225S9497728N2936?region=na

How do you know these are the parts that were designed in the system of
interest? They made a huge number of variants and I know I have seen
EOL notices for Pentium 4s.


Well, you do have to look in the schematics. You only asked whether one
can still buy Pentium 4 and I said yes, and gave evidence. Are you
changing the game now? :)

Legacy stuff in the PC world does not go away fast. To this day you can
still easily buy brand-new ISA-bus PCs. Because scores of them are used
in production facilities. I helped replace one a few years ago and it
also had a processor from the days of Methusaleh.

I wasn't aware that conventional PCs were used in industry that much.

Oh yeah, it's all PCs there. Even the CNC stuff ultimately gets
controlled by someone on a PC. Lots of legacy devices because production
machines remain in service for decades.

At church they just switched to Apple. <sigh> So today when I looked at
the song projector software I couldn't make heads or tails of it.
That'll take a learning curve.

[...]

The DSP market was just a tiny exploration by TI initially. Then they
saw cell phones as a way to utilize that capability. They actually
reorganized the entire company to take full advantage of it. As a
result they ended up with four segments for DSPs.


Analog Devices had the market first, they really ruled in the eearly
90's. We had boards with about a dozen 16-bit FP DSPs on there.

90's??? TI came out with their DSP in the early 80's by my memory. They
then captured the cell market. The FP DSPs were not the bread and
butter of DSP. Without the cell market DSPs would likely still be
rather a niche.

It was actually 1990. Analog Devices had the medical devices market
pretty much covered AFAICT and our product was medical. IIRC it was the
ADSP-2105 and you can still buy them today. Except now they want over
$20 a piece. Highway robbery :)

[...]


4) Motor control, white goods, etc - fixed point with price the major
factor. These have appeared in a range of variations, some with flash,
some with ADCs, etc. These are almost MCUs with simlar performance,
slow compared to segment 1 and 2. Intended for high volume apps, but
again, longevity is not important.

So if you are going to consider DSPs for your apps, I expect you would
be looking at the last category. I'm pretty sure I wouldn't be
designing from this group if I wanted to be building this board 10 years
from now though. Have you talked to TI about longevity?


Not yet. That comes if I decide to have a DSP in a project. But mostly
my clients have those discussions because that's their turf, I am more
the analog guys. On large projects stuff gets put in writing about
guaranteed years of supply. On some chips it goes as far as putting the
mask data in escrow, especially with smaller companies where there is a
chance of them going belly-up down the road.

I only wish I could do that, but making chips from masks can be an
expensive proposition if the fab is being shut down. It would seem that
is what is behind the end of the XP series from Lattice.

It is a hassle if the whole process is discontinued. But there are IC
houses that cater to this market. They'll take the old design and
migrate it onto a newer process. However, that only works if you have
one of those escrow deals. The important thing is to make the original
design not depend on the sluggishness of contemporary ICs because on a
new process you might get the same IC functionality but on steroids.

--
Regards, Joerg

http://www.analogconsultants.com/
 
rickman wrote:
On 9/8/2013 12:34 PM, Joerg wrote:
krw@attt.bizz wrote:

Pick ones that are in the automotive market. Support for fifteen
years is required.


That's a good point. How does one find out which ones those are?

All FPGA makers offer automotive lines. They advertise them as such.

Ok then, next time I'll ask them. When I selected 125C devices Atmel
dropped out of the list at Digikey. Could it be that they don't do
automotive?

Mil stuff is even better, that needs to remain available for decades.
That is the reason why the LM331 is still around and why I used it in a
long-life design many years ago.

You won't find many $3 parts in a MIL line...

The LM331 costs slightly above $1 in bulk. But it only comes in
through-hole packages.

--
Regards, Joerg

http://www.analogconsultants.com/
 
On Sun, 08 Sep 2013 09:34:38 -0700, Joerg <invalid@invalid.invalid>
wrote:

krw@attt.bizz wrote:
On Sat, 07 Sep 2013 15:23:39 -0700, Joerg <invalid@invalid.invalid
wrote:

rickman wrote:
On 9/7/2013 4:46 PM, Joerg wrote:
Paul Rubin wrote:
Joerg<invalid@invalid.invalid> writes:
I don't see how the equivalent of a TMS320 or a big MSP430 could fit
into one of these small Lattice devices.
I had thought the parts of those processors that would bloat up badly
(instruction decode etc.) are pretty simple so the overall effect of the
bloat is ok in the scheme of things. The parts doing the most work
(memory, arithmetic) are done in the FPGA hardware (RAM and DSP blocks,
adders connected to the LUT's somehow) as efficiently as on the MCU's.

I do think softcores seem like a silly idea a lot of the time, and am
looking forward to more low end FPGA's with MCU blocks.

Much of it has to do with legacy code. Yes, some things could even be
done more efficiently in the FPGA because you can actually streamline
the HW to the task, something neither uC nor DSP allow. For example, why
have a 32-bit HW multiplier when you know you'll never exceed 23 bits?
But legacy code won't run anymore and you need FPGA specialists to make
it all work.
No, you would need a DSP specialist. The FPGA designer only needs to
know how to code the FPGA.

So for this kind of solution in an FPGA you need a DSP specialist and an
FPGA specialist? That would be a problem.

Pick ones that are in the automotive market. Support for fifteen
years is required.


That's a good point. How does one find out which ones those are?

Click on the "Automotive" tab on the applications page in their web
site? Look for the ACQ- compliance designations?

Mil stuff is even better, that needs to remain available for decades.
That is the reason why the LM331 is still around and why I used it in a
long-life design many years ago.

The difference is that while Mil stuff may be available until the end
of times, that doesn't mean that it'll be priced within the range of
mere mortals for that time.

But that is exactly the point of the FPGA in DSP apps. You code to the
app, not to a processor.

How long do the usual FPGA stay in the market? Meaning plop-in
replaceable, same footprint, same code, no changes.

The usual? About 30 minutes. ;-)

:)
 
On Sun, 08 Sep 2013 13:01:02 -0700, Joerg <invalid@invalid.invalid>
wrote:

rickman wrote:
On 9/8/2013 12:34 PM, Joerg wrote:
krw@attt.bizz wrote:

Pick ones that are in the automotive market. Support for fifteen
years is required.


That's a good point. How does one find out which ones those are?

All FPGA makers offer automotive lines. They advertise them as such.


Ok then, next time I'll ask them. When I selected 125C devices Atmel
dropped out of the list at Digikey. Could it be that they don't do
automotive?

http://www.altera.com/end-markets/auto/aut-index.html
http://www.latticesemi.com/automotive
http://www.xilinx.com/applications/automotive/index.htm

Atmel makes a lot of automotive stuff but apparently they see no
reason to go there with FPGAs. Don't blame them.
 
On Sun, 08 Sep 2013 13:42:04 -0400, rickman <gnuarm@gmail.com> wrote:

On 9/8/2013 4:04 AM, John Devereux wrote:
rickman<gnuarm@gmail.com> writes:

On 9/7/2013 4:24 AM, John Devereux wrote:
rickman<gnuarm@gmail.com> writes:

If your FPGA designs are expensive or power hungry, then you are doing
things you can't do in an MCU or you are not using FPGAs properly.
They don't need to use any more power than an MCU and in many cases
less. They certainly don't need to be significantly more expensive
unless you consider every dollar in your designs. At the very low end
MCUs can be under $1 and still have reasonable performance. For $1
you can't get much in the way of programmable logic. For $3 however,
you can get a chip large enough for a CPU (with math) and room for
your special logic.

I've never used an FPGA, microcontrollers have increased in speed faster
than my needs so far. So I can usually bitbang everything or use a
peripheral. I used PLDs for glue logic back in the day but that's it. Oh
and I bought a small xilinx dev kit which I got to make a led flash then
put in a drawer for 15 years.

So your use of MCUs is based on inertia?

Partly I suppose.

Or I could say that my projects so far all require a microcontroller
anyway, and it seemed likely that a separate FPGA was always going to be
more expensive than, say, choosing a faster CPU.

A STM32F4 can bitbang a PIO at 84MHz. (It can't do anything else then,
but still...)

I think what you are saying is that the MCU is a key part of your design
and you use a lot of code in it. Ok, if your emphasis in on using a
commercial MCU that will do the job. But unless your MCU needs are just
too large for something that fits in an FPGA, you have it backwards in
my opinion. Why have both when you can just use an FPGA?

First, you're assuming an FPGA (hammer).
Second, you're going to need a bigger FPGA (even bigger hammers aren't
free).
 
krw@attt.bizz wrote:
On Sun, 08 Sep 2013 09:34:38 -0700, Joerg <invalid@invalid.invalid
wrote:

krw@attt.bizz wrote:
On Sat, 07 Sep 2013 15:23:39 -0700, Joerg <invalid@invalid.invalid
wrote:

rickman wrote:
On 9/7/2013 4:46 PM, Joerg wrote:
Paul Rubin wrote:
Joerg<invalid@invalid.invalid> writes:
I don't see how the equivalent of a TMS320 or a big MSP430 could fit
into one of these small Lattice devices.
I had thought the parts of those processors that would bloat up badly
(instruction decode etc.) are pretty simple so the overall effect of the
bloat is ok in the scheme of things. The parts doing the most work
(memory, arithmetic) are done in the FPGA hardware (RAM and DSP blocks,
adders connected to the LUT's somehow) as efficiently as on the MCU's.

I do think softcores seem like a silly idea a lot of the time, and am
looking forward to more low end FPGA's with MCU blocks.
Much of it has to do with legacy code. Yes, some things could even be
done more efficiently in the FPGA because you can actually streamline
the HW to the task, something neither uC nor DSP allow. For example, why
have a 32-bit HW multiplier when you know you'll never exceed 23 bits?
But legacy code won't run anymore and you need FPGA specialists to make
it all work.
No, you would need a DSP specialist. The FPGA designer only needs to
know how to code the FPGA.

So for this kind of solution in an FPGA you need a DSP specialist and an
FPGA specialist? That would be a problem.
Pick ones that are in the automotive market. Support for fifteen
years is required.

That's a good point. How does one find out which ones those are?

Click on the "Automotive" tab on the applications page in their web
site? Look for the ACQ- compliance designations?

I usually go in via Digikey. No automotive tab in that segment but the
temperature range gives it away.


Mil stuff is even better, that needs to remain available for decades.
That is the reason why the LM331 is still around and why I used it in a
long-life design many years ago.

The difference is that while Mil stuff may be available until the end
of times, that doesn't mean that it'll be priced within the range of
mere mortals for that time.

Not the certified parts or rad-hard. But with other semiconductors it
usually means that the consumer-grade stuff remains available. Like the
LM331 which they even brought out in lead-free which the military folks
wouldn't even touch. At $4 not exactly a bargain but in situations where
you need true analog V/F conversion that is a great help.

[...]

--
Regards, Joerg

http://www.analogconsultants.com/
 
On Sun, 08 Sep 2013 14:20:46 -0700, Joerg <invalid@invalid.invalid>
wrote:

krw@attt.bizz wrote:
On Sun, 08 Sep 2013 09:34:38 -0700, Joerg <invalid@invalid.invalid
wrote:

krw@attt.bizz wrote:
On Sat, 07 Sep 2013 15:23:39 -0700, Joerg <invalid@invalid.invalid
wrote:

rickman wrote:
On 9/7/2013 4:46 PM, Joerg wrote:
Paul Rubin wrote:
Joerg<invalid@invalid.invalid> writes:
I don't see how the equivalent of a TMS320 or a big MSP430 could fit
into one of these small Lattice devices.
I had thought the parts of those processors that would bloat up badly
(instruction decode etc.) are pretty simple so the overall effect of the
bloat is ok in the scheme of things. The parts doing the most work
(memory, arithmetic) are done in the FPGA hardware (RAM and DSP blocks,
adders connected to the LUT's somehow) as efficiently as on the MCU's.

I do think softcores seem like a silly idea a lot of the time, and am
looking forward to more low end FPGA's with MCU blocks.
Much of it has to do with legacy code. Yes, some things could even be
done more efficiently in the FPGA because you can actually streamline
the HW to the task, something neither uC nor DSP allow. For example, why
have a 32-bit HW multiplier when you know you'll never exceed 23 bits?
But legacy code won't run anymore and you need FPGA specialists to make
it all work.
No, you would need a DSP specialist. The FPGA designer only needs to
know how to code the FPGA.

So for this kind of solution in an FPGA you need a DSP specialist and an
FPGA specialist? That would be a problem.
Pick ones that are in the automotive market. Support for fifteen
years is required.

That's a good point. How does one find out which ones those are?

Click on the "Automotive" tab on the applications page in their web
site? Look for the ACQ- compliance designations?


I usually go in via Digikey. No automotive tab in that segment but the
temperature range gives it away.

Not at all. Automotive and industrial temperatures are quite alike
but the qualifications certainly are not. Automotive needs 85C
ambient. What that does to Tj varies by component.
Mil stuff is even better, that needs to remain available for decades.
That is the reason why the LM331 is still around and why I used it in a
long-life design many years ago.

The difference is that while Mil stuff may be available until the end
of times, that doesn't mean that it'll be priced within the range of
mere mortals for that time.


Not the certified parts or rad-hard. But with other semiconductors it
usually means that the consumer-grade stuff remains available. Like the
LM331 which they even brought out in lead-free which the military folks
wouldn't even touch. At $4 not exactly a bargain but in situations where
you need true analog V/F conversion that is a great help.

No, it sometimes means that they've squirreled away enough parts to
satisfy (infinite pockets) Uncle Sam. Yes, Mil stuff has that evil
lead, in it but again, the price bathtubs are quite different (and not
symmetrical)
 
krw@attt.bizz wrote:
On Sun, 08 Sep 2013 14:20:46 -0700, Joerg <invalid@invalid.invalid
wrote:

krw@attt.bizz wrote:
On Sun, 08 Sep 2013 09:34:38 -0700, Joerg <invalid@invalid.invalid
wrote:

krw@attt.bizz wrote:
On Sat, 07 Sep 2013 15:23:39 -0700, Joerg <invalid@invalid.invalid
wrote:

rickman wrote:
On 9/7/2013 4:46 PM, Joerg wrote:
Paul Rubin wrote:
Joerg<invalid@invalid.invalid> writes:
I don't see how the equivalent of a TMS320 or a big MSP430 could fit
into one of these small Lattice devices.
I had thought the parts of those processors that would bloat up badly
(instruction decode etc.) are pretty simple so the overall effect of the
bloat is ok in the scheme of things. The parts doing the most work
(memory, arithmetic) are done in the FPGA hardware (RAM and DSP blocks,
adders connected to the LUT's somehow) as efficiently as on the MCU's.

I do think softcores seem like a silly idea a lot of the time, and am
looking forward to more low end FPGA's with MCU blocks.
Much of it has to do with legacy code. Yes, some things could even be
done more efficiently in the FPGA because you can actually streamline
the HW to the task, something neither uC nor DSP allow. For example, why
have a 32-bit HW multiplier when you know you'll never exceed 23 bits?
But legacy code won't run anymore and you need FPGA specialists to make
it all work.
No, you would need a DSP specialist. The FPGA designer only needs to
know how to code the FPGA.

So for this kind of solution in an FPGA you need a DSP specialist and an
FPGA specialist? That would be a problem.
Pick ones that are in the automotive market. Support for fifteen
years is required.

That's a good point. How does one find out which ones those are?
Click on the "Automotive" tab on the applications page in their web
site? Look for the ACQ- compliance designations?

I usually go in via Digikey. No automotive tab in that segment but the
temperature range gives it away.

Not at all. Automotive and industrial temperatures are quite alike
but the qualifications certainly are not. Automotive needs 85C
ambient. What that does to Tj varies by component.

85C in a vehicle? Yikes. I would not dare to design that in.

--
Regards, Joerg

http://www.analogconsultants.com/
 
On Sun, 08 Sep 2013 14:03:47 -0400, rickman <gnuarm@gmail.com> wrote:

On 9/8/2013 11:19 AM, krw@attt.bizz wrote:
On Sat, 07 Sep 2013 13:23:48 -0400, rickman<gnuarm@gmail.com> wrote:

On 9/7/2013 11:17 AM, krw@attt.bizz wrote:
On Fri, 06 Sep 2013 23:59:59 -0400, rickman<gnuarm@gmail.com> wrote:

On 9/6/2013 7:10 PM, Joerg wrote:
That is often the problem. Sometimes a buck fifty is the pain threshold.
Not in this ATMega case, the 2560 is very expensive but comes with lots
of ADC and analog muxes and all that. Things that will cost extra with a
FPGA solution and eat real estate.


For $3 however, you can get a
chip large enough for a CPU (with math) and room for your special logic.


For $3 I can get a big DSP.

What "big" DSP can you get for $3? It has been a while since I looked
hard at DSP chips, but I don't recall any I would call remotely "big"
for $3. The TI chips that would be "big" are the TMS6xxx line which
start somewhere around $20 the last time I looked and that requires all
memory and I/O to be separate. The smaller DSP chips that you can get
for the $3 range are not "big" in any sense and only a very few of them
include Flash memory. So you still need another chip.

We pay less than that for the largest of the ADI sigma DSPs. I just
received a quote for the smallest CPLD for around $.75. I have use
for CPLDs and FPGAs but they're simply too expensive for most of my
applications.

It seems the prices have come down in recent years, but still, the parts
I have seen have no Flash. So you need to add in that cost. But the
Sigma parts aren't really general purpose. They are good if you can
make you app fit the DSP design, otherwise they aren't much use. I
pursued them hard a few years ago until an FAE just threw in the towel
and said I couldn't do my app on their part.

Good grief. The issue wasn't to show YOU that YOUR application was
better in a DSP. Like many FGPA weenies, you're trying to sell a part
that has a niche market as the universal hammer.

Good grief is right. You don't need to be rude. It isn't just my
application, the Sigma parts are designed for a very limited set of DSP
apps and even the development software limits how you design with them.
They won't do the job of *most* DSP apps.

That doesn't alter the fact that you're constantly moving the goal
posts. You *are* defending the FPGA as the general solution when it
is quite decidedly only applicable in the niches. You wanted to know
what DSP had a CODEC. I told you but now you whine that it won't
solve YOUR problem. It's not my job to do your work.

Even a "small" FPGA can run rings around a DSP when it comes to
performance. Usually "big" in DSPs means fast and when you want really
fast DSP you use an FPGA with all the parallelism you can handle. DSPs
can't touch FPGAs for speed, even with low power.

Comparing the two is silly. Each has its place.

That makes no sense.

Hammer, meet nail.

If you don't want to discuss engineering, then please spare me.

If you refuse to understand, why do you bother coming here?

There will always be some designs that a given
part is a perfect fit for, but that doesn't mean different devices can't
be compared. The question is what is the best fit for a given job.

That is *NOT* what you're arguing. You're making the general case
that FPGA>> DSP>> uC, which is just silly.

I am hearing some say that FPGAs aren't the best fit and I find they often
are a better fit than an MCU.

Hammer, meet nail.

You are repeating yourself.

Only because you are repeating your silly FPGA uber alles, nonsense.

Much of it has to do with mis-information
about what FPGAs can and can't do and what is required to make them run.

Nonsense.

Just read Joerge's post.

I have.

Much of the stuff he objects to is specific
to the individual devices he has worked with.

Like DSPs. I agree with him. FPGAs aren't in his future. You keep
sugar-coating FPGAs and (erroneously) tear down DSPs. Note that I'm
more of an FPGA kind of guy than a DSP sort but in this case Joerg is
absolutely right. FPGAs only compete in small niche markets and those
where money is no object.

No one is tearing down DSPs. Can you just stick to the engineering and
skip the drama?

WFT are you talking about? You *are* saying that FPGAs are the cat's
ass for all applications when nothing could be further from the truth.
they're a niche and the solution of last resort. Well, and ASSC is
the solution of last resort, but...

Your statement is exactly the sort of "mis-information" I am talking
about. At $3 I think you can use an FPGA in a low cost app. So your
"money is no object" claim is just BS.

Absolutely wrong. FPGAs are *only* useful when there is no other
choice. For anything else, they will always be the most expensive
solution. Your position is exactly a nail looking at every tool as a
hammer.

<snip>

I have not found a big difference in software. The software is
different, but those differences are not important. It all compiles my
HDL fine (mostly because they often use the same third party tool
vendors) and simulation just works anymore.

The software is different in how it works, not what it does. That
difference makes *NO* difference to the end result or the cost of the
product. IOW, it's completely irrelevant. At one time it may have
been important but only in so much as that much of it didn't work
(making the hardware useless).

That is what I am saying. I find little difference in how the tools
work. You write your HDL in your editor or their built in editor, you
simulate it using the free tool they provide and you compile to a bit
stream that gets downloaded into the FPGA, Flash or RAM. No, the tools
aren't going to look exactly the same, but they do the same job and work
the same way. Most of the tool is actually third party anyway, except
for the Xilinx HDL in house compiler. But them most FPGA professionals
(read as working for a company that has a few bucks) pay for the third
party tools anyway.


The one feature that isn't universal is programming modes. This can
make a big difference in indirect costs (field upgrade, SKU
personalization, etc.) that may not show up directly on the raw BOM.

I don't know what devices you work with, but the ones I use are easy to
program.

Pile on more sugar. You clearly don't work where time is money.

Ok, very convincing argument. I have no idea what you are talking
about. Downloading an FPGA is no different than an MCU. You either
attach a cable for JTAG or your use the resources you designed into your
target.

You clearly don't simulate. You've completely ignored the issue here.
Like the elephant in the phone booth, FPGA zealots want to ignore it.
Simulation is much more work than design, yet it is always forgotten
in these discussions.

<snip>

The CPU is the easy part to port, the compiler handles that for you. It
is the drivers for the I/O that is harder.

That's all included in the port. I'm talking from working hardware to
working hardware (the target system not qualified, of course). There
is only about 10% of the code that even has to be looked at.

With FPGAs *none* of the code has to be looked at.

Oh, good grief! You never have to rework libraries? Roll your own,
for what was free in the other vendor's? All of the peripherals
function the same, for each vendors'? Come on, get real!

Their libraries have to have
compatible interfaces and every port is a port.

Wrong. That's all included.

You have identical peripheral interface libraries for different brands
of MCUs? Every timer function works the same, every SPI port sets up
the same, every power controller is operated the same?

That's all in the two weeks. From working code to working code. We
just did it from a TI to Freescale SoC.

....yet you claim that you can do that all between X and A without even
looking at the code. Unbelievable!

With FPGAs, all you
need to do to switch between brands is normally a new pin list and
timing constraints.

Bullshit! More sugar!

You are the consummate debater...

I call bullshit when I see bullshit and that is *BULLSHIT*.
The HDL just compiles to suit the new device.

Oh, you never use libraries? Yet you (erroneously) add that cost into
the DSP/uC bucket.

The only libraries I use are the HDL libraries which are standardized,
like using stdio. I don't add the libraries into the MCU column, I'm
not the one using the MCU.

What bullshit. There are no PCIe libraries? USB? uC? GMAFB!

It has been a while since I ported between brands but it would make sense
if they provide tools to port the timing constraints. That is the only
part that might be any work at all.

In short, there is a lot of FUD about FPGAs. Talk to someone who
doesn't buy into the FUD.

The FUD is on both sides. The support costs aren't as low as you
pretend.

Care to elaborate?

You've TOTALLY forgotten about simulation, for instance. That's a
huge effort that you simply sweep under the rug.

What about it? I paid for a set of tools from Lattice.

Tools? TOOLS?! What do they do all by themselves?

I had used the
Modelsim simulator at work and was used to it. The Lattice tools said
they came with the Modelsim simulator. But by the time I got the
package it had the Active HDL simulator. I complained about this
thinking I would have to learn a new simulator... but it was a no-op to
switch. Even my Modelsim scripts ran under the AHDL simulator.

So what is your concern?

Simulation takes TIME and SKILL that most don't have. That's in
addition to the skills and time needed to do a uC or DSP solution. If
there is *any* way to do a project with either, the FPGA loses. That's
the whole point here; FPGAs are a solution to niches - always will be.

<snip>

I would appreciate a list of the MCUs/DSPs which have stereo CD quality
CODECs on chip. The Sigma parts from ADI don't count because their DSPs
can *only* be used for certain coding like filters, not general purpose
use.

Sigmas have them. I haven't looked for others.

But Sigmas aren't general purpose DSPs and can't do most DSP jobs. They
are designed to be filters, like in hearing aids.

They do a lot and some better than other DSPs. It was an example, to
counter your point. BTW, I'm not trying to say that DSPs are the
end-all solution, like you are trying to, with FPGAs. ...and like I
said, I've done way more FPGA design than DSP (I only do hardware).


The families all look the same and vary only in density and mix of
memory, speed, MCU, DSP(hmm), and other features.

Good grief, you're arguing both sides.

We've been down the road before. You are not enjoyable to discuss
things with. You get obnoxious and don't explain what you are talking
about. Do you really want to have this discussion? I don't think I do.

I can't help it if you don't like being called on your bullshit. If
you don't like being called on it, don't bullshit.
 
On Sun, 08 Sep 2013 15:57:29 -0700, Joerg <invalid@invalid.invalid>
wrote:

krw@attt.bizz wrote:
On Sun, 08 Sep 2013 14:20:46 -0700, Joerg <invalid@invalid.invalid
wrote:

krw@attt.bizz wrote:
On Sun, 08 Sep 2013 09:34:38 -0700, Joerg <invalid@invalid.invalid
wrote:

krw@attt.bizz wrote:
On Sat, 07 Sep 2013 15:23:39 -0700, Joerg <invalid@invalid.invalid
wrote:

rickman wrote:
On 9/7/2013 4:46 PM, Joerg wrote:
Paul Rubin wrote:
Joerg<invalid@invalid.invalid> writes:
I don't see how the equivalent of a TMS320 or a big MSP430 could fit
into one of these small Lattice devices.
I had thought the parts of those processors that would bloat up badly
(instruction decode etc.) are pretty simple so the overall effect of the
bloat is ok in the scheme of things. The parts doing the most work
(memory, arithmetic) are done in the FPGA hardware (RAM and DSP blocks,
adders connected to the LUT's somehow) as efficiently as on the MCU's.

I do think softcores seem like a silly idea a lot of the time, and am
looking forward to more low end FPGA's with MCU blocks.
Much of it has to do with legacy code. Yes, some things could even be
done more efficiently in the FPGA because you can actually streamline
the HW to the task, something neither uC nor DSP allow. For example, why
have a 32-bit HW multiplier when you know you'll never exceed 23 bits?
But legacy code won't run anymore and you need FPGA specialists to make
it all work.
No, you would need a DSP specialist. The FPGA designer only needs to
know how to code the FPGA.

So for this kind of solution in an FPGA you need a DSP specialist and an
FPGA specialist? That would be a problem.
Pick ones that are in the automotive market. Support for fifteen
years is required.

That's a good point. How does one find out which ones those are?
Click on the "Automotive" tab on the applications page in their web
site? Look for the ACQ- compliance designations?

I usually go in via Digikey. No automotive tab in that segment but the
temperature range gives it away.

Not at all. Automotive and industrial temperatures are quite alike
but the qualifications certainly are not. Automotive needs 85C
ambient. What that does to Tj varies by component.


85C in a vehicle? Yikes. I would not dare to design that in.

It's a fact of life. Think JimT's car interior, parked in the
driveway in August. Design to 85C isn't just a good idea, it's the
spec (-40C to 85C).
 
krw@attt.bizz wrote:
On Sun, 08 Sep 2013 15:57:29 -0700, Joerg <invalid@invalid.invalid
wrote:

krw@attt.bizz wrote:
On Sun, 08 Sep 2013 14:20:46 -0700, Joerg <invalid@invalid.invalid
wrote:

krw@attt.bizz wrote:
On Sun, 08 Sep 2013 09:34:38 -0700, Joerg <invalid@invalid.invalid
wrote:

krw@attt.bizz wrote:
On Sat, 07 Sep 2013 15:23:39 -0700, Joerg <invalid@invalid.invalid
wrote:

rickman wrote:
On 9/7/2013 4:46 PM, Joerg wrote:
Paul Rubin wrote:
Joerg<invalid@invalid.invalid> writes:
I don't see how the equivalent of a TMS320 or a big MSP430 could fit
into one of these small Lattice devices.
I had thought the parts of those processors that would bloat up badly
(instruction decode etc.) are pretty simple so the overall effect of the
bloat is ok in the scheme of things. The parts doing the most work
(memory, arithmetic) are done in the FPGA hardware (RAM and DSP blocks,
adders connected to the LUT's somehow) as efficiently as on the MCU's.

I do think softcores seem like a silly idea a lot of the time, and am
looking forward to more low end FPGA's with MCU blocks.
Much of it has to do with legacy code. Yes, some things could even be
done more efficiently in the FPGA because you can actually streamline
the HW to the task, something neither uC nor DSP allow. For example, why
have a 32-bit HW multiplier when you know you'll never exceed 23 bits?
But legacy code won't run anymore and you need FPGA specialists to make
it all work.
No, you would need a DSP specialist. The FPGA designer only needs to
know how to code the FPGA.

So for this kind of solution in an FPGA you need a DSP specialist and an
FPGA specialist? That would be a problem.
Pick ones that are in the automotive market. Support for fifteen
years is required.

That's a good point. How does one find out which ones those are?
Click on the "Automotive" tab on the applications page in their web
site? Look for the ACQ- compliance designations?

I usually go in via Digikey. No automotive tab in that segment but the
temperature range gives it away.
Not at all. Automotive and industrial temperatures are quite alike
but the qualifications certainly are not. Automotive needs 85C
ambient. What that does to Tj varies by component.

85C in a vehicle? Yikes. I would not dare to design that in.

It's a fact of life. Think JimT's car interior, parked in the
driveway in August.

The last measurement I remember for under the dash where electronics
often live was 190F on a 98F day, with sun exposure and windows up.
Meaning the classic situation in the airport economy lot where the car
bakes 8-10h per day. That is over 87C, and this assumes the electronics
do not run and thus do not generate heat in addition. Now imagine a 115F
day which is not unusual in places like AZ or NM.


... Design to 85C isn't just a good idea, it's the
spec (-40C to 85C).

It is a bad spec. The result of such flawed design evidences itself over
and over. For example, the minivan of friends of ours would not start
when parked at a mall on a hot day after more than 15mins of driving. It
would (sometimes) come back to life if you let it sit for half an hour
so the radiated engine heat became less. In the winter it was mostly ok.
That design is IMHO junk.

--
Regards, Joerg

http://www.analogconsultants.com/
 
krw@attt.bizz writes:
Like DSPs... You keep sugar-coating FPGAs and (erroneously) tear
down DSPs. ... FPGAs only compete in small niche markets and those
where money is no object.

I liked this article and it's one of the things that has me interested
in FPGA's:

http://www.yosefk.com/blog/how-fpgas-work-and-why-youll-buy-one.html

The attractive thing there (compared to DSP's) is having multiple DSP
blocks right there on the FPGA, even in fairly cheap ones. Even at low
clock rates you can outcompute a pretty serious DSP by using those
multipliers in parallel. And more and more powerful function blocks
(hard macros) on the FPGA are on their way. But, for computing directly
in the LUTs, there is obviously a price to be paid.
 
On Sun, 08 Sep 2013 17:15:18 -0700, Joerg <invalid@invalid.invalid>
wrote:

krw@attt.bizz wrote:
On Sun, 08 Sep 2013 15:57:29 -0700, Joerg <invalid@invalid.invalid
wrote:

krw@attt.bizz wrote:
On Sun, 08 Sep 2013 14:20:46 -0700, Joerg <invalid@invalid.invalid
wrote:

krw@attt.bizz wrote:
On Sun, 08 Sep 2013 09:34:38 -0700, Joerg <invalid@invalid.invalid
wrote:

krw@attt.bizz wrote:
On Sat, 07 Sep 2013 15:23:39 -0700, Joerg <invalid@invalid.invalid
wrote:

rickman wrote:
On 9/7/2013 4:46 PM, Joerg wrote:
Paul Rubin wrote:
Joerg<invalid@invalid.invalid> writes:
I don't see how the equivalent of a TMS320 or a big MSP430 could fit
into one of these small Lattice devices.
I had thought the parts of those processors that would bloat up badly
(instruction decode etc.) are pretty simple so the overall effect of the
bloat is ok in the scheme of things. The parts doing the most work
(memory, arithmetic) are done in the FPGA hardware (RAM and DSP blocks,
adders connected to the LUT's somehow) as efficiently as on the MCU's.

I do think softcores seem like a silly idea a lot of the time, and am
looking forward to more low end FPGA's with MCU blocks.
Much of it has to do with legacy code. Yes, some things could even be
done more efficiently in the FPGA because you can actually streamline
the HW to the task, something neither uC nor DSP allow. For example, why
have a 32-bit HW multiplier when you know you'll never exceed 23 bits?
But legacy code won't run anymore and you need FPGA specialists to make
it all work.
No, you would need a DSP specialist. The FPGA designer only needs to
know how to code the FPGA.

So for this kind of solution in an FPGA you need a DSP specialist and an
FPGA specialist? That would be a problem.
Pick ones that are in the automotive market. Support for fifteen
years is required.

That's a good point. How does one find out which ones those are?
Click on the "Automotive" tab on the applications page in their web
site? Look for the ACQ- compliance designations?

I usually go in via Digikey. No automotive tab in that segment but the
temperature range gives it away.
Not at all. Automotive and industrial temperatures are quite alike
but the qualifications certainly are not. Automotive needs 85C
ambient. What that does to Tj varies by component.

85C in a vehicle? Yikes. I would not dare to design that in.

It's a fact of life. Think JimT's car interior, parked in the
driveway in August.


The last measurement I remember for under the dash where electronics
often live was 190F on a 98F day, with sun exposure and windows up.
Meaning the classic situation in the airport economy lot where the car
bakes 8-10h per day. That is over 87C, and this assumes the electronics
do not run and thus do not generate heat in addition. Now imagine a 115F
day which is not unusual in places like AZ or NM.


... Design to 85C isn't just a good idea, it's the
spec (-40C to 85C).


It is a bad spec. The result of such flawed design evidences itself over
and over. For example, the minivan of friends of ours would not start
when parked at a mall on a hot day after more than 15mins of driving. It
would (sometimes) come back to life if you let it sit for half an hour
so the radiated engine heat became less. In the winter it was mostly ok.
That design is IMHO junk.

Oh, the spec I mentioned above isn't for the engine compartment or any
of the ignition or safety gadgets. It's for the noise makers. ;-)
yes, that's the unpowered temperature. Temp rise has to be added to
that.
 
rickman <gnuarm@gmail.com> writes:
You have a bias against soft cores because you want to analyze them in
a meaningless way. How about analyzing them in the terms that you
care about?

Nothing I have seen indicates any problem with my analysis, which is
grounded in basic knowledge of how these circuits work. FPGA's don't
run on magic pixie dust. They have the same gates, memory cells,
etc. that other chips do. If you've got some evidence saying otherwise,
I think it's on you to put up the numbers.q

The GA144 isn't 0.5 Watts either, it is close to 1 Watt with all nodes
running.

Still not too bad.

I don't have much confidence GA will be around in 10 years. Do you
know of one major design win they have had?

I have doubts about GA too, but it's irrelevant. They did something
with a quite low tech ASIC design and fab process, which doesn't appear
to be possible with FPGA's without much more advanced fab tech.

No, you can't use an FPGA to implement some existing processor and
improve on cost, power or any other parameter. I never said you
could.

Why are you going on about softcores then? A big attraction of
softcores is to use code and compilers that you already have, instead of
building your application from scratch in Verilog. That generally means
implementing an existing processor. How else are you going to run that
code?

But if you have an application - it may well be easier to implement in
an FPGA than in a GA144...

The comparison was simply between hard and soft processors of similar
programmability to see how they did in terms of cost and speed.
 

Welcome to EDABoard.com

Sponsor

Back
Top