EDAboard.com | EDAboard.de | EDAboard.co.uk | WTWH Media

Is FPGA code called firmware?

Ask a question - edaboard.com

elektroda.net NewsGroups Forum Index - FPGA - Is FPGA code called firmware?

Goto page Previous  1, 2, 3, 4  Next


Guest

Wed Feb 22, 2006 12:54 pm   



JJ wrote:
Quote:
On the ASIC side, Synopsys BC didn't fare too well with hardcore ASIC
guys, better with system guys. Since FPGAs let software, system guys
into the party, BC style synthesis is going to be far more acceptable
and widespread, cost of failure so much lower than with ASICs. As for
is it HW or SW, I decline, but the ASIC guys would tend to call BC
design too software like given early results, I don't think their
opinion for C style behavioural design has changed any either.

As a half EE and CSc guy from the 1970's I spent more than a few years
doing hard core assembly language systems programming and device driver
work. The argument about C for systems programming was much louder and
much more opinionated about "what real systems programmer can do". As
an early Unix evanglist and systems programmer, it didn't take long to
discover that I could code C easily to produce exactly the asm I
needed. As the DEC systems guys and the UNIX systems guys war'ed over
what was best, it was more than fun to ask them for their rosetta
assembly language, and frequently knock out a faster C design in a few
hours, for a piece of code that took weeks to fine tune in asm. It was
almost always because they got fixated on micro optimization of a few
loops, and missed the big picture optimizations. Rewriting asm
libraries in C as we ported to microprocessors and away from PDP11's
was seldom a performance hit at all.

I see similar things happening with large ASIC and FPGA designs, as the
real performance gains are in highly optimized, but more complex
architectures, and less in the performance of any particular FSM and
data path. Doing the very best gate level designs, just like the very
best asm designs, at some point is just a distraction, when you start
looking at complex systems with high degrees of parallelism and
specialized functional units where the system design/architecture is
the win, not a few cycles at the bottom of some subsystem.

The advantage of transfering optimization knowledge into HLL tools, is
that they do it right EVERY time after that. Where the same energy
spent optimizing one low level design is seldom leveraged to other
designs. Because of this HLL programming languages routinely deliver
three or four nines of the performance hand coding will do, and
frequently better as all optimations are automatically taken and
applied, where a hand coder would not be able to.

We see the same evolution in bit level boolean design for hardware
engineers. A little over a decade ago, all equations were hand
optimized .... today that is a lost art. As the tools do more of it,
probably in another decade it will no longer be taught as a core
subject to EE's, if not sooner. There are far more important things for
them to learn that they WILL actually use and need. That will not stop
the oldie moldies from lamenting how little the kids today know, and
claiming they don't even know their trade. The truth is, that the kids
will have new skills that leave the Dino's just that.

Quote:
If the end result of BC style design produces results as good as
typically achieved by DC synthesis then it is everybit as good as
hardware but does it produce such results? In hardcore hardware land we
expect to drive upwards of 300MHz cycle rates and plan a hardware
design onto a floorplan but I wouldn't expect such performance or
efficiency from these BC tools. Do they routinely produce as good
results, I very much doubt? Replacing RTL for behavioural design may
raise productivity but it is not the same thing as replacing assembler
coding for HLL coding IMHO given the current nature of OoO cpus.

I believe, having seen this same technology war from the other side,
that it is not only the same, but will actually evolve better because
the state of the art and knowledge about how to take optimizations and
exploit them is much better understood these days. The limits in
software technology and machine performance that slowed extensive
computational optimizations for software compilers are much less of a
problem with VERY fast cpus and large memory systems today. Probably
the hard part will be yanking the knowledge set to do a good job from
the minds of heavilly protesting EE's worried about job security. I
suspect a few, will see the handwritting on the wall, and will become
coders for tools, just to have a job.


Guest

Wed Feb 22, 2006 1:42 pm   



Hal Murray wrote:
Quote:
Back in the old days, it was common to build FSMs using ROMs. That
approach makes it natural to think of the problem as software - each word
in the ROM holds the instruction you execute at that PC plus the right
external conditions.

Any of us educated in engineering school in the 1970's probably have
more than a few times. On the other hand, I also built a DMA engine out
of an M68008 using address space microcoding which saved a bunch of
expensive PAL's and board space, plus used the baby 68k to implement a
scsi protocol engine to emulate a WD1000 chipset. The whole design took
a couple months to production.

Having done it the hard way with bipolar bit slices, just gives you the
tools to take a more powerful piece of silicon and refine it better.
That is the beauty of FPGAs as computational engines today. Looking
past what it's ment to do, and looking forward to what you can do with
it tomarrow, by exploiting the parallism and avoiding the sequential
bottlenecks of cpu/memory designs. Designers that only know how to use
a cpu + memory, and lack the skills of designing with lower level
building blocks miss the big picture - both at a software and hardware
level.

Quote:
I've done a reasonable amount of hack programming where I count
every cycle to get the timing right. I could probably have done
it in c, but I'm a bottom up rather than top down sort of person.

it's not about up or down, its simply learning your tools. for 30 years
I've written C thinking asm, and coding C line for line with the asm it
produced. For the last couple years, after learning TMCC and taking a
one day Celoxica intro seminar, I started writing C thinking gates,
just as a VHDL/Verilog engineer writes in that syntax thinking gates.
Hacking on, and extending TMCC as FpgaC has only widened my
visualization of what we can do with the tool. The things that
TMCC/FpgaC does wrong are almost purely masked by the back end boolean
optimizer which comes very close to getting it right. Where it
doesn't, is because its synthesis rules are targeted at a generic
device, and it lacks device specific optimizations to target the
available CLB/Slice implementation. That will come with time, but
really don't have that big an impact today.

Right now we are focused on implementng the rest of the C language that
was left out of TMCC, which is mostly parser work, and some utility
routine work inside the compiler. That will be completed march/april,
then we can move on to back end work, and target what I call compile,
load and go work, which will focus on targeting the backend to current
devices. With that will come distributed arithmetic optimized to
several platforms as well as using carry chains and muxes available in
the slices these days. At that point, FpgaC will be very close to
fiting designs as current HDL's do .... if you can learn to write C
thinking gates. FpgaC will over time hide most of that from less
skilled programmers, requiring only modest retraining.

The focus will be computing with fpgas, not fpga designs to support
computing hardware development. RC.

For the last couple weeks we have been doing integration and cleanup
from a number of major internal changes, mostly from a symbol table
manager and scoping rules fix to bring TMCC inline with Std C scoping
and naming, so that we could support Structures, typedef and enum.
We've also implemented the first crack at traditional typing in prep
for enum/typedef, allowing unsigned and floating point now in the
process. Both are likely to be in beta-2 this month. The work I checked
in last night has the core code for FP using intrinsic functions, and
probably needs a few days to finish. It also now has do-while and for
loops, along with structures and small LUT based arrays or BRAM arrays.
It currently regression tests pretty well, with a couple minor
problems left, including one that cause some temp symbols to end up
with the same names. That should be gone by this weekend, as FP gets
finished and hopefully unsigned is done too.

svn co https://svn.sourceforge.net/svnroot/fpgac/trunk/fpgac fpgac

alpha/beta testers and other developers welcome Smile

Andy Peters
Guest

Wed Feb 22, 2006 8:47 pm   



Gabor wrote:
Quote:
At our company we call FPGA configuration code "software" if it
is stored on the hard drive and uploaded at run time by a user
application. When it is stored in a serial PROM or flash memory
on the board we call it firmware.

I don't think the terms "firmware" or "software" have as much to do
with the programming paradigm as with the delivery of the bits to
make the actual hardware run.

At my current and previous jobs, FPGA "loads" are/were considered
firmware, for the same reason that processor boot code and the
lowest-level debug monitor was considered firmware: the images are
stored on board in some kind of non-volatile memory.

-a

Nial Stewart
Guest

Thu Feb 23, 2006 1:02 pm   



Quote:
I think
what is suprising to some, is that low level software design is long
gone,

?????

No-one ever programs in assembly language any more then?


Quote:
and low level hardware design is soon to be long gone for all the
same reasons of labor cost vs. hardware cost.


Where price, performance and power consumption don't matter a higher
level language might become more prevalent. I think we'll always
need to be able to get down to a lower level hardware description
to get the best out of the latest devices, stretch the performance
of devices or squeeze what needs to be done into a smaller device.


I also wonder if price/performance/power consumption will become much
less important in the future, as it has with software. These days
you can assume application software will be run on a 'standard'
sufficiently powerful PC. It won't be the case that at the start of
every hardware project that you can assume you have a multi million
gate FPGA (or whatever) at your disposal.



Nial.


Guest

Thu Feb 23, 2006 9:39 pm   



Nial Stewart wrote:
Quote:
I think what is suprising to some, is that low level software design is long gone,
No-one ever programs in assembly language any more then?

When I started doing systems programming in the late 1960's, everything
core system important was written in assembly language on big IRON
IBM's - 1401's, 1410's, 360's, with applications in RPG, COBOL, and
Fortran. Pretty much was the same when I started programing
minicomputers, DG's, DEC's, and Varian's, except some flavor of Basic
was the applications language of choice.

99% of systems code - operating system, utilities, compilers, linkers,
assemblers, ... etc was assembly language.

I suspect that number is less than a very small fraction of a percent
today, that is new designs.

Quote:
and low level hardware design is soon to be long gone for all the
same reasons of labor cost vs. hardware cost.
Where price, performance and power consumption don't matter a higher
level language might become more prevalent.

The power argument is moot, as the difference between power for a good
C coder and a good asm coder, is probably less than a fraction of a
percent. Ditto for performance. The only case I can think of, is using
the wrong compiler for the job, such as using an ANSI 32bit compiler on
a micro that is native 8 or 16 bits, rather than downsizing to a subset
compiler which was designed to target that architecture. And there are
a lot of really good subset 8/16 bit compilers for micros,
and it only takes a man week or two to adapt SmallC/TinyC derivatives
to a new architecture.

Cost on the other hand, I believe is strongly in favor of using C or
some other low level HLL instead of assembly. When the burdened cost of
good software engineers is roughly $200-250K/yr and the productivity
factor between asm and HLL's is better than a factor of 3-10 over the
lifecycle of the product, it's nearly impossible to make up the
difference in volume.

Usinging junior labor that chooses the wrong tools and botches the
design is a management problem. There are plenty of good HLL coders
that can do low level software fit to hardware, and get it right. And
it only takes a good person a few weeks to train an experienced coder
how to do it when experienced low level guys are in high demand.

Quote:
I think we'll always
need to be able to get down to a lower level hardware description
to get the best out of the latest devices, stretch the performance
of devices or squeeze what needs to be done into a smaller device.

Yep ... but coding C other other systems class HLL line for line as
asm, the fraction of a percent gained by coding more than a few lines
of asm is quickly lost in time to market, labor cost, and ability to
maintain the product long term, including porting to a different mirco
to chase the low cost components over time.

Quote:
I also wonder if price/performance/power consumption will become much
less important in the future, as it has with software. These days
you can assume application software will be run on a 'standard'
sufficiently powerful PC. It won't be the case that at the start of
every hardware project that you can assume you have a multi million
gate FPGA (or whatever) at your disposal.

Today, the price difference between low end FPGA boards and million
gate boards is getting pretty small, with megagate FPGAs in high
volume. Five, or even two years ago, was pretty different.

The real issue is that FPGA's with CPU's and softcore CPU's allow you
to implement the bulk of the software design on a traditional
sequential architecture where performance is acceptable, and push the
most performance sensitive part of the design down to using raw logic.

The Google description for this group is: Field Programmable Gate Array
based computing systems, under Computer Architecture FPGA. And, after
a few years, I think we are finally getting there .... FPGA based
coputing instead of CPU based computing.

The days of FPGA's being only for hardware design are slipping away.
While this group has been dominated by hardware designers using FPGA's
for hardware designs, I suspect that we will see more and more
engineers of all kinds here doing computing on FPGA's, at all levels.

JJ
Guest

Fri Feb 24, 2006 12:30 am   



fpga_toys_at_yahoo.com wrote:

snipping, kind of tired of being told software taking over hardware
design, it ain't

Quote:
The Google description for this group is: Field Programmable Gate Array
based computing systems, under Computer Architecture FPGA. And, after
a few years, I think we are finally getting there .... FPGA based
coputing instead of CPU based computing.

The days of FPGA's being only for hardware design are slipping away.
While this group has been dominated by hardware designers using FPGA's
for hardware designs, I suspect that we will see more and more
engineers of all kinds here doing computing on FPGA's, at all levels.

While thats likely somewhat true and I really welcome anyone with
interesting content problems, I suspect it's already too late for new
comers without strong EE backgrounds or associates.

15 years ago FPGAs were pretty darn simple and not much use for
anything but glue logic. Good ole days when any old CMOS logic slapped
together just worked. Synthesis just around the corner.

10 years ago they got big enough but not performance enough to start to
make predictions about RC and the possibly of replacement of general
purpose cpus with hardware computing ie the 4000 days and a couple of
new companies to boot. ASIC design started to get harder.

5 years ago with Virtex I'd say they started to get the ASIC
performance with the embedded blocks and specialized IO resources
making performance almost even to cover the much slower LUT blocks, and
we also got the hugely more complex data sheets.

Today most FPGAs seem to have the whole kitchen sink in there to make
complex systems more practical if the sink can be made small enough to
hide cost when not used.

Look at any data sheet for modern parts, maybe 5% or less could be
understood by your avg SW engineer (far less I bet), the rest is all
electrical stuff, signal integrity, power supplies, packaging,
reliabilty, clocking in no particular order.

Ask around here for books on FPGA computing, there aren't any, there
all old shit 10yrs or more from the easy days. I have one that covers
the 3000 series. Ray has one coming and it sure ain't targeted at
software guys, he's too busy with real work, as are most EEs with a job
to write up their current knowledge. FPGAs are simply moving too fast
to be documented for the laissez faire user.

SW engineers with the mathematical applications are used to dealing
with ready made PC boxen, give enough ventilation and hot math
shouldn't faze a P4. There really isn't anything available in the same
sense of off the shelf FPGA computing that can be sold as a std board
to all the math, idea guys with out HW pain. Yeh there are lots of FPGA
PCI cards but they are mostly not useable to software guys without some
EE around as well as hardware lab tools. So that means a special
application likely needs special boards to be built. Welcome to the
real world, power supplies, interfaces, SI, GHz IO's, lead free. They
haven't tought that in school in CS ever, and perhaps maybe some EE
schools too. I can feel pretty sure that EEs that don't know this won't
get much work. I suspect logic classes are going to be with us too for
ever.

When I interviewed candidates that don't know basic bool algebra but
would like to do mil gate designs, I'd say let you know later, or let
them know what they need to know. Is that job protection, sure it is,
EEs don't want Joe90 liability around, bad ASIC design kills companies.
We are going the same place with FPGA systems, bad designs will never
work but only the project is lost, not million $ mask sets. My last
employer's FPGA project cost far more than previous predecessor full
custom mixed signal ASIC, it had lots of nice new math in it to figure
out. Even really good EEs make logic mistakes, so some further
abstractions are likely but that doesn't help much with all the dirty
backend EE stuff.

Quote:
From time to time we have had a few math, bio guys come here with
questions about their interesting problems, but what I noticed is that

they seem to be pretty coy about what they are upto. I suspect the days
of SW engineers coming to the FPGA party are already over. FPGAs are
getting bigger and more interesting but a darn site harder to use and
that won't ever get covered up by synthesis tools.

Also from what I have seen of some of the applications of FPGAs to
computing v PC computing, the FPGA projest didn't even match the PC
solution on cost. Not because the FPAG doesn't have the grunt but
because too much was left on the table since the design was done by
math oriented people. Now as I said before cpu designers have decided
to go the same way FPGA are, packing density plus any incremental clock
speed so its a parallel race again. My gut tells me that PC computing
is still the best way to go if a plain C description works using 32 bit
int math and esp FP math. But when the math looks like crypto with S
boxes and shuffles or has dedicated IO interface, FPGAs cream all over.

Multi disciplined teams are the future but EEs won't be in the back
seat.

I am done

John Jakson
transputer guy
Marlboro MA

BTW I don't know how to change brake pads or do oil changes (or even
spell) so none of the above makes diddly squat.

Nial Stewart
Guest

Fri Feb 24, 2006 11:06 am   



<fpga_toys_at_yahoo.com> wrote in message news:1140727198.395726.97600_at_u72g2000cwu.googlegroups.com...


Quote:
and low level hardware design is soon to be long gone for all the
same reasons of labor cost vs. hardware cost.
Where price, performance and power consumption don't matter a higher
level language might become more prevalent.

The power argument is moot, as the difference between power for a good
C coder and a good asm coder, is probably less than a fraction of a
percent.

I was talking about the FPGA domain here, not SW.


Quote:
I also wonder if price/performance/power consumption will become much
less important in the future, as it has with software. These days
you can assume application software will be run on a 'standard'
sufficiently powerful PC. It won't be the case that at the start of
every hardware project that you can assume you have a multi million
gate FPGA (or whatever) at your disposal.

Today, the price difference between low end FPGA boards and million
gate boards is getting pretty small, with megagate FPGAs in high
volume. Five, or even two years ago, was pretty different.

Not every design has the need for million gate device functionality,
Altera and Xilinx's low cost families seem to be selling in big
numbers. Sometimes it's important to push the performance of these
lower cost devices to keep costs down. Getting the same functionality
into a smaller device can also be important if power consumtion is
critical (my original point).

How many power supplies do you need for your big devices?

Quote:
The Google description for this group is: Field Programmable Gate Array
based computing systems, under Computer Architecture FPGA. And, after
a few years, I think we are finally getting there .... FPGA based
coputing instead of CPU based computing.

This newsgroup and FPGAs were around long before some numpty at Google
decided what their description should be. I don't think we should
be taking this as a guiding pointer for the future.


Quote:
The days of FPGA's being only for hardware design are slipping away.
While this group has been dominated by hardware designers using FPGA's
for hardware designs, I suspect that we will see more and more
engineers of all kinds here doing computing on FPGA's, at all levels.

That's probably true, and I expect to be using other tools as well as
VHDL in 5 years. However as John posted above, there's alot more to
implementing an FPGA design than the description used for the logic
and I think we'll still be using HDLs to get the most out of them for
a long time to come (to a bigger extent than with C/asm).



Nial.


Guest

Tue Feb 28, 2006 11:42 am   



Nial Stewart wrote:
Quote:
That may be the case for large multi designer designs, for smaller
devices someone who understands the underlying architecture and
what they're actually trying to design to will be needed.

I think that has always been the case for embedded, and realtime, and
any other tightly integrated hardware/software design of any size.

Quote:
The problem is, people who talk about this stuff get into their niche
and see everything else from that perspective. Few people routinely
work with a broad spectrum of systems from 4-bit to 64-bit and code
volumes from a few hundred bytes to a few dozen megabytes."

Certainly true. As a consultant, I can only view the diverse sample of
my clients for a perspective ... and that is certainly harder for W-2
employees that have lived inside the same company for the last 10
years. It would be interesting to take a survey at a developers
embedded conference as get a better feel for the real numbers.

Quote:
You seem to have a deeply entrenched view of the FPGA development future.
Only time will tell if you are correct or not, I don't believe you are
and I'll leave it at that.

More like a receintly converted evangelist, with a pragmatic view of my
prior 35 years of systems programming experiences casting a view on
this new field and watching what is happening around me too.

I did have a little fun this evening, writing a PCI target mode core in
FpgaC as an example for the beta-2 release that is nearly at hand. It's
not quite done, but checked in to subversion on sourceforge in the
FpgaC examples directory. For something that is a bus interface state
machine, it is expressed in C pretty nicely, and will get better as
unions/enums are added to FpgaC.

It brought out a couple problems with using I/O ports as structure
members that I need to fix in FpgaC tomarrow, then finish the pci
coding along with a C test bench before testing/installing on my Dini
DN2K card.

Colin Paul Gloster
Guest

Tue Feb 28, 2006 5:32 pm   



In news:nn1kv1t60lgcsm0tn2v5tnhnbn836166ed_at_4ax.com timestamped 23 Feb 2006 12:39:58 -0800, it was posted

"[..]

The Google description for this group is: Field Programmable Gate Array
based computing systems, [..]

[..]"

In news:468449F9vf44U1_at_individual.net timestamped Fri, 24 Feb 2006
10:06:01 -0000, Nial Stewart replied:

"[..]

This newsgroup and FPGAs were around long before some numpty at Google
decided what their description should be. [..]

[..]"

This has nothing to do with Google. Check your newsgroups file, the
description of this group is "Field Programmable Gate Array
based computing systems." See
WWW.SLRN.org/manual/slrn-manual-6.html#ss6.117
or
HTTP://Quimby.Gnus.org/gnus/manual/gnus_358.html#SEC358
or something similar.


Guest

Fri Mar 03, 2006 9:09 pm   



Nial Stewart wrote:
Quote:
You've had to understand the target architecture and what's causing your
timing constraint to fail, then re-jig your HDL to reduce the number of
levels of logic to achieve timing closure.

Not at all. Programmers juggle instruction/statement flow all the time
to reach timing closure in C and asm for device drivers and embedded
applications in many fields

First, your cut an paste of several different points and answers isn't
accurate. My answer above is to this part that that it directly
followed:

||> I thought that one of the arguments for using a C based HDL was you
can avoid
||> this level of design implementaion detail?

My point was and is that low level programmers have to understand the
underlying hardware in significant detail to program it properly ...
that has never changed in 35 years. It really doesn't mater if we are
talking about clock latency and scheduling for multiple pipelines, or
the physics and dynamics of a motion control system. I really does
matter if the system measurement units are hours, seconds, microseconds
or picoseconds. Or that the units are core access and cycle times,
pipeline latencies, clock cycles, gate delays or routing segment
latencies.

Quote:

But you're not just juggling lines of code about so the order of
execution is different (ie to make sure things are picked up
quickly enough in an ISR or whatever).

Certainly. So what's your point?

Quote:
Looking at the problem a little more this afternoon, the C based 66mhz
PCI core is looking much more viable. The combinatorial length was
actually from unnecessarily including the main references to the pci
bus signals in the else side of the reset conditional. Breaking that
dependency changed the base FSM speed from 63mhz to better than 73mhz,
making it likely the other setup and hold times can be met as well.

I still think this is an accurate observation...

"You've had to understand the target architecture and what's causing your
timing constraint to fail, then re-jig your HDL to reduce the number of
levels of logic to achieve timing closure."

Certainly. But the second part and real point in your question remains,
and that I addressed in detail I still have issues with:

|| > I thought that one of the arguments for using a C based HDL was
you can avoid
||> this level of design implementaion detail?

and the answer remains, not at all.


Guest

Mon Jan 27, 2020 7:45 pm   



On Monday, February 20, 2006 at 1:50:15 PM UTC-8, James Morrison wrote:
Quote:
On Mon, 2006-02-20 at 10:18 -0800, Marko wrote:
Traditionally, firmware was defined as software that resided in ROM.
So, my question is, what do you call FPGA code? Is "firmware"
appropriate?

In a former position I pondered this very question.

What is firmer than firmware (the term they used to describe code
designs for micro-controllers) but softer than hardware (designs using
wires to connect together various components)?

The answer I came up with was "stiffware".

The problem is that there are elements of both in FPGA code, or at least
there can be. And depending on how you write your VHDL it may resemble
one more than the other.

James.


"Stiffware" I love it!!

Jon Elson
Guest

Wed Jan 29, 2020 9:45 pm   



On Mon, 27 Jan 2020 10:28:39 -0800, ritchiew wrote:

Quote:
On Monday, February 20, 2006 at 1:50:15 PM UTC-8, James Morrison wrote:
On Mon, 2006-02-20 at 10:18 -0800, Marko wrote:
Traditionally, firmware was defined as software that resided in ROM.
So, my question is, what do you call FPGA code? Is "firmware"
appropriate?

The FPGA manufacturers call it the "configuration", but I don't think
firmware is very wrong when talking to less-technical folks.

Jon

Rick C
Guest

Wed Jan 29, 2020 9:45 pm   



On Wednesday, January 29, 2020 at 3:07:11 PM UTC-5, Jon Elson wrote:
Quote:
On Mon, 27 Jan 2020 10:28:39 -0800, ritchiew wrote:

On Monday, February 20, 2006 at 1:50:15 PM UTC-8, James Morrison wrote:
On Mon, 2006-02-20 at 10:18 -0800, Marko wrote:
Traditionally, firmware was defined as software that resided in ROM.
So, my question is, what do you call FPGA code? Is "firmware"
appropriate?
The FPGA manufacturers call it the "configuration", but I don't think
firmware is very wrong when talking to less-technical folks.

Jon


I think "configuration" would be equivalent to the machine code that is produced by a standard compiler for CPUs. The OP is asking what to call the source code. In PCs and other typical computers it is "software". In embedded applications it is "firmware", mainly because it is loaded into a write only memory (or write mostly). So what name to use for programming that is for configuring hardware?

I think I'm in favor of "hardware"... oh, what, that's already used... lol

Firmware is as good as any term. I'm ok with calling it software.

--

Rick C.

- Get 1,000 miles of free Supercharging
- Tesla referral code - https://ts.la/richard11209

Rick C
Guest

Thu Jan 30, 2020 1:45 am   



On Wednesday, January 29, 2020 at 7:28:18 PM UTC-5, gtwrek wrote:
Quote:
In article <7d2a4342-3444-435c-a73b-9e130834212f_at_googlegroups.com>,
Rick C <gnuarm.deletethisbit_at_gmail.com> wrote:

Firmware is as good as any term. I'm ok with calling it software.

"Firmware" is one of those terms that has a very specific meaning in a
very specific context. And means exactly what it is intended to mean.
But as soon as the context changes... So does the definition.

Which makes the term practically useless as a general use technical
description. Sure for the general public, (or anything above a level 2
manager), "Firmware" might as well mean "The magic smoke which makes my
device do things when I turn it on". Usually one thinks of it as
applying to "embedded" electronic devices - which is another fuzzy
definition in itself. Here, "Embedded" usually means any device that's
not a personal computer sitting on one's desk. But even that's up
for argument...


So what is that very specific meaning of "firmware"??? I've not come across it. Every source I check has a different wording and meaning.

Wikipedia - In computing, firmware[a] is a specific class of computer software that provides the low-level control for the device's specific hardware.

Google - permanent software programmed into a read-only memory.

techterms.com - Firmware is a software program or set of instructions programmed on a hardware device. It provides the necessary instructions for how the device communicates with the other computer hardware.

lifewire.com - Firmware is software that's embedded in a piece of hardware

So each one is different enough that the included classes vary hugely!

What is yours?

--

Rick C.

+ Get 1,000 miles of free Supercharging
+ Tesla referral code - https://ts.la/richard11209

gtwrek
Guest

Thu Jan 30, 2020 1:45 am   



In article <7d2a4342-3444-435c-a73b-9e130834212f_at_googlegroups.com>,
Rick C <gnuarm.deletethisbit_at_gmail.com> wrote:
Quote:

Firmware is as good as any term. I'm ok with calling it software.


"Firmware" is one of those terms that has a very specific meaning in a
very specific context. And means exactly what it is intended to mean.
But as soon as the context changes... So does the definition.

Which makes the term practically useless as a general use technical
description. Sure for the general public, (or anything above a level 2
manager), "Firmware" might as well mean "The magic smoke which makes my
device do things when I turn it on". Usually one thinks of it as
applying to "embedded" electronic devices - which is another fuzzy
definition in itself. Here, "Embedded" usually means any device that's
not a personal computer sitting on one's desk. But even that's up
for argument...

Regards,
Mark

Goto page Previous  1, 2, 3, 4  Next

elektroda.net NewsGroups Forum Index - FPGA - Is FPGA code called firmware?

Ask a question - edaboard.com

Arabic version Bulgarian version Catalan version Czech version Danish version German version Greek version English version Spanish version Finnish version French version Hindi version Croatian version Indonesian version Italian version Hebrew version Japanese version Korean version Lithuanian version Latvian version Dutch version Norwegian version Polish version Portuguese version Romanian version Russian version Slovak version Slovenian version Serbian version Swedish version Tagalog version Ukrainian version Vietnamese version Chinese version Turkish version
EDAboard.com map