interesting

John Larkin wrote:
On Tue, 15 Mar 2005 20:01:10 +1100, Clifford Heath <no@spam.please
wrote:


It's just a pity they don't often have any software design skills.
I don't mean skills to make a chosen solution work or fit, but to
choose a solution in the first place. Too much emphasis on the
hardware, not enough on the user. The sheer percentage of badly
programmed embedded designs is staggering, worse even than the
software industry proper (if possible).



Disagree. Most embedded products - cars, appliances, calculators, home
entertainment stuff - just work. Most computer-level stuff is buggy
crap. I have scores of designs in the field, thousands of products in
total, that use an embedded 32-bit CPU, and we have zero known
firmware bugs, and all coded by EEs. My brand-new Dell/XP computers
had stupid software problems right out the box, like occasionally
insisting that floppies are write protected or unformatted (fix?
reboot!) or messing up the Zip drive fats or Word crashing when
certain graphics images are imported.

I've seen some Windows source code, and I understand why it's such
tripe. It's written exactly like the academics teach programming these
days: jillions of files, convoluted logic, abstraction for its own
sake, zero comments or visible context, all based on a language that
was designed to substitute sequences of punctuation marks for keywords
(which is literally why they call it "code").

Modern CS education scoops up masses of youngsters like herring in a
net, and teaches them how to code and how not to think. A Fellow of
United Technologies made that same comment to me two days ago.

John
Hear Hear!

Have you read the various articles about the space shuttle software
team? their error rate is about one bug per 400,000 lines of code. And
99% of their debugging is done *BEFORE* writing any code - the
"thinking" to which John is referring.

Most "programmers" are little more than typists. ISP has caused a whole
host of problems, simply because it enables lousy programmers to fix
their fuckups relatively easily - this tends to encourage sloppy (or
more often non-existent) design.

Typing should be the *last* step in writing software, not the first. And
I am constantly horrified when I hear programmers saying things like
"try this, it might work" - to me, that indicates a distinct lack of
understanding.


The AC drive manufacturer I first started working with *REFUSED* to hire
CS grads at all - only EEs with ME or PhD (until me, the start of a
slippery slope downhill :). The reasoning? CS people think of bugs.
Engineers think of things like "my crane controller is holding 200 Tons
of steel above a roadway, I'd better not fuck up"

And even then, *ALL* hardware was/is fully interlocked to prevent sw
screw ups from causing wholesale destruction. Very handy when people
doing debugging with an ICE pause the micro, suddenly applying DC to the
little 10kW test machine spinning at 200% speed (a *very* loud bang
preceded the motor rolling about the floor in this instance)

Cheers
Terry
 
Clifford Heath wrote:
Terry Given and John Larkin wrote:

... A bunch of back-slapping tripe that had almost nothing to do with
the content of the message to which they were replying. Try to keep to
the subject, guys.
LOL :)

See if you can refute my claim that EE is suffering worse from
outsourcing than CS.

See if you can refute my argument as to why that is, and why it's
likely to continue.
CS is a lot easier to outsource than EE, simply because its physical
requirements are so much easier (a PC and an internet connection), and
the output is entirely virtual. As opposed to truckloads of fancy test
gear, shielded rooms, blah blah blah (of course CAD output is virtual)

I no longer have the particular issue, but IIRC IEEE spectrum did an
issue a year or so ago that focused on the massive outsourcing of
software to countries like india, because of the aforementioned reason
(and, of course, the ability to pay programmers with handfuls of gravel
instead of USD). Most issues contain letters about this very subject,
invariably more weighted to software than hardware.

See if you can stop imputing value judgements to me that *I* didn't
make.
I havent imputed anything. nor even implied....

There is an alarming proportion of bozos in every industry,
but at least in EE, you don't get so stuck with bad decisions made
thirty years ago (unless you work for Intel :).
Bang on wrt bozos. The bell curve.....

When you have, say, 1 employee, its pretty obvious if "it" (careful
choice of gender-neutral terminology. oddly it seems to upset people) is
a drongo. Now try 100 employees - not so hard for the drongo to hide.
Now try 100,000 employees, and it becomes "spot the skilled one" amongst
the large number of idiots and even larger number of un-inspired
"plodders" who wouldnt recognise innovation if it bit them on the arse.

many industries suffer from legacy issues, EE isnt too different from
many others (MODBUS protocol anyone?). PC sw is just the worst, thats
all. Hey, the PC motherboard industry has always suffered from legacy
issues (god, how long did it finally take for the dreaded 640k screw up
to finally disappear), and if that aint EE, I dont know what is :)


See if you can rationalise the fact that, for example, people who
should be able to program a VCR can't - because the software is
built to suit the hardware, instead of being *designed* to suit
the human. I do mean "designed" here, not "developed". I've always
owned high end VCRs, and could list at least dozen design defects
in the software on each of them. Same for the microwave, the watering
system controller, and almost *every* other consumer-level embedded
device. These things should be the *most* user-focussed designs
out there.
hear hear {does that make you feel better?}.

My youth was spent adjusting the clocks on other peoples VCRs, for
precisely the reasons you outline. As I was a programmer back then
(thankfully I saw the light and converted to analogue design) it was
fairly trivial for me to think like the machine (so to speak) and
thereby figure out how to operate it. Grandma, OTOH, never had a chance.

by far the most annoying appliance I have ever owned was a(smeg IIRC)
stove, that wouldnt work after power-cycling until the *FUCKING* clock
had been set. Luckily a long thin stiff thing (thinner than the tines on
a fork) was required to set the clock, thereby ensuring it was nigh
impossible. Just to maximise frustration, we would have a power outage
every week or two. I would have happily murdered the prick who though
that "feature" up. Its not that uncommon, either.

But at least those devices actually worked. It is no longer uncommon to
have household devices which freeze/crash/shit-their-pants and need
power-cycling to cure. I dont think its a coincidence that these
behaviours all seem to occur as the amount of software increases (and
the amount of hardware decreases). If virtual memory is involved, kiss
reliability goodbye IME.....

I sampled a $40,000 LeCroy scope a few years back - it crashed within a
couple of hours of use, and had to be rebooted. Unsurprisingly, I didnt
buy it. The likelihood of me buying a WinCE instrument is slightly less
than that of my garden spontaneously transmuting into gold. When I
bought a new CD player for my van a few months back, the guy at
super-cheap auto suggested I wait until the "WMA" players came out in a
few weeks. WMA? Windows Media Access. Like hell, I just want something
that plays CDs.

A few years back the guy in charge of R&D at HP (a psychologist) wrote a
great article on complexity (IEEE spectrum again? cant recall). His
thesis was that as the number of tasks an instrument performs increases,
the required complexity increases much, much faster. A machine that does
2 different tasks really must do three - task 1, task 2, and deciding
which task to perform.

[I read an article in the NZ herald today about a japanese firm
producing cellphones for the elderly - none of this fancy shit, just a
phone. The simplest model only has 3 buttons]


Try to acknowledge that I freely admit that my industry has produced
volumne of crap that doesn't suit either the hardware or the human.
But that when it succeeds better, it's usually because it's focussed
on the human factors first.
I think there are 2 separate arguments going on here, related but near
orthogonal:

1) piss-poor implementation of software (riddled with bugs)

2) lousy user interface design - eg impossible-to-use VCRs, stoves that
require the clock to be set etc.

I am certainly arguing that CS/IT twats are responsible for a lot of
(1). And if (2) is not done well, it can be a real PITA. But poor (2) +
good (1) = highly reliable (if annoying/inconvenient) behaviour.


My brand-new Dell/XP computers
had stupid software problems right out the box, like occasionally
insisting that floppies are write protected or unformatted (fix?
reboot!)


BIOS problems, written by EE's.
really?

or messing up the Zip drive fats or Word crashing when
certain graphics images are imported.
I've seen some Windows source code, and I understand why it's such
tripe.


Windows is the laughing stock of the industry, not the measure of it.
"Unknown error 0x80000000" indeed!
Absolutely. What makes matters worse is that the techniques for writing
good OS' have been known for DECADES.

days: jillions of files,


Perhaps you think the 1.8 million lines of code for the product for
which I'm a designer should be all in one file? I'm sure *that* would
make a difference... And the thought that using words instead of
symbols would make it any better is equally ignorant. There are many
things wrong with C (and in many respects it and C++ *are* responsible
for the state of Windows) but use of symbols isn't one.
I think the biggest problem with most programming languages (and the OS
gets a fair whack of the blame too) is they rely on the programmer not
to fuck up. Spend a decade involved with production engineering, and
you'll pretty soon come to the realisation that people screw up
regularly. Really skilled workers screw up infrequently, on a random
basis. Dipshit workers screw up almost everything they touch. We once
had a programmer whose total output for a year was negative - *every*
routine he wrote was thrown out and done from scratch by someone more
competent. last I saw him he was a tutor at a minor NZ university....


Most "programmers" are little more than typists.


You need to get out more, if that's all can you see around you. There
are large slabs of the software industry where excellent thinkers
predominate.
Absolutely. I have been privileged to work with some brilliant
programmers. Alas they are few and far between.

I think a lot of sw problems are because managers often understand
little or nothing about development of decent software, but clearly see
that a 21-year-old CS grad with *NO* experience costs a lot less than a
highly skilled programmer with, say, 5+ years experience. So large
amounts of software gets written by people who have never even had a job
before (this is a complaint often levelled at M$).


Engineers think of things like "my crane controller is holding 200
Tons of steel above a roadway, I'd better not fuck up"


It's also necessary to think "I'd better not confuse my operator into
fucking up". But you knew that...

Clifford.
Hell yes. any design ought to start with basic questions like who is it
for, what do they want to do with it, how will they use it, what are the
ramifications if something goes wrong, etc.

I forget now what the hell it was called, but apple published a "bible"
on writing Mac software that was chock-full of great stuff (I had a
software engineering lecturer who used to quote great sections of it,
and it was invariably right). Things like not forcing the user to do
something (or not do something) unless there is a compelling reason to
do so - a test which the dreaded stove clock-setting requirement fails
immediately.

Much of this sort of user interface design is common sense, which is
lacking in many, many people. I think its a trees/woods problem - most
programmers are so fixated on *WHAT* they are doing, they seldom if ever
stop to consider *WHY*.

This is where a background in software engineering is very useful - eg
data structures can (and should) make an early entrance into the design
of a piece of sw, with little or no consideration of how they might be
implemented.

Its pretty funny to watch programmers struggle along and "invent"
balanced trees, doubly linked lists etc. I saw one tragic example where
a programmer had spent 2 years developing a proportional controller with
astonishingly low gain. 6 months to develop the "algorithm," 18 months
trying desperately to tune the controller. 10 minutes in a library would
have shown him a how to implement the code, and another hour or so would
have given him the design for a PI controller, and how to tune it. That
particular startup company *DIED* specifically because the software
never worked (I was involved in the autopsy).

I've actually seen quite a few programmers come horribly unstuck on
control systems, something that appears to be almost entirely lacking
from CS educations. I guess thats what God invented EEs for :)

Cheers
Terry
 
John Larkin wrote:
On Wed, 16 Mar 2005 14:54:35 +1100, Clifford Heath
no@spam.please.net> wrote:


Terry Given and John Larkin wrote:

... A bunch of back-slapping tripe that had almost nothing to do with
the content of the message to which they were replying. Try to keep to
the subject, guys.

See if you can refute my claim that EE is suffering worse from
outsourcing than CS.

See if you can refute my argument as to why that is, and why it's
likely to continue.



Well, my company designs real electronics, mixed-signal stuff with
errors measured in PPM and picoseconds and millikelvins, with the FPGA
logic, pc boards, and firmware to support the analog stuff. And I see
very few people who can do this, and none of them are unemployed.

I do meet lots of EEs who can't do quantitative design or analysis,
and programmers who know nothing but programming (ie, couldn't be less
interested in what a control loop might have to do.) Engineering
school enrollment has exploded in recent decades, and there are lots
of so-called engineers and programmers who signed up because they
heard the money was good or because some high-school counsellor
suggested it. They can't all be good, and they aren't; they can't all
be employed, and they aren't.

At least in the things I do, precision instrumentation, we have
relatively little non-USian competition. And from what I read, the
stuff being heavily outsourced is mostly software and digital (asic)
design... the very things that the EE schools are churning out in
quantity.
these are also the lowest-cost options - PC + sw + lecture theatre. As
opposed to, say, a power electronics lab. Highly profitable for
universities.

I wonder if complex ASIC design (I've never done any) has progressed
down the same path as software, with massive code bloat as people cobble
together large block out of little blocks (akin to 3gls then 4gls then
wizards)

Interesting aside: programming and digital design are both essentially
qualitative.
and require little more than the ability to count to 1.

Cheers
Terry
 
John Larkin wrote:
...from what I read, the
stuff being heavily outsourced is mostly software and digital (asic)
design... the very things that the EE schools are churning out in
quantity.

Interesting aside: programming and digital design are both essentially
qualitative.
Agree completely - that's what makes it "design" - and contrary to
your usage in the previous paragraph. Perhaps the problem is that
design isn't taught, either because it's too hard to teach or the
teachers don't understand it. It's about how to decide what you
want from a thing, not about how to build it once you've decided.

I've broken down the common aspects of all the software I've ever
worked on into over 200 independent aspects, all of them common to
most software, and identified the things I want and expect from
*all* software in regard to each aspect. In that respect, I've
*designed* the infrastructure we need. When you compare that list
to the infrastructure available in common programming languages,
libraries and tools re-invented by each programming team, there's
a huge shortfall. And I believe *that* is why software is mostly
such shite (Unknown error 0x80000000!). That and the fact that folk
aren't taught how to recognise and handle mission-critical issues.
Maybe I should write a book...
 
Clifford Heath wrote:
John Larkin wrote:

...from what I read, the
stuff being heavily outsourced is mostly software and digital (asic)
design... the very things that the EE schools are churning out in
quantity.


Interesting aside: programming and digital design are both essentially
qualitative.


Agree completely - that's what makes it "design" - and contrary to
your usage in the previous paragraph. Perhaps the problem is that
design isn't taught, either because it's too hard to teach or the
teachers don't understand it. It's about how to decide what you
want from a thing, not about how to build it once you've decided.

I've broken down the common aspects of all the software I've ever
worked on into over 200 independent aspects, all of them common to
most software, and identified the things I want and expect from
*all* software in regard to each aspect. In that respect, I've
*designed* the infrastructure we need. When you compare that list
to the infrastructure available in common programming languages,
libraries and tools re-invented by each programming team, there's
a huge shortfall. And I believe *that* is why software is mostly
such shite (Unknown error 0x80000000!). That and the fact that folk
aren't taught how to recognise and handle mission-critical issues.
Maybe I should write a book...
I'll buy it :)

Cheers
Terry
 
Terry Given wrote:
CS is a lot easier to outsource than EE,
Disagree that CS is easy. It's too hard to adequately specify
software compared to hardware. The only software which can be
safely outsourced should be produced by a compiler operating
directly on the specification. I should know - I've written
more than one such compiler and have more in mind :).

..as the number of tasks an instrument performs increases,
the required complexity increases much, much faster.
True. And as complexity increases, the "software entropy" curve
trends upwards. By software entropy, I mean the ratio of the
complexity of the software to the complexity of the problem.
The higher this ratio, the faster it grows.

Absolutely. What makes matters worse is that the techniques for writing
good OS' have been known for DECADES.
Some of them, but not enough.

Dipshit workers screw up almost everything they touch.
Seen a few of those, but I only remember in their backsides ;-).

I think a lot of sw problems are because managers often understand
little or nothing about development of decent software,
That's not mainly their fault, it's because we've been pretty bad at
describing and quantifying quality.

This is where a background in software engineering is very useful - eg
data structures can (and should) make an early entrance into the design
of a piece of sw, with little or no consideration of how they might be
implemented.
Data structures are about implementation. The discipline you
mean is called "data modelling", and is one of the main things
I harp on about and teach people. It's relevant that UML seems
to have been created to make it impossible to do properly. I
believe in object-oriented design, but I resile from the notion
that proper object modelling means you *aren't allowed* to do
data modelling. This ignorant notion comes from acknowledged
leaders in the field, who maintain that if you fully explore
the behaviour and relationships between objects you can keep
their internal states hidden, even from the designer. Nothing
could be further from the truth, and the only reason such
projects ever succeed is the designer has a good perception of
the unspoken states and their required representations.

Clifford Heath.
 
Clifford Heath wrote:
Terry Given wrote:

CS is a lot easier to outsource than EE,


Disagree that CS is easy. It's too hard to adequately specify
software compared to hardware. The only software which can be
safely outsourced should be produced by a compiler operating
directly on the specification. I should know - I've written
more than one such compiler and have more in mind :).
OK, re-phrased: CS *appears* a lot easier to outsource than EE. ISTR
reading an article a while back on some US companies discovering your point.

the devils advocate in me suggests that since so much software performs
so badly anyway, outsourcing wont make it appreciably worse (hows that
for pessimism)

..as the number of tasks an instrument performs increases, the
required complexity increases much, much faster.


True. And as complexity increases, the "software entropy" curve
trends upwards. By software entropy, I mean the ratio of the
complexity of the software to the complexity of the problem.
The higher this ratio, the faster it grows.

Absolutely. What makes matters worse is that the techniques for
writing good OS' have been known for DECADES.


Some of them, but not enough.
one of the most important IMO is of course protected memory. Many a VAX
program ran amok without harming anything (other than itself). cf the
propensity of windoze to shit its pants permnanently at the slightest
provocation. Still, we've only had dedicated memory management hardware
integrated in the micro since the 80386.

Dipshit workers screw up almost everything they touch.


Seen a few of those, but I only remember in their backsides ;-).

I think a lot of sw problems are because managers often understand
little or nothing about development of decent software,


That's not mainly their fault, it's because we've been pretty bad at
describing and quantifying quality.
alas, it appears "can I click on a pretty button" is the measure used by
many PC sw houses to determine operational status. Several times now I
have purchased CAD software, only to find menus in the manual/help dont
even exist.

I have seen an analagous problem with sw for motor control - once the GM
sees the shaft of a motor turning, he is convinced the software is
finished, when in reality it has only just begun. This is usually
followed by some dickhead salesman selling 50 units with huge penalty
clauses.

This is where a background in software engineering is very useful - eg
data structures can (and should) make an early entrance into the
design of a piece of sw, with little or no consideration of how they
might be implemented.


Data structures are about implementation. The discipline you
mean is called "data modelling", and is one of the main things
I harp on about and teach people.
OK, semantic error on my part. I of course meant looking at these things
from a big-picture perspective - what to do, rather than how.

It's relevant that UML seems
to have been created to make it impossible to do properly. I
believe in object-oriented design, but I resile from the notion
that proper object modelling means you *aren't allowed* to do
data modelling. This ignorant notion comes from acknowledged
leaders in the field, who maintain that if you fully explore
the behaviour and relationships between objects you can keep
their internal states hidden, even from the designer. Nothing
could be further from the truth, and the only reason such
projects ever succeed is the designer has a good perception of
the unspoken states and their required representations.

Clifford Heath.
such silly adherence to blind formalism is invariably a bad idea. I
prefer "Bruce Lee theory" - absorb what is useful, reject what is
useless. thou shalt not goto is another good example.

Taguchi reckoned a similar approach could work with experimental design
- as if. Ultimately designers have to know what the hell they are
working with.

The little software I do is relatively easy - real-time multi-rate
nonlinear controllers, crap like that. Because so much of it is
mathematical, it is fairly straightforward to *prove* it works - I
automate exhaustive tests on the actual hardware, comparing measured
results with matlab-calculated "exact" answers and generating error
plots, which *better* be < 1LSB. The hilarious part is when I first
started doing this, I found an exciting array of bugs in algebraic
routines, in functioning code. Turns out that plenty of loop gain hides
all manner of sins, IOW a slightly screwy calculation looks a lot like
an external disturbance to a controller :)

Plus of course I dont implement a controller until my simulation
operates satisfactorily. Which I learned the hard way.

Mostly, I draw little boxes with pins in schematics, specify the
required behaviour and let others take the blame for software.

Cheers
Terry
 
Hello Terry,

When you have, say, 1 employee, its pretty obvious if "it" (careful
choice of gender-neutral terminology. oddly it seems to upset people) is
a drongo. Now try 100 employees - not so hard for the drongo to hide.
Not necessarily. When I ran a division with about 100 emlpoyees the
co-workers would quickly start complaining if someone wasn't pulling
their weight. Simply because repeated failures would reflect back on the
group. Not so much from me but from peers.

My youth was spent adjusting the clocks on other peoples VCRs, for
precisely the reasons you outline. As I was a programmer back then
(thankfully I saw the light and converted to analogue design) it was
fairly trivial for me to think like the machine (so to speak) and
thereby figure out how to operate it. Grandma, OTOH, never had a chance.
I wonder when the designers of VCRs ever figure this out.

by far the most annoying appliance I have ever owned was a(smeg IIRC)
stove, that wouldnt work after power-cycling until the *FUCKING* clock
had been set. Luckily a long thin stiff thing (thinner than the tines on
a fork) was required to set the clock, thereby ensuring it was nigh
impossible. Just to maximise frustration, we would have a power outage
every week or two. I would have happily murdered the prick who though
that "feature" up. Its not that uncommon, either.
A clear case of "nerd design". Probably someone designed it who didn't
have the foggiest idea of how to think like a user.

I sampled a $40,000 LeCroy scope a few years back - it crashed within a
couple of hours of use, and had to be rebooted. Unsurprisingly, I didnt
buy it. The likelihood of me buying a WinCE instrument is slightly less
than that of my garden spontaneously transmuting into gold.
ROFL! Same here.

Windows is the laughing stock of the industry, not the measure of it.
"Unknown error 0x80000000" indeed!


Absolutely. What makes matters worse is that the techniques for writing
good OS' have been known for DECADES.
But the folks who were truly able to write a reliable OS such as DOS are
retired or have passed away. Same with radios and other stuff. Grandma's
tube set or my old Astor BPJ from 1959 run circles around our new
stereo. It seems that "modern" engineers don't even know how to design a
reasonable tuner or IF circuit anymore. So, even if I had to spend lots
of $$ to replace a tube in that Astor I'd do it in a heartbeat. However,
the original set from day one still worketh...

I've actually seen quite a few programmers come horribly unstuck on
control systems, something that appears to be almost entirely lacking
from CS educations. I guess thats what God invented EEs for :)
Well, you have to have a good dose of HW understanding to succeed in
embedded or control. At least on the chip level. Such as "timer A needs
at least x clock cycles to properly issue the interrupt and the next
compare register setting should be this much away". Violate stuff like
that and chances are that not even the experts would know what might happen.

Regards, Joerg

http://www.analogconsultants.com
 
Hello John,

... Most programmers hate LabView already!
And many analog engineers hate Spice. I believe even Bob Pease let off
something to that effect.

I remember when someone simulated a pulser power stage and everything
simulated just fine. Then he built it, or rather had it built by the
techs, turned it on and "kapoof".

Part of the problem with engineering employment is that the tools keep
getting better and efficiency keeps going up, so it takes less
man-hours to design stuff. I can design a pc board of a given
functionality in, say, 1/3 the time it took 20 years ago.
Some of that may be because we have better and new parts. In the 70's
there weren't such things as a BSS123 or a 74HC14. And in the 80's we
didn't have LVDS chips. Only ECL but those were mostly off limits
because of power or cost concerns. uCs had to have an EPROM next to them
and just erasing one took a long coffee break.

Regards, Joerg

http://www.analogconsultants.com
 
On Wed, 16 Mar 2005 20:19:43 GMT, Joerg
<notthisjoergsch@removethispacbell.net> wrote:

Hello John,

... Most programmers hate LabView already!

And many analog engineers hate Spice. I believe even Bob Pease let off
something to that effect.

[snip]

And how many of Bob Pease's "tweaked" circuits have you seen that
would pass muster in a production environment?

About the same percentage as in AoE ?:)

...Jim Thompson
--
| James E.Thompson, P.E. | mens |
| Analog Innovations, Inc. | et |
| Analog/Mixed-Signal ASIC's and Discrete Systems | manus |
| Phoenix, Arizona Voice:(480)460-2350 | |
| E-mail Address at Website Fax:(480)460-2142 | Brass Rat |
| http://www.analog-innovations.com | 1962 |

I love to cook with wine. Sometimes I even put it in the food.
 
Hello Jim,

And how many of Bob Pease's "tweaked" circuits have you seen that
would pass muster in a production environment?
Didn't he design the LM331? That, for example, is a remarkable chip
especially when considering how long ago it was conceived.

Same for some other "paper and pencil" designers like the late Robert
Widlar. His ideas were quite clever. Maybe with the exception of that
newspaper he supposedly set on fire at a trade show...

Regards, Joerg

http://www.analogconsultants.com
 
On Wed, 16 Mar 2005 20:19:43 GMT, Joerg
<notthisjoergsch@removethispacbell.net> wrote:


Some of that may be because we have better and new parts. In the 70's
there weren't such things as a BSS123 or a 74HC14. And in the 80's we
didn't have LVDS chips. Only ECL but those were mostly off limits
because of power or cost concerns.

The macrocell delays inside a Xilinx Spartan3 are now in the 50 ps
range, and I can buy a TinyLogic flipflop with a 1 ns prop delay. But
ecl is up to 10 GHz, so it stays ahead. Things do move on.

What shocks me is that we'd do a system using 74xx gates and flops,
and it would take weeks to bring up. Now we do something with a uP, a
few FPGAs, a heap of fast ram, and maybe a dozen flash ADCs, and the
first rev works in a few days.

The combination of massive numbers of graduated EEs plus huge
productivity improvements must eventually hit the wall. Electronics
will be 100 years old in 2006.


John
 
I read in sci.electronics.design that John Larkin <jjlarkin@highlandtech
nology.com> wrote (in <8n9h31l7m42ghgs7qc77fr0eh1ld4fhkq6@4ax.com>)
about 'interesting', on Wed, 16 Mar 2005:
Electronics will
be 100 years old in 2006.
What happened in 1906 that qualifies as a start date?
--
Regards, John Woodgate, OOO - Own Opinions Only.
The good news is that nothing is compulsory.
The bad news is that everything is prohibited.
http://www.jmwa.demon.co.uk Also see http://www.isce.org.uk
 
Gidday Joerg,

Joerg wrote:
Hello Terry,

When you have, say, 1 employee, its pretty obvious if "it" (careful
choice of gender-neutral terminology. oddly it seems to upset people)
is a drongo. Now try 100 employees - not so hard for the drongo to hide.


Not necessarily. When I ran a division with about 100 emlpoyees the
co-workers would quickly start complaining if someone wasn't pulling
their weight. Simply because repeated failures would reflect back on the
group. Not so much from me but from peers.
TEAM - Teams Encourage Absolute Mediocrity.

In some of the larger places I have worked, I have seen (or been on the
receiving end of) complaints about cow-orkers not pulling their weight
used to discipline or even sack those complaining. Regardless of the
veracity of the complaints.

conversely I have worked with a couple of teams where the dynamic was
incredible - work hard, play hard and produce fantastic results in
record time. Ah, the difference a good manager can make.


My youth was spent adjusting the clocks on other peoples VCRs, for
precisely the reasons you outline. As I was a programmer back then
(thankfully I saw the light and converted to analogue design) it was
fairly trivial for me to think like the machine (so to speak) and
thereby figure out how to operate it. Grandma, OTOH, never had a chance.


I wonder when the designers of VCRs ever figure this out.
<snort>

by far the most annoying appliance I have ever owned was a(smeg IIRC)
stove, that wouldnt work after power-cycling until the *FUCKING* clock
had been set. Luckily a long thin stiff thing (thinner than the tines
on a fork) was required to set the clock, thereby ensuring it was nigh
impossible. Just to maximise frustration, we would have a power outage
every week or two. I would have happily murdered the prick who though
that "feature" up. Its not that uncommon, either.


A clear case of "nerd design". Probably someone designed it who didn't
have the foggiest idea of how to think like a user.
I used to have a desk pad that had written on it "how is what you are
doing adding to company profitability" or words to that effect.
Designers should have a mantra like "how is that going to enhance the
useability of the product", to which said clock requirements response
would be "in a negative manner, by infuriating the customer and adding
no functionality whatsoever." At which point the instigator should be
beaten senseless with a bag of spinach.

The *worst* example of this was the twit who built a suicide function
into an IP67/IP68 sealed module (16 screws to remove & open module, up
to 2000 modules per product). Send this command (luckily a global
broadcast command) and each module shuts off its comms link and writes
the state to EEPROM, so power-cycling cant fix it. The only cure was to
remove the back and plug into the diagnostic serial port to bring it
back to life. Needless to say, a programmer accidentally sent the
suicide command to a customers product, killing 1000 modules. When asked
WTF was he thinking, the implementer shrugged and said "I had to make a
choice, so I chose to write the state to EEPROM". He should have been
fired.....


I sampled a $40,000 LeCroy scope a few years back - it crashed within
a couple of hours of use, and had to be rebooted. Unsurprisingly, I
didnt buy it. The likelihood of me buying a WinCE instrument is
slightly less than that of my garden spontaneously transmuting into gold.


ROFL! Same here.

Windows is the laughing stock of the industry, not the measure of it.
"Unknown error 0x80000000" indeed!



Absolutely. What makes matters worse is that the techniques for
writing good OS' have been known for DECADES.


But the folks who were truly able to write a reliable OS such as DOS are
retired or have passed away. Same with radios and other stuff. Grandma's
tube set or my old Astor BPJ from 1959 run circles around our new
stereo. It seems that "modern" engineers don't even know how to design a
reasonable tuner or IF circuit anymore. So, even if I had to spend lots
of $$ to replace a tube in that Astor I'd do it in a heartbeat. However,
the original set from day one still worketh...

I've actually seen quite a few programmers come horribly unstuck on
control systems, something that appears to be almost entirely lacking
from CS educations. I guess thats what God invented EEs for :)


Well, you have to have a good dose of HW understanding to succeed in
embedded or control. At least on the chip level. Such as "timer A needs
at least x clock cycles to properly issue the interrupt and the next
compare register setting should be this much away". Violate stuff like
that and chances are that not even the experts would know what might
happen.

Regards, Joerg
reminds me of the guy who wrote some 8051 code for me. He used C, and it
used 32-bit numbers for booleans! After a couple of weeks, he came and
complained about the lack of RAM.....

Cheers
Terry
 
"John Larkin" <xxxxxxxxxx> schreef in
bericht news:8n9h31l7m42ghgs7qc77fr0eh1ld4fhkq6@4ax.com...

No anti-spam in your email address.

--
Thanks, Frank.
(remove 'q' and 'invalid' when replying by email)
 
On Wed, 16 Mar 2005 21:16:32 GMT, Joerg
<notthisjoergsch@removethispacbell.net> wrote:

Hello Jim,

And how many of Bob Pease's "tweaked" circuits have you seen that
would pass muster in a production environment?

Didn't he design the LM331? That, for example, is a remarkable chip
especially when considering how long ago it was conceived.

Same for some other "paper and pencil" designers like the late Robert
Widlar. His ideas were quite clever. Maybe with the exception of that
newspaper he supposedly set on fire at a trade show...

Regards, Joerg

http://www.analogconsultants.com
I have nothing against pencil and paper. Most of the chips on my
website were designed that way. It's only been since ~1985 that I've
used Spice simulators extensively.

And I don't really DESIGN using Spice, just verify.

...Jim Thompson
--
| James E.Thompson, P.E. | mens |
| Analog Innovations, Inc. | et |
| Analog/Mixed-Signal ASIC's and Discrete Systems | manus |
| Phoenix, Arizona Voice:(480)460-2350 | |
| E-mail Address at Website Fax:(480)460-2142 | Brass Rat |
| http://www.analog-innovations.com | 1962 |

I love to cook with wine. Sometimes I even put it in the food.
 
Hello Terry,

In some of the larger places I have worked, I have seen (or been on the
receiving end of) complaints about cow-orkers not pulling their weight
used to discipline or even sack those complaining. Regardless of the
veracity of the complaints.
That wouldn't have been tolerated in our company.

reminds me of the guy who wrote some 8051 code for me. He used C, and it
used 32-bit numbers for booleans! After a couple of weeks, he came and
complained about the lack of RAM.....
Did he ever declare? Or did he just let it all default? I mean, that
should be part of the code...

Regards, Joerg

http://www.analogconsultants.com
 
Joerg wrote:
Hello Terry,

In some of the larger places I have worked, I have seen (or been on
the receiving end of) complaints about cow-orkers not pulling their
weight used to discipline or even sack those complaining. Regardless
of the veracity of the complaints.


That wouldn't have been tolerated in our company.
nor should it in *any* company

reminds me of the guy who wrote some 8051 code for me. He used C, and
it used 32-bit numbers for booleans! After a couple of weeks, he came
and complained about the lack of RAM.....


Did he ever declare? Or did he just let it all default? I mean, that
should be part of the code...

Regards, Joerg
all defaults. To be fair, his last embedded code job had 32Mb SRAM. He
also used up almost all of the 64kb flash, too - most of which I wanted
reserved for data.

So we sat down and nutted out a more detailed spec, putting upper limits
on flash & ram utilisation, bounding the required response times etc. A
lot of the problem was my fault for making assumptions. His 2nd cut at
the code was a lot better.

Cheers
Terry
 
Tim Wescott wrote:
Jim Thompson wrote:

On Wed, 16 Mar 2005 21:16:32 GMT, Joerg
notthisjoergsch@removethispacbell.net> wrote:


Hello Jim,


And how many of Bob Pease's "tweaked" circuits have you seen that
would pass muster in a production environment?


Didn't he design the LM331? That, for example, is a remarkable chip
especially when considering how long ago it was conceived.

Same for some other "paper and pencil" designers like the late Robert
Widlar. His ideas were quite clever. Maybe with the exception of that
newspaper he supposedly set on fire at a trade show...

Regards, Joerg

http://www.analogconsultants.com



I have nothing against pencil and paper. Most of the chips on my
website were designed that way. It's only been since ~1985 that I've
used Spice simulators extensively.

And I don't really DESIGN using Spice, just verify.

...Jim Thompson


Oh, haven't you seen the new .DESIGN card? I don't have the syntax
here, but an example would go something like:

.DESIGN "I want to have a circuit that will drive an actuator that
will spritz a measured amount of air freshener once every
ten minutes or so. I don't know what the specifications of
the actuator, or if that's 30 seconds to 1 hour or ten
minutes plus or minus 1 second, but I know I need it by next
Tuesday"

Then SPICE spits out a schematic.
Hasn't Kevin already implemented that in SuperSpice ?

Cheers
Terry
 
On Wed, 16 Mar 2005 23:39:45 +0100, "Frank Bemelman"
<f.bemelmanq@xs4all.invalid.nl> wrote:

"John Larkin" <xxxxxxxxxx> schreef in
bericht news:8n9h31l7m42ghgs7qc77fr0eh1ld4fhkq6@4ax.com...

No anti-spam in your email address.

Oops, I just installed Agent on my new Dell and forgot to mung the
address. I should do something new so everybody will have to plonk me
again.

Thanks.

John
 

Welcome to EDABoard.com

Sponsor

Back
Top