EDAboard.com | EDAboard.de | EDAboard.co.uk | WTWH Media

Writing to MCU flash

Ask a question - edaboard.com

elektroda.net NewsGroups Forum Index - Electronics Design - Writing to MCU flash

Goto page Previous  1, 2, 3, 4, 5, 6, 7, 8  Next


Guest

Mon Jan 28, 2019 6:45 pm   



David Brown stated
<david.brown_at_hesbynett.no> wrote in <q2n5en$965$1_at_dont-email.me>:

Quote:
I am not a beginner at programming - but I can see that you hadn't
learned much about reliable coding before writing that jumble. You
should learn a little about interfaces and implementations, and how to
write them in C.


Well, the code is from 1998 (really) some minor things changed later in Usenet,
so some minor things were changed in the code.
In all those years nothing bad happened, I have a database that goes back to year 2000 or so
(old one was lost when harddisk was dropped), and maybe in all those 18 years 18 crashes.
I know why, and do not bother.
So, now compare that to eeehhh wherewasit redmond.. stuff?

Quote:
I'd recommend you open your eyes and see what has happened in the
industry in the last 30 years or so. I agree that not all of it is
good, but many things have moved forward.


Na, I'v contributed some to that...


Quote:
I write embedded code for a living, not just toys for fun. I don't get
to publish my clients' code.


I know, everything you do is secret and perfect.

Quote:
But I'd be embarrassed to have something like that newsreader code
published in my name.


I would be embarrassed too if you published my code in your name Smile
Without mentioning source that is.

There was a website long ago that published a clone of my xste (subtitle editor),
every time I added some feature they did too (for money that is).
Then I announced a feature, it was for a working group, they announced it too.
The requirements changed and that feature was no longer needed, so I never implemented it.
That website disappeared. Like snow in the sun.
Yes many derivates of software I wrote were made by many people all over the world
with my consent.
Open source is fun.
There are always the jealous types, I remember people writing 'that can never work' and stuff like that,
'it has memory leaks all over it' (for the subtitle software), and crap like that,
Some had a very specific business interest to attack.
None of those are still around AFAIK.
You should try open source too.
People will rip your code apart, some people will have requests (I implemented several in NewsFleX),
read open source code, share it, that is the right way. All your secrecy to me means it is unverifyable
and of no value.
Your attacks are not specific, and hold no ground.
You hide.
Smile

John Larkin
Guest

Mon Jan 28, 2019 7:45 pm   



On Mon, 28 Jan 2019 13:06:40 -0500, Phil Hobbs
<pcdhSpamMeSenseless_at_electrooptical.net> wrote:

Quote:
On 1/28/19 12:02 PM, Tom Gardner wrote:
On 28/01/19 16:38, John Larkin wrote:
On Mon, 28 Jan 2019 08:27:19 +0000, Tom Gardner
spamjunk_at_blueyonder.co.uk> wrote:

On 28/01/19 06:30, David Brown wrote:
People who write C code professionally usually know what their lines
of code do.

Older people in the embedded world, yes. Younger people
in other fields - sometimes :(

I've seen too many that only have a vague concept of
what a cache is. And even getting them to outline what
a processor does during an unoptimised subroutine call
can be like pulling teeth.

PEBCAK in a stark form.


/Anything/ is better than doing PIC assembly.

I once looked at a PIC's assembler, and thought
"life is too short".


We had a weird problem last week. An ARM talks to an FPGA which
implements a bunch of DDS sine wave generators, driving a mess of
serial DACs. The sinewaves had weird erratic spikey glitches, which
were suspected to be SPI transmission-line problems, but weren't.

Much experimenting and thinking led us to the real problem: a VOLATILE
declaration in c wasn't always working, so the sinewave amplitude
multiplier values would occasionally get zeroed. One clue was that the
glitches were erratic but quantized to 1 ms time ticks, and the ARM
runs a 1 KHz interrupt.

I solved the problem by applying the universal principle of "always
blame the software."

That's trivially valid if you include the VHDL/Verilog
code in the definition of "software" :)

Now there's a giveaway. You don't even believe in hardware!

Cheers

Phil Hobbs


We almost always get our PC boards right first try.

Code is almost always wrong as written, and needs many iterations to
bet to mostly bug free. VHDL is better than c, mainly because my VHDL
guys include a lot of test benching and don't consider the code done
until it simulates right. But the first as-typed code is usually
wrong.

The idea is that the easier it is to change something, the less effort
will be invested in getting it right.

Fun: google spreadsheet errors

Turns out that most spreadsheets are wrong. Billion dollar mistakes
have been made.


--

John Larkin Highland Technology, Inc
picosecond timing precision measurement

jlarkin att highlandtechnology dott com
http://www.highlandtechnology.com

John Larkin
Guest

Mon Jan 28, 2019 7:45 pm   



On Mon, 28 Jan 2019 08:27:19 +0000, Tom Gardner
<spamjunk_at_blueyonder.co.uk> wrote:

Quote:
On 28/01/19 06:30, David Brown wrote:
People who write C code professionally usually know what their lines of code do.

Older people in the embedded world, yes. Younger people
in other fields - sometimes Sad


If A is a float,

A=A+3

could be a big deal.


--

John Larkin Highland Technology, Inc
picosecond timing precision measurement

jlarkin att highlandtechnology dott com
http://www.highlandtechnology.com

Phil Hobbs
Guest

Mon Jan 28, 2019 7:45 pm   



On 1/28/19 12:02 PM, Tom Gardner wrote:
Quote:
On 28/01/19 16:38, John Larkin wrote:
On Mon, 28 Jan 2019 08:27:19 +0000, Tom Gardner
spamjunk_at_blueyonder.co.uk> wrote:

On 28/01/19 06:30, David Brown wrote:
People who write C code professionally usually know what their lines
of code do.

Older people in the embedded world, yes. Younger people
in other fields - sometimes :(

I've seen too many that only have a vague concept of
what a cache is. And even getting them to outline what
a processor does during an unoptimised subroutine call
can be like pulling teeth.

PEBCAK in a stark form.


/Anything/ is better than doing PIC assembly.

I once looked at a PIC's assembler, and thought
"life is too short".


We had a weird problem last week. An ARM talks to an FPGA which
implements a bunch of DDS sine wave generators, driving a mess of
serial DACs. The sinewaves had weird erratic spikey glitches, which
were suspected to be SPI transmission-line problems, but weren't.

Much experimenting and thinking led us to the real problem: a VOLATILE
declaration in c wasn't always working, so the sinewave amplitude
multiplier values would occasionally get zeroed. One clue was that the
glitches were erratic but quantized to 1 ms time ticks, and the ARM
runs a 1 KHz interrupt.

I solved the problem by applying the universal principle of "always
blame the software."

That's trivially valid if you include the VHDL/Verilog
code in the definition of "software" Smile


Now there's a giveaway. You don't even believe in hardware!

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC / Hobbs ElectroOptics
Optics, Electro-optics, Photonics, Analog Electronics
Briarcliff Manor NY 10510

http://electrooptical.net
https://hobbs-eo.com

Phil Hobbs
Guest

Mon Jan 28, 2019 8:45 pm   



On 1/28/19 1:34 PM, John Larkin wrote:
Quote:
On Mon, 28 Jan 2019 13:06:40 -0500, Phil Hobbs
pcdhSpamMeSenseless_at_electrooptical.net> wrote:

On 1/28/19 12:02 PM, Tom Gardner wrote:
On 28/01/19 16:38, John Larkin wrote:
On Mon, 28 Jan 2019 08:27:19 +0000, Tom Gardner
spamjunk_at_blueyonder.co.uk> wrote:

On 28/01/19 06:30, David Brown wrote:
People who write C code professionally usually know what
their lines of code do.

Older people in the embedded world, yes. Younger people in
other fields - sometimes :(

I've seen too many that only have a vague concept of what a
cache is. And even getting them to outline what a processor
does during an unoptimised subroutine call can be like
pulling teeth.

PEBCAK in a stark form.


/Anything/ is better than doing PIC assembly.

I once looked at a PIC's assembler, and thought "life is too
short".


We had a weird problem last week. An ARM talks to an FPGA
which implements a bunch of DDS sine wave generators, driving a
mess of serial DACs. The sinewaves had weird erratic spikey
glitches, which were suspected to be SPI transmission-line
problems, but weren't.

Much experimenting and thinking led us to the real problem: a
VOLATILE declaration in c wasn't always working, so the
sinewave amplitude multiplier values would occasionally get
zeroed. One clue was that the glitches were erratic but
quantized to 1 ms time ticks, and the ARM runs a 1 KHz
interrupt.

I solved the problem by applying the universal principle of
"always blame the software."

That's trivially valid if you include the VHDL/Verilog code in
the definition of "software" :)

Now there's a giveaway. You don't even believe in hardware!

Cheers

Phil Hobbs

We almost always get our PC boards right first try.

Code is almost always wrong as written, and needs many iterations to
bet to mostly bug free. VHDL is better than c, mainly because my VHDL
guys include a lot of test benching and don't consider the code done
until it simulates right. But the first as-typed code is usually
wrong.


Test-driven design is a software development technique that applies that
idea to code. You write a description of each function, write a unit
test for it, and then write the function.

My son and colleague Simon (formerly known as Dashing Firmware
Hunchback) built a build-and-testing framework that he uses for
that--the firmware gets mostly debugged on x86, with functions that log
and mimic the peripherals. It's way faster than doing it on the target,
and of course it can be going on in parallel with the target hardware
development. (He's a much more orderly fellow than I am.)

Quote:

The idea is that the easier it is to change something, the less
effort will be invested in getting it right.

Fun: google spreadsheet errors

Turns out that most spreadsheets are wrong. Billion dollar mistakes
have been made.


You betcha. The last spreadsheet I used for anything except reading
other people's CSVs was Visicalc, circa 1982. I took one look at it,
realized that spreadsheets were impossible to debug, and chucked the
whole notion.

I do plots using text files + Gnuplot, feasibility calculations with
an old version of Mathcad, and otherwise I mostly write something.

Cheers

Phil Hobbs



--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC / Hobbs ElectroOptics
Optics, Electro-optics, Photonics, Analog Electronics
Briarcliff Manor NY 10510

http://electrooptical.net
https://hobbs-eo.com


Guest

Mon Jan 28, 2019 8:45 pm   



Phil Hobbs wrote
Quote:
I do plots using text files + Gnuplot, feasibility calculations with
an old version of Mathcad, and otherwise I mostly write something.


Yes gnuplot is cool, use it a lot.
Some Linux programs support output for it,
for example the 'sox' audio processor can show the frequency characteristics of its filters in gnuplot,
gnu octave (a mathlab equivalent) also supports it.

From man sox
--plot gnuplot|octave|off
If not set to off (the default if --plot is not given), run in a mode that can be used, in conjunction with the gnuplot program or the GNU Octave program,
to assist with the selection and configuration of many of the transfer-function based effects.
For the first given effect that supports the selected plotting program, SoX will output commands to plot the effect's transfer function,
and then exit without actually processing any audio. E.g.
sox --plot octave input-file -n highpass 1320 > highpass.plt
octave highpass.plt

David Brown
Guest

Mon Jan 28, 2019 9:45 pm   



On 28/01/2019 18:44, 698839253X6D445TD_at_nospam.org wrote:
Quote:
David Brown stated
david.brown_at_hesbynett.no> wrote in <q2n5en$965$1_at_dont-email.me>:

I am not a beginner at programming - but I can see that you hadn't
learned much about reliable coding before writing that jumble. You
should learn a little about interfaces and implementations, and how to
write them in C.

Well, the code is from 1998 (really) some minor things changed later in Usenet,
so some minor things were changed in the code.


It would have been been considered bad coding style in 1998, just as it
is now.

Quote:
In all those years nothing bad happened, I have a database that goes back to year 2000 or so
(old one was lost when harddisk was dropped), and maybe in all those 18 years 18 crashes.
I know why, and do not bother.
So, now compare that to eeehhh wherewasit redmond.. stuff?


<https://en.wikipedia.org/wiki/Whataboutism>

The fact that MS has produced worse code does not make yours good.

Quote:
I'd recommend you open your eyes and see what has happened in the
industry in the last 30 years or so. I agree that not all of it is
good, but many things have moved forward.

Na, I'v contributed some to that...


I write embedded code for a living, not just toys for fun. I don't get
to publish my clients' code.

I know, everything you do is secret and perfect.

But I'd be embarrassed to have something like that newsreader code
published in my name.

I would be embarrassed too if you published my code in your name Smile


Do not worry about that.

Quote:
Without mentioning source that is.

There was a website long ago that published a clone of my xste (subtitle editor),
every time I added some feature they did too (for money that is).
Then I announced a feature, it was for a working group, they announced it too.
The requirements changed and that feature was no longer needed, so I never implemented it.
That website disappeared. Like snow in the sun.
Yes many derivates of software I wrote were made by many people all over the world
with my consent.
Open source is fun.
There are always the jealous types, I remember people writing 'that can never work' and stuff like that,
'it has memory leaks all over it' (for the subtitle software), and crap like that,
Some had a very specific business interest to attack.
None of those are still around AFAIK.
You should try open source too.
People will rip your code apart, some people will have requests (I implemented several in NewsFleX),
read open source code, share it, that is the right way. All your secrecy to me means it is unverifyable
and of no value.
Your attacks are not specific, and hold no ground.
You hide.
:-)



David Brown
Guest

Mon Jan 28, 2019 9:45 pm   



On 28/01/2019 17:25, John Larkin wrote:
Quote:
On Mon, 28 Jan 2019 09:13:14 +0100, David Brown
david.brown_at_hesbynett.no> wrote:

On 28/01/2019 08:23, 698839253X6D445TD_at_nospam.org wrote:
David Brown opiniated
On 27/01/2019 19:34, 698839253X6D445TD_at_nospam.org wrote:>>> David Brown wrote
snip

Perhaps that all made sense to you when you wrote it - it makes no sense
to me.


That is possibly where the problem is.

It is where your problem is, yes - you write things that have no
connection with the discussion and make no sense.

I see no 'tronics from you, you almost sound like a sales person for some bloatware that blinds
people from the inner working of things,

I am an embedded programmer, not an electronics designer. (I've done a
fair bit of digital electronics design in the past - I have very little
experience of anything analogue beyond ADC's and DAC's.)

How do you interact with the electronic designers?

Do you get pulled in after the architecture is frozen, and given a
spec to implement? Or are you involved in
hardware/software/analog/digital tradeoffs from the start?


I get involved from early on, and work together with them. I have made
full schematics and pcb designs (with relatively little analogue stuff -
one transistor is okay, but if I put two transistors together I'll get
them wrong). But other people are better at it than me, so they do the
electronics design. I'll often be the main voice behind choosing the
microcontroller or other digital architecture.

Quote:

Signals and Systems is fun, especially if you have multiple ways to
implement it. A filter can be various kinds of analog, FIR in an FPGA,
IIR in uP code, whatever works.


We don't find much need for advanced filters in most of our stuff. But
yes, they can be fun.

Quote:




Freaking hell a 32 bit square root in PIC asm is only a few lines man.
Your freaking compiler of whatever kind can _never_ do better.


/Anything/ is better than doing PIC assembly.

That statement proves my point.


It proves that you are stuck in the 80's.

We still, rarely, drop down to a bit of asm. It's ugly on an ARM.



Agreed. Assembly is nice on mid-level cpus. It is ugly on braindead
devices with too few registers, too few addressing modes, too few
instructions - like the PIC, 8051, etc. It is ugly on RISC devices with
too many registers, complicated scheduling, complicated instruction
formats. The goldilocks area is in the middle - things like the 68k or
the msp430.

Still, C code (or C++, or Forth, or other low-level high-level
programming languages) are a great deal more efficient, as long as you
understand how the language works and how your tools work. (You also
need to know how to design good software, but that applies to all
languages.) It is unfortunate that a fair number of people don't
properly understand the languages they use, the tools they use, or their
targets.

David Brown
Guest

Mon Jan 28, 2019 10:45 pm   



On 28/01/2019 17:38, John Larkin wrote:
Quote:
On Mon, 28 Jan 2019 08:27:19 +0000, Tom Gardner
spamjunk_at_blueyonder.co.uk> wrote:

On 28/01/19 06:30, David Brown wrote:
People who write C code professionally usually know what their lines of code do.

Older people in the embedded world, yes. Younger people
in other fields - sometimes :(

I've seen too many that only have a vague concept of
what a cache is. And even getting them to outline what
a processor does during an unoptimised subroutine call
can be like pulling teeth.

PEBCAK in a stark form.


/Anything/ is better than doing PIC assembly.

I once looked at a PIC's assembler, and thought
"life is too short".


We had a weird problem last week. An ARM talks to an FPGA which
implements a bunch of DDS sine wave generators, driving a mess of
serial DACs. The sinewaves had weird erratic spikey glitches, which
were suspected to be SPI transmission-line problems, but weren't.

Much experimenting and thinking led us to the real problem: a VOLATILE
declaration in c wasn't always working, so the sinewave amplitude
multiplier values would occasionally get zeroed. One clue was that the
glitches were erratic but quantized to 1 ms time ticks, and the ARM
runs a 1 KHz interrupt.


Was it someone who thought "volatile" meant "atomic" ?

Quote:
I solved the problem by applying the universal principle of "always
blame the software."


Lasse Langwadt Christense
Guest

Tue Jan 29, 2019 12:45 am   



tirsdag den 29. januar 2019 kl. 00.01.29 UTC+1 skrev Tom Gardner:
Quote:
On 28/01/19 18:06, Phil Hobbs wrote:
On 1/28/19 12:02 PM, Tom Gardner wrote:
On 28/01/19 16:38, John Larkin wrote:
On Mon, 28 Jan 2019 08:27:19 +0000, Tom Gardner
spamjunk_at_blueyonder.co.uk> wrote:

On 28/01/19 06:30, David Brown wrote:
People who write C code professionally usually know what their lines
of code do.

Older people in the embedded world, yes. Younger people
in other fields - sometimes :(

I've seen too many that only have a vague concept of
what a cache is. And even getting them to outline what
a processor does during an unoptimised subroutine call
can be like pulling teeth.

PEBCAK in a stark form.


/Anything/ is better than doing PIC assembly.

I once looked at a PIC's assembler, and thought
"life is too short".


We had a weird problem last week. An ARM talks to an FPGA which
implements a bunch of DDS sine wave generators, driving a mess of
serial DACs. The sinewaves had weird erratic spikey glitches, which
were suspected to be SPI transmission-line problems, but weren't.

Much experimenting and thinking led us to the real problem: a VOLATILE
declaration in c wasn't always working, so the sinewave amplitude
multiplier values would occasionally get zeroed. One clue was that the
glitches were erratic but quantized to 1 ms time ticks, and the ARM
runs a 1 KHz interrupt.

I solved the problem by applying the universal principle of "always
blame the software."

That's trivially valid if you include the VHDL/Verilog
code in the definition of "software" :)

Now there's a giveaway. You don't even believe in hardware!

More accurately, I don't see a solid divide
between /digital/ hardware and software. They
are both similar technologies for implementing
devices, and frequently either could be used.

I used to think RF was fundamentally different,
but even that is eroding with modern ADCs, DACs
and DSP!

But it is fun to get people with a more limited
background into a pub, and to get them to try
to unambiguously define the difference. Ditto
pornography/art and life/death.


https://en.wikipedia.org/wiki/I_know_it_when_I_see_it

Tom Gardner
Guest

Tue Jan 29, 2019 12:45 am   



On 28/01/19 18:26, John Larkin wrote:
Quote:
On Mon, 28 Jan 2019 08:27:19 +0000, Tom Gardner
spamjunk_at_blueyonder.co.uk> wrote:

On 28/01/19 06:30, David Brown wrote:
People who write C code professionally usually know what their lines of code do.

Older people in the embedded world, yes. Younger people
in other fields - sometimes :(

If A is a float,

A=A+3

could be a big deal.


That's "easy" in that it is more or less a language
independent issue.

C/C++ has far more and far more subtle traps. That's
been known from the earliest days: the /second/ C book
(the "C Puzzle Book") was about some of them. With
language and processor changes over the decade, the
problem has only become worse.

And I'll skip over the philosophical aspects of
there even being an Obfuscated C contest. Some
of those entries are truly remarkable in many
ways!

Tom Gardner
Guest

Tue Jan 29, 2019 12:45 am   



On 28/01/19 18:45, Phil Hobbs wrote:
Quote:
On 1/28/19 1:34 PM, John Larkin wrote:
On Mon, 28 Jan 2019 13:06:40 -0500, Phil Hobbs
pcdhSpamMeSenseless_at_electrooptical.net> wrote:

On 1/28/19 12:02 PM, Tom Gardner wrote:
On 28/01/19 16:38, John Larkin wrote:
On Mon, 28 Jan 2019 08:27:19 +0000, Tom Gardner
spamjunk_at_blueyonder.co.uk> wrote:

On 28/01/19 06:30, David Brown wrote:
People who write C code professionally usually know what
their lines of code do.

Older people in the embedded world, yes. Younger people in
other fields - sometimes :(

I've seen too many that only have a vague concept of what a
cache is. And even getting them to outline what a processor
does during an unoptimised subroutine call can be like
pulling teeth.

PEBCAK in a stark form.


/Anything/ is better than doing PIC assembly.

I once looked at a PIC's assembler, and thought "life is too
short".


We had a weird problem last week. An ARM talks to an FPGA
which implements a bunch of DDS sine wave generators, driving a
mess of serial DACs. The sinewaves had weird erratic spikey
glitches, which were suspected to be SPI transmission-line
problems, but weren't.

Much experimenting and thinking led us to the real problem: a
VOLATILE declaration in c wasn't always working, so the
sinewave amplitude multiplier values would occasionally get
zeroed. One clue was that the glitches were erratic but
quantized to 1 ms time ticks, and the ARM runs a 1 KHz
interrupt.

I solved the problem by applying the universal principle of
"always blame the software."

That's trivially valid if you include the VHDL/Verilog code in
the definition of "software" :)

Now there's a giveaway. You don't even believe in hardware!

Cheers

Phil Hobbs

We almost always get our PC boards right first try.

Code is almost always wrong as written, and needs many iterations to
bet to mostly bug free. VHDL is better than c, mainly because my VHDL
guys include a lot of test benching and don't consider the code done
until it simulates right. But the first as-typed code is usually
wrong.

Test-driven design is a software development technique that applies that
idea to code. You write a description of each function, write a unit
test for it, and then write the function.

My son and colleague Simon (formerly known as Dashing Firmware
Hunchback) built a build-and-testing framework that he uses for
that--the firmware gets mostly debugged on x86, with functions that log
and mimic the peripherals. It's way faster than doing it on the target,
and of course it can be going on in parallel with the target hardware
development. (He's a much more orderly fellow than I am.)


I first (triumphantly re)invented that process in 1982.

It remains a source of pleasurable bewilderment that
it is regarded as novel (and that people can make a
living proselytising it).


Quote:
The idea is that the easier it is to change something, the less
effort will be invested in getting it right.

Fun: google spreadsheet errors

Turns out that most spreadsheets are wrong. Billion dollar mistakes
have been made.

You betcha. The last spreadsheet I used for anything except reading
other people's CSVs was Visicalc, circa 1982. I took one look at it,
realized that spreadsheets were impossible to debug, and chucked the
whole notion.


They are all other domain specific languages, and
languages touted as allowing non-programmers to avoid
having to interact with the IT department.

Programs start small and well within the relevant
domain. They gradually expand (and become cancerous
growths) and go outside the domain (and fall over
a cliff).


Quote:
I do plots using text files + Gnuplot, feasibility calculations with
an old version of Mathcad, and otherwise I mostly write something.


Yes, it is difficult to beat those tools that process.

Tom Gardner
Guest

Tue Jan 29, 2019 12:45 am   



On 28/01/19 18:06, Phil Hobbs wrote:
Quote:
On 1/28/19 12:02 PM, Tom Gardner wrote:
On 28/01/19 16:38, John Larkin wrote:
On Mon, 28 Jan 2019 08:27:19 +0000, Tom Gardner
spamjunk_at_blueyonder.co.uk> wrote:

On 28/01/19 06:30, David Brown wrote:
People who write C code professionally usually know what their lines
of code do.

Older people in the embedded world, yes. Younger people
in other fields - sometimes :(

I've seen too many that only have a vague concept of
what a cache is. And even getting them to outline what
a processor does during an unoptimised subroutine call
can be like pulling teeth.

PEBCAK in a stark form.


/Anything/ is better than doing PIC assembly.

I once looked at a PIC's assembler, and thought
"life is too short".


We had a weird problem last week. An ARM talks to an FPGA which
implements a bunch of DDS sine wave generators, driving a mess of
serial DACs. The sinewaves had weird erratic spikey glitches, which
were suspected to be SPI transmission-line problems, but weren't.

Much experimenting and thinking led us to the real problem: a VOLATILE
declaration in c wasn't always working, so the sinewave amplitude
multiplier values would occasionally get zeroed. One clue was that the
glitches were erratic but quantized to 1 ms time ticks, and the ARM
runs a 1 KHz interrupt.

I solved the problem by applying the universal principle of "always
blame the software."

That's trivially valid if you include the VHDL/Verilog
code in the definition of "software" :)

Now there's a giveaway. You don't even believe in hardware!


More accurately, I don't see a solid divide
between /digital/ hardware and software. They
are both similar technologies for implementing
devices, and frequently either could be used.

I used to think RF was fundamentally different,
but even that is eroding with modern ADCs, DACs
and DSP!

But it is fun to get people with a more limited
background into a pub, and to get them to try
to unambiguously define the difference. Ditto
pornography/art and life/death.

John Larkin
Guest

Tue Jan 29, 2019 4:45 am   



On Mon, 28 Jan 2019 17:02:41 +0000, Tom Gardner
<spamjunk_at_blueyonder.co.uk> wrote:

Quote:
On 28/01/19 16:38, John Larkin wrote:
On Mon, 28 Jan 2019 08:27:19 +0000, Tom Gardner
spamjunk_at_blueyonder.co.uk> wrote:

On 28/01/19 06:30, David Brown wrote:
People who write C code professionally usually know what their lines of code do.

Older people in the embedded world, yes. Younger people
in other fields - sometimes :(

I've seen too many that only have a vague concept of
what a cache is. And even getting them to outline what
a processor does during an unoptimised subroutine call
can be like pulling teeth.

PEBCAK in a stark form.


/Anything/ is better than doing PIC assembly.

I once looked at a PIC's assembler, and thought
"life is too short".


We had a weird problem last week. An ARM talks to an FPGA which
implements a bunch of DDS sine wave generators, driving a mess of
serial DACs. The sinewaves had weird erratic spikey glitches, which
were suspected to be SPI transmission-line problems, but weren't.

Much experimenting and thinking led us to the real problem: a VOLATILE
declaration in c wasn't always working, so the sinewave amplitude
multiplier values would occasionally get zeroed. One clue was that the
glitches were erratic but quantized to 1 ms time ticks, and the ARM
runs a 1 KHz interrupt.

I solved the problem by applying the universal principle of "always
blame the software."

That's trivially valid if you include the VHDL/Verilog
code in the definition of "software" :)

In cases such as yours, it can be "entertaining" to
determine whether to assign the root-cause to the
programmer, the compiler, or the language.


The programmer applied the universal principle of "always blame the
compiler."


--

John Larkin Highland Technology, Inc

lunatic fringe electronics

Tom Gardner
Guest

Tue Jan 29, 2019 10:45 am   



On 29/01/19 03:20, John Larkin wrote:
Quote:
On Mon, 28 Jan 2019 17:02:41 +0000, Tom Gardner
spamjunk_at_blueyonder.co.uk> wrote:

On 28/01/19 16:38, John Larkin wrote:
On Mon, 28 Jan 2019 08:27:19 +0000, Tom Gardner
spamjunk_at_blueyonder.co.uk> wrote:

On 28/01/19 06:30, David Brown wrote:
People who write C code professionally usually know what their lines of code do.

Older people in the embedded world, yes. Younger people
in other fields - sometimes :(

I've seen too many that only have a vague concept of
what a cache is. And even getting them to outline what
a processor does during an unoptimised subroutine call
can be like pulling teeth.

PEBCAK in a stark form.


/Anything/ is better than doing PIC assembly.

I once looked at a PIC's assembler, and thought
"life is too short".


We had a weird problem last week. An ARM talks to an FPGA which
implements a bunch of DDS sine wave generators, driving a mess of
serial DACs. The sinewaves had weird erratic spikey glitches, which
were suspected to be SPI transmission-line problems, but weren't.

Much experimenting and thinking led us to the real problem: a VOLATILE
declaration in c wasn't always working, so the sinewave amplitude
multiplier values would occasionally get zeroed. One clue was that the
glitches were erratic but quantized to 1 ms time ticks, and the ARM
runs a 1 KHz interrupt.

I solved the problem by applying the universal principle of "always
blame the software."

That's trivially valid if you include the VHDL/Verilog
code in the definition of "software" :)

In cases such as yours, it can be "entertaining" to
determine whether to assign the root-cause to the
programmer, the compiler, or the language.

The programmer applied the universal principle of "always blame the
compiler."


Regrettably, with complicated languages and
complicated ISAs, that isn't a completely
ridiculous attitude.

Trust, but verify.

Goto page Previous  1, 2, 3, 4, 5, 6, 7, 8  Next

elektroda.net NewsGroups Forum Index - Electronics Design - Writing to MCU flash

Ask a question - edaboard.com

Arabic version Bulgarian version Catalan version Czech version Danish version German version Greek version English version Spanish version Finnish version French version Hindi version Croatian version Indonesian version Italian version Hebrew version Japanese version Korean version Lithuanian version Latvian version Dutch version Norwegian version Polish version Portuguese version Romanian version Russian version Slovak version Slovenian version Serbian version Swedish version Tagalog version Ukrainian version Vietnamese version Chinese version Turkish version
EDAboard.com map