Conical inductors--still $10!...

On 15/08/20 03:51, Les Cargill wrote:
Tom Gardner wrote:
On 14/08/20 04:13, Les Cargill wrote:
Tom Gardner wrote:
Rust and Go are showing significant promise in the
marketplace,

Mozzlla seems to have dumped at least some of the Rust team:

https://www.reddit.com/r/rust/comments/i7stjy/how_do_mozilla_layoffs_affect_rust/


I doubt they will remain unemployed. Rust is gaining traction
in wider settings.


I dunno - I can\'t separate the messaging from the offering. I\'m
fine with a C/C++ compiler so I have less than no incentive to
even become remotely literate about Rust.

The Rustaceans seem obsessed with stuff my cohort ( read:eek:ld people )
learned six months into their first C project. But there may
well be benefits I don\'t know about.

Too many people /think/ they know C.

I first used C in ~81, and learned it from the two
available books, which I still have. The second book
was, of course, a book on traditional mistakes in C
\"The C Puzzle Book\".

It is horrifying that Boehm thought it worth writing this
http://www.hpl.hp.com/techreports/2004/HPL-2004-209.pdf
http://hboehm.info/misc_slides/pldi05_threads.pdf
and that it surprised many C practitioners.

Rust directly addresses some of the pain points.


It is not-not a thing; the CVE list shows that. I am just appalled
that these defects are released.

If you think C and C++ languages and implementations
are fault-free, I\'d like to visit your planet sometime :)

You can start with the C++ FQA http://yosefk.com/c++fqa/

Watching (from a distance) the deliberations of the C/C++
committees in the early 90s was enlightening, in a bad way.
One simple debate (which lasted years) was whether is ought
to be possible or impossible to \"cast away constness\".
There are good reasons for both, and they cannot be
reconciled.
(Yes: to allow debuggers and similar tools to inspect memory.
No: to enable safe aggressive optimisations)



Linus Torvalds is vociferously and famously opposed to having
C++ anywhere near the Linux kernel (good taste IMNSHO).

Don\'t take any cues from Linus Torvalds. He\'s why my deliverables
at one gig were patch files. I\'ve no objection to that but geez...

And C++ is Just Fine. Now. It took what, 20 years?

Worse: 30 years!

I first used it in \'88, and thought it a regression
over other available languages.


The reasons for \"no C++ in the kernel\" are quite serious, valid and worthy of
our approval.

He
has given a big hint he wouldn\'t oppose Rust, by stating that
if it is there it should be enabled by default.

https://www.phoronix.com/scan.php?page=news_item&px=Torvalds-Rust-Kernel-K-Build


I\'ve seen this movie before. It\'s yet another This Time It\'s Different
approach.

Oh, we\'ve all seen too many examples of that, in hardware
and software! The trick is recognising which bring worthwhile
practical and novel capabilities to the party. Most don\'t,
a very few do.

The jury is out w.r.t. Rust and Go, but they are worth watching.
 
On 14/08/20 04:35, Les Cargill wrote:
jlarkin@highlandsniptechnology.com wrote:
On Wed, 12 Aug 2020 08:33:20 +0100, Martin Brown
\'\'\'newspam\'\'\'@nonad.co.uk> wrote:

snip

The real dollar cost of bad software is gigantic. There should be no
reason for a small or mid-size company to continuously pay IT security
consultants, or to run AV software.


It\'s not even accounted for, nor is it an actual cost in the usual sense
of the word - nobody\'s trying to make this actually less expensive.


C invites certain dangerous practices that attackers ruthlessly exploit
like loops copying until they hit a null byte.

Let bad programs malfunction or crash. But don\'t allow a stack or
buffer overflow to poke exploits into code space. The idea of
separating data, code, and stack isn\'t hard to understand, or even
hard to implement.

We probably need to go to pseudocode-only programs. The machine needs
to be protected from programmers and from bad architectures. Most
programmers never learn about machine-level processes.


That\'s what \"managed languages\" like Java or C# do. It\'s all bytecode
in a VM.

Java is, C# isn\'t.

During installation C# assemblies are compiled into code
optimised for the specific processor. Of course those
optimisations can only be based on what the compiler/installer
guesses the code will do at runtime.

I\'ve wondered (without conclusion) whether that is why
it takes so long to install Windows updates, compared
with Linux updates.

Caveat: I haven\'t followed C# since the designer (Anders
Hejlsberg) gave us a lecture at HPLabs, just as C# was
being released.

None of us were impressed, correctly regarding it as a
proprietary me-too Java knockoff with slightly different
implementation choices.
 
Am 15.08.20 um 05:13 schrieb Les Cargill:

Never mind the bandwidth of a NVME M.2 SSD. It\'s several orders
of magnitude faster than the interface to a nice graphics card.

\"Never underestimate the bandwidth of a station wagon full of mag tapes\"

I think that was Andy Tanenbaum in \"Structured Computer Organization\",
35 years ago.


Gerhard
 
Am 15.08.20 um 05:04 schrieb Les Cargill:
Tom Gardner wrote:

It also goes lower than that. The processor internally decomposes
x86 ISA instructions into sequences of simpler micro operations that
are invisible externally. Yup, microcode :)



I\'m not 100% sure what you mean by that but it\'s way less than
important. We all know that x86 is microcoded. So was the HP3000.

Yes, early x86 was microcoded.

Calling the workings of current X86 microcoded is such an
oversimplification that it\'s completely wrong. Analyzing entire
sequences of x86 instructions for data dependencies in hardware,
feeding it to multiple ALUs and 100s of internal renaming
registers has nothing in common with good old 360 style
microprogramming.


cheers, Gerhard
 
Reminds me of the NASA website late 1990s there was a advisory that some library photos were not available for immediate downloading and were subject to 20 minute retrieval time, presumably while the tape was loaded.

piglet
 
On Saturday, August 15, 2020 at 5:28:20 AM UTC-4, Gerhard Hoffmann wrote:
Am 15.08.20 um 05:13 schrieb Les Cargill:

Never mind the bandwidth of a NVME M.2 SSD. It\'s several orders
of magnitude faster than the interface to a nice graphics card.


\"Never underestimate the bandwidth of a station wagon full of mag tapes\"

Or a shirt pocket full of SD cards...
 
On 15/8/20 1:04 pm, Les Cargill wrote:
Tom Gardner wrote:
On 14/08/20 04:35, Les Cargill wrote:
jlarkin@highlandsniptechnology.com wrote:
On Wed, 12 Aug 2020 08:33:20 +0100, Martin Brown
\'\'\'newspam\'\'\'@nonad.co.uk> wrote:

snip

The real dollar cost of bad software is gigantic. There should be no
reason for a small or mid-size company to continuously pay IT security
consultants, or to run AV software.


It\'s not even accounted for, nor is it an actual cost in the usual sense
of the word - nobody\'s trying to make this actually less expensive.


C invites certain dangerous practices that attackers ruthlessly
exploit
like loops copying until they hit a null byte.

Let bad programs malfunction or crash. But don\'t allow a stack or
buffer overflow to poke exploits into code space. The idea of
separating data, code, and stack isn\'t hard to understand, or even
hard to implement.

We probably need to go to pseudocode-only programs. The machine needs
to be protected from programmers and from bad architectures. Most
programmers never learn about machine-level processes.


That\'s what \"managed languages\" like Java or C# do. It\'s all bytecode
in a VM.

It also goes lower than that. The processor internally decomposes
x86 ISA instructions into sequences of simpler micro operations that
are invisible externally. Yup, microcode :)



I\'m not 100% sure what you mean by that but it\'s way less than
important. We all know that x86 is microcoded. So was the HP3000.

It\'s not even really microcoded. It\'s transpiled to a different
architecture in the pipeline. No wonder they need 40+ stage pipelines
for that.

CH
 
On a sunny day (Sat, 15 Aug 2020 21:36:27 +1000) it happened Clifford Heath
<no.spam@please.net> wrote in <%MPZG.268111$0W4.180141@fx42.iad>:

It\'s not even really microcoded. It\'s transpiled to a different
architecture in the pipeline. No wonder they need 40+ stage pipelines
for that.

The way I see those chips is as them having their own OS,
complete with login and password for the NSA (forgot the password),
but who needs it..
https://arstechnica.com/information-technology/2020/06/new-exploits-plunder-crypto-keys-and-more-from-intels-ultrasecure-sgx/


https://arstechnica.com/information-technology/2020/08/intel-is-investigating-the-leak-of-20gb-of-its-source-code-and-private-data/

One would almost think they are drawing attention to themselves after that
https://arstechnica.com/gadgets/2020/05/intels-comet-lake-desktop-cpus-are-here/

Best and safest is to design your nuke on a piece of paper, we had to do it at high school chem classes
easy.
 
On 2020-08-15 04:21, Tom Gardner wrote:
On 15/08/20 03:51, Les Cargill wrote:
Tom Gardner wrote:
On 14/08/20 04:13, Les Cargill wrote:
Tom Gardner wrote:
Rust and Go are showing significant promise in the
marketplace,

Mozzlla seems to have dumped at least some of the Rust team:

https://www.reddit.com/r/rust/comments/i7stjy/how_do_mozilla_layoffs_affect_rust/


I doubt they will remain unemployed. Rust is gaining traction
in wider settings.


I dunno - I can\'t separate the messaging from the offering. I\'m
fine with a C/C++ compiler so I have less than no incentive to
even become remotely literate about Rust.

The Rustaceans seem obsessed with stuff my cohort ( read:eek:ld people )
learned six months into their first C project. But there may
well be benefits I don\'t know about.

Too many people /think/ they know C.

I first used C in ~81, and learned it from the two
available books, which I still have. The second book
was, of course, a book on traditional mistakes in C
\"The C Puzzle Book\".

It is horrifying that Boehm thought it worth writing this
http://www.hpl.hp.com/techreports/2004/HPL-2004-209.pdf
http://hboehm.info/misc_slides/pldi05_threads.pdf
and that it surprised many C practitioners.

Rust directly addresses some of the pain points.


It is not-not a thing; the CVE list shows that. I am just appalled
that these defects are released.

If you think C and C++ languages and implementations
are fault-free, I\'d like to visit your planet sometime :)

You can start with the C++ FQA http://yosefk.com/c++fqa/

Watching (from a distance) the deliberations of the C/C++
committees in the early 90s was enlightening, in a bad way.
One simple debate (which lasted years) was whether is ought
to be possible or impossible to \"cast away constness\".
There are good reasons for both, and they cannot be
reconciled.
(Yes: to allow debuggers and similar tools to inspect memory.
No: to enable safe aggressive optimisations)



Linus Torvalds is vociferously and famously opposed to having
C++ anywhere near the Linux kernel (good taste IMNSHO).

Don\'t take any cues from Linus Torvalds. He\'s why my deliverables
at one gig were patch files. I\'ve no objection to that but geez...

And C++ is Just Fine. Now. It took what, 20 years?

Worse: 30 years!

I first used it in \'88, and thought it a regression
over other available languages.

I\'ve always liked C++. The OOP paradigm maps very naturally onto the
sorts of coding I do: embedded, instrument control, and simulations.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC / Hobbs ElectroOptics
Optics, Electro-optics, Photonics, Analog Electronics
Briarcliff Manor NY 10510

http://electrooptical.net
http://hobbs-eo.com
 
On 15/08/20 16:58, Phil Hobbs wrote:
On 2020-08-15 04:21, Tom Gardner wrote:
On 15/08/20 03:51, Les Cargill wrote:
Tom Gardner wrote:
On 14/08/20 04:13, Les Cargill wrote:
Tom Gardner wrote:
Rust and Go are showing significant promise in the
marketplace,

Mozzlla seems to have dumped at least some of the Rust team:

https://www.reddit.com/r/rust/comments/i7stjy/how_do_mozilla_layoffs_affect_rust/


I doubt they will remain unemployed. Rust is gaining traction
in wider settings.


I dunno - I can\'t separate the messaging from the offering. I\'m
fine with a C/C++ compiler so I have less than no incentive to
even become remotely literate about Rust.

The Rustaceans seem obsessed with stuff my cohort ( read:eek:ld people )
learned six months into their first C project. But there may
well be benefits I don\'t know about.

Too many people /think/ they know C.

I first used C in ~81, and learned it from the two
available books, which I still have. The second book
was, of course, a book on traditional mistakes in C
\"The C Puzzle Book\".

It is horrifying that Boehm thought it worth writing this
http://www.hpl.hp.com/techreports/2004/HPL-2004-209.pdf
http://hboehm.info/misc_slides/pldi05_threads.pdf
and that it surprised many C practitioners.

Rust directly addresses some of the pain points.


It is not-not a thing; the CVE list shows that. I am just appalled
that these defects are released.

If you think C and C++ languages and implementations
are fault-free, I\'d like to visit your planet sometime :)

You can start with the C++ FQA http://yosefk.com/c++fqa/

Watching (from a distance) the deliberations of the C/C++
committees in the early 90s was enlightening, in a bad way.
One simple debate (which lasted years) was whether is ought
to be possible or impossible to \"cast away constness\".
There are good reasons for both, and they cannot be
reconciled.
(Yes: to allow debuggers and similar tools to inspect memory.
No: to enable safe aggressive optimisations)



Linus Torvalds is vociferously and famously opposed to having
C++ anywhere near the Linux kernel (good taste IMNSHO).

Don\'t take any cues from Linus Torvalds. He\'s why my deliverables
at one gig were patch files. I\'ve no objection to that but geez...

And C++ is Just Fine. Now. It took what, 20 years?

Worse: 30 years!

I first used it in \'88, and thought it a regression
over other available languages.

I\'ve always liked C++.  The OOP paradigm maps very naturally onto the sorts of
coding I do: embedded, instrument control, and simulations.

Ditto in spades, except for C++.

I was doing primitive version of OOP in C around \'82, for
embedded machine control.

When I came across OOP in 85, I instantly recognised that
two customer statements mapped directly onto OOP:
- \"I\'d like three of those\" => object creation
- \"just like that example, except\" => class hierarchy
And seeing what was possible in Smalltalk (container classes,
reflection) made me a convert!

Unfortunately ParcPlace[1] Smalltalk was totally unsuited to
embedded systems, so I looked out for alternatives.

In \'88 I evaluated C++ and Objective-C. The latter is really
Smalltalk without a GC, so rapidly became productive, using
the available classes and adding my own. But C++ was dreadful;
there was no class hierarchy and reversing any mistaken design
choice was unnecessarily painful.

Then in the early 90s, I watched the C and C++ committees
wrangling endlessly over pretty fundamental points /without/
there being /any possibility/ of adequately reconciling the
two viewpoints. (Simple example: should it be allowed or
forbidden to cast away a const declaration. There are good
arguments for both, but the necessary choice has far-reaching
implications)

At that point I realised the language was building a bigger
castle on sand. Not a good position to be in.

Then, when I first used Java in \'96, people were amazed at
how quickly I could create complex applications (3D graphs
of cellular system performance). That was because after
only a couple of years, Java came with a large highly functional
class library - something that C and C++ had conspicuously
failed to manage in a decade. And you could simply plug in
random libraries from random companies, intertwine them with
your data, and it simply /worked/ as expected.

At that point I completely gave up on C++, on the basis that
if C++ is the best answer, you\'ve asked the wrong question!

And in the decades since it has been confirmed that typical C++
programmers are content with tools that usually work as they
expect, e.g. Boehm\'s garbage collector and compiler optimisation.

So, the sooner we can \"de-emphasise\" C++ the better.
Just like COBOL.

[1] Digitalk Smalltalk in ~90 was completely different,
and was embedded into some HP equipment (and I believe
Tek equipment).
 
On 2020-08-15 13:01, Tom Gardner wrote:
On 15/08/20 16:58, Phil Hobbs wrote:
On 2020-08-15 04:21, Tom Gardner wrote:
On 15/08/20 03:51, Les Cargill wrote:
Tom Gardner wrote:
On 14/08/20 04:13, Les Cargill wrote:
Tom Gardner wrote:
Rust and Go are showing significant promise in the
marketplace,

Mozzlla seems to have dumped at least some of the Rust team:

https://www.reddit.com/r/rust/comments/i7stjy/how_do_mozilla_layoffs_affect_rust/


I doubt they will remain unemployed. Rust is gaining traction
in wider settings.


I dunno - I can\'t separate the messaging from the offering. I\'m
fine with a C/C++ compiler so I have less than no incentive to
even become remotely literate about Rust.

The Rustaceans seem obsessed with stuff my cohort ( read:eek:ld people )
learned six months into their first C project. But there may
well be benefits I don\'t know about.

Too many people /think/ they know C.

I first used C in ~81, and learned it from the two
available books, which I still have. The second book
was, of course, a book on traditional mistakes in C
\"The C Puzzle Book\".

It is horrifying that Boehm thought it worth writing this
http://www.hpl.hp.com/techreports/2004/HPL-2004-209.pdf
http://hboehm.info/misc_slides/pldi05_threads.pdf
and that it surprised many C practitioners.

Rust directly addresses some of the pain points.


It is not-not a thing; the CVE list shows that. I am just appalled
that these defects are released.

If you think C and C++ languages and implementations
are fault-free, I\'d like to visit your planet sometime :)

You can start with the C++ FQA http://yosefk.com/c++fqa/

Watching (from a distance) the deliberations of the C/C++
committees in the early 90s was enlightening, in a bad way.
One simple debate (which lasted years) was whether is ought
to be possible or impossible to \"cast away constness\".
There are good reasons for both, and they cannot be
reconciled.
(Yes: to allow debuggers and similar tools to inspect memory.
No: to enable safe aggressive optimisations)



Linus Torvalds is vociferously and famously opposed to having
C++ anywhere near the Linux kernel (good taste IMNSHO).

Don\'t take any cues from Linus Torvalds. He\'s why my deliverables
at one gig were patch files. I\'ve no objection to that but geez...

And C++ is Just Fine. Now. It took what, 20 years?

Worse: 30 years!

I first used it in \'88, and thought it a regression
over other available languages.

I\'ve always liked C++.  The OOP paradigm maps very naturally onto the
sorts of coding I do: embedded, instrument control, and simulations.

Ditto in spades, except for C++.

I was doing primitive version of OOP in C around \'82, for
embedded machine control.

When I came across OOP in 85, I instantly recognised that
two customer statements mapped directly onto OOP:
 - \"I\'d like three of those\" => object creation
 - \"just like that example, except\" => class hierarchy
And seeing what was possible in Smalltalk (container classes,
reflection) made me a convert!

Unfortunately ParcPlace[1] Smalltalk was totally unsuited to
embedded systems, so I looked out for alternatives.

In \'88 I evaluated C++ and Objective-C. The latter is really
Smalltalk without a GC, so rapidly became productive, using
the available classes and adding my own. But C++ was dreadful;
there was no class hierarchy and reversing any mistaken design
choice was unnecessarily painful.

Then in the early 90s, I watched the C and C++ committees
wrangling endlessly over pretty fundamental points /without/
there being /any possibility/ of adequately reconciling the
two viewpoints. (Simple example: should it be allowed or
forbidden to cast away a const declaration. There are good
arguments for both, but the necessary choice has far-reaching
implications)

At that point I realised the language was building a bigger
castle on sand. Not a good position to be in.

Then, when I first used Java in \'96, people were amazed at
how quickly I could create complex applications (3D graphs
of cellular system performance). That was because after
only a couple of years, Java came with a large highly functional
class library - something that C and C++ had conspicuously
failed to manage in a decade. And you could simply plug in
random libraries from random companies, intertwine them with
your data, and it simply /worked/ as expected.

At that point I completely gave up on C++, on the basis that
if C++ is the best answer, you\'ve asked the wrong question!

IOW you\'ve never used anything newer than C++89. I agree with you about
that language, but not about C++98 or newer. Most of my stuff is
C++03ish but I\'m warming up to the standard library and the C++11-17
features.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC / Hobbs ElectroOptics
Optics, Electro-optics, Photonics, Analog Electronics
Briarcliff Manor NY 10510

http://electrooptical.net
http://hobbs-eo.com
 
On 15/08/20 18:15, Phil Hobbs wrote:
On 2020-08-15 13:01, Tom Gardner wrote:
On 15/08/20 16:58, Phil Hobbs wrote:
On 2020-08-15 04:21, Tom Gardner wrote:
On 15/08/20 03:51, Les Cargill wrote:
Tom Gardner wrote:
On 14/08/20 04:13, Les Cargill wrote:
Tom Gardner wrote:
Rust and Go are showing significant promise in the
marketplace,

Mozzlla seems to have dumped at least some of the Rust team:

https://www.reddit.com/r/rust/comments/i7stjy/how_do_mozilla_layoffs_affect_rust/


I doubt they will remain unemployed. Rust is gaining traction
in wider settings.


I dunno - I can\'t separate the messaging from the offering. I\'m
fine with a C/C++ compiler so I have less than no incentive to
even become remotely literate about Rust.

The Rustaceans seem obsessed with stuff my cohort ( read:eek:ld people )
learned six months into their first C project. But there may
well be benefits I don\'t know about.

Too many people /think/ they know C.

I first used C in ~81, and learned it from the two
available books, which I still have. The second book
was, of course, a book on traditional mistakes in C
\"The C Puzzle Book\".

It is horrifying that Boehm thought it worth writing this
http://www.hpl.hp.com/techreports/2004/HPL-2004-209.pdf
http://hboehm.info/misc_slides/pldi05_threads.pdf
and that it surprised many C practitioners.

Rust directly addresses some of the pain points.


It is not-not a thing; the CVE list shows that. I am just appalled
that these defects are released.

If you think C and C++ languages and implementations
are fault-free, I\'d like to visit your planet sometime :)

You can start with the C++ FQA http://yosefk.com/c++fqa/

Watching (from a distance) the deliberations of the C/C++
committees in the early 90s was enlightening, in a bad way.
One simple debate (which lasted years) was whether is ought
to be possible or impossible to \"cast away constness\".
There are good reasons for both, and they cannot be
reconciled.
(Yes: to allow debuggers and similar tools to inspect memory.
No: to enable safe aggressive optimisations)



Linus Torvalds is vociferously and famously opposed to having
C++ anywhere near the Linux kernel (good taste IMNSHO).

Don\'t take any cues from Linus Torvalds. He\'s why my deliverables
at one gig were patch files. I\'ve no objection to that but geez...

And C++ is Just Fine. Now. It took what, 20 years?

Worse: 30 years!

I first used it in \'88, and thought it a regression
over other available languages.

I\'ve always liked C++.  The OOP paradigm maps very naturally onto the sorts
of coding I do: embedded, instrument control, and simulations.

Ditto in spades, except for C++.

I was doing primitive version of OOP in C around \'82, for
embedded machine control.

When I came across OOP in 85, I instantly recognised that
two customer statements mapped directly onto OOP:
  - \"I\'d like three of those\" => object creation
  - \"just like that example, except\" => class hierarchy
And seeing what was possible in Smalltalk (container classes,
reflection) made me a convert!

Unfortunately ParcPlace[1] Smalltalk was totally unsuited to
embedded systems, so I looked out for alternatives.

In \'88 I evaluated C++ and Objective-C. The latter is really
Smalltalk without a GC, so rapidly became productive, using
the available classes and adding my own. But C++ was dreadful;
there was no class hierarchy and reversing any mistaken design
choice was unnecessarily painful.

Then in the early 90s, I watched the C and C++ committees
wrangling endlessly over pretty fundamental points /without/
there being /any possibility/ of adequately reconciling the
two viewpoints. (Simple example: should it be allowed or
forbidden to cast away a const declaration. There are good
arguments for both, but the necessary choice has far-reaching
implications)

At that point I realised the language was building a bigger
castle on sand. Not a good position to be in.

Then, when I first used Java in \'96, people were amazed at
how quickly I could create complex applications (3D graphs
of cellular system performance). That was because after
only a couple of years, Java came with a large highly functional
class library - something that C and C++ had conspicuously
failed to manage in a decade. And you could simply plug in
random libraries from random companies, intertwine them with
your data, and it simply /worked/ as expected.

At that point I completely gave up on C++, on the basis that
if C++ is the best answer, you\'ve asked the wrong question!

IOW you\'ve never used anything newer than C++89.

Used, no. But as I mentioned I kept an open eye (and mind)
on what the committees were up to.

After 8 years of heated nattering, there was insufficient
progress, especially compared with what was being achieved
in other languages in a quarter of that time.

They couldn\'t get their act together, so I moved to languages
where they /had/ got their act together. For simple embedded
stuff I still used C.


I agree with you about that
language, but not about C++98 or newer.

Yebbut, even that late they hadn\'t got their act together
w.r.t. threading.

Yebbut even in 2005 they hadn\'t, and many people had
forgotten that they /couldn\'t/ - hence Boehm\'s paper
http://www.hpl.hp.com/techreports/2004/HPL-2004-209.pdf
http://hboehm.info/misc_slides/pldi05_threads.pdf

Finally, almost a quarter of a century(!) later in
2011, a memory model appeared. IMHO that that will
take a long time to be proven sufficient and correctly
implemented.

Quarter of a century is rather a long time for a
language to be insufficient to implement a major
library (Pthreads)!


Most of my stuff is C++03ish but I\'m
warming up to the standard library and the C++11-17 features.

If I\'m being catty I\'ll ask if you use the same subset
of the language as those around you and the libraries
you use :)

How much of the C++ Frequently Questioned Answers has
become obsolete now? http://yosefk.com/c++fqa/
That is a laugh-and-weep diatribe with too much truth.

My embedded stuff is now multicore and hard realtime.
My preference is xC, since that has multicore parallelism
baked in from the beginning, not bolted on as an afterthought.
Unfortunately that is only on the (delightful) xCORE
processors, so I also keep an eye on the progress of others
such as Rust. Time will tell.
 
On 2020-08-15 19:00, Tom Gardner wrote:
On 15/08/20 18:15, Phil Hobbs wrote:
On 2020-08-15 13:01, Tom Gardner wrote:
On 15/08/20 16:58, Phil Hobbs wrote:
On 2020-08-15 04:21, Tom Gardner wrote:
On 15/08/20 03:51, Les Cargill wrote:
Tom Gardner wrote:
On 14/08/20 04:13, Les Cargill wrote:
Tom Gardner wrote:
Rust and Go are showing significant promise in the
marketplace,

Mozzlla seems to have dumped at least some of the Rust team:

https://www.reddit.com/r/rust/comments/i7stjy/how_do_mozilla_layoffs_affect_rust/


I doubt they will remain unemployed. Rust is gaining traction
in wider settings.


I dunno - I can\'t separate the messaging from the offering. I\'m
fine with a C/C++ compiler so I have less than no incentive to
even become remotely literate about Rust.

The Rustaceans seem obsessed with stuff my cohort ( read:eek:ld people )
learned six months into their first C project. But there may
well be benefits I don\'t know about.

Too many people /think/ they know C.

I first used C in ~81, and learned it from the two
available books, which I still have. The second book
was, of course, a book on traditional mistakes in C
\"The C Puzzle Book\".

It is horrifying that Boehm thought it worth writing this
http://www.hpl.hp.com/techreports/2004/HPL-2004-209.pdf
http://hboehm.info/misc_slides/pldi05_threads.pdf
and that it surprised many C practitioners.

Rust directly addresses some of the pain points.


It is not-not a thing; the CVE list shows that. I am just appalled
that these defects are released.

If you think C and C++ languages and implementations
are fault-free, I\'d like to visit your planet sometime :)

You can start with the C++ FQA http://yosefk.com/c++fqa/

Watching (from a distance) the deliberations of the C/C++
committees in the early 90s was enlightening, in a bad way.
One simple debate (which lasted years) was whether is ought
to be possible or impossible to \"cast away constness\".
There are good reasons for both, and they cannot be
reconciled.
(Yes: to allow debuggers and similar tools to inspect memory.
No: to enable safe aggressive optimisations)



Linus Torvalds is vociferously and famously opposed to having
C++ anywhere near the Linux kernel (good taste IMNSHO).

Don\'t take any cues from Linus Torvalds. He\'s why my deliverables
at one gig were patch files. I\'ve no objection to that but geez...

And C++ is Just Fine. Now. It took what, 20 years?

Worse: 30 years!

I first used it in \'88, and thought it a regression
over other available languages.

I\'ve always liked C++.  The OOP paradigm maps very naturally onto
the sorts of coding I do: embedded, instrument control, and
simulations.

Ditto in spades, except for C++.

I was doing primitive version of OOP in C around \'82, for
embedded machine control.

When I came across OOP in 85, I instantly recognised that
two customer statements mapped directly onto OOP:
  - \"I\'d like three of those\" => object creation
  - \"just like that example, except\" => class hierarchy
And seeing what was possible in Smalltalk (container classes,
reflection) made me a convert!

Unfortunately ParcPlace[1] Smalltalk was totally unsuited to
embedded systems, so I looked out for alternatives.

In \'88 I evaluated C++ and Objective-C. The latter is really
Smalltalk without a GC, so rapidly became productive, using
the available classes and adding my own. But C++ was dreadful;
there was no class hierarchy and reversing any mistaken design
choice was unnecessarily painful.

Then in the early 90s, I watched the C and C++ committees
wrangling endlessly over pretty fundamental points /without/
there being /any possibility/ of adequately reconciling the
two viewpoints. (Simple example: should it be allowed or
forbidden to cast away a const declaration. There are good
arguments for both, but the necessary choice has far-reaching
implications)

At that point I realised the language was building a bigger
castle on sand. Not a good position to be in.

Then, when I first used Java in \'96, people were amazed at
how quickly I could create complex applications (3D graphs
of cellular system performance). That was because after
only a couple of years, Java came with a large highly functional
class library - something that C and C++ had conspicuously
failed to manage in a decade. And you could simply plug in
random libraries from random companies, intertwine them with
your data, and it simply /worked/ as expected.

At that point I completely gave up on C++, on the basis that
if C++ is the best answer, you\'ve asked the wrong question!

IOW you\'ve never used anything newer than C++89.

Used, no. But as I mentioned I kept an open eye (and mind)
on what the committees were up to.

After 8 years of heated nattering, there was insufficient
progress, especially compared with what was being achieved
in other languages in a quarter of that time.

They couldn\'t get their act together, so I moved to languages
where they /had/ got their act together. For simple embedded
stuff I still used C.


I agree with you about that language, but not about C++98 or newer.

Yebbut, even that late they hadn\'t got their act together
w.r.t. threading.

Yebbut even in 2005 they hadn\'t, and many people had
forgotten that they /couldn\'t/ - hence Boehm\'s paper
http://www.hpl.hp.com/techreports/2004/HPL-2004-209.pdf
http://hboehm.info/misc_slides/pldi05_threads.pdf

Finally, almost a quarter of a century(!) later in
2011, a memory model appeared. IMHO that that will
take a long time to be proven sufficient and correctly
implemented.

Quarter of a century is rather a long time for a
language to be insufficient to implement a major
library (Pthreads)!

I was using pthreads very successfully 2003ish. AFAIK the first
microcomputer OS that supported multithreaded programming was OS/2 2.0
in 1992, so that\'s not 20 years in my book. Between 1992 and 2003 I was
writing multithreaded C programs on the OS/2 and Windows OS thread APIs,
which worked fine, partly because the compiler vendors also owned the OS. ;)

The most beautiful debugger I have ever used is the one that came with
VisualAge C++ v. 3.08 for OS/2, circa 2001. Streets ahead of anything
I\'ve seen on Linux to this day--hit \'pause\' and all the threads paused
_right_now_, rather than the UI thread pausing now and the others not
till the end of their timeslice. That worked even on my SMP box, not
just on uniprocessors.

Most of my stuff is C++03ish but I\'m warming up to the standard
library and the C++11-17 features.

If I\'m being catty I\'ll ask if you use the same subset
of the language as those around you and the libraries
you use :)

Incompatibility of third-party libraries was a serious problem in the
beginning, for sure. Everybody and his dog had his own complex number
type, for instance. Back then, my solution (which worked fine for my
purposes) was to stick with <cstdio>, <cstdlib>, and stuff I wrote myself.

Nowadays with namespaces and a much more capable standard library,
there\'s much less reason for that. There are still warts, for sure, of
which the one I love most to hate is <iostreams>. It\'s okay for light
duty use, but the moment you try to do formatted output you disappear
into the long dim corridors of <iomanip>, perhaps never to emerge. ;)

How much of the C++ Frequently Questioned Answers has
become obsolete now? http://yosefk.com/c++fqa/
That is a laugh-and-weep diatribe with too much truth.

Dunno. I skimmed through it once iirc but wasn\'t that impressed.

My embedded stuff is now multicore and hard realtime.
My preference is xC, since that has multicore parallelism
baked in from the beginning, not bolted on as an afterthought.
Unfortunately that is only on the (delightful) xCORE
processors, so I also keep an eye on the progress of others
such as Rust. Time will tell.

Well, horses for courses. But tarring C++ in 2020 with a brush from
1989 is unpersuasive.

Cheers

Phil Hobbs


--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC / Hobbs ElectroOptics
Optics, Electro-optics, Photonics, Analog Electronics
Briarcliff Manor NY 10510

http://electrooptical.net
http://hobbs-eo.com
 
On 2020-08-14 15:39, Joe Gwinn wrote:
On Fri, 14 Aug 2020 10:30:12 -0700, John Larkin
jlarkin@highland_atwork_technology.com> wrote:

On Fri, 14 Aug 2020 10:32:56 -0400, Joe Gwinn <joegwinn@comcast.net
wrote:

On Wed, 12 Aug 2020 07:30:18 -0700, jlarkin@highlandsniptechnology.com
wrote:

[snip]

Have you ever written any code past Hello, World! that compiled
error-free and ran correctly the very first time? That\'s unheard of.

For the record, I have not - it\'s too slow to do it that way. But I
do have a war story from the early 1970s:

My first job out of school was as an engineer at the Federal
Communications Commission in Washington, DC. We had a Univac 1106
computer. The was in the days before Burroughs, as I recall.

One fine day, I was asked to help an much older enginer whose Fortran
program was tying the 1106 up for hours. This engineer was a very
methodical man. He would extensively desk-check his code, and was
proud that his code always ran the first time, without error.

The program estimated signal strengths of all other radio stations as
measured at the transmittinig antenna of each radio system. The
propagation code required the distance from the station under
consideration to all other stations, which he calculated anew for each
and every station, so the scaling law was n^3, for a few thousand
stations. No wonder it ground away all day.

It turns out that he was unaware that disk storage was easy in Univac
Fortran, or that he could compute the distance matrix once, and reload
it whenever needed. Now, his program took five or ten minutes, not 8
to 10 hours.

Joe Gwinn

While staying with a friend in Juneau, I wrote an RTOS on paper, with
a pencil, and mailed sheets back to the factory for them to enter and
assemble. Someone claimed that it had one bug.

For what machine, in what language, and how many lines of code?

I\'ve done much the same, but the RTOS was usually purchased and then
modified. These RTOSs were all in assembly code. The largest was
35,000 lines of SEL 32/55 (a IBM 360 clone) assembly, including the
file system, if memory serves.

There was much integration and OS debugging involved. Usually the
applications programmers would find the problem, and be stuck.
Eventually, they world call me, and I\'d go to the lab. In may cases,
I could tell them what they had done wrong. In a few cases, I\'d say
that this smelled like an OS problem, and take over with kernel-level
tools, usually with immediate success because they had handed it over
to me right in the middle of the problem.


I just checked my work, like good engineers do.

And so do we all, with varying degrees of success. I know from
experience when further checking is not worthwhile, and it\'s time for
the lab.

My favorite non-OS bug was actually in a program written in SEL 32/55
Fortran. When a certain subroutine was called, the sky fell. I
debugged at the Fortran level, which did isolate the problem to the
call of this specific subroutine, and then hit a wall. Dropped into
the assembly-level debugger, single-stepping the assembly code
generated by the Fortran compiler. Still no joy. Went back and
forth, multiple times. Stuck. Then, a crazy thought... Dropped a
level deeper, into the CPU microcode debugger, on the machine console,
and micro-stepped through the indexed load machine instruction where
the problem occurred. Bingo!

In the IBM 360 instruction set, there is no \"load word\" or \"load
double word\" instruction per se. There is a general \"load\"
instruction, the load width being determined by a field just above the
operand address field in the instruction word. What was happening was
that the indexed operations added the entire operand field to the
index register contents, and an overflow overlaid the load-width
field, changing the load width to a double, overlaying both the
intended register and an adjacent register. Oops. I forget which
value was incorrect, the operand or the register, but one or the other
had been stomped upon, but it didn\'t take long to trace it back to the
original cause.

The compiler had no idea that an uninvolved register had been stomped,
and no amount of staring at the Fortran code was going to help.

In my whole career, I\'ve had to resort to microcode-level debugging
only this one time.

Joe Gwinn

You had a beautiful tool set for the day though!

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC / Hobbs ElectroOptics
Optics, Electro-optics, Photonics, Analog Electronics
Briarcliff Manor NY 10510

http://electrooptical.net
http://hobbs-eo.com
 
On 16/08/20 01:26, Phil Hobbs wrote:
On 2020-08-15 19:00, Tom Gardner wrote:
On 15/08/20 18:15, Phil Hobbs wrote:
On 2020-08-15 13:01, Tom Gardner wrote:
On 15/08/20 16:58, Phil Hobbs wrote:
On 2020-08-15 04:21, Tom Gardner wrote:
On 15/08/20 03:51, Les Cargill wrote:
Tom Gardner wrote:
On 14/08/20 04:13, Les Cargill wrote:
Tom Gardner wrote:
Rust and Go are showing significant promise in the
marketplace,

Mozzlla seems to have dumped at least some of the Rust team:

https://www.reddit.com/r/rust/comments/i7stjy/how_do_mozilla_layoffs_affect_rust/


I doubt they will remain unemployed. Rust is gaining traction
in wider settings.


I dunno - I can\'t separate the messaging from the offering. I\'m
fine with a C/C++ compiler so I have less than no incentive to
even become remotely literate about Rust.

The Rustaceans seem obsessed with stuff my cohort ( read:eek:ld people )
learned six months into their first C project. But there may
well be benefits I don\'t know about.

Too many people /think/ they know C.

I first used C in ~81, and learned it from the two
available books, which I still have. The second book
was, of course, a book on traditional mistakes in C
\"The C Puzzle Book\".

It is horrifying that Boehm thought it worth writing this
http://www.hpl.hp.com/techreports/2004/HPL-2004-209.pdf
http://hboehm.info/misc_slides/pldi05_threads.pdf
and that it surprised many C practitioners.

Rust directly addresses some of the pain points.


It is not-not a thing; the CVE list shows that. I am just appalled
that these defects are released.

If you think C and C++ languages and implementations
are fault-free, I\'d like to visit your planet sometime :)

You can start with the C++ FQA http://yosefk.com/c++fqa/

Watching (from a distance) the deliberations of the C/C++
committees in the early 90s was enlightening, in a bad way.
One simple debate (which lasted years) was whether is ought
to be possible or impossible to \"cast away constness\".
There are good reasons for both, and they cannot be
reconciled.
(Yes: to allow debuggers and similar tools to inspect memory.
No: to enable safe aggressive optimisations)



Linus Torvalds is vociferously and famously opposed to having
C++ anywhere near the Linux kernel (good taste IMNSHO).

Don\'t take any cues from Linus Torvalds. He\'s why my deliverables
at one gig were patch files. I\'ve no objection to that but geez...

And C++ is Just Fine. Now. It took what, 20 years?

Worse: 30 years!

I first used it in \'88, and thought it a regression
over other available languages.

I\'ve always liked C++.  The OOP paradigm maps very naturally onto the sorts
of coding I do: embedded, instrument control, and simulations.

Ditto in spades, except for C++.

I was doing primitive version of OOP in C around \'82, for
embedded machine control.

When I came across OOP in 85, I instantly recognised that
two customer statements mapped directly onto OOP:
  - \"I\'d like three of those\" => object creation
  - \"just like that example, except\" => class hierarchy
And seeing what was possible in Smalltalk (container classes,
reflection) made me a convert!

Unfortunately ParcPlace[1] Smalltalk was totally unsuited to
embedded systems, so I looked out for alternatives.

In \'88 I evaluated C++ and Objective-C. The latter is really
Smalltalk without a GC, so rapidly became productive, using
the available classes and adding my own. But C++ was dreadful;
there was no class hierarchy and reversing any mistaken design
choice was unnecessarily painful.

Then in the early 90s, I watched the C and C++ committees
wrangling endlessly over pretty fundamental points /without/
there being /any possibility/ of adequately reconciling the
two viewpoints. (Simple example: should it be allowed or
forbidden to cast away a const declaration. There are good
arguments for both, but the necessary choice has far-reaching
implications)

At that point I realised the language was building a bigger
castle on sand. Not a good position to be in.

Then, when I first used Java in \'96, people were amazed at
how quickly I could create complex applications (3D graphs
of cellular system performance). That was because after
only a couple of years, Java came with a large highly functional
class library - something that C and C++ had conspicuously
failed to manage in a decade. And you could simply plug in
random libraries from random companies, intertwine them with
your data, and it simply /worked/ as expected.

At that point I completely gave up on C++, on the basis that
if C++ is the best answer, you\'ve asked the wrong question!

IOW you\'ve never used anything newer than C++89.

Used, no. But as I mentioned I kept an open eye (and mind)
on what the committees were up to.

After 8 years of heated nattering, there was insufficient
progress, especially compared with what was being achieved
in other languages in a quarter of that time.

They couldn\'t get their act together, so I moved to languages
where they /had/ got their act together. For simple embedded
stuff I still used C.


I agree with you about that language, but not about C++98 or newer.

Yebbut, even that late they hadn\'t got their act together
w.r.t. threading.

Yebbut even in 2005 they hadn\'t, and many people had
forgotten that they /couldn\'t/ - hence Boehm\'s paper
http://www.hpl.hp.com/techreports/2004/HPL-2004-209.pdf
http://hboehm.info/misc_slides/pldi05_threads.pdf

Finally, almost a quarter of a century(!) later in
2011, a memory model appeared. IMHO that that will
take a long time to be proven sufficient and correctly
implemented.

Quarter of a century is rather a long time for a
language to be insufficient to implement a major
library (Pthreads)!

I was using pthreads very successfully 2003ish.  AFAIK the first microcomputer
OS that supported multithreaded programming was OS/2 2.0 in 1992, so that\'s not
20 years in my book.  Between 1992 and 2003 I was writing multithreaded C
programs on the OS/2 and Windows OS thread APIs, which worked fine, partly
because the compiler vendors also owned the OS. ;)

Sure, but that will have depended on things that are (were?)
explicitly not defined in C. Pthreads operation would have
been based on implementation-dependent behaviour.

That is inelegant at best, and fragile at worst.
(Just like castles built on sand.)


The most beautiful debugger I have ever used is the one that came with VisualAge
C++ v. 3.08 for OS/2, circa 2001.  Streets ahead of anything I\'ve seen on Linux
to this day--hit \'pause\' and all the threads paused _right_now_, rather than the
UI thread pausing now and the others not till the end of their timeslice.  That
worked even on my SMP box, not just on uniprocessors.

Yes, multicore/thread debuggers aren\'t usually very good.

That\'s why I aim to get core functionality debugged in a
single thread, and then rely on simple and predictable
and reliable \"high level\" multithread/core design patterns.

That excludes rolling my own mutex-based code, and implies
re-using well-conceived and well-tested libraries built
on a solid base. Often those are RTOS libraries or libraries
inspired by RTOS libraries.


Most of my stuff is C++03ish but I\'m warming up to the standard library and
the C++11-17 features.

If I\'m being catty I\'ll ask if you use the same subset
of the language as those around you and the libraries
you use :)

Incompatibility of third-party libraries was a serious problem in the beginning,
for sure.  Everybody and his dog had his own complex number type, for instance.
Back then, my solution (which worked fine for my purposes) was to stick with
cstdio>, <cstdlib>, and stuff I wrote myself.

Or boolean, or String, or..., and especially generic container
classes.

My continually reinventing a wheel seemed a waste of my life,
especially when the wheels were slightly off centre. Doubly so
when it demonstrably wouldn\'t have been necessary if I had a
better starting point.

As the old joke goes...
\"How do I get to the Blarney Stone?\"
\"Well sir, if I wanted to go there, I wouldn\'t start from here\".


Nowadays with namespaces and a much more capable standard library, there\'s much
less reason for that.  There are still warts, for sure, of which the one I love
most to hate is <iostreams>.  It\'s okay for light duty use, but the moment you
try to do formatted output you disappear into the long dim corridors of
iomanip>, perhaps never to emerge. ;)

I can believe that, and might be persuaded such complexity
was inevitable and tolerable if buried in a library.

OTOH, I\'ve head many stories about the interaction of core
language features such as exceptions and templates. That
just made me think \"dragons; avoid\".


How much of the C++ Frequently Questioned Answers has
become obsolete now? http://yosefk.com/c++fqa/
That is a laugh-and-weep diatribe with too much truth.

Dunno.  I skimmed through it once iirc but wasn\'t that impressed.

It is a \"many a truth is hidden in jest\" type of diatribe.


My embedded stuff is now multicore and hard realtime.
My preference is xC, since that has multicore parallelism
baked in from the beginning, not bolted on as an afterthought.
Unfortunately that is only on the (delightful) xCORE
processors, so I also keep an eye on the progress of others
such as Rust. Time will tell.

Well, horses for courses.  But tarring C++ in 2020 with a brush from 1989 is
unpersuasive.

There\'s validity to that, but...
- the 1989 stuff is still there and visible
- frequently you simply cannot use the latest stuff, for
various corporate and technical reasons
- it is still sand, albeit with a few piles driven into
the ground :)

Life moves on, hopefully to better things.
 
On 14/8/20 2:36 pm, Phil Hobbs wrote:
On 2020-08-13 23:35, Les Cargill wrote:
jlarkin@highlandsniptechnology.com wrote:
On Wed, 12 Aug 2020 08:33:20 +0100, Martin Brown
\'\'\'newspam\'\'\'@nonad.co.uk> wrote:

snip

The real dollar cost of bad software is gigantic. There should be no
reason for a small or mid-size company to continuously pay IT security
consultants, or to run AV software.


It\'s not even accounted for, nor is it an actual cost in the usual sense
of the word - nobody\'s trying to make this actually less expensive.


C invites certain dangerous practices that attackers ruthlessly exploit
like loops copying until they hit a null byte.

Let bad programs malfunction or crash. But don\'t allow a stack or
buffer overflow to poke exploits into code space. The idea of
separating data, code, and stack isn\'t hard to understand, or even
hard to implement.

We probably need to go to pseudocode-only programs. The machine needs
to be protected from programmers and from bad architectures. Most
programmers never learn about machine-level processes.


That\'s what \"managed languages\" like Java or C# do. It\'s all bytecode
in a VM.

Sort of like UCSD Pascal, circa 1975. ;)

In the same way that a bacterium is sort of like a mammal. Both have
DNA. The similarity ends there.
 
That\'s what \"managed languages\" like Java or C# do. It\'s all bytecode
in a VM.

Sort of like UCSD Pascal, circa 1975. ;)

In the same way that a bacterium is sort of like a mammal. Both have
DNA. The similarity ends there.

If you would care to compare and contrast Java and UCSD Pascal, I\'d read with interest.

Cheers

Phil Hobbs
 
On 17/8/20 7:09 am, pcdhobbs@gmail.com wrote:
That\'s what \"managed languages\" like Java or C# do. It\'s all bytecode
in a VM.

Sort of like UCSD Pascal, circa 1975. ;)

In the same way that a bacterium is sort of like a mammal. Both have
DNA. The similarity ends there.

If you would care to compare and contrast Java and UCSD Pascal, I\'d read with interest.

I could, but don\'t care to say much. Any bytecode is like any other for
the most part, but JVM is designed to JIT, and the level of
sophistication in HotSpot is analogous to the mammalian superstructure.
It really is an immense tower of technology. Personally I prefer the AOT
idea to JIT, but I respect the achievement.

CH
 
I could, but don\'t care to say much. Any bytecode is like any other for
the most part

My point. Pretty far from your earlier response, which I reprise:

\">> Sort of like UCSD Pascal, circa 1975. ;)

In the same way that a bacterium is sort of like a mammal. Both have
DNA. The similarity ends there.

So any bacterium is like any mammal \"for the most part.\" good to know!
;)

Cheers

Phil Hobbs
 
On 17/08/20 02:46, Clifford Heath wrote:
On 17/8/20 7:09 am, pcdhobbs@gmail.com wrote:
That\'s what \"managed languages\" like Java or C# do. It\'s all bytecode
in a VM.

Sort of like UCSD Pascal, circa 1975. ;)

In the same way that a bacterium is sort of like a mammal. Both have
DNA. The similarity ends there.

If you would care to compare and contrast Java and UCSD Pascal, I\'d read with
interest.

I could, but don\'t care to say much. Any bytecode is like any other for the most
part, but JVM is designed to JIT, and the level of sophistication in HotSpot is
analogous to the mammalian superstructure. It really is an immense tower of
technology. Personally I prefer the AOT idea to JIT, but I respect the achievement.

Never confuse HotSpot with JIT.

JIT is a runtime peephole optimiser, and hence
uses only local information about the code emitted
by the compiler. That code is based on what the

compiler can guess/presume about the code /might/
behave.

HotSpot looks at what the code is /actually/ doing,
and optimises the shit out of that.

AOT is HotSpot without knowing what the code will do.
It optimises for the instruction set in a particular
processor (and there are many variants between AMD/intel!)
I don\'t know how it deals with processors being
changed after installation, e.g. for all the recent
cache-timing malware attacks.

HotSpot and JIT can take account of \"removed\" processor
functionality, by replacing the runtime.

AOT is the only game for embedded.

HotSpot has major advantages elsewhere.
 

Welcome to EDABoard.com

Sponsor

Back
Top