the hot new programming language

On 7/2/2015 6:42 PM, Phil Hobbs wrote:
On 7/2/2015 5:29 PM, bitrex wrote:
On 7/2/2015 5:20 PM, John Larkin wrote:
On Thu, 02 Jul 2015 17:12:36 -0400, bitrex
bitrex@de.lete.earthlink.net> wrote:

On 7/2/2015 4:21 PM, John Larkin wrote:
On Thu, 02 Jul 2015 16:08:15 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 07/02/2015 01:47 PM, John Larkin wrote:
On Thu, 02 Jul 2015 11:21:42 +1000, Sylvia Else
sylvia@not.at.this.address> wrote:

On 2/07/2015 5:13 AM, John Larkin wrote:

http://www.itworld.com/article/2694378/college-students-learning-cobol-make-more-money.html


The revival of Basic is next.


Apparently people who can resist the urge to gnaw their own leg
off from
boredom command a premium.

Sylvia.

Yup. Accountants. Lawyers. Plastic surgeons.

Cobol was designed so that bankers could code. It was brilliant.

https://en.wikipedia.org/wiki/COBOL#History_and_specification

Two of the designers were women, who were apparently more interested
in solving a real problem than they were interested in playing mental
games. Compare Cobol to c or Pascal or APL.


And run screaming in the other direction. Cobol is verbose and
inflexible. Just the sheer amount of typing would slow me down a lot.

C was described as designed by geniuses to be used by geniuses. But
most programmers aren't geniuses. Most people need hard typing and
runtime bounds checking and proper memory management to keep out of
trouble; they need verbose. I cite basically all Microsoft products.


They weren't geniuses, they just knew that they couldn't do fucking
runtime bounds checking and "proper" memory management on a PDP-11 with
as much processing power as a modern clock radio

That was 40 years ago.

Actually, the 11 had great memory management hardware, but c isn't
designed to be able to use it. Everything gets mixed up.


Right, the main issue is what it always is, of course: nobody ever
expected the language to be as long-lived as it was, and then once it
becomes apparent that it actually is going to be around for a long time
they can't update it or add many new features for fear of breaking
backwards-compatability for a bunch of legacy shit


Riiighhttt. Which is why C++11 is just the same as K&R 1.0.

Cheers

Phil Hobbs

Sure, you can compile some C code with a C++ compiler, but modern C++
and ordinary C are so distant as to be barely recognizable as the same
language.

And one could very easily argue that the mess that is C++ is _precisely_
the reason you shouldn't take a 40 year old language and start trying to
tack all sorts of modern features onto it...
 
On 7/2/2015 8:12 PM, John Larkin wrote:
On Thu, 02 Jul 2015 18:38:27 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 7/2/2015 6:06 PM, John Larkin wrote:
On Thu, 02 Jul 2015 17:29:33 -0400, bitrex
bitrex@de.lete.earthlink.net> wrote:

On 7/2/2015 5:20 PM, John Larkin wrote:
On Thu, 02 Jul 2015 17:12:36 -0400, bitrex
bitrex@de.lete.earthlink.net> wrote:

On 7/2/2015 4:21 PM, John Larkin wrote:
On Thu, 02 Jul 2015 16:08:15 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 07/02/2015 01:47 PM, John Larkin wrote:
On Thu, 02 Jul 2015 11:21:42 +1000, Sylvia Else
sylvia@not.at.this.address> wrote:

On 2/07/2015 5:13 AM, John Larkin wrote:

http://www.itworld.com/article/2694378/college-students-learning-cobol-make-more-money.html

The revival of Basic is next.


Apparently people who can resist the urge to gnaw their own leg off from
boredom command a premium.

Sylvia.

Yup. Accountants. Lawyers. Plastic surgeons.

Cobol was designed so that bankers could code. It was brilliant.

https://en.wikipedia.org/wiki/COBOL#History_and_specification

Two of the designers were women, who were apparently more interested
in solving a real problem than they were interested in playing mental
games. Compare Cobol to c or Pascal or APL.


And run screaming in the other direction. Cobol is verbose and
inflexible. Just the sheer amount of typing would slow me down a lot.

C was described as designed by geniuses to be used by geniuses. But
most programmers aren't geniuses. Most people need hard typing and
runtime bounds checking and proper memory management to keep out of
trouble; they need verbose. I cite basically all Microsoft products.


They weren't geniuses, they just knew that they couldn't do fucking
runtime bounds checking and "proper" memory management on a PDP-11 with
as much processing power as a modern clock radio

That was 40 years ago.

Actually, the 11 had great memory management hardware, but c isn't
designed to be able to use it. Everything gets mixed up.


Right, the main issue is what it always is, of course: nobody ever
expected the language to be as long-lived as it was, and then once it
becomes apparent that it actually is going to be around for a long time
they can't update it or add many new features for fear of breaking
backwards-compatability for a bunch of legacy shit

I'd think that a good c compiler and a modern CPU could separate
i/d/stack spaces and prevent dumb buffer errors at least. Executing
data is unforgivable.



They can, and do. It's all in the compiler options. There are also all
kinds of bounds-checked arrays and other security features in the C++
standard library. But they're optional, which is as it should be.

Otherwise you break a lot of code.

Addressing out of bounds and executing data *should* be broken. By the
hardware.

I agree, but that's a separate issue. Outlawing C strings, for
instance, outlaws good code as well as bad. Those of us who know that,
for instance, strncpy() doesn't append a null if it runs out of space,
know enough to unconditionally put the null in there. (Yes, it's a
stupid design, but alternatives are available. The auto industry's
standards are actually pretty useful.)

My standard laptop is a Thinkpad 4x0 series. Their BIOS allows you to
turn on data execution prevention, and all modern MMUs have the same
capability.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510

hobbs at electrooptical dot net
http://electrooptical.net
 
On 7/2/2015 8:26 PM, bitrex wrote:
On 7/2/2015 6:42 PM, Phil Hobbs wrote:
On 7/2/2015 5:29 PM, bitrex wrote:
On 7/2/2015 5:20 PM, John Larkin wrote:
On Thu, 02 Jul 2015 17:12:36 -0400, bitrex
bitrex@de.lete.earthlink.net> wrote:

On 7/2/2015 4:21 PM, John Larkin wrote:
On Thu, 02 Jul 2015 16:08:15 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 07/02/2015 01:47 PM, John Larkin wrote:
On Thu, 02 Jul 2015 11:21:42 +1000, Sylvia Else
sylvia@not.at.this.address> wrote:

On 2/07/2015 5:13 AM, John Larkin wrote:

http://www.itworld.com/article/2694378/college-students-learning-cobol-make-more-money.html




The revival of Basic is next.


Apparently people who can resist the urge to gnaw
their own leg off from boredom command a premium.

Sylvia.

Yup. Accountants. Lawyers. Plastic surgeons.

Cobol was designed so that bankers could code. It was
brilliant.

https://en.wikipedia.org/wiki/COBOL#History_and_specification



Two of the designers were women, who were apparently more
interested in solving a real problem than they were
interested in playing mental games. Compare Cobol to c
or Pascal or APL.


And run screaming in the other direction. Cobol is
verbose and inflexible. Just the sheer amount of typing
would slow me down a lot.

C was described as designed by geniuses to be used by
geniuses. But most programmers aren't geniuses. Most people
need hard typing and runtime bounds checking and proper
memory management to keep out of trouble; they need
verbose. I cite basically all Microsoft products.


They weren't geniuses, they just knew that they couldn't do
fucking runtime bounds checking and "proper" memory
management on a PDP-11 with as much processing power as a
modern clock radio

That was 40 years ago.

Actually, the 11 had great memory management hardware, but c
isn't designed to be able to use it. Everything gets mixed up.


Right, the main issue is what it always is, of course: nobody
ever expected the language to be as long-lived as it was, and
then once it becomes apparent that it actually is going to be
around for a long time they can't update it or add many new
features for fear of breaking backwards-compatability for a bunch
of legacy shit


Riiighhttt. Which is why C++11 is just the same as K&R 1.0.

Cheers

Phil Hobbs

Sure, you can compile some C code with a C++ compiler, but modern C++
and ordinary C are so distant as to be barely recognizable as the
same language.

Except that with very rare exceptions, all ISO C programs are also valid
C++ programs. Even old-time K&R code can be compiled by Visual C++,
gcc, Intel C++, and every other variant I know of.

Pretty far from "they can't update it or add many new features".
And one could very easily argue that the mess that is C++ is
_precisely_ the reason you shouldn't take a 40 year old language and
start trying to tack all sorts of modern features onto it...

I like C++ very much, because it's very general and very powerful. I'm
all ears for your arguments to the contrary.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510

hobbs at electrooptical dot net
http://electrooptical.net
 
On Thu, 02 Jul 2015 20:20:56 -0400, Joe Gwinn <joegwinn@comcast.net>
wrote:

In article <3e2bpap8e70o7mnep2joboecjrdsggpp96@4ax.com>, John Larkin
jlarkin@highlandtechnology.com> wrote:

On Thu, 2 Jul 2015 11:51:54 -0700 (PDT),
bloggs.fredbloggs.fred@gmail.com wrote:

On Thursday, July 2, 2015 at 2:47:36 PM UTC-4, rickman wrote:
On 7/2/2015 2:30 PM, bloggs.fredbloggs.fred@gmail.com wrote:
On Thursday, July 2, 2015 at 1:47:12 PM UTC-4, John Larkin wrote:

https://en.wikipedia.org/wiki/COBOL#History_and_specification

Two of the designers were women, who were apparently more interested
in solving a real problem than they were interested in playing mental
games. Compare Cobol to c or Pascal or APL.


A bunch of apocryphal bullshyte, knee deep, from the department which to
this day is un-auditable because of systemic incompetence and
criminality.

Which department is this? Wikipedia, NBS or any of the many books
written and cited?

" It was created as part of a US Department of Defense effort to create a
portable programming language for data processing. Intended as a temporary
stopgap, the Department of Defense promptly forced computer manufacturers to
provide it, resulting in its widespread adoption."

That part is right...

https://en.wikipedia.org/wiki/COBOL


--

Rick

The language

It's a shame that ADA wasn't as widely accepted. Programmers tend to
hate safe languages that make them be careful.

In the day, I was one of the many programmers who suffered through
Ada83. I made a little career of making Ada run fast enough to be
plausible in realtime applications like radars.

The method was simple but brutal - remove all of Ada that didn't look
exactly like Pascal, and if that wasn't enough, resort to assembly.
This was still Ada enough to qualify as Ada, and to meet the DoD
mandate.

Datapoint: My team implemented what would now be called middleware in
a severe subset of Ada83 plus some assembly code on a DEC VAX to
replace a pure-Ada message communications infrastructure. The subset
Ada plus assembly approach was literally ten times faster than the pure
Ada, and saved the project.

By the time Ada95 came out, it was too late - C/C++ had won.

Ada died because it was designed by academics who had no notion of
realtime, and thus made blunder after blunder, such that not even a DoD
Mandate could save Ada.

Pascal had the same problem. But they both had the idea that a
computer language should be safe. So now we have a nearly $100 billion
anti-virus industry, and lots of products that have had thousands of
security bugs.

It's not just the language c that is dangerous, it's the programming
culture that surrounds it. Rockets crash and planes fall out of the
sky because of bad code. Programming is the worst thing that
technology does, and nobody seems to care much.


--

John Larkin Highland Technology, Inc
picosecond timing precision measurement

jlarkin att highlandtechnology dott com
http://www.highlandtechnology.com
 
On 7/2/2015 8:06 PM, John Larkin wrote:
On Thu, 02 Jul 2015 18:07:30 -0400, rickman <gnuarm@gmail.com> wrote:

On 7/2/2015 6:03 PM, John Larkin wrote:

We can always buy faster CPUs.

That is literally the stupidest thing I've ever seen come from you.

You are being a jerk again. That seems to be your nature.

Really? Are you going to try to defend such a statement? There is no
limit to the speed CPU you can buy? No real limit? No practical limit?
No limit to what you can use in a given design?

I'm not being a jerk. I'm just trying to get you to see what you just
said. Care to comment on *your* statement?

--

Rick
 
On 7/2/2015 8:20 PM, Joe Gwinn wrote:
In article <3e2bpap8e70o7mnep2joboecjrdsggpp96@4ax.com>, John Larkin
jlarkin@highlandtechnology.com> wrote:

On Thu, 2 Jul 2015 11:51:54 -0700 (PDT),
bloggs.fredbloggs.fred@gmail.com wrote:

On Thursday, July 2, 2015 at 2:47:36 PM UTC-4, rickman wrote:
On 7/2/2015 2:30 PM, bloggs.fredbloggs.fred@gmail.com wrote:
On Thursday, July 2, 2015 at 1:47:12 PM UTC-4, John Larkin wrote:

https://en.wikipedia.org/wiki/COBOL#History_and_specification

Two of the designers were women, who were apparently more interested
in solving a real problem than they were interested in playing mental
games. Compare Cobol to c or Pascal or APL.


A bunch of apocryphal bullshyte, knee deep, from the department which to
this day is un-auditable because of systemic incompetence and
criminality.

Which department is this? Wikipedia, NBS or any of the many books
written and cited?

" It was created as part of a US Department of Defense effort to create a
portable programming language for data processing. Intended as a temporary
stopgap, the Department of Defense promptly forced computer manufacturers to
provide it, resulting in its widespread adoption."

That part is right...

https://en.wikipedia.org/wiki/COBOL


--

Rick

The language

It's a shame that ADA wasn't as widely accepted. Programmers tend to
hate safe languages that make them be careful.

In the day, I was one of the many programmers who suffered through
Ada83. I made a little career of making Ada run fast enough to be
plausible in realtime applications like radars.

The method was simple but brutal - remove all of Ada that didn't look
exactly like Pascal, and if that wasn't enough, resort to assembly.
This was still Ada enough to qualify as Ada, and to meet the DoD
mandate.

Datapoint: My team implemented what would now be called middleware in
a severe subset of Ada83 plus some assembly code on a DEC VAX to
replace a pure-Ada message communications infrastructure. The subset
Ada plus assembly approach was literally ten times faster than the pure
Ada, and saved the project.

By the time Ada95 came out, it was too late - C/C++ had won.

Ada died because it was designed by academics who had no notion of
realtime, and thus made blunder after blunder, such that not even a DoD
Mandate could save Ada.

If you are an Ada expert, I will defer to your judgement. But was the
speed problem with Ada inherent in the language (therefor the fault of
the language designers) or was it just the shortcomings of the
implementations? What could have changed in Ada95 that would speed up
implementations? I know C has gotten faster over the years to the point
where it is hard to write hand optimized assembly that is faster. At
least that is what I read. I don't use C much these days and I can't
remember the last time I wrote assembly other than for stack machines
with a very simple assembly language.

I use mostly VHDL for hardware design which is very Ada like and Forth
for apps which is like a stack machine assembly language. A bit of a
strange dichotomy, but there is no Forth like HDL.

--

Rick
 
On Thu, 02 Jul 2015 20:46:47 -0400, rickman <gnuarm@gmail.com> wrote:

On 7/2/2015 8:06 PM, John Larkin wrote:
On Thu, 02 Jul 2015 18:07:30 -0400, rickman <gnuarm@gmail.com> wrote:

On 7/2/2015 6:03 PM, John Larkin wrote:

We can always buy faster CPUs.

That is literally the stupidest thing I've ever seen come from you.

You are being a jerk again. That seems to be your nature.

Really? Are you going to try to defend such a statement? There is no
limit to the speed CPU you can buy? No real limit? No practical limit?
No limit to what you can use in a given design?

I'm not being a jerk. I'm just trying to get you to see what you just
said. Care to comment on *your* statement?

I haven't called you stupid before, but I do now.




--

John Larkin Highland Technology, Inc
picosecond timing precision measurement

jlarkin att highlandtechnology dott com
http://www.highlandtechnology.com
 
On 7/2/2015 8:54 PM, John Larkin wrote:
On Thu, 02 Jul 2015 20:46:47 -0400, rickman <gnuarm@gmail.com> wrote:

On 7/2/2015 8:06 PM, John Larkin wrote:
On Thu, 02 Jul 2015 18:07:30 -0400, rickman <gnuarm@gmail.com> wrote:

On 7/2/2015 6:03 PM, John Larkin wrote:

We can always buy faster CPUs.

That is literally the stupidest thing I've ever seen come from you.

You are being a jerk again. That seems to be your nature.

Really? Are you going to try to defend such a statement? There is no
limit to the speed CPU you can buy? No real limit? No practical limit?
No limit to what you can use in a given design?

I'm not being a jerk. I'm just trying to get you to see what you just
said. Care to comment on *your* statement?

I haven't called you stupid before, but I do now.

Lol. Whatever. Have a nice evening. :)

--

Rick
 
On 7/2/2015 10:17 PM, John Larkin wrote:
On Thu, 02 Jul 2015 17:04:49 -0400, bitrex
bitrex@de.lete.earthlink.net> wrote:

On 7/2/2015 3:24 PM, John Larkin wrote:
On Thu, 02 Jul 2015 15:18:29 -0400, bitrex
bitrex@de.lete.earthlink.net> wrote:

On 7/2/2015 3:00 PM, John Larkin wrote:
It's a shame that ADA wasn't as widely accepted. Programmers tend to
hate safe languages that make them be careful.

As someone who knows many folks who work in the software industry/game
design, it's kind of cute when EEs talk about things that they're so
very sure about...;-)

How many buffer overrun vulnerabilities has Windows had so far? Round
your answer to the nearest thousand.



I would feel fairly hard-pressed to use a Microsoft product as some kind
of archetypical example to give insight into what "Programmers" think, haha

Most big software projects are buggy messes or outright, often
gigabuck, failures. I don't know why CS departments don't seem to
care. Somebody should figure out how to do good software, and teach
it.

Maybe we should just stop using software until someone figures it out?
How about if we make all computers analog? Yeah, that has got to work
better, right? The digital stuff is boring and easy, even if no one
seems to be able to do it right. I guess the whole world is just stupid...

--

Rick
 
On Thu, 02 Jul 2015 17:04:49 -0400, bitrex
<bitrex@de.lete.earthlink.net> wrote:

On 7/2/2015 3:24 PM, John Larkin wrote:
On Thu, 02 Jul 2015 15:18:29 -0400, bitrex
bitrex@de.lete.earthlink.net> wrote:

On 7/2/2015 3:00 PM, John Larkin wrote:
It's a shame that ADA wasn't as widely accepted. Programmers tend to
hate safe languages that make them be careful.

As someone who knows many folks who work in the software industry/game
design, it's kind of cute when EEs talk about things that they're so
very sure about...;-)

How many buffer overrun vulnerabilities has Windows had so far? Round
your answer to the nearest thousand.



I would feel fairly hard-pressed to use a Microsoft product as some kind
of archetypical example to give insight into what "Programmers" think, haha

Most big software projects are buggy messes or outright, often
gigabuck, failures. I don't know why CS departments don't seem to
care. Somebody should figure out how to do good software, and teach
it.




--

John Larkin Highland Technology, Inc
picosecond timing laser drivers and controllers

jlarkin att highlandtechnology dott com
http://www.highlandtechnology.com
 
On 7/2/2015 9:43 PM, Joe Gwinn wrote:
In article <mn4mc1$o6m$1@dont-email.me>, rickman <gnuarm@gmail.com
wrote:

What could have changed in Ada95 that would speed up
implementations?

Fixing the many blunders of Ada83 was a large part of it. (There was
lots of field experience by then.) Adding true multiprocessor support
and the ability to control hardware was another. Adding support for
object-oriented programming (borrowed from C++). Etc.

All successful programming languages so far have had the following
things in common: Developed by a handful of people who wrote the
compiler at the same time. Not a committee writing a 300-page language
description in a vacuum. The early language and compilers co-evolved
as real users (typically the colleagues of the handful) used the
language for real work, and brought the problems back to the handful,
who fixed the problem by changing the language if needed., for a few
full versions. Full darwinist selection in the marketplace beyond the
handful and their colleagues - hundreds of languages qualify for the
common characteristics named before, but only a few languages achieve
more than a few hundred users. After such a language has succeeded in
the marketplace for some time, and has matured, it is formally
standardized (and thus frozen).

Ada83 violated all of the above.

As I said, I don't really know Ada, so I defer to your knowledge.


I know C has gotten faster over the years to the point
where it is hard to write hand optimized assembly that is faster. At
least that is what I read. I don't use C much these days and I can't
remember the last time I wrote assembly other than for stack machines
with a very simple assembly language.

Assembly language is always faster than compiled language, if competent
assembly programmers are used. And assembly can do things that are
very difficult in any compiled language, with direct hardware control
being a traditional area. But the cost is about four times that of
compiled C, so assembly is used sparingly. The UNIX kernel was about
4% assembly.

Again, I don't have first hand knowledge of compiler efficiency, but I
have discussed this with others who do and they tell me compilers are
every bit as good as hand coded assembly most of the time. That is no
surprise to me as this is a problem that has been worked on for a long
time. So forgive me if I don't defer to your opinion on this one.


There is also a nasty trick: write the code in C, and look at the
resulting generated assembly code. If it's worse than what one can
write manually, paraphrase the C code and try again. Eventually, the
optimum paraphrase will be found. In practice, C compilers are
sufficiently alike the a good paraphrase works on most compilers.

Do you have any examples?


I use mostly VHDL for hardware design which is very Ada like and Forth
for apps which is like a stack machine assembly language. A bit of a
strange dichotomy, but there is no Forth-like HDL.

VHDL was expressly based on Ada, and Verilog on C.

FORTH is a pure stack language, most resembling a HP calculator (with
RPN), and was developed for machine control by an astronomer. The
machines being controlled are such as large telescopes.

I was very interested in FORTH when it went public and they were being
very close-mouthed. I ended up developing my own prototype language,
which I called Fifth. The interpreter core was written in assembly, as
that was how to get the necessary speed. But I found Fifth to be too
limiting for what I was then doing, and never pursued it.

Closed mouth? When it went public? Was this in the early 70's? Fig
forth was freely available in the 70's. Who was closed mouth about it?
I remember attending a conference on Forth early in my career, about
1982 or '83. Hardly closed mouth then. What Forth did you find that
the "inner interpreter" wasn't done in assembly? These days commercial
Forths are compilers with no interpretation for compiled code. Speed is
great with good support and documentation if not actual source. You've
missed a lot.

I'm currently using a Forth that runs on the target processor of a TI
Stellaris Cortex M3 MCU, not even a high end one.

--

Rick
 
In article <cbmbpat3j9lmlaqa8v2p41b33tn079vaur@4ax.com>,
jlarkin@highlandtechnology.com says...
Ada plus assembly approach was literally ten times faster than the pure
Ada, and saved the project.

By the time Ada95 came out, it was too late - C/C++ had won.

Ada died because it was designed by academics who had no notion of
realtime, and thus made blunder after blunder, such that not even a DoD
Mandate could save Ada.

Pascal had the same problem. But they both had the idea that a
computer language should be safe. So now we have a nearly $100 billion
anti-virus industry, and lots of products that have had thousands of
security bugs.

It's not just the language c that is dangerous, it's the programming
culture that surrounds it. Rockets crash and planes fall out of the
sky because of bad code. Programming is the worst thing that
technology does, and nobody seems to care much.
And you'll find that many gear head coders will jump fast to
point the finger at some user or user's PC having issues
among otherthings, instead of actually looking at their code
and admitting to error. And when they find bad code, they still
won't admit to it. They also go as far as getting a few people
to collaborate with them. Leading them down that dark alley.

I find it ironic how easy the public has become blinded by those
that write software and then load you day after day with security
fixes, when really it's their buggy code they are fixing, before too
many Figure it out.

Yes, the viruses, security threats, pumped up as much as possible
makes a nice cover story for bad code, written by lazzy coders, stealing
it from others while not being to concerned about it having bugs or some
back door injected in it.

You can say, it's the blind leading the blind !


Jamie
 
In article <mn4mc1$o6m$1@dont-email.me>, rickman <gnuarm@gmail.com>
wrote:

On 7/2/2015 8:20 PM, Joe Gwinn wrote:
In article <3e2bpap8e70o7mnep2joboecjrdsggpp96@4ax.com>, John Larkin
jlarkin@highlandtechnology.com> wrote:

On Thu, 2 Jul 2015 11:51:54 -0700 (PDT),
bloggs.fredbloggs.fred@gmail.com wrote:

On Thursday, July 2, 2015 at 2:47:36 PM UTC-4, rickman wrote:
On 7/2/2015 2:30 PM, bloggs.fredbloggs.fred@gmail.com wrote:
On Thursday, July 2, 2015 at 1:47:12 PM UTC-4, John Larkin wrote:

https://en.wikipedia.org/wiki/COBOL#History_and_specification

Two of the designers were women, who were apparently more interested
in solving a real problem than they were interested in playing mental
games. Compare Cobol to c or Pascal or APL.


A bunch of apocryphal bullshyte, knee deep, from the department which to
this day is un-auditable because of systemic incompetence and
criminality.

Which department is this? Wikipedia, NBS or any of the many books
written and cited?

" It was created as part of a US Department of Defense effort to create a
portable programming language for data processing. Intended as a temporary
stopgap, the Department of Defense promptly forced computer manufacturers
to
provide it, resulting in its widespread adoption."

That part is right...

https://en.wikipedia.org/wiki/COBOL


--

Rick

The language

It's a shame that ADA wasn't as widely accepted. Programmers tend to
hate safe languages that make them be careful.

In the day, I was one of the many programmers who suffered through
Ada83. I made a little career of making Ada run fast enough to be
plausible in realtime applications like radars.

The method was simple but brutal - remove all of Ada that didn't look
exactly like Pascal, and if that wasn't enough, resort to assembly.
This was still Ada enough to qualify as Ada, and to meet the DoD
mandate.

Datapoint: My team implemented what would now be called middleware in
a severe subset of Ada83 plus some assembly code on a DEC VAX to
replace a pure-Ada message communications infrastructure. The subset
Ada plus assembly approach was literally ten times faster than the pure
Ada, and saved the project.

By the time Ada95 came out, it was too late - C/C++ had won.

Ada died because it was designed by academics who had no notion of
realtime, and thus made blunder after blunder, such that not even a DoD
Mandate could save Ada.

If you are an Ada expert, I will defer to your judgement. But was the
speed problem with Ada inherent in the language (therefor the fault of
the language designers) or was it just the shortcomings of the
implementations?

It was the design of Ada83 the language, which forced all Ada83
compilers to be clumsy, and to emit slow code. There are many details.


What could have changed in Ada95 that would speed up
implementations?

Fixing the many blunders of Ada83 was a large part of it. (There was
lots of field experience by then.) Adding true multiprocessor support
and the ability to control hardware was another. Adding support for
object-oriented programming (borrowed from C++). Etc.

All successful programming languages so far have had the following
things in common: Developed by a handful of people who wrote the
compiler at the same time. Not a committee writing a 300-page language
description in a vacuum. The early language and compilers co-evolved
as real users (typically the colleagues of the handful) used the
language for real work, and brought the problems back to the handful,
who fixed the problem by changing the language if needed., for a few
full versions. Full darwinist selection in the marketplace beyond the
handful and their colleagues - hundreds of languages qualify for the
common characteristics named before, but only a few languages achieve
more than a few hundred users. After such a language has succeeded in
the marketplace for some time, and has matured, it is formally
standardized (and thus frozen).

Ada83 violated all of the above.


I know C has gotten faster over the years to the point
where it is hard to write hand optimized assembly that is faster. At
least that is what I read. I don't use C much these days and I can't
remember the last time I wrote assembly other than for stack machines
with a very simple assembly language.

Assembly language is always faster than compiled language, if competent
assembly programmers are used. And assembly can do things that are
very difficult in any compiled language, with direct hardware control
being a traditional area. But the cost is about four times that of
compiled C, so assembly is used sparingly. The UNIX kernel was about
4% assembly.

There is also a nasty trick: write the code in C, and look at the
resulting generated assembly code. If it's worse than what one can
write manually, paraphrase the C code and try again. Eventually, the
optimum paraphrase will be found. In practice, C compilers are
sufficiently alike the a good paraphrase works on most compilers.


I use mostly VHDL for hardware design which is very Ada like and Forth
for apps which is like a stack machine assembly language. A bit of a
strange dichotomy, but there is no Forth-like HDL.

VHDL was expressly based on Ada, and Verilog on C.

FORTH is a pure stack language, most resembling a HP calculator (with
RPN), and was developed for machine control by an astronomer. The
machines being controlled are such as large telescopes.

I was very interested in FORTH when it went public and they were being
very close-mouthed. I ended up developing my own prototype language,
which I called Fifth. The interpreter core was written in assembly, as
that was how to get the necessary speed. But I found Fifth to be too
limiting for what I was then doing, and never pursued it.

Joe Gwinn
 
On Thu, 2 Jul 2015 15:22:32 -0700 (PDT), Lasse Langwadt Christensen
<langwadt@fonz.dk> wrote:

Den fredag den 3. juli 2015 kl. 00.06.49 UTC+2 skrev John Larkin:
On Thu, 02 Jul 2015 17:29:33 -0400, bitrex
bitrex@de.lete.earthlink.net> wrote:

On 7/2/2015 5:20 PM, John Larkin wrote:
On Thu, 02 Jul 2015 17:12:36 -0400, bitrex
bitrex@de.lete.earthlink.net> wrote:

On 7/2/2015 4:21 PM, John Larkin wrote:
On Thu, 02 Jul 2015 16:08:15 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 07/02/2015 01:47 PM, John Larkin wrote:
On Thu, 02 Jul 2015 11:21:42 +1000, Sylvia Else
sylvia@not.at.this.address> wrote:

On 2/07/2015 5:13 AM, John Larkin wrote:

http://www.itworld.com/article/2694378/college-students-learning-cobol-make-more-money.html

The revival of Basic is next.


Apparently people who can resist the urge to gnaw their own leg off from
boredom command a premium.

Sylvia.

Yup. Accountants. Lawyers. Plastic surgeons.

Cobol was designed so that bankers could code. It was brilliant.

https://en.wikipedia.org/wiki/COBOL#History_and_specification

Two of the designers were women, who were apparently more interested
in solving a real problem than they were interested in playing mental
games. Compare Cobol to c or Pascal or APL.


And run screaming in the other direction. Cobol is verbose and
inflexible. Just the sheer amount of typing would slow me down a lot.

C was described as designed by geniuses to be used by geniuses. But
most programmers aren't geniuses. Most people need hard typing and
runtime bounds checking and proper memory management to keep out of
trouble; they need verbose. I cite basically all Microsoft products.


They weren't geniuses, they just knew that they couldn't do fucking
runtime bounds checking and "proper" memory management on a PDP-11 with
as much processing power as a modern clock radio

That was 40 years ago.

Actually, the 11 had great memory management hardware, but c isn't
designed to be able to use it. Everything gets mixed up.


Right, the main issue is what it always is, of course: nobody ever
expected the language to be as long-lived as it was, and then once it
becomes apparent that it actually is going to be around for a long time
they can't update it or add many new features for fear of breaking
backwards-compatability for a bunch of legacy shit

I'd think that a good c compiler and a modern CPU could separate
i/d/stack spaces and prevent dumb buffer errors at least. Executing
data is unforgivable.


it is isn't as simple as in some small mcu with flash for program and ram for data and stack

stuff on a hdd is data, at some point some of it might become code

moderen OSs try their best, with stuff like the NX bit

-Lasse

Even the older x86 had some protection bits in the segment register
level (yes, 32 bit 386 use segment registers, but in a different way
than 8086). A good compiler would have allocated code and data spaces
separately, with proper protection for each segment. Keeping the
segment areas well separated from each other in the virtual address
space will also helping detecting segment overruns.

For this same reason, the first page is often not mapped, so it easy
to trap NULL pointer references.
 
On Thu, 02 Jul 2015 17:54:35 -0700, John Larkin
<jlarkin@highlandtechnology.com> Gave us:

>I haven't called you stupid before, but I do now.

John Larkin in rare form tonight... NOPE, this *stupid* shit behavior
from him is his *norm*.

He blathers on about civil behavior ALL THE TIME.

How much you wanna bet that he completely ignored my tale about when I
saved a part at a company I once worked for?

It was in response to his "I caught a hot iron once." remark, and I
didn't cuss even once.

From John... crickets.
 
On 03/07/15 05:24, John Larkin wrote:
On Thu, 02 Jul 2015 15:18:29 -0400, bitrex
As someone who knows many folks who work in the software industry/game
design, it's kind of cute when EEs talk about things that they're so
very sure about...;-)
How many buffer overrun vulnerabilities has Windows had so far? Round
your answer to the nearest thousand.

None, because some more careless company would have put Microsoft out of
business. We have the "quality" we were prepared to pay for, more or less.

Clifford Heath.
 
Den fredag den 3. juli 2015 kl. 00.03.35 UTC+2 skrev John Larkin:
On Thu, 2 Jul 2015 14:26:20 -0700 (PDT), Lasse Langwadt Christensen
langwadt@fonz.dk> wrote:

Den torsdag den 2. juli 2015 kl. 22.12.14 UTC+2 skrev John Larkin:
On Thu, 2 Jul 2015 12:33:39 -0700 (PDT), Lasse Langwadt Christensen
langwadt@fonz.dk> wrote:

Den torsdag den 2. juli 2015 kl. 21.24.41 UTC+2 skrev John Larkin:
On Thu, 02 Jul 2015 15:18:29 -0400, bitrex
bitrex@de.lete.earthlink.net> wrote:

On 7/2/2015 3:00 PM, John Larkin wrote:
It's a shame that ADA wasn't as widely accepted. Programmers tend to
hate safe languages that make them be careful.

As someone who knows many folks who work in the software industry/game
design, it's kind of cute when EEs talk about things that they're so
very sure about...;-)

How many buffer overrun vulnerabilities has Windows had so far? Round
your answer to the nearest thousand.


and you know for certain that if only they had used ADA everything would
be perfect?

Not perfect, but runtime bounds checking and proper memory and stack
management keep strangers from executing data on your PC. And you
can't get wild pointers if you don't use pointers.


if you make sure your soldering iron never get over 50'C you won't burn you fingers if you grap the wrong end

-Lasse


I once dropped an iron and caught it in mid-air. Only once.

Some people can write bug-free, really solid c. Most programmers
can't. It's easier in embedded systems than in OSs and OS-level apps.

My people really like Python lately, for Windows and Linux engineering
and test software and such, not embedded. As an interpreter, it trades
performance for safety. We can always buy faster CPUs.

up to a point, haven't you been wishing for a faster ltspice?
if is was as easy as buying a faster cpu why don't you?

-Lasse
 
Den fredag den 3. juli 2015 kl. 12.52.52 UTC+2 skrev Martin Brown:
If you want to sell your soul for maximum financial gain then
destabilising the global stock trading systems with sophisticated high
frequency trading algorithms is definitely the way to go.

One guy in the UK in his parents bedroom can allegedly do this:

http://www.bbc.co.uk/news/business-32415664

People seem to get upset if you are too good at it!

yeh, you have to be a member of "the money sucking parasite club" to
manipulate prices and steal money like that


-Lasse
 
On 01/07/2015 20:35, Lasse Langwadt Christensen wrote:
Den onsdag den 1. juli 2015 kl. 21.13.49 UTC+2 skrev John Larkin:

http://www.itworld.com/article/2694378/college-students-learning-cobol-make-more-money.html

Only so long as Cobol programmers are in short supply.
These things run in cycles.

The revival of Basic is next.


I doubt it, the reason to teach people cobol is that there is still a
ton of cobol code in use and I assume those who originally learned it
is getting a
bit grey

The problem is in maintaining code which is critical to the banks core
clearing operations (as opposed to the snazzy trader floor stuff).

They don't spend much on the legacy systems and they do tend to have
spectacular MFUs from time to time. Some banks more than others.

I saw an article a couple of years ago, something like 75% of all business transactions,
90% of financial transactions is still done on cobol.
200 billion lines of code running, 5 billion lines added every year


-Lasse

If you want to sell your soul for maximum financial gain then
destabilising the global stock trading systems with sophisticated high
frequency trading algorithms is definitely the way to go.

One guy in the UK in his parents bedroom can allegedly do this:

http://www.bbc.co.uk/news/business-32415664

People seem to get upset if you are too good at it!

--
Regards,
Martin Brown
 
On 02/07/2015 23:07, rickman wrote:
On 7/2/2015 6:03 PM, John Larkin wrote:

We can always buy faster CPUs.

That is literally the stupidest thing I've ever seen come from you.

I don't often defend Johns comments but on this he does have a point.
For most home and office kit these days the CPU power available is so
huge that efficiency literally does not matter in end user code.

The notable exceptions are video editing and 3D gaming which really do
push the performance envelope. Word processing and general stuff you can
trade a few percent of speed for safety without any problems.

Fast CPUs are cheap and getting cheaper in line with Moore's law whereas
good software engineers are rare, expensive and getting more so.

TBH I am amazed that Moore's law has held good for as long as it has.

--
Regards,
Martin Brown
 

Welcome to EDABoard.com

Sponsor

Back
Top