Driver to drive?

On Dec 30, 12:58 am, k...@attt.bizz wrote:
On Sat, 29 Dec 2012 17:44:45 -0800 (PST), Mahipal









mahipal7...@gmail.com> wrote:
On Dec 29, 8:33 pm, k...@attt.bizz wrote:
On Sat, 29 Dec 2012 17:11:48 -0800 (PST), Mahipal

[trim]

Interesting reference this Bhopal incident. Por que?

As long as you're trolling, might just as well have fun.

Look, it's snowing outside my window. I've got internet to serve
mankind all night long. As do you too. O well. So Bhopal rhymed with
Mahipal. Not my problem.  Nice you got rhyme in your mind. Must make
your right?

Sucks to be you.

Can't and will not argue that. Duh.

No, I can't argue how much it sucks to be you.
Speaking of trolls...

The high unemployment rates prove altruism is dead.

Repeating your idiocy just proves the point; you're an idiot.

Re-demonstrating your sensitivity proves what I said to be true.

No, just continuing your idiocy.

You are beneath the levels my idiocy can reach.

The high unemployment rates prove altruism is dead.

Repeating a lie doesn't make the truth.  Lefties like to think so,
though.

As if the righties never repeat a word. So true, and I agree, that
they the righties do not think. Keep the words coming you nameless
twit.

Look, loser, you're the one who's repeating a lie, hoping that someone
will think it's true.

You are the one hoping that I lie. Fyi, I am not repeating anything. I
authored the line that bothers you to the core.

You do lie. Live with it.
Speaking of trolls...

The high unemployment rates prove altruism is dead. Deal with it.

Proof.

I as an individual am no proof of the statistics in the News and its
nearest sister Social Media. Deal with it you nameless Mother Idiot
Twit (MIT). Or do you not know how to use punctuation?

IOW, you're another idiot troll. Now run along and troll elsewhere.
Pot Kettle Speaking of Black trolls... Good riddance.
 
<visualforth@rocketmail.com> wrote in message
news:6e281116-c867-4626-ad67-945faae19c4a@googlegroups.com...

Several times there have been requests and attempts
for building a computer having Forth as Operating System.
Yes, mostly by "gavino" ...


All of the electronics discussion related to computer motherboards
on comp.lang.forth and comp.lang.asm.x86 of late are beginning to
annoy me. No one here, except me, seems to know anything about
electronic manufacturing. However, it's been a while since I
worked in electronics industry. So, I added a couple of other
newsgroups ...

The OP's complete post here (new and old style Google Groups):
https://groups.google.com/d/topic/comp.lang.forth/WMwRd49iTj0/discussion
http://groups.google.com/group/comp.lang.forth/msg/d805cba622cdfa10

There are Open Source projects in the making.
Today I read about this one:

"Building my Own Laptop"

Quotation:

We are building an open laptop, with some wacky
features in it for hackers like me.
Isn't "wacky" just a euphemism for "hack" ... ?

This is a lengthy project. Fortunately, ARM CPUs are getting
fast enough, [...]
Yes, ARM microprocessors are becoming faster, but they're also
requiring more power to do so. x86 designs are reducing their
power consumption at the same time. Once the ARMs reach x86
performance levels, they won't have any advantage.

Myth? Even Gordon Moore predicted his law would slow down.
He was wrong. Did something change?

so that even if it took a year or so to complete, I won't
be left with a woefully useless design.
Wrong. By the time it takes to "complete" your project, i.e.,
many years, all the hardware you just coded for will be obsolete.
All hardware requires a huge amount of custom software to make it
work correctly. I.e., if you buy a new PC or laptop, you'll have
to start all over again in five years, or you'll need a team of
people to continually keep the code current.

Today's state of the art ARM CPUs - quad-core with GHz+
performance levels - is good enough for most day-to-day
code development, email checking, browsing etc...
GHz+ performance level is overkill for those tasks.

A DX4-133Mhz was good enough, except for video.
AMD K6-2 500Mhz was good enough with video.

Of course, a feature of a build-it-yourself laptop is that all
the design documentation is open, [...]
Good luck with that. I.e., highly unlikely.

Such a design with likely use COTS ("common-off-the-shelf")
components, such as standard PC chipsets, standard PC GPU's, etc.
*ALL* of that stuff is proprietary and closed. If the project is
serious about that, you'll need to find, and likely pay
handsomely, an experienced FPGA circuit designer to design custom
logic.

so others of sufficient skill and resources can also build
it.
Good luck with that. I.e., highly unlikely.

If the project uses custom FPGA's, everyone attempting to build it
will need either 1) an FPGA programmer or 2) buy programmed FPGA's
from the project. If the project uses COTS, e.g., AMD or Intel
motherboard chipset, then you'll have to find a way to purchase
the components in small quantities.

The hardware and its sub-components are picked so as to make
this the most practically open hardware laptop I could create
using state of the art technology.
Delirious?

You can download, without NDA, the datasheets for all the
components, and key peripheral options are available so it's
possible to build a complete firmware from source with no
opaque blobs.

Source: http://www.bunniestudios.com/blog/?p=2686
Well, after seeing his webpage, he's clearly not delirious in
terms of "using state of the art technology". But, he's *solidly*
delirious in terms of "others of sufficient skill and resources
can also build it." He clearly has access to a motherboard
manufacturer - needed to get a motherboard manufactured - and is
part of an electronics manufacturing firm somewhere - to get
high-tech parts in small quantities for prototypes. I.e., he's
probably an EE working for a major electronics firm. His
motherboard is an **UNREALISTIC** achievement for a hobbyist.
That's a modern multi-layer motherboard with current surface mount
components. Firstly, "you" - any given hobbyist - don't know
enough about electronics to design one yourself. Circuit board
design is a highly-specialized trade. Secondly, "you" don't have
access to the professional grade design software for required to
design one. No, a professional motherboard manufacturer *will
not* accept designs from open-source software even if in the
correct file formats. Thirdly, "you" can't get the modern
components. So, you wouldn't be able to get one of those
manufactured. Given you can't do those things, you're going to
have to buy a motherboard he's had manufactured and a part set
from him. Even if you could buy the motherboard and parts from
him, you couldn't assemble the board yourself. Doing so requires
access to wave-soldering machines for through-hole components and
SMT oven-soldering machines for the SMT components. Also, small
electronic manufacturing firms have to submit minimum orders well
into the thousands before a board manufacturer will even consider
a run of boards.

My favorite would be a laptop with an RTX2000-like
microprocessor instead of the ARM. A friend of mine had
developed such a computer with the RTX2000, twenty years ago,
everything wirewrapped. It's a pity that I didn't take the
chance to make a PCB out of it. This time is gone, and today
the only microprocessor which comes near to an RTX2000
style microprocessor is the ARM.
For small quantities, wirewrapped or point-to-point or grid-style
board or perfboard is the way to go. Unfortunately, a modern
multi-layer board has way too many connections for any hobbyist
method to work correctly. The length of the wire traces affect
capacitance, resistance, and inductance. Engineers generally like
to keep the traces as short as possible. The multiple layers
helps with that.

In my opinion an ARM based laptop with Forth as
Operating System is feasible.
Everything is feasible. Is it worthy? That's the question you
should be asking.

I.e., if the OS is slower, the same speed, or even slightly faster
than an a hobbyist x86 OS in assembly or Linux, is there point to
using Forth? (No.) It must be _significantly_ better in some
way. Cost can be that factor, but that's not the case here. He's
used some very recent components.

There may be parts of the I/O software written in C needed
to be used, but it will be manageable to access this I/O
from a Forth OS via entry points.
If the OS must be partially coded in C, is there any point to
having any of it coded in Forth? (Not really.)


Rod Pemberton
 
Den 31-12-2012 10:49, Rod Pemberton skrev:
visualforth@rocketmail.com> wrote in message

design one. No, a professional motherboard manufacturer *will
not* accept designs from open-source software even if in the
correct file formats.
Of course they will. It's a business. You pay, they Play:

http://www.pcb-pool.com/ppuk/info_pcb_assembling.html

Same story with CAM and 3D-printing - today there are many manufacturers
that will run your designs from CAD files.
 
Rod Pemberton wrote:
visualforth@rocketmail.com> wrote in message

[...] Moore's Law is slowing down, [...]

Myth? Even Gordon Moore predicted his law would slow down.
He was wrong. Did something change?
Was everything I read in the 70s and 80s wrong?

He predicted the number of elements per IC would double every year, and it
did from 1959 to the early 80s.

In the 70s, everyone said we would hit a wall, probably in the 80s. It was
pointed out that if progress did not slow, we would have billion-transistor
chips by 2000. It did take several years longer.

When we hit the wall in the 80s it was said that Carver Meade rewrote the
book on processes, and enabled continued progress.

But that progress was reduced to doubling every 18 months, and recently
every 2 years.

Every citation of "Moore's Law" in the 70s said "every year", but people
keep redefining it. Sure, it will last forever if you redefine it.


--

Reply in group, but if emailing add one more
zero, and remove the last word.
 
On Mon, 31 Dec 2012 15:06:42 -0500, "Tom Del Rosso"
<tomd_u1@verizon.net.invalid> wrote:

Rod Pemberton wrote:
visualforth@rocketmail.com> wrote in message

[...] Moore's Law is slowing down, [...]

Myth? Even Gordon Moore predicted his law would slow down.
He was wrong. Did something change?

Was everything I read in the 70s and 80s wrong?

He predicted the number of elements per IC would double every year, and it
did from 1959 to the early 80s.

In the 70s, everyone said we would hit a wall, probably in the 80s. It was
pointed out that if progress did not slow, we would have billion-transistor
chips by 2000. It did take several years longer.

When we hit the wall in the 80s it was said that Carver Meade rewrote the
book on processes, and enabled continued progress.

But that progress was reduced to doubling every 18 months, and recently
every 2 years.

Every citation of "Moore's Law" in the 70s said "every year", but people
keep redefining it. Sure, it will last forever if you redefine it.

http://en.wikipedia.org/wiki/Immersion_lithography

We are now nearing these newer limitations.

We will end up on optical computers before long. Then the march starts
all over again.
 
On 12/31/2012 3:06 PM, Tom Del Rosso wrote:
Rod Pemberton wrote:
visualforth@rocketmail.com> wrote in message

[...] Moore's Law is slowing down, [...]

Myth? Even Gordon Moore predicted his law would slow down.
He was wrong. Did something change?

Was everything I read in the 70s and 80s wrong?

He predicted the number of elements per IC would double every year, and it
did from 1959 to the early 80s.

In the 70s, everyone said we would hit a wall, probably in the 80s. It was
pointed out that if progress did not slow, we would have billion-transistor
chips by 2000. It did take several years longer.

When we hit the wall in the 80s it was said that Carver Meade rewrote the
book on processes, and enabled continued progress.

But that progress was reduced to doubling every 18 months, and recently
every 2 years.

Every citation of "Moore's Law" in the 70s said "every year", but people
keep redefining it. Sure, it will last forever if you redefine it.
In 1965 Moore observed that growth was doubling every year and he saw no
reason for it not to continue for 10 years. So extrapolating beyond
1975 is outside of Moore's law. That said, I found a graph on the
wikipedia site that seems to be showing a measurement of microprocessors
from 1971 to 2011, 40 years. In this time the transistor count has
grown from 2300 to 2.6 billion. If I did the math right transistor
density is doubling every two years.

This graph shows little variation from this slope over the last 40
years. So I would say that Moore's law likely held during the 10 years
he expected it to and has been slightly declining since, but has been
pretty consistent over the last 40 years at this lower rate. I don't
see any reduction in the rate in more recent years, in fact it looks
like there was some recovery of the rate of growth over the last 12 or
15 years from a slump in the 90's.

Is there some distortion to the curve because of using only
microprocessor data and not memory, etc?

Rick
 
On 12/31/2012 3:06 PM, Tom Del Rosso wrote:
Rod Pemberton wrote:
visualforth@rocketmail.com> wrote in message

[...] Moore's Law is slowing down, [...]

Myth? Even Gordon Moore predicted his law would slow down.
He was wrong. Did something change?

Was everything I read in the 70s and 80s wrong?

He predicted the number of elements per IC would double every year, and it
did from 1959 to the early 80s.

In the 70s, everyone said we would hit a wall, probably in the 80s. It was
pointed out that if progress did not slow, we would have billion-transistor
chips by 2000. It did take several years longer.

When we hit the wall in the 80s it was said that Carver Meade rewrote the
book on processes, and enabled continued progress.

But that progress was reduced to doubling every 18 months, and recently
every 2 years.

Every citation of "Moore's Law" in the 70s said "every year", but people
keep redefining it. Sure, it will last forever if you redefine it.
Sorry, forgot to post the link to the graph. Here is the Moore's Law
page with the image.

http://en.wikipedia.org/wiki/Moore%27s_law

Rick
 
totally disagree, at higher temperatures these will smell bad.
and vibration will get you too.
there is a paper describing charge carrier mutation in flexible substrates
for a 100 to 101 percent increase of body temperature under argon atmosphere in space environments
that points out the effect of climate change on these constructs.
so no reason to worry about it in more mundane amplifier stages unless at stage performances where the base drum resonates with the substrates.
 
rickman <gnuarm@gmail.com> writes:
Is there some distortion to the curve because of using only
microprocessor data and not memory, etc?
Possibly. In <2002Dec22.163332@a0.complang.tuwien.ac.at> I computed
the price/bit of DRAM (from 1983 to 2002), and got an average halving
in 17 months.

- anton
--
M. Anton Ertl http://www.complang.tuwien.ac.at/anton/home.html
comp.lang.forth FAQs: http://www.complang.tuwien.ac.at/forth/faq/toc.html
New standard: http://www.forth200x.org/forth200x.html
EuroForth 2012: http://www.euroforth.org/ef12/
 
On 1/2/2013 1:16 PM, Anton Ertl wrote:
rickman<gnuarm@gmail.com> writes:
Is there some distortion to the curve because of using only
microprocessor data and not memory, etc?

Possibly. In<2002Dec22.163332@a0.complang.tuwien.ac.at> I computed
the price/bit of DRAM (from 1983 to 2002), and got an average halving
in 17 months.
Regardless of whether the curve is softening a little or not, I am
amazed that it has lasted as long as it has. Lately I have been hearing
that we may really be reaching the end of what can be done "easily",
meaning it is getting a lot harder to keep extending that curve. I
understand we will be seeing some fundamental limits becoming real
problems around 10 nm... but I have heard that song play on the radio
before.

From what I've heard, the real issues are not advances in process
technology, but that we don't have any good use for more transistors in
CPUs or more CPUs on a chip. Plus, the market has been changing so that
more transistors aren't what is needed, but less power consumption for
the highly portable devices.

I was a bit surprised by the huge adoption of mobile computing over the
last 10 years. Having gotten over my surprise, I expect the PC to
become a much smaller player to the handheld and table form factors with
the resulting emphasis on very low power and the resulting change in
emphasis in processing technology from density to power consumption.

Rick
 
On Wed, 02 Jan 2013 18:22:41 -0500, rickman <gnuarm@gmail.com> wrote:

On 1/2/2013 1:16 PM, Anton Ertl wrote:
rickman<gnuarm@gmail.com> writes:
Is there some distortion to the curve because of using only
microprocessor data and not memory, etc?

Possibly. In<2002Dec22.163332@a0.complang.tuwien.ac.at> I computed
the price/bit of DRAM (from 1983 to 2002), and got an average halving
in 17 months.

Regardless of whether the curve is softening a little or not, I am
amazed that it has lasted as long as it has. Lately I have been hearing
that we may really be reaching the end of what can be done "easily",
meaning it is getting a lot harder to keep extending that curve. I
understand we will be seeing some fundamental limits becoming real
problems around 10 nm... but I have heard that song play on the radio
before.
Indeed.

From what I've heard, the real issues are not advances in process
technology, but that we don't have any good use for more transistors in
CPUs or more CPUs on a chip. Plus, the market has been changing so that
more transistors aren't what is needed, but less power consumption for
the highly portable devices.
That's an old song, too.

I was a bit surprised by the huge adoption of mobile computing over the
last 10 years. Having gotten over my surprise, I expect the PC to
become a much smaller player to the handheld and table form factors with
the resulting emphasis on very low power and the resulting change in
emphasis in processing technology from density to power consumption.
I've been hearing that song for a couple of decades, too.

The reality is that we can't know what the next "big thing" is. We do
know it's not the current "big thing".
 
On 1/2/2013 6:30 PM, krw@attt.bizz wrote:
On Wed, 02 Jan 2013 18:22:41 -0500, rickman<gnuarm@gmail.com> wrote:

From what I've heard, the real issues are not advances in process
technology, but that we don't have any good use for more transistors in
CPUs or more CPUs on a chip. Plus, the market has been changing so that
more transistors aren't what is needed, but less power consumption for
the highly portable devices.

That's an old song, too.

I was a bit surprised by the huge adoption of mobile computing over the
last 10 years. Having gotten over my surprise, I expect the PC to
become a much smaller player to the handheld and table form factors with
the resulting emphasis on very low power and the resulting change in
emphasis in processing technology from density to power consumption.

I've been hearing that song for a couple of decades, too.

The reality is that we can't know what the next "big thing" is. We do
know it's not the current "big thing".
Not sure why you say that is a "couple of decades" old. In '93 they
were still pushing for "longer, lower, wider" to quote the auto
industry's motto during the years when they blithely promoted fancier
cars with shiny doodads instead of safety or lower pollution. PCs
didn't reach their peak power dissipation until they were over 100 Watts
in what, 2000 something? Now they hardly have a PC CPU that uses 100
Watts. I think even the ultra powerful server CPUs try to keep the
power consumption down as it costs more to cool the equipment than it
does to power it.

Rick
 
On Wed, 02 Jan 2013 18:42:03 -0500, rickman <gnuarm@gmail.com> wrote:

On 1/2/2013 6:30 PM, krw@attt.bizz wrote:
On Wed, 02 Jan 2013 18:22:41 -0500, rickman<gnuarm@gmail.com> wrote:

From what I've heard, the real issues are not advances in process
technology, but that we don't have any good use for more transistors in
CPUs or more CPUs on a chip. Plus, the market has been changing so that
more transistors aren't what is needed, but less power consumption for
the highly portable devices.

That's an old song, too.

I was a bit surprised by the huge adoption of mobile computing over the
last 10 years. Having gotten over my surprise, I expect the PC to
become a much smaller player to the handheld and table form factors with
the resulting emphasis on very low power and the resulting change in
emphasis in processing technology from density to power consumption.

I've been hearing that song for a couple of decades, too.

The reality is that we can't know what the next "big thing" is. We do
know it's not the current "big thing".


Not sure why you say that is a "couple of decades" old. In '93 they
were still pushing for "longer, lower, wider" to quote the auto
industry's motto during the years when they blithely promoted fancier
cars with shiny doodads instead of safety or lower pollution.
Perhaps not '93, but by '95 the main issue of PCs had turned to power.
It didn't get fixed, mainly because the packaging people were better
than anyone gave the credit for.

PCs
didn't reach their peak power dissipation until they were over 100 Watts
in what, 2000 something? Now they hardly have a PC CPU that uses 100
Watts. I think even the ultra powerful server CPUs try to keep the
power consumption down as it costs more to cool the equipment than it
does to power it.
Just because the problem wasn't "solved" (it really never was - people
just got bored with balls-to-the-wall performance), doesn't mean it
wasn't a primary concern.
 
actually, there is no definitive line between weather & climate,
that I have ever seen. anyway, when I started reading *Science*
in the mid-'80s, it just so-happened that the terminology was
changing at that moment.

what used to be known as the quasibiennial southern oscillation (QSO).
with a periodicity of about 26 months, came to be called El Nino-
southern oscillation
or ENSO. apparently, the QSO became chaotic.

I say this as not being a Denierist nor a Confirmerist, but
as a student of teh Quaternary Period -- you be when -- although
I do aver that there is no proper idiom of "global" warming.
 
On 2012-12-31, Rod Pemberton <do_not_have@notemailnotz.cnm> wrote:



Good luck with that. I.e., highly unlikely.

If the project uses custom FPGA's, everyone attempting to build it
will need either 1) an FPGA programmer or 2) buy programmed FPGA's
from the project. If the project uses COTS, e.g., AMD or Intel
motherboard chipset, then you'll have to find a way to purchase
the components in small quantities.
FPGAs are programmed in the field not in the factory, every time you
reset the device you need to reprogram the FPGA.

No, a professional motherboard manufacturer *will
not* accept designs from open-source software even if in the
correct file formats.
I'm not not sure what a "professional motherboard manufacturer' is
but the likes of PCBCart will manufacture the boards I doodle up in
(gEDA) PCB

Thirdly, "you" can't get the modern components.
Its not hard to open a "cash" account with the three big online
suppliers (digikey, mouser, farnell) You may need to register
a one-man business first.

access to wave-soldering machines for through-hole components and
SMT oven-soldering machines for the SMT components.
throug-hole can be soldered wih a soldering iron, as can leaded SMD
(like SOIC). (lead as in wire, not as in Pb)
Non-leaded like QFN and BGA require an oven, with a controlled
temperature profile. an ordinary kitchen appliance with a conscious
operator watching the temperature and the clock can probably fit the
bill if he's only making one.

Also, small
electronic manufacturing firms have to submit minimum orders well
into the thousands before a board manufacturer will even consider
a run of boards.
the more more you order the cheaper for each.
for the price to be competitive it will need a large order.

My favorite would be a laptop with an RTX2000-like
microprocessor instead of the ARM.
only two stacks, and tiny ones at that, if you like stacks why not use
a M68000 instead?

I.e., if the OS is slower, the same speed, or even slightly faster
than an a hobbyist x86 OS in assembly or Linux, is there point to
using Forth? (No.) It must be _significantly_ better in some
way. Cost can be that factor, but that's not the case here. He's
used some very recent components.

There may be parts of the I/O software written in C needed
to be used, but it will be manageable to access this I/O
from a Forth OS via entry points.

If the OS must be partially coded in C, is there any point to
having any of it coded in Forth? (Not really.)
forth is an incremental compiler hardware drivers can probably be
done in assembler and forth.

--
⚂⚃ 100% natural

--- news://freenews.netfront.net/ - complaints: news@netfront.net ---
 
On 3 Jan 2013 11:11:24 GMT, Jasen Betts <jasen@xnet.co.nz> wrote:

On 2012-12-31, Rod Pemberton <do_not_have@notemailnotz.cnm> wrote:



Good luck with that. I.e., highly unlikely.

If the project uses custom FPGA's, everyone attempting to build it
will need either 1) an FPGA programmer or 2) buy programmed FPGA's
from the project. If the project uses COTS, e.g., AMD or Intel
motherboard chipset, then you'll have to find a way to purchase
the components in small quantities.

FPGAs are programmed in the field not in the factory, every time you
reset the device you need to reprogram the FPGA.
That's not universal. There are flash based FPGAs and there are FPGAs
with included flash (two chip module). The former require no
configuration at power-on and the latter does it automagically (you
don't have to).

No, a professional motherboard manufacturer *will
not* accept designs from open-source software even if in the
correct file formats.

I'm not not sure what a "professional motherboard manufacturer' is
but the likes of PCBCart will manufacture the boards I doodle up in
(gEDA) PCB

Thirdly, "you" can't get the modern components.

Its not hard to open a "cash" account with the three big online
suppliers (digikey, mouser, farnell) You may need to register
a one-man business first.
I'm pretty sure, at least DigiKey, sells to individuals.

access to wave-soldering machines for through-hole components and
SMT oven-soldering machines for the SMT components.

throug-hole can be soldered wih a soldering iron, as can leaded SMD
(like SOIC). (lead as in wire, not as in Pb)
Non-leaded like QFN and BGA require an oven, with a controlled
temperature profile. an ordinary kitchen appliance with a conscious
operator watching the temperature and the clock can probably fit the
bill if he's only making one.
QFNs can be done with an iron, too. Some are a little tricky but most
aren't any more difficult than 0402s. Easier, actually, since they're
easier to hold.

Also, small
electronic manufacturing firms have to submit minimum orders well
into the thousands before a board manufacturer will even consider
a run of boards.

the more more you order the cheaper for each.
for the price to be competitive it will need a large order.
My "production" run is often ten boards. There are assembly houses
that cater to engineering prototypes rather than production. It's
really not all that expensive, either.
 
On 1/2/2013 11:04 PM, krw@attt.bizz wrote:
On Wed, 02 Jan 2013 18:42:03 -0500, rickman<gnuarm@gmail.com> wrote:

On 1/2/2013 6:30 PM, krw@attt.bizz wrote:
On Wed, 02 Jan 2013 18:22:41 -0500, rickman<gnuarm@gmail.com> wrote:

I was a bit surprised by the huge adoption of mobile computing over the
last 10 years. Having gotten over my surprise, I expect the PC to
become a much smaller player to the handheld and table form factors with
the resulting emphasis on very low power and the resulting change in
emphasis in processing technology from density to power consumption.

I've been hearing that song for a couple of decades, too.

The reality is that we can't know what the next "big thing" is. We do
know it's not the current "big thing".


Not sure why you say that is a "couple of decades" old. In '93 they
were still pushing for "longer, lower, wider" to quote the auto
industry's motto during the years when they blithely promoted fancier
cars with shiny doodads instead of safety or lower pollution.

Perhaps not '93, but by '95 the main issue of PCs had turned to power.
It didn't get fixed, mainly because the packaging people were better
than anyone gave the credit for.

PCs
didn't reach their peak power dissipation until they were over 100 Watts
in what, 2000 something? Now they hardly have a PC CPU that uses 100
Watts. I think even the ultra powerful server CPUs try to keep the
power consumption down as it costs more to cool the equipment than it
does to power it.

Just because the problem wasn't "solved" (it really never was - people
just got bored with balls-to-the-wall performance), doesn't mean it
wasn't a primary concern.
I'm not sure what your point is. There was a tradeoff between building
chips which were larger and ran faster and chips that used less power.
The fact that they continued to build the faster chips and didn't build
the lower power ones says to me they are only now being forced by the
market to go for low power at the expense of performance.

My point is that power will be the primary issue with CPUs in the coming
years. Even today many are not so happy with cell phones that have to
be charged more than once a day. Processing speed will be taking a
secondary seat to power consumption and the product mix in the market
will reflect that; fewer desktops and laptops with more tablets, PDAs
and cell phones.

Rick
 
Joe AutoDrill wrote:
I am looking for an official drawing or spec sheet on a taper that isn't
listed in the Machinery Handbook and doesn't seem to exist on the
Google-search-braintrust called the Internet...

It's apparently known as the A-O (aye-ooh) taper and is short for
American Optics.

Apparently, it's a taper that has been around for 50+ years and used by
folks in the optical industry to hold specialized tooling. But because
the tooling is hand loaded in many cases (not for long if I have
something to say about it), the taper isn't all that precise.

Well... In my industry, precision is key so if a drawing(s) exist
anywhere, I'd love to compare them.

Problem is, I can't find a single one.

Any A-O taper experts out there with links to drawings?

There is an Electro-Optical Engineer on news:sci.electronics.design
named Phil Hobbs that might be able to help you. I added the group,
since there are a lot of knowledgeable people on that group.
 
On 1/3/2013 3:36 PM, Michael A. Terrell wrote:
There is an Electro-Optical Engineer on news:sci.electronics.design
named Phil Hobbs that might be able to help you. I added the group,
since there are a lot of knowledgeable people on that group.
Thank you. I didn't even think of changing to another group... Old
habits die hard!



--
http://tinyurl.com/My-Official-Response

Regards,
Joe Agro, Jr.
(800) 871-5022 x113
01.908.542.0244
Flagship Site: http://www.Drill-HQ.com
Automatic / Pneumatic Drills: http://www.AutoDrill.com
Multiple Spindle Drills: http://www.Multi-Drill.com
Production Tapping: http://www.Drill-HQ.com/?page_id=226
VIDEOS: http://www.youtube.com/user/AutoDrill
FACEBOOK: http://www.facebook.com/AutoDrill
TWITTER: http://twitter.com/AutoDrill

V8013-R
 
On 01/03/2013 03:36 PM, Michael A. Terrell wrote:
Joe AutoDrill wrote:

I am looking for an official drawing or spec sheet on a taper that isn't
listed in the Machinery Handbook and doesn't seem to exist on the
Google-search-braintrust called the Internet...

It's apparently known as the A-O (aye-ooh) taper and is short for
American Optics.

Apparently, it's a taper that has been around for 50+ years and used by
folks in the optical industry to hold specialized tooling. But because
the tooling is hand loaded in many cases (not for long if I have
something to say about it), the taper isn't all that precise.

Well... In my industry, precision is key so if a drawing(s) exist
anywhere, I'd love to compare them.

Problem is, I can't find a single one.

Any A-O taper experts out there with links to drawings?


There is an Electro-Optical Engineer on news:sci.electronics.design
named Phil Hobbs that might be able to help you. I added the group,
since there are a lot of knowledgeable people on that group.
The American Optical taper is used for mounting tools on optical
polishing machines.

None of my optical finishing or optomechanics books has a specification,
but I did come across these folks:

http://www.lensmastertooling.com/grindingandpolishing.html

who supply laps and grinding tools with an AO taper. You might ask them.

Cheers

Phil Hobbs
--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510

hobbs at electrooptical dot net
http://electrooptical.net
 

Welcome to EDABoard.com

Sponsor

Back
Top