Conical inductors--still $10!...

On Wed, 22 Jul 2020 07:57:58 +0100, Tom Gardner
<spamjunk@blueyonder.co.uk> wrote:

On 22/07/20 04:01, Phil Hobbs wrote:
On 2020-07-20 14:34, Tom Gardner wrote:
On 20/07/20 19:19, bitrex wrote:
You don\'t need to be able to design circuits to get an EE job at most large
companies as a newly-minted EE, seems like it\'s expected that\'s one of the
things you learn on the job.

Not back in my day. I was thrown in at the deep
end on my first day.

I\'ve always had jobs like that, and wouldn\'t have
had it any other way.

Me too.  With my new astronomy and physics bachelor\'s degree, and a hobby
background in electronics. I got hired to do 2/3 of the timing and frequency
control electronics for the first civilian direct-broadcast satellite system.
I\'d heard of PLLs but had never actually come across one to know what it was.
Talk about drinking from a fire hose.

My first job involved that newfangled digital logic; easy.

Beg to differ. Early integrated logic chips (RTL, DTL, weird stuff,
TTL) were horrible.

It also involved creating a test set for the newfangled
multimode optical fibres that were just being installed
between exchanges. My knowledge: zero.

I ended up with a receiver with 180dB electrical (90dB
optical) dynamic range, using a large photodiode (BPW34)
and a LF351 based transimpedance amp. To recover the
signal I decided I couldn\'t predict how a PLL would work in
a range switched design, so I made a filter with a Q of 4000
using 10% components. The noise equivalent power was 1pW.

I\'ve always wanted to revisit that N-path filter for RF work,
since it has interesting properties. When I finally looked,
I found the Tayloe mixer had been patented, dammit.

It is annoying to invent things, and have someone else patent them and
get rich.



--

John Larkin Highland Technology, Inc

Science teaches us to doubt.

Claude Bernard
 
On 22/07/20 19:19, John Larkin wrote:
On Wed, 22 Jul 2020 18:52:46 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 22/07/20 15:20, jlarkin@highlandsniptechnology.com wrote:
On Wed, 22 Jul 2020 07:57:58 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:
My first job involved that newfangled digital logic; easy.

Beg to differ. Early integrated logic chips (RTL, DTL, weird stuff,
TTL) were horrible.

TTL was blissfully simple compared with the others!

Agree, but the early plastic DIPs failed a lot.


No, I haven\'t dealt with magnetic logic (although
the first computer I used in anger did).

Yes, I have looked at hydraulic logic running at
2000psi. It was a design study for replacing it
with the newfangled micros.

I concluded it wasn\'t worth the aggro, since it
was on an offshore oil rig with zero electricity
due to the inflammable atmosphere :)


It also involved creating a test set for the newfangled
multimode optical fibres that were just being installed
between exchanges. My knowledge: zero.

I ended up with a receiver with 180dB electrical (90dB
optical) dynamic range, using a large photodiode (BPW34)
and a LF351 based transimpedance amp. To recover the
signal I decided I couldn\'t predict how a PLL would work in
a range switched design, so I made a filter with a Q of 4000
using 10% components. The noise equivalent power was 1pW.

I\'ve always wanted to revisit that N-path filter for RF work,
since it has interesting properties. When I finally looked,
I found the Tayloe mixer had been patented, dammit.

It is annoying to invent things, and have someone else patent them and
get rich.

I can\'t complain. I didn\'t follow up the concept for
20+ years!

I invented the dual-slope ADC when I was a kid, but I figured that
relay bounce would make it inaccurate. Relays were the only analog
switches I could imagine then.

I also invented the successive log RF detector, but my boss thought it
was useless.

In a similar vein I unwittingly invented FSMs plus a
neat implementation technique in the first machine
code program I wrote (for that 39bit serial computer
with magnetic logic).

I also conceived of implementing a CPU using microprogramming
(as in the AMD2900 bit slice processors).

But neither of those were difficult; they were quite
simple, even if most kids don\'t think of them.
 
On 22/07/20 19:19, John Larkin wrote:
On Wed, 22 Jul 2020 18:52:46 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 22/07/20 15:20, jlarkin@highlandsniptechnology.com wrote:
On Wed, 22 Jul 2020 07:57:58 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:
My first job involved that newfangled digital logic; easy.

Beg to differ. Early integrated logic chips (RTL, DTL, weird stuff,
TTL) were horrible.

TTL was blissfully simple compared with the others!

Agree, but the early plastic DIPs failed a lot.


No, I haven\'t dealt with magnetic logic (although
the first computer I used in anger did).

Yes, I have looked at hydraulic logic running at
2000psi. It was a design study for replacing it
with the newfangled micros.

I concluded it wasn\'t worth the aggro, since it
was on an offshore oil rig with zero electricity
due to the inflammable atmosphere :)


It also involved creating a test set for the newfangled
multimode optical fibres that were just being installed
between exchanges. My knowledge: zero.

I ended up with a receiver with 180dB electrical (90dB
optical) dynamic range, using a large photodiode (BPW34)
and a LF351 based transimpedance amp. To recover the
signal I decided I couldn\'t predict how a PLL would work in
a range switched design, so I made a filter with a Q of 4000
using 10% components. The noise equivalent power was 1pW.

I\'ve always wanted to revisit that N-path filter for RF work,
since it has interesting properties. When I finally looked,
I found the Tayloe mixer had been patented, dammit.

It is annoying to invent things, and have someone else patent them and
get rich.

I can\'t complain. I didn\'t follow up the concept for
20+ years!

I invented the dual-slope ADC when I was a kid, but I figured that
relay bounce would make it inaccurate. Relays were the only analog
switches I could imagine then.

I also invented the successive log RF detector, but my boss thought it
was useless.

In a similar vein I unwittingly invented FSMs plus a
neat implementation technique in the first machine
code program I wrote (for that 39bit serial computer
with magnetic logic).

I also conceived of implementing a CPU using microprogramming
(as in the AMD2900 bit slice processors).

But neither of those were difficult; they were quite
simple, even if most kids don\'t think of them.
 
On Wed, 22 Jul 2020 23:54:37 +0100, Tom Gardner
<spamjunk@blueyonder.co.uk> wrote:

On 22/07/20 19:19, John Larkin wrote:
On Wed, 22 Jul 2020 18:52:46 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 22/07/20 15:20, jlarkin@highlandsniptechnology.com wrote:
On Wed, 22 Jul 2020 07:57:58 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:
My first job involved that newfangled digital logic; easy.

Beg to differ. Early integrated logic chips (RTL, DTL, weird stuff,
TTL) were horrible.

TTL was blissfully simple compared with the others!

Agree, but the early plastic DIPs failed a lot.


No, I haven\'t dealt with magnetic logic (although
the first computer I used in anger did).

Yes, I have looked at hydraulic logic running at
2000psi. It was a design study for replacing it
with the newfangled micros.

I concluded it wasn\'t worth the aggro, since it
was on an offshore oil rig with zero electricity
due to the inflammable atmosphere :)


It also involved creating a test set for the newfangled
multimode optical fibres that were just being installed
between exchanges. My knowledge: zero.

I ended up with a receiver with 180dB electrical (90dB
optical) dynamic range, using a large photodiode (BPW34)
and a LF351 based transimpedance amp. To recover the
signal I decided I couldn\'t predict how a PLL would work in
a range switched design, so I made a filter with a Q of 4000
using 10% components. The noise equivalent power was 1pW.

I\'ve always wanted to revisit that N-path filter for RF work,
since it has interesting properties. When I finally looked,
I found the Tayloe mixer had been patented, dammit.

It is annoying to invent things, and have someone else patent them and
get rich.

I can\'t complain. I didn\'t follow up the concept for
20+ years!

I invented the dual-slope ADC when I was a kid, but I figured that
relay bounce would make it inaccurate. Relays were the only analog
switches I could imagine then.

I also invented the successive log RF detector, but my boss thought it
was useless.

In a similar vein I unwittingly invented FSMs plus a
neat implementation technique in the first machine
code program I wrote (for that 39bit serial computer
with magnetic logic).

I also conceived of implementing a CPU using microprogramming
(as in the AMD2900 bit slice processors).

But neither of those were difficult; they were quite
simple, even if most kids don\'t think of them.

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.
 
On Wed, 22 Jul 2020 23:54:37 +0100, Tom Gardner
<spamjunk@blueyonder.co.uk> wrote:

On 22/07/20 19:19, John Larkin wrote:
On Wed, 22 Jul 2020 18:52:46 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 22/07/20 15:20, jlarkin@highlandsniptechnology.com wrote:
On Wed, 22 Jul 2020 07:57:58 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:
My first job involved that newfangled digital logic; easy.

Beg to differ. Early integrated logic chips (RTL, DTL, weird stuff,
TTL) were horrible.

TTL was blissfully simple compared with the others!

Agree, but the early plastic DIPs failed a lot.


No, I haven\'t dealt with magnetic logic (although
the first computer I used in anger did).

Yes, I have looked at hydraulic logic running at
2000psi. It was a design study for replacing it
with the newfangled micros.

I concluded it wasn\'t worth the aggro, since it
was on an offshore oil rig with zero electricity
due to the inflammable atmosphere :)


It also involved creating a test set for the newfangled
multimode optical fibres that were just being installed
between exchanges. My knowledge: zero.

I ended up with a receiver with 180dB electrical (90dB
optical) dynamic range, using a large photodiode (BPW34)
and a LF351 based transimpedance amp. To recover the
signal I decided I couldn\'t predict how a PLL would work in
a range switched design, so I made a filter with a Q of 4000
using 10% components. The noise equivalent power was 1pW.

I\'ve always wanted to revisit that N-path filter for RF work,
since it has interesting properties. When I finally looked,
I found the Tayloe mixer had been patented, dammit.

It is annoying to invent things, and have someone else patent them and
get rich.

I can\'t complain. I didn\'t follow up the concept for
20+ years!

I invented the dual-slope ADC when I was a kid, but I figured that
relay bounce would make it inaccurate. Relays were the only analog
switches I could imagine then.

I also invented the successive log RF detector, but my boss thought it
was useless.

In a similar vein I unwittingly invented FSMs plus a
neat implementation technique in the first machine
code program I wrote (for that 39bit serial computer
with magnetic logic).

I also conceived of implementing a CPU using microprogramming
(as in the AMD2900 bit slice processors).

But neither of those were difficult; they were quite
simple, even if most kids don\'t think of them.

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.
 
On 2020-07-22 20:14, John Larkin wrote:

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.

It\'s a fun book, and does a lot to deflate the Harvard spin, which is
always good.

The sequel on the 360 and early 370s is a good read too, as is \"The
Mythical Man Month\" by Fred Brooks, who was in charge of OS/360, at the
time by far the largest programming project in the world. As he says,
\"How does a software project go a year late? One day at a time.\"

Obligatory Real Programmer reference:

<http://www.cs.utah.edu/~elb/folklore/mel.html>

Cheers

Phil Hobbs


--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC / Hobbs ElectroOptics
Optics, Electro-optics, Photonics, Analog Electronics
Briarcliff Manor NY 10510

http://electrooptical.net
http://hobbs-eo.com
 
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
<pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.


It\'s a fun book, and does a lot to deflate the Harvard spin, which is
always good.

The sequel on the 360 and early 370s is a good read too, as is \"The
Mythical Man Month\" by Fred Brooks, who was in charge of OS/360, at the
time by far the largest programming project in the world. As he says,
\"How does a software project go a year late? One day at a time.\"

Obligatory Real Programmer reference:

http://www.cs.utah.edu/~elb/folklore/mel.html

Cheers

Phil Hobbs

Burroughs programmed their computers in Algol. There was never any
other assembler or compiler. I was told that, after the Algol compiler
was written in Algol, two guys hand-compiled it to machine code,
working side-by-side and checking every opcode. That was the bootstrap
compiler.

Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?



--

John Larkin Highland Technology, Inc

Science teaches us to doubt.

Claude Bernard
 
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
<pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.


It\'s a fun book, and does a lot to deflate the Harvard spin, which is
always good.

The sequel on the 360 and early 370s is a good read too, as is \"The
Mythical Man Month\" by Fred Brooks, who was in charge of OS/360, at the
time by far the largest programming project in the world. As he says,
\"How does a software project go a year late? One day at a time.\"

Obligatory Real Programmer reference:

http://www.cs.utah.edu/~elb/folklore/mel.html

Cheers

Phil Hobbs

Burroughs programmed their computers in Algol. There was never any
other assembler or compiler. I was told that, after the Algol compiler
was written in Algol, two guys hand-compiled it to machine code,
working side-by-side and checking every opcode. That was the bootstrap
compiler.

Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?



--

John Larkin Highland Technology, Inc

Science teaches us to doubt.

Claude Bernard
 
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
<pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.


It\'s a fun book, and does a lot to deflate the Harvard spin, which is
always good.

The sequel on the 360 and early 370s is a good read too, as is \"The
Mythical Man Month\" by Fred Brooks, who was in charge of OS/360, at the
time by far the largest programming project in the world. As he says,
\"How does a software project go a year late? One day at a time.\"

Obligatory Real Programmer reference:

http://www.cs.utah.edu/~elb/folklore/mel.html

Cheers

Phil Hobbs

Burroughs programmed their computers in Algol. There was never any
other assembler or compiler. I was told that, after the Algol compiler
was written in Algol, two guys hand-compiled it to machine code,
working side-by-side and checking every opcode. That was the bootstrap
compiler.

Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?



--

John Larkin Highland Technology, Inc

Science teaches us to doubt.

Claude Bernard
 
Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?

In his other famous essay, \"No Silver Bullet\", Brooks points out that the factors-of-10 productivity improvements of the early days were gained by getting rid of extrinsic complexity--crude tools, limited hardware, and so forth.

Now the issues are mostly intrinsic to an artifact built of thought. So apart from more and more Python libraries, I doubt that there are a lot more orders of magnitude available.

Cheers

Phil Hobbs
 
Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?

In his other famous essay, \"No Silver Bullet\", Brooks points out that the factors-of-10 productivity improvements of the early days were gained by getting rid of extrinsic complexity--crude tools, limited hardware, and so forth.

Now the issues are mostly intrinsic to an artifact built of thought. So apart from more and more Python libraries, I doubt that there are a lot more orders of magnitude available.

Cheers

Phil Hobbs
 
On 23/07/20 19:10, Phil Hobbs wrote:
On 2020-07-23 12:43, Tom Gardner wrote:
On 23/07/20 16:30, pcdhobbs@gmail.com wrote:
Isn\'t our ancient and settled idea of what a computer is, and what an OS
and languages are, overdue for the next revolution?

In his other famous essay, \"No Silver Bullet\", Brooks points out that the
factors-of-10 productivity improvements of the early days were gained by
getting rid of extrinsic complexity--crude tools, limited hardware, and so
forth.

Now the issues are mostly intrinsic to an artifact built of thought. So apart
from more and more Python libraries, I doubt that there are a lot more orders
of magnitude available.
Not in a single processor (except perhaps the Mill).

But with multiple processors there can be significant
improvement - provided we are prepared to think in
different ways, and the tools support it.

Examples: mapreduce, or xC on xCORE processors.

I\'m talking about programmer productivity, not MIPS.

Oh, there we do come to an immovable object: human
stupidity.

Having said that, the right fundamental tools do help
and the wrong ones do hinder. In that respect the current
tools are deficient w.r.t.
- parallel programming in general
- multithreaded architectures
- multicore architectures
- NUMA, from registers through L* caches to local core
and beyond
- distributed architectures, especially w.r.t.
- partial system failure
- no universal time/state
- plus the eight+ fallacies

There are signs some of those are being tackled decently,
but most practical stuff is based on the illusion of a
sequential single threaded von Neuman machine.

That will have to change, but I\'ll probably be long dead
before it has sunk into the general consciousness.
 
On 23/07/20 19:10, Phil Hobbs wrote:
On 2020-07-23 12:43, Tom Gardner wrote:
On 23/07/20 16:30, pcdhobbs@gmail.com wrote:
Isn\'t our ancient and settled idea of what a computer is, and what an OS
and languages are, overdue for the next revolution?

In his other famous essay, \"No Silver Bullet\", Brooks points out that the
factors-of-10 productivity improvements of the early days were gained by
getting rid of extrinsic complexity--crude tools, limited hardware, and so
forth.

Now the issues are mostly intrinsic to an artifact built of thought. So apart
from more and more Python libraries, I doubt that there are a lot more orders
of magnitude available.
Not in a single processor (except perhaps the Mill).

But with multiple processors there can be significant
improvement - provided we are prepared to think in
different ways, and the tools support it.

Examples: mapreduce, or xC on xCORE processors.

I\'m talking about programmer productivity, not MIPS.

Oh, there we do come to an immovable object: human
stupidity.

Having said that, the right fundamental tools do help
and the wrong ones do hinder. In that respect the current
tools are deficient w.r.t.
- parallel programming in general
- multithreaded architectures
- multicore architectures
- NUMA, from registers through L* caches to local core
and beyond
- distributed architectures, especially w.r.t.
- partial system failure
- no universal time/state
- plus the eight+ fallacies

There are signs some of those are being tackled decently,
but most practical stuff is based on the illusion of a
sequential single threaded von Neuman machine.

That will have to change, but I\'ll probably be long dead
before it has sunk into the general consciousness.
 
On 23/07/20 19:10, Phil Hobbs wrote:
On 2020-07-23 12:43, Tom Gardner wrote:
On 23/07/20 16:30, pcdhobbs@gmail.com wrote:
Isn\'t our ancient and settled idea of what a computer is, and what an OS
and languages are, overdue for the next revolution?

In his other famous essay, \"No Silver Bullet\", Brooks points out that the
factors-of-10 productivity improvements of the early days were gained by
getting rid of extrinsic complexity--crude tools, limited hardware, and so
forth.

Now the issues are mostly intrinsic to an artifact built of thought. So apart
from more and more Python libraries, I doubt that there are a lot more orders
of magnitude available.
Not in a single processor (except perhaps the Mill).

But with multiple processors there can be significant
improvement - provided we are prepared to think in
different ways, and the tools support it.

Examples: mapreduce, or xC on xCORE processors.

I\'m talking about programmer productivity, not MIPS.

Oh, there we do come to an immovable object: human
stupidity.

Having said that, the right fundamental tools do help
and the wrong ones do hinder. In that respect the current
tools are deficient w.r.t.
- parallel programming in general
- multithreaded architectures
- multicore architectures
- NUMA, from registers through L* caches to local core
and beyond
- distributed architectures, especially w.r.t.
- partial system failure
- no universal time/state
- plus the eight+ fallacies

There are signs some of those are being tackled decently,
but most practical stuff is based on the illusion of a
sequential single threaded von Neuman machine.

That will have to change, but I\'ll probably be long dead
before it has sunk into the general consciousness.
 
On 2020-07-23 20:10, Phil Hobbs wrote:
On 2020-07-23 12:43, Tom Gardner wrote:
On 23/07/20 16:30, pcdhobbs@gmail.com wrote:
Isn\'t our ancient and settled idea of what a computer is, and what an OS
and languages are, overdue for the next revolution?

In his other famous essay, \"No Silver Bullet\", Brooks points out that the
factors-of-10 productivity improvements of the early days were gained by
getting rid of extrinsic complexity--crude tools, limited hardware, and so
forth.

Now the issues are mostly intrinsic to an artifact built of thought. So apart
from more and more Python libraries, I doubt that there are a lot more orders
of magnitude available.
Not in a single processor (except perhaps the Mill).

But with multiple processors there can be significant
improvement - provided we are prepared to think in
different ways, and the tools support it.

Examples: mapreduce, or xC on xCORE processors.

I\'m talking about programmer productivity, not MIPS.

We don\'t want productivity, in as more new versions. We
want quality, robustness and durability.

Jeroen Belleman
 
On 2020-07-23 20:10, Phil Hobbs wrote:
On 2020-07-23 12:43, Tom Gardner wrote:
On 23/07/20 16:30, pcdhobbs@gmail.com wrote:
Isn\'t our ancient and settled idea of what a computer is, and what an OS
and languages are, overdue for the next revolution?

In his other famous essay, \"No Silver Bullet\", Brooks points out that the
factors-of-10 productivity improvements of the early days were gained by
getting rid of extrinsic complexity--crude tools, limited hardware, and so
forth.

Now the issues are mostly intrinsic to an artifact built of thought. So apart
from more and more Python libraries, I doubt that there are a lot more orders
of magnitude available.
Not in a single processor (except perhaps the Mill).

But with multiple processors there can be significant
improvement - provided we are prepared to think in
different ways, and the tools support it.

Examples: mapreduce, or xC on xCORE processors.

I\'m talking about programmer productivity, not MIPS.

We don\'t want productivity, in as more new versions. We
want quality, robustness and durability.

Jeroen Belleman
 
On 2020-07-23 20:10, Phil Hobbs wrote:
On 2020-07-23 12:43, Tom Gardner wrote:
On 23/07/20 16:30, pcdhobbs@gmail.com wrote:
Isn\'t our ancient and settled idea of what a computer is, and what an OS
and languages are, overdue for the next revolution?

In his other famous essay, \"No Silver Bullet\", Brooks points out that the
factors-of-10 productivity improvements of the early days were gained by
getting rid of extrinsic complexity--crude tools, limited hardware, and so
forth.

Now the issues are mostly intrinsic to an artifact built of thought. So apart
from more and more Python libraries, I doubt that there are a lot more orders
of magnitude available.
Not in a single processor (except perhaps the Mill).

But with multiple processors there can be significant
improvement - provided we are prepared to think in
different ways, and the tools support it.

Examples: mapreduce, or xC on xCORE processors.

I\'m talking about programmer productivity, not MIPS.

We don\'t want productivity, in as more new versions. We
want quality, robustness and durability.

Jeroen Belleman
 
On 23/07/20 18:06, John Larkin wrote:
On Thu, 23 Jul 2020 17:39:57 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 23/07/20 16:13, jlarkin@highlandsniptechnology.com wrote:
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.


It\'s a fun book, and does a lot to deflate the Harvard spin, which is
always good.

The sequel on the 360 and early 370s is a good read too, as is \"The
Mythical Man Month\" by Fred Brooks, who was in charge of OS/360, at the
time by far the largest programming project in the world. As he says,
\"How does a software project go a year late? One day at a time.\"

Obligatory Real Programmer reference:

http://www.cs.utah.edu/~elb/folklore/mel.html

Cheers

Phil Hobbs

Burroughs programmed their computers in Algol. There was never any
other assembler or compiler. I was told that, after the Algol compiler
was written in Algol, two guys hand-compiled it to machine code,
working side-by-side and checking every opcode. That was the bootstrap
compiler.

Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?

The trick will be to get a revolution which starts from
where we are. There is no chance of completely throwing
out all that has been achieved until now, however appealing
that might be.

I know of two plausible starting points...

1) The Mill Processor, as described by Ivan Godard over
on comp.arch. This has many innovative techniques that,
in effect, bring DSP processor parallelism when executing
standard languages such as C. It appears that there\'s an
order of magnitude to be gained.

Incidentally, Godard\'s background is the Burroughs/Unisys
Algol machines, plus /much/ more.


2) xCORE processors are commercially available (unlike the
Mill). They start from presuming that embedded programs can
be highly parallel /iff/ the hardware and software allows
programmers to express it cleanly. They merge Hoare\'s CSP
with innovative hardware to /guarantee/ *hard* realtime
performance. In effect they have occupied a niche that is
halfway between conventional processors and FPGA.

I\'ve used them, and they are *easy* and fun to use.
(Cf C on a conventional processor!)

We don\'t need more compute power. We need reliability and user
friendliness.

Executing buggy c faster won\'t help. Historically, adding resources
(virtual memory, big DRAM, threads, more MIPS) makes things worse.

For Pete\'s sake, we still have buffer overrun exploits. We still have
image files with trojans. We still have malicious web pages.

Yes indeed. C and C++ are an *appalling*[1] starting point!

But better alternatives are appearing...

xC has some C syntax but removes the dangerous bits and
adds parallel constructs based on CSP; effectively the
hard real time RTOS is in the language and the xCORE
processor hardware.

Rust is gaining ground; although Torvalds hates and
prohibits C++ in the Linux kernel, he has hinted he won\'t
oppose seeing Rust in the Linux kernel.

Go is gaining ground at the application and server level;
it too has CSP constructs to enable parallelism.

Python, on the other hand, cannot make use of multicore
parallelism due to its global interpreter lock :)

[1] cue comments from David Brown ;}
 
On 23/07/20 18:06, John Larkin wrote:
On Thu, 23 Jul 2020 17:39:57 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 23/07/20 16:13, jlarkin@highlandsniptechnology.com wrote:
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.


It\'s a fun book, and does a lot to deflate the Harvard spin, which is
always good.

The sequel on the 360 and early 370s is a good read too, as is \"The
Mythical Man Month\" by Fred Brooks, who was in charge of OS/360, at the
time by far the largest programming project in the world. As he says,
\"How does a software project go a year late? One day at a time.\"

Obligatory Real Programmer reference:

http://www.cs.utah.edu/~elb/folklore/mel.html

Cheers

Phil Hobbs

Burroughs programmed their computers in Algol. There was never any
other assembler or compiler. I was told that, after the Algol compiler
was written in Algol, two guys hand-compiled it to machine code,
working side-by-side and checking every opcode. That was the bootstrap
compiler.

Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?

The trick will be to get a revolution which starts from
where we are. There is no chance of completely throwing
out all that has been achieved until now, however appealing
that might be.

I know of two plausible starting points...

1) The Mill Processor, as described by Ivan Godard over
on comp.arch. This has many innovative techniques that,
in effect, bring DSP processor parallelism when executing
standard languages such as C. It appears that there\'s an
order of magnitude to be gained.

Incidentally, Godard\'s background is the Burroughs/Unisys
Algol machines, plus /much/ more.


2) xCORE processors are commercially available (unlike the
Mill). They start from presuming that embedded programs can
be highly parallel /iff/ the hardware and software allows
programmers to express it cleanly. They merge Hoare\'s CSP
with innovative hardware to /guarantee/ *hard* realtime
performance. In effect they have occupied a niche that is
halfway between conventional processors and FPGA.

I\'ve used them, and they are *easy* and fun to use.
(Cf C on a conventional processor!)

We don\'t need more compute power. We need reliability and user
friendliness.

Executing buggy c faster won\'t help. Historically, adding resources
(virtual memory, big DRAM, threads, more MIPS) makes things worse.

For Pete\'s sake, we still have buffer overrun exploits. We still have
image files with trojans. We still have malicious web pages.

Yes indeed. C and C++ are an *appalling*[1] starting point!

But better alternatives are appearing...

xC has some C syntax but removes the dangerous bits and
adds parallel constructs based on CSP; effectively the
hard real time RTOS is in the language and the xCORE
processor hardware.

Rust is gaining ground; although Torvalds hates and
prohibits C++ in the Linux kernel, he has hinted he won\'t
oppose seeing Rust in the Linux kernel.

Go is gaining ground at the application and server level;
it too has CSP constructs to enable parallelism.

Python, on the other hand, cannot make use of multicore
parallelism due to its global interpreter lock :)

[1] cue comments from David Brown ;}
 
On 23/07/20 18:06, John Larkin wrote:
On Thu, 23 Jul 2020 17:39:57 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 23/07/20 16:13, jlarkin@highlandsniptechnology.com wrote:
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.


It\'s a fun book, and does a lot to deflate the Harvard spin, which is
always good.

The sequel on the 360 and early 370s is a good read too, as is \"The
Mythical Man Month\" by Fred Brooks, who was in charge of OS/360, at the
time by far the largest programming project in the world. As he says,
\"How does a software project go a year late? One day at a time.\"

Obligatory Real Programmer reference:

http://www.cs.utah.edu/~elb/folklore/mel.html

Cheers

Phil Hobbs

Burroughs programmed their computers in Algol. There was never any
other assembler or compiler. I was told that, after the Algol compiler
was written in Algol, two guys hand-compiled it to machine code,
working side-by-side and checking every opcode. That was the bootstrap
compiler.

Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?

The trick will be to get a revolution which starts from
where we are. There is no chance of completely throwing
out all that has been achieved until now, however appealing
that might be.

I know of two plausible starting points...

1) The Mill Processor, as described by Ivan Godard over
on comp.arch. This has many innovative techniques that,
in effect, bring DSP processor parallelism when executing
standard languages such as C. It appears that there\'s an
order of magnitude to be gained.

Incidentally, Godard\'s background is the Burroughs/Unisys
Algol machines, plus /much/ more.


2) xCORE processors are commercially available (unlike the
Mill). They start from presuming that embedded programs can
be highly parallel /iff/ the hardware and software allows
programmers to express it cleanly. They merge Hoare\'s CSP
with innovative hardware to /guarantee/ *hard* realtime
performance. In effect they have occupied a niche that is
halfway between conventional processors and FPGA.

I\'ve used them, and they are *easy* and fun to use.
(Cf C on a conventional processor!)

We don\'t need more compute power. We need reliability and user
friendliness.

Executing buggy c faster won\'t help. Historically, adding resources
(virtual memory, big DRAM, threads, more MIPS) makes things worse.

For Pete\'s sake, we still have buffer overrun exploits. We still have
image files with trojans. We still have malicious web pages.

Yes indeed. C and C++ are an *appalling*[1] starting point!

But better alternatives are appearing...

xC has some C syntax but removes the dangerous bits and
adds parallel constructs based on CSP; effectively the
hard real time RTOS is in the language and the xCORE
processor hardware.

Rust is gaining ground; although Torvalds hates and
prohibits C++ in the Linux kernel, he has hinted he won\'t
oppose seeing Rust in the Linux kernel.

Go is gaining ground at the application and server level;
it too has CSP constructs to enable parallelism.

Python, on the other hand, cannot make use of multicore
parallelism due to its global interpreter lock :)

[1] cue comments from David Brown ;}
 

Welcome to EDABoard.com

Sponsor

Back
Top