OT: redhat

Bill Vermillion fp at wjv.com
Sun Nov 14 15:31:53 PST 2004


if:
then: nm(30,*)="Bill Campbell"; dt(20,*)="Sun, Nov 14 10:55 "; 
if:
then: show ("1","1") "On" < dt < nm < "said:" 

On Sun, Nov 14 10:55  Bill Campbell said: 

> On Sun, Nov 14, 2004, Bill Vermillion wrote:

...

> >In actuallity IBM was the only one who did it correctly.  As much as
> >anyone decries the IBM ways they do adhere to design specs and
> >implementations to the letter.  And most of the HD improvements
> >that have been made - going back for 50 years - have been from IBM.

> You mean like the IBM ``improvements'' to the Centronics
> printer specs on the PC where they changed the pin-outs, and
> then used the DB-25 female connectors for the printer which had
> been used for years for RS-232 serial connections? Many decried
> Tandy printers because they were ``non- standard'' when, in
> fact, they adhered strictly to the original Centronics specs.

The PC came from a group in Boca that was outside the normal
research area of IBM. It doesn't take much to see that the PC
was not 'engineered' but assembled by people who were more like
'hardware hackers'. What is now the PC was the 3rd attempt by
IBM to build a small computer and a great many people in the IBM
organization wished these people to fail just as the other attempts
had.  Those were those dedicated to big computers.

The bus was patterned a bit after the S-100 architecture.  The
power supply goes against all proper engineering standards.

I think their changing of the standard Centronics mode shows
the rogue nature of the group.   However the Centronics standard
had only been a standard for about 5 years at that time.

The common feeling in the amateur community was to use serial
connections for printers as there are standards for RS-232 and
the parallel printer connections used prior to the Centronics
interface were NOT standard. You never knew how each would be
wired. At least in that area the 25 pin spec with pin 7 for for
ground, pin 2 for transmit and pin 3 for receive on the DTE
interface was always followed.  Centronice became a de-facto
standard.  I also wonder if part of that change by IBM may have
come from Epson who built the printer that was designed to work
with the IBM.   I have no idea on that - and now 25 years later
it would probalby be hard to determine just who was reponsible.
That Epson was SO SLOW compared to others on the market at that
time.  And when you installed an Epson emulation chip set in
an Okidata - it ever ran at the Epson speed - about 1/2 that of the
native.

Those who complained about not getting serial devices to work
because they were not standard usually didn't understand
the interface, articulary when you had one with all signal
implemented. Somewhere in the piles of books I have the specs
for RS-232 serial. And there are seven configurations in the
standards. The minimum being one way - eg using pin 2 for
transmit and pin for signal ground - through the ones with full
handshaking with all signals implemented.  And the 7th of those
configuration was 'pin 7 signal ground - all others user defined'.
I always thought that was some concession somewhere as it meant
that every serial interface met at least one standard. :-)

That of course, pin 2 for transmit, is for a DTE - data terminal
interface. For the DCE - data communications [aka modem] - pin 3
is transmit and it was designed for a DCE to interface to a DTE
with with a pure straight through connection - eg pin to pin.

And if you look at specs carefully they really weren't designed
for things such as the PC - eg outside of a commercial style
environment.

The DCE 25 pin was to be female and the DCT was to be male.

That's OK if you have technically oriented people connecting
things, but in something that goes to a consumer market it
is double-plus-ungood.  Having a connector with mail pins
in a piece of hardware means that pins are broken / bent and
to fix that means a hardware disassembly to fix it. 

Having female connectors on pieces of hardware with male connectors
only on hardware is a much more robust design.   But engineering
specs were followed on the physical portion - but the wiring was
surely not standard.

I spent at least a lifetime in audio [broadcast and recording]
and you also see a standards perversion there when you compare the
audio in the original audio reinforcement field.

The small cannon connectors - the three pin/socket devices -
replacing the original ones that were about 1.5" in diameter would
be wired opposite in the two areas.

In broadcast/recording all inputs used female sockets, but in
re-enforcement all inputs were male.  That was not too good for
such things a microphone inputs on stages with pins - while being
sunken in were still 'naked' and susceptible to anything that could
fall in and short out - and if broken required a physical
replacement.   Saner heads prevailed in broadcast where every
second lost costs money so that the portion of a connection that
could fail was kept on the cable side which was quickly and easily
replaces.    

So those two approaches show that when you follow standards exactly
you sometimes lead to failure when the devices were not used
in ways foreseen in development.  The female socket on hardware
became a de-facto standard while the mail socket was de-jure.

And if you work in audio you never let an electrician do your
wiring.  In audio the black wire is ground/neutral with white
being hot/positive.  In AC it's just the opposite.

In this world of small computers we are saddled with mistakes
by people who just made things work and many of whom probably never
saw a standards specification in their life.  That can apply to
both the hardware and software sides.

It's the approach of doing thing the right way or the expedient
way.  And that goes along with specifications in mechanical devices
meeting specifications exactly - or having specification with
tolerances [aka 'slop'].

When Ford brought back the Continental in the 1950s it was the
first time in the US auto field that every single piece that went
into the finished product had 100% quality control. Instead of
inspecting 1 of 100 washers for example, each one was inspected.

Losing sight of perfection caused Ford to have problems years ago
in the Taurus transmissions.  They were getting unexpected failures
and could not find out why.  Some of the transmissions were built in
the US and some were built by Mitsubishi [I believe it was Mit. -
but it not it was another Japanese source].

They found that all failed transmissions were US built while 
all from Japan were not failing.  They went through piece by piece
and every part in the US made trannys met specs for tolerance.

Then they started checking the Japanese transmissions and they
found that all measurements were identical and figured their
instruments had failed and sent them out for re-calibration.
When they got them back they still found that the Japanese parts
met the specifications exactly - with no tolerances while the US
parts did vary but were within tolerance.

That's the difference between doing something the absolute right
way and a way that works.  Sometimes being exact it the best way,
at other times a working way [not following specs] is better.

And it takes a good engineering team to make those decision as
to what will work best when the product is built and in the hands
of the end users, taking into account just who the end users will
be.

Bill
-- 
Bill Vermillion - bv @ wjv . com


More information about the Filepro-list mailing list