Tuesday, October 25, 2022

WEDNESDAY, OCTOBER 26, 2022 - COMPUTING I

 WEDNESDAY, OCTOBER 26, 2022 - COMPUTING I




Last week and this week (and I have more material - so maybe even next week), I’ve been writing about technologies - drones, autonomous cars, 3D printing, facial recognition, and telephones.  It struck me the other day that I haven’t written about the basic technology underlying the technology of today - computing, and the internet.  So, today, a brief look at computing.


COMPUTING


No, there haven’t always been computers and there hasn’t always been computing.  I like the concept that some of us older folk are “computer immigrants”, and the younger generations are “computer natives”.  


I was born in 1947 - about the same time as the first real viable computer was made.  A court case determined that the first computer was about 1942 in Ames Iowa - the ABC Computer.  


The ENIAC - Electronic Numerical Integrator And Computer was born (according to the internet) in 1946 (although one source says 1945).  


***

Some historical factors.


World War II was a global war - many countries took part, the two major oceans (Atlantic and Pacific) were theaters of operations, and men, ships, supplies, and airplanes, were all going one way or another.


First - the business side of war.  There was a movie “Saving Private Ryan” - but how do you know which Private Ryan needs saving.  There weren’t databases where an SQL search on “Last-name = Ryan”; there were paper records but no digital records.  How many pounds of potatoes would be needed in England?  How many extra uniforms, bars of soap, and equipment.


Second - the mechanics/mathematics of war.  How to aim and shoot your artillery shells when the wind is from the north at 20 miles per hour, the humidity is 55%, with a cloud cover?  At what second do you need to release your bombs so they fill on the right target (not a church, school, or hospital).  


*****

The government, recognizing the need for keeping track of things (aka “data processing” in the old language), and high-speed calculations (“number crunching”) was encouraging universities and technology businesses to pursue some sort of analysis devices.  


*****

Thus, the first generation of computing - ENIAC was followed by similar devices.  These were digit (not analog) and used on and off switches or devices - if you will 1 (on), and 0 (off).  Binary coding schemes had been developed - so 1011010 would be the number 90 in base 10 (or decimal).  On and off - pretty simple - but also pretty complicated to represent letters, punctuation, and decimal values, in every language known.  


There was a quip, “First was the bit (binary digit - on and off), then the byte (basically eight bits), then the nibble, word, and double word.  Confusion covered the face of computing and IBM said ‘it is profitable’) - as a play on words from Genesis chapter 1.  


IBM as the International Business Machines had worked with coding schemes and working with the US Census Bureau had found ways to count Americans using punched cards (of Hollerith Cards).  They also had typewriters.  The early competition was between IBM, Sperry-Rand, Univac, and others.  The comment was that some of the early computers worked faster and better than the IBM machines, but IBM’s sales force won the day by selling customers on their machines.  By the 1960s, IBM was the big force in computing.  


In the early days, a computer might work one hour out of the week; during that hour, the scientists would get data that might take years to do by hand.  One of the true stories is that a moth was attracted to the early computers with vacuum tubes and flew in and got fried.  The technicians used tweezers to pull the moth remains out of the device.  That concept was known as “debugging” and is still used today, but in a different concept - keeping “bugs” and errors out of software.


As the world got into the 1960s, the American President (John F. Kennedy) said “We will land men on the moon and bring them home in less than ten years.” That took a lot of computing.  It also has been said that the computing power to land men on the moon and bring them back is less than the computing power on your phone!!!


The devices were mammoth - but grew smaller as transistors replaced vacuum tubes (like light bulbs), and then as integrated circuits replaced transistors, and very large scale integration replaced the first integrated circuits, and more and more memory and faster and faster speeds prevailed.  I haven’t heard any significant discussions about Moore’s Law for a few years.  Moore’s law predicted that processing speeds would double about every two years for the same price (or, basically twice as much processing power in two years for the same money).


Computing has many facets - hardware, software, communications, and applications.  


The hardware is a thousand times faster than the early computers, the software is a thousand times more robust than the early computers, and the networking is a thousand times faster than the early computers.  That allows for artificial intelligence - searching multiple databases in only a few seconds; computer vision - allowing for facial identification and for autonomous vehicles as the cameras process what is ahead, along the side, behind, above, and below a car.


There was an old analogy - computers in 1947 would be like driving from New York to Los Angeles in a 1932 Ford Roadster; computers in 2022 would be like flying from New York to Los Angeles in three minutes.  


More tomorrow on technology!


LOVE WINS

Karen White, October 26, 2022, © 




No comments:

Post a Comment

Thank you for visiting Karens2019.blogspot.com. I will review your message!!!