Not about photography, but computers changed around 1980

"Probably thinking of future employability."

Yes, that was the problem. Only a few choices for employers and a few places to live.

There is a bit of that with optics too.

With a general EE or ME degree you can live and work practically anywhere.
 
Last edited:
When I was in school, there was a large corporation doing high power which required tubes that was offering a full four-year scholarship for EE's with summer internships for anyone who would come work for them after graduating. They could get no takers. Everyone young wanted to do transistors.
We were expected to understand mag amps, about which I can remember very little. I might add that we were told about various other obsolete or obsolescent technologies too. Carbon pile voltage regulators for example, the first aircraft to have an autoland capability (DH Trident) used analogue computers. In 1979 I did a Trident Instrument Autopilot type course between February and March, analogue, and a 737-200 Advanced type course, digital, between September and October. With each successive type we knew less and less about what went on in the boxes. By 2000 we rarely changed them so not knowing the detail didn't matter much.
 
Being 87, I went through that era of computers. Today, with a video card in a windows PC which has (I think 43 billion transistors), I try to imagine what the computer will look like in year 2050.

Bert
I'm the opposite. I tend to think what would a computer with 20 billion transistors, converted back to vacuum tubes into the 1950 looks like. Even with small nu-vista tubes, 20 billion of them is just mind blowingly insane. HAHAHAHAHA

And then we look at those Ai chips, nVidia B200 with 208 billion transistors!!! Think vacuum tubes!!! HAHAHAHAHA
If you could manage to construct a pc capable of running Windows 11 with 12AX7s, think about how long it would take to boot up.
I think a cavity magnetron would qualify as a vacuum tube (valve this side of the pond).
I want to see the design of an ALU using cavity magnetrons!
Simply pointing out that not everything has been replaced by transistors, yet.
Indeed. Analog iterative optical machines another route or some chap that likes to build them out of big cogs and shafts 😄

The idea of a TWT or a magnetron based calculating machine is pretty novel.
I'm trying to get my head around that idea but all I come up with is a baked potato
We could used potatoes for memory. Raw is zero. Baked is one. Write once, though.
Not archival either, very volatile in fact.
 
When I was in school, there was a large corporation doing high power which required tubes that was offering a full four-year scholarship for EE's with summer internships for anyone who would come work for them after graduating. They could get no takers. Everyone young wanted to do transistors.
We were expected to understand mag amps, about which I can remember very little. I might add that we were told about various other obsolete or obsolescent technologies too. Carbon pile voltage regulators for example, the first aircraft to have an autoland capability (DH Trident) used analogue computers. In 1979 I did a Trident Instrument Autopilot type course between February and March, analogue, and a 737-200 Advanced type course, digital, between September and October. With each successive type we knew less and less about what went on in the boxes. By 2000 we rarely changed them so not knowing the detail didn't matter much.
In my undergrad education at Stanford, I learned about power engineering, Klystron tube design, Class C tube amplifier design, and a bunch of other stuff I never used.
 
Being 87, I went through that era of computers. Today, with a video card in a windows PC which has (I think 43 billion transistors), I try to imagine what the computer will look like in year 2050.

Bert
I'm the opposite. I tend to think what would a computer with 20 billion transistors, converted back to vacuum tubes into the 1950 looks like. Even with small nu-vista tubes, 20 billion of them is just mind blowingly insane. HAHAHAHAHA

And then we look at those Ai chips, nVidia B200 with 208 billion transistors!!! Think vacuum tubes!!! HAHAHAHAHA
If you could manage to construct a pc capable of running Windows 11 with 12AX7s, think about how long it would take to boot up.
I think a cavity magnetron would qualify as a vacuum tube (valve this side of the pond).
I want to see the design of an ALU using cavity magnetrons!
Simply pointing out that not everything has been replaced by transistors, yet.
Indeed. Analog iterative optical machines another route or some chap that likes to build them out of big cogs and shafts 😄

The idea of a TWT or a magnetron based calculating machine is pretty novel.
I'm trying to get my head around that idea but all I come up with is a baked potato
Haha - well they are nice. Perhaps that can be the OS name 🤣
 
Being 87, I went through that era of computers. Today, with a video card in a windows PC which has (I think 43 billion transistors), I try to imagine what the computer will look like in year 2050.

Bert
I'm the opposite. I tend to think what would a computer with 20 billion transistors, converted back to vacuum tubes into the 1950 looks like. Even with small nu-vista tubes, 20 billion of them is just mind blowingly insane. HAHAHAHAHA

And then we look at those Ai chips, nVidia B200 with 208 billion transistors!!! Think vacuum tubes!!! HAHAHAHAHA
If you could manage to construct a pc capable of running Windows 11 with 12AX7s, think about how long it would take to boot up.
It will boot up very quickly I presume. It will have a magnetron tube pumping 3.33GHz as a clock for this cpu. HAHAHAHAHAHA
 
With tubes you primarily designed with current but with transistors you designed to voltage. So they all worked really hard to understand the theory about electrons and holes and npn versus pnp. With vacuum tubes, you really needed to understand the physics inside to design with them effectively. So they thought transistors would be the same.

A decade later my Dad said he didn't understand why they wasted all that time on electrons and holes. It wasn't helpful in designing circuits at all.
You can design to current using transistors too. And those electrons and holes is electric current flow. Transistors and tubes are very similar in operations, similar load lines plotting, base/grid current vs output at collector/anode, and many others. Less the high operating voltage from tubes of cos'.
 
Being 87, I went through that era of computers. Today, with a video card in a windows PC which has (I think 43 billion transistors), I try to imagine what the computer will look like in year 2050.

Bert
I'm the opposite. I tend to think what would a computer with 20 billion transistors, converted back to vacuum tubes into the 1950 looks like. Even with small nu-vista tubes, 20 billion of them is just mind blowingly insane. HAHAHAHAHA

And then we look at those Ai chips, nVidia B200 with 208 billion transistors!!! Think vacuum tubes!!! HAHAHAHAHA
If you could manage to construct a pc capable of running Windows 11 with 12AX7s, think about how long it would take to boot up.
I would think that both programmers and operating systems were a few orders more efficient than today.

Best regards

Erik
 
"....an image that looks like a real photo"

I remember that too, but I can't place exactly when and what technology. Would that have been with VGA?

I recall Steve Jobs being strongly opposed to color monitors. It was completely against his paradigm of emulating paper. But then IBM came out with color CGA with horrible resolution and our secretary absolutely loved having that color monitor.
 
"....an image that looks like a real photo"

I remember that too, but I can't place exactly when and what technology. Would that have been with VGA?

I recall Steve Jobs being strongly opposed to color monitors. It was completely against his paradigm of emulating paper. But then IBM came out with color CGA with horrible resolution and our secretary absolutely loved having that color monitor.
I don't remember what tech it was, but I don't think it was VGA in '85. That era breakthrough was EGA.

If I remember correctly, the computer brand that displayed the image was Amiga, not IBM or NEC or Fujitsu, etc.

Those were some mind blowing days... Apple had their Mocking board (sound card) and for the first time, games have fancy sound effects and music scores!!! As the years progress, we have "high resolution" 24 pins dot matrix printers, and then games in 3D, that brought forth fancy 3D graphics cards. Amazing times!!!

And who can forget that turbo switch, changes from 8MHz to 12MHz cpu clock speed. HAHAHAHAHA
 
"....an image that looks like a real photo"

I remember that too, but I can't place exactly when and what technology. Would that have been with VGA?

I recall Steve Jobs being strongly opposed to color monitors. It was completely against his paradigm of emulating paper. But then IBM came out with color CGA with horrible resolution and our secretary absolutely loved having that color monitor.
I don't remember what tech it was, but I don't think it was VGA in '85. That era breakthrough was EGA.
The VGA first shipped in 1987.
If I remember correctly, the computer brand that displayed the image was Amiga, not IBM or NEC or Fujitsu, etc.

Those were some mind blowing days... Apple had their Mocking board (sound card) and for the first time, games have fancy sound effects and music scores!!! As the years progress, we have "high resolution" 24 pins dot matrix printers, and then games in 3D, that brought forth fancy 3D graphics cards. Amazing times!!!

And who can forget that turbo switch, changes from 8MHz to 12MHz cpu clock speed. HAHAHAHAHA
 
Amiga with HAM mode I think is the right answer just looking things up on wikipedia. That fits 1985 just fine.
 
"....an image that looks like a real photo"

I remember that too, but I can't place exactly when and what technology. Would that have been with VGA?

I recall Steve Jobs being strongly opposed to color monitors. It was completely against his paradigm of emulating paper. But then IBM came out with color CGA with horrible resolution and our secretary absolutely loved having that color monitor.
I don't remember what tech it was, but I don't think it was VGA in '85. That era breakthrough was EGA.
The VGA first shipped in 1987.
If I remember correctly, the computer brand that displayed the image was Amiga, not IBM or NEC or Fujitsu, etc.

Those were some mind blowing days... Apple had their Mocking board (sound card) and for the first time, games have fancy sound effects and music scores!!! As the years progress, we have "high resolution" 24 pins dot matrix printers, and then games in 3D, that brought forth fancy 3D graphics cards. Amazing times!!!

And who can forget that turbo switch, changes from 8MHz to 12MHz cpu clock speed. HAHAHAHAHA
Sometimes around 1982, or so, I was a postgrad student, two buddies at the lab and me built a Motorola 68000 based computer running an early UNIX derivative.

That was a text based computer, but we developed a graphics card based on a graphics processor called NEC 7220, I would recall with a Motorola 6809 driving it with a HPGL inspired instruction set. The graphics card used a separate monitor.

We used it to design printed circuit boards. The institution where we were working had a HP plotter, We developed a plotter pen with a lamp, so we could plot on photographic film. It was a bit of fun.

Best regards

Erik
 
that’s pretty cool. did you then use the film to mask photoresist then acid bath to make the PCBs?
 
that’s pretty cool. did you then use the film to mask photoresist then acid bath to make the PCBs?
Yes, it worked pretty well. Controlling the lamp was the hard part.

Later, when I worked at NPP Simulation facility I found a similar plotter that was unused. So I wrote a plotting program for it.

I had one of those 'shoe box' size portable computers from Compaq. The data files were quite big, so plotting them took some time. My program crashed mostly during the plotting, could not find the error.

Until I realised the hot air exhaust of the plotter was just across the cool air intake of the Compaq. :-)
Soon enough, I had a Sun Sparc Station 1 computer attached to a Laser printer, also from SUN. life got much simpler.

Best regards

Erik

--
Erik Kaffehr
Website: http://echophoto.dnsalias.net
Magic tends to disappear in controlled experiments…
Gallery: http://echophoto.smugmug.com
Articles: http://echophoto.dnsalias.net/ekr/index.php/photoarticles
 
Last edited:
"Controlling the lamp was the hard part..."

it always is. one of those things that sounds easy until you try it.
 
The 512k mac 20meg hd20 and excel + ominous 3 database was the revolution to computing in the mid 80s. $10k systems from apple where way more advanced than $2mil ibm based systems.
For those of us involved in art and design (including photography) the first colour Mac and the Commodore Amiga were the big jump forward. Both with mice, which was a key part of making computers usable by people without electronics or coding skills.

Before that, we did have graphics tablets with colour software, but you had to work at it to get decent pictures.

I did build a Z80 based computer before that, but it was text only.

I think I drew this on an Amiga, using a mouse.

f7802af888ed416bb745eafd10a13234.jpg
 
That's still nice today with an Amiga nostalgic feel to it. In 1985 that was like, wow!
 
dba7a32375424e4fb76a63205eca1433.jpg

This is an analog computer that I visited around 1980. It was designed for pattern recognition, with the ultimate aim of being able to read text. The man in the photo built it, from ordinary electronic components (transistors, resistors, etc). It was designed by a researcher at one of the many colleges in London University. Unfortunately I don't know his name.

Motorized resistors were used to record the luminance of pixels in a grid of about 8x8 resolution, which is enough to identify a single character (letter or numeral).



97eba3197ae74e7ba17c1e98e1f3bf13.jpg

The designer and his son.
 
Amiga with HAM mode I think is the right answer just looking things up on wikipedia. That fits 1985 just fine.
The Amiga was a big step forward. Unfortunately the money men at Commodore didn't know what they had. They thought computers were just for doing the accounts. So the Amiga was allowed to die.



852c5040cbac456492f05d7e88230b96.jpg

Amiga 3D program.
 

Keyboard shortcuts

Back
Top