• Sign up or login, and you'll have full access to opportunities of forum.

Milestones

Go to CruxDreams.com
So I was working as a programmer, using FORTRAN mainly. The computer was a CDC6600, a "RISC" (reduced instruction set) machine, designed to be very fast. We punched in our programs on cards, and they were read in. (Recall that in the very early days there were no computer languages, and one had to enter programs by toggling switches, so this was a sophisticated improvement.) One day, over in the corner of the keypunch room, a little CRT monitor with a keyboard appeared, and everyone wondered what it was.
The whole thing was in a basement under a gym (reminds me of Fermi's first atomic "pile" under the football stadium at the University of Chicago, which had the potential to start a runaway chain reaction and blow up the city), and the air conditioning would often go out and force them to shut the machine--which was huge-- down to avoid overheating.
Gremlins are real. Recall the "Great Galactic Ghoul", which scuttled many of the early Mars missions, protecting the planet from spacecraft.
When you are a poor student who has to scape a few pennies to buy either paper or beer, how do you get a ream of paper for free?

Do some sums in FORTRAN using said punch cards, but 'accidently' end up dividing by zero. Plenty of paper in the output as long as you don't mind the word ERROR! written neatly down the side of each one.
 
So I was working as a programmer, using FORTRAN mainly. The computer was a CDC6600, a "RISC" (reduced instruction set) machine, designed to be very fast. We punched in our programs on cards, and they were read in. (Recall that in the very early days there were no computer languages, and one had to enter programs by toggling switches, so this was a sophisticated improvement.) One day, over in the corner of the keypunch room, a little CRT monitor with a keyboard appeared, and everyone wondered what it was.
The whole thing was in a basement under a gym (reminds me of Fermi's first atomic "pile" under the football stadium at the University of Chicago, which had the potential to start a runaway chain reaction and blow up the city), and the air conditioning would often go out and force them to shut the machine--which was huge-- down to avoid overheating.
Gremlins are real. Recall the "Great Galactic Ghoul", which scuttled many of the early Mars missions, protecting the planet from spacecraft.
I'm suprised. I thought RISC architecture only came in during the 80s, long after punch cards would stop being used.
 
I'm suprised. I thought RISC architecture only came in during the 80s, long after punch cards would stop being used.
Agreed. I thought RISC was something invented by Acorn Computers in the UK back in the 80s, before they split that side of the business off into a little outfit called Acorn RISC Machines (ARM) whos processor architecture is now found inside pretty much every mobile device on the planet
 
When you are a poor student who has to scape a few pennies to buy either paper or beer, how do you get a ream of paper for free?

Do some sums in FORTRAN using said punch cards, but 'accidently' end up dividing by zero. Plenty of paper in the output as long as you don't mind the word ERROR! written neatly down the side of each one.
I might be wrong, but I think one of the earliest precursor viruses, is propsed to be a programme which output just zeros on a punchcard, essentially causing the card to deteriorate within the machine.
 
Agreed. I thought RISC was something invented by Acorn Computers in the UK back in the 80s, before they split that side of the business off into a little outfit called Acorn RISC Machines (ARM) whos processor architecture is now found inside pretty much every mobile device on the planet
I understood there were multiple RISC arhetectures developed during the 80s including Acorn's Arm, but I didn't think it was the first one.
As well as being used in mobile devices today, RISC CPUs were also the choice disciosn for game consoles during the 90s. The Sega Saturn and the Playstaition.
Looking up the CDC6600 on Wikipedia, I read that the main CPUs responsibilty in most computers at the time, was split between multiple 'Peripheral Processors' in the CDC6600, resulting in a reduced instruction set for the main CPU, and making it the first RISC computer. But according to Wikipedia, term RISC only came later.
 
So I was working as a programmer, using FORTRAN mainly. The computer was a CDC6600, a "RISC" (reduced instruction set) machine, designed to be very fast. We punched in our programs on cards, and they were read in. (Recall that in the very early days there were no computer languages, and one had to enter programs by toggling switches, so this was a sophisticated improvement.) One day, over in the corner of the keypunch room, a little CRT monitor with a keyboard appeared, and everyone wondered what it was.
The whole thing was in a basement under a gym (reminds me of Fermi's first atomic "pile" under the football stadium at the University of Chicago, which had the potential to start a runaway chain reaction and blow up the city), and the air conditioning would often go out and force them to shut the machine--which was huge-- down to avoid overheating.
Gremlins are real. Recall the "Great Galactic Ghoul", which scuttled many of the early Mars missions, protecting the planet from spacecraft.
The very first time I saw a 'real' computer, was in early 1980. Actually, we only saw the keyboard, and I seriously did wonder what that 'typewriter' was intended for, since I had expected to see something with lots of switches to run it.
 
I understood there were multiple RISC arhetectures developed during the 80s including Acorn's Arm, but I didn't think it was the first one.
As well as being used in mobile devices today, RISC CPUs were also the choice disciosn for game consoles during the 90s. The Sega Saturn and the Playstaition.
Looking up the CDC6600 on Wikipedia, I read that the main CPUs responsibilty in most computers at the time, was split between multiple 'Peripheral Processors' in the CDC6600, resulting in a reduced instruction set for the main CPU, and making it the first RISC computer. But according to Wikipedia, term RISC only came later.
According to Wikipedia, MIPS were among one of the first popularized version of RISC architectures, developed in Standford University between 1981 and 1984, while a project conducted at Berkeley in the University of California between 1980 and 1984, popularized the term RISC. I think the development of RISC is tied between MIPS and Berkeley.
Acorn's ARM started development in 1983, drawing on the developments made in Berkeley, in an effort to compete with the higher performing IBM PC at a lower price range, which they thought was disproportionally expansive for the advantage in performance it offered over the BBC Micro.

EDIT:
IBM's 801 is the main contender for the earliest RISC architecture, developed between 1975 and 1980, when it was found while working on a telephone switching system designed to handle 1 million calls per hour, that compliers make better use of registers, ignoring a lot of the instruction set. It later led to the development of IBM's RISC architecture, POWER.

I also only mentioned RISC being used in two consoles from the 90s, but really it was used in most consoles between the mid 90s to the 2000s, including the Wii U and Nintendo Switch.
 
Last edited:
Agreed. I thought RISC was something invented by Acorn Computers in the UK back in the 80s, before they split that side of the business off into a little outfit called Acorn RISC Machines (ARM) whos processor architecture is now found inside pretty much every mobile device on the planet
I read that in the introduction to the book I bought on CDC6600 assembler (Seymour Cray later of Cray computers was the architect). I programmed rather extensively in that assembly language. As I recall (which I am not really eager to do now) there were A (address) registers and longer X (numerical) registers, 7 of each if I recall. There were load instructions for each set, and numerical instructions for the X's, and logical and/or/nor instructions. There wasn't much else.
By contrast, when I learned IBM 370 assembler, there were many more instructions, lots of "move" instructions, and even a MVCL (move long--move a large string of bytes from one starting memory location to another) instruction that could be interrupted, and I think 13 total registers, all the same. You could use the registers in the various move instructions to shift contents of addresses between memory locations. This was of course geared toward COBOL, which dealt with text much more than really accurate scientific calculation. I don't recall this kind of thing with the CDC (I may be misremembering). One had to store a memory byte address in an A register, a target address in another, then do a store. You could also store the actual contents of a register to memory. So there were many fewer instructions in the 6600.
When I was getting a master's degree, we had to use a motorola processor (the 8600 if I recall) and it too had lots more instructions than the CDC had. You could do all this stuff with the CDC, but you did need to string a lot of stuff together and roll your own, basically. The CDC was designed for fast numerical computation for scientific applications--it was a super computer of that day.
At the time, "computer science" was new and almost all the faculty were arrogant, and because there were jobs out there their department had more potential majors than it could serve. In the operating systems course, the instructor was so full of himself that he would often spend the whole class yapping about Robin Hood or English grammar (when can you use "its" and when can you use "it''s") to demonstrate his vast expertise (beats actually preparing a lecture, I guess), although half the class was Chinese or South Asian and didn't really know or care. Once someone raised her hand and asked "can we PLEASE go on". When he did talk about interrupts, he said that there was an interruptible assembly instruction, and he leeringly asked if anyone knew what it was. The poor guy was crestfallen when I raised my hand and said it was IBM's MVCL. "That's what I was thinking of." In the end, he didn't get tenure.
 
Last edited:
I read that in the introduction to the book I bought on CDC6600 assembler (Seymour Cray later of Cray computers was the architect). I programmed rather extensively in that assembly language. As I recall (which I am not really eager to do now) there were A (address) registers and longer X (numerical) registers, 7 of each if I recall. There were load instructions for each set, and numerical instructions for the X's, and logical and/or/nor instructions. There wasn't much else.
By contrast, when I learned IBM 370 assembler, there were many more instructions, lots of "move" instructions, and even a MVCL (move long--move a large string of bytes from one starting memory location to another) instruction that could be interrupted, and I think 13 total registers, all the same. You could use the registers in the various move instructions to shift contents of addresses between memory locations. This was of course geared toward COBOL, which dealt with text much more than really accurate scientific calculation. I don't recall this kind of thing with the CDC (I may be misremembering). One had to store a memory byte address in an A register, a target address in another, then do a store. You could also store the actual contents of a register to memory. So there were many fewer instructions in the 6600.
When I was getting a master's degree, we had to use a motorola processor (the 8600 if I recall) and it too had lots more instructions than the CDC had. You could do all this stuff with the CDC, but you did need to string a lot of stuff together and roll your own, basically. The CDC was designed for fast numerical computation for scientific applications--it was a super computer of that day.
At the time, "computer science" was new and almost all the faculty were arrogant, and because there were jobs out there their department had more potential majors than it could serve. In the operating systems course, the instructor was so full of himself that he would often spend the whole class yapping about Robin Hood or English grammar (when can you use "its" and when can you use "it''s") to demonstrate his vast expertise (beats actually preparing a lecture, I guess), although half the class was Chinese or South Asian and didn't really know or care. Once someone raised her hand and asked "can we PLEASE go on". When he did talk about interrupts, he said that there was an interruptible assembly instruction, and he leeringly asked if anyone knew what it was. The poor guy was crestfallen when I raised my hand and said it was IBM's MVCL. "That's what I was thinking of." In the end, he didn't get tenure.
I find this vary interesting, thank you for sharing. So more complex forms of addressing, like arrays and offseting from a pointer, are done by performing arithmetic in the X registers, and moving them to A registers. Would the X registers also be used for storing logical conditions generated by subtraction, more than, less than, or equals, represented as plus, minus, and zero, and the logical operations between are used as logical comparisons for branching?
 
Last edited:
I find this vary interesting, thank you for sharing. So more complex forms of addressing, like arrays and offseting from a pointer, are done by performing arithmetic in the X registers, and moving them to A registers. Would the X registers also be used for storing logical conditions generated by subtraction, more than, less than, or equals, represented as plus, minus, and zero, and the logical operations between are used as logical comparisons for branching?
Sorry, I don't remember. I looked for the book, but couldn't find it in my piles. I know I still have it. Certainly the architecture supported arrays, because FORTRAN did.
I believe the introduction said that the machine was RISC de facto, but it wasn't called that originally. It just turned out that their architecture tried to minimize instructions for speed and followed what later became the "RISC way". This was an expensive computer, designed with fast computation in mind. Cray is a big name in the history of computers.
 
Sorry, I don't remember. I looked for the book, but couldn't find it in my piles. I know I still have it. Certainly the architecture supported arrays, because FORTRAN did.
I believe the introduction said that the machine was RISC de facto, but it wasn't called that originally. It just turned out that their architecture tried to minimize instructions for speed and followed what later became the "RISC way". This was an expensive computer, designed with fast computation in mind. Cray is a big name in the history of computers.
I know that it is not reliable source, but the Wikipedia page for CDC 6600 claims it to be the first RISC computer, but before the term was invented, but the page on RISC architecture cites Jack Dongarra in his claim that the CDC 6600 is a forerunner of RISC, but didn't overcome all of the technical barriers of a modern RISC system.
Cray must have a very inovative mind, to have imagine something over ten years before it would become widespread and popular.
 
When you are a poor student who has to scape a few pennies to buy either paper or beer, how do you get a ream of paper for free?

Do some sums in FORTRAN using said punch cards, but 'accidently' end up dividing by zero. Plenty of paper in the output as long as you don't mind the word ERROR! written neatly down the side of each one.
One of my many problems was that I didn't drink beer as a student.
I did once need to do an integral on a computer. My FORTRAN instructor gave me a paper on some "numerical analysis" methods of doing so--people had already figured it out--but I ignored it and tried to program it on my own and ended up dividing by zero often. Fortunately, the printer was owned by the University, and they cleaned up the excess paper. It was in a room distinct from the computer itself, to keep underclassmen out of the Holy of Holies.
 
Fifty years ago today, Richard Nixon closed the "gold window" at the US Federal Reserve under which foreign governments could exchange their dollars for gold at the fixed rate of $35/ounce. This effectively brought to an end the system of fixed exchange rates established under the Bretton Woods Agreement in the waning days of World War II, ushering in the current system of floating exchange rates. Was this a good or a bad thing? Damned if I know...
 
Fifty years ago today, Richard Nixon closed the "gold window" at the US Federal Reserve under which foreign governments could exchange their dollars for gold at the fixed rate of $35/ounce. This effectively brought to an end the system of fixed exchange rates established under the Bretton Woods Agreement in the waning days of World War II, ushering in the current system of floating exchange rates. Was this a good or a bad thing? Damned if I know...
I guess Tricky Dickie had no choice once all the gold in Fort Knox had been sold (After all, everybody knows that the vaults there are empty, right ? :) )

That's probably why the producers of "Goldfinger" were refused permission to film scenes inside of Fort Knox, forcing production designer Ken Adam to create this impressive masterpiece;
goldfinger_fort-knox.jpeg

Probably just as well, as I seriously doubt that the real thing is anything like as cool-looking as this film set. Everything about this set is classic Ken Adam - the vaulted roof, acres of polished stone and stainless steel surfaces - Definitely a distinctive visual style that is instantly identifiable in all the films that he worked on
 
I guess Tricky Dickie had no choice once all the gold in Fort Knox had been sold (After all, everybody knows that the vaults there are empty, right ? :) )

That's probably why the producers of "Goldfinger" were refused permission to film scenes inside of Fort Knox, forcing production designer Ken Adam to create this impressive masterpiece;
View attachment 1046329

Probably just as well, as I seriously doubt that the real thing is anything like as cool-looking as this film set. Everything about this set is classic Ken Adam - the vaulted roof, acres of polished stone and stainless steel surfaces - Definitely a distinctive visual style that is instantly identifiable in all the films that he worked on
It just needs a few crosses with naked women writhing on them - manip artists?
 
It just needs a few crosses with naked women writhing on them - manip artists?
Well if Oddjob had been a naked woman, her demise would have been a bit more interesting than watching a fat Korean guy getting fried, that's for sure :D

(On the subject of which, surely the felt covering on Oddjob's hat would have insulated the steel brim (unless it was wet of course, but it wasn't wet in the film but yeah whatever - it's still a great film even though it doesn't get a lot of the stuff right (skin suffocation with gold paint isn't really a thing, though heat exhaustion may be, and Goldfinger getting blown (not sucked - a common misconception) out of a tiny aircraft window - real decompression doesn't work like that - both of these topics were thoroughly put to bed on Mythbusters years ago, none of which spoils the film of course)
 
Was this a good or a bad thing? Damned if I know...
Nor do I. Though I think we would not have cyber currency devouring our electricity otherwise.
It just needs a few crosses with naked women writhing on them
Crosses of Gold? An early opponent of the gold standard, Willian Jenning Bryan.

The most famous political speech in the history of the United States.
"...we shall answer their demands for a gold standard by saying to them, you shall not press down upon the brow of labor this crown of thorns. You shall not crucify mankind upon a cross of gold!"
 
August 12, 1991 - exactly 30 years ago, the fifth studio album of Metallica, also known as The Black Album, was released.
It was the first album that introduced me to foreign modern music after the fall of the Iron Curtain. And perhaps he determined my tastes for decades to come. I learned about Led Zeppelin, Deep Purple, Pink Floyd much later.
274px-Metallica_Album.jpg
 
Twenty-five years ago today, on 16 August 1996, an unscheduled flight took to the skies from the Kandahar Airport and sped towards the airspace of Iran.

That wasn't a run-of-the-mill charter thing, though. A crew of seven Russian flyers who had been captured the year before by the nice Taliban people seized control of their own Il-76 and took off, Hollywood-like -- kidnapping three of their jailers into the bargain. :goofy:
 
Back
Top Bottom