simulate 12 days ago [-]
We had a terminal to a PDP-11 70 in our middle school library in 1977 when I was 12. The terminal printed on green and white striped paper and no one at our school knew how to operate it until a friend and I began to play with it. I hand-typed games from this book:

which eventually led to me writing my own programs.

The actual PDP 11 was located at a local college. After learning a bit more about the architecture, we two twelve-year-olds called the administrator and asked for root privileges. Sadly, he said, "No." :-)

flyinghamster 11 days ago [-]
My high school got a PDP-11/34 when I was a sophomore (class of '82), and offered a one-week intro that summer to all interested students. I was hooked, and wound up taking programming courses in junior and senior years (BASIC as a junior, COBOL as a senior). We were running RSTS/E, with BASIC-PLUS and WATBOL. WATFOR was also available, but none of my courses used it.

TECO scared me away; all I ever learned was how to load and run VTEDIT, if I were lucky enough to get one of the few VT100s. Unfortunately, most of the terminals were Visual 200s which frequently broke down and only emulated VT52.

Noting my interest, my dad went out and bought an Apple II+, and mostly by keying in program listings I taught myself 6502 assembler. A Heathkit H-11 (an LSI-11 system in kit form) would have really rocked, but I had a lot of fun with the Apple just the same.

ralphc 10 days ago [-]
It was the 70's. I was in a similar situation, I just guessed bad words as passwords until I got in with "login 1 hell". My first hack :)
panic 12 days ago [-]
One can write a string copy routine using two instructions, assuming that the source and destination are already in registers.

    loop:   MOVB (src)+, (dst)+
            BNE loop
Suddenly the C idiom

    while (*dst++ = *src++);
makes a lot more sense!
Crashprofessor 11 days ago [-]
Though Ritchie has portrayed the similarities as coincidence,

The influence of the 11's indexing [Bell] versus the 8 on C/Unix abounds, whatever your perspective on 'Macro' move instructions, they were certainly the obvious follow on rage, witness the Movc3/Movc5 in the Vax.

I've always enjoyed the notes K&R made following a non-disclosure presentation of the Vax, the most prescient I recall being a comment on the 512 page size.

I revered the 11, had Qbus machines which could be assembled with superior 3rd party components in places, scattered through my apt/basement as a student from the 23 through the j11 all the way up to a MV2.

Everything since has been anticlimactic :)

dbcurtis 12 days ago [-]
For sure. If you look at Old Testament K&R C and PDP-11 assembly language side by side, you can see the mapping very clearly. It's been a looooong time since I did PDP-11 at that level, but if memory serves, I'm pretty sure that a switch(e) {...} can be done as a single jump instruction with indexed-indirect addressing mode.
hazeii 11 days ago [-]
Adding fun with PDP-11 autoincrement:-

   MOV (PC)+,(PC)+
What does that do? It's a single-instruction program that replicates itself throughout memory.
cesarb 11 days ago [-]
Suddenly Core War ( makes a lot more sense.
agumonkey 12 days ago [-]
Teacher regularly called C a macroassembler
davidw 11 days ago [-]
"C: all the power of assembly language with all the ease-of-use of assembly language"
beefhash 11 days ago [-]
Not quite, unfortunately. In assembly language, you don't have to fear that your compiler optimizes out clearing cryptographic keys out of memory at the end of a function.
syrrim 11 days ago [-]
In assembly, you don't have to worry about pesky optimizations speeding up your code at all. A similar effect can be achieved in C by not manually enabling optimizations.
dangerface 11 days ago [-]
"A similar effect can be achieved in C by not manually enabling optimizations."

Good luck with that.

mpweiher 11 days ago [-]
Nor in K&R C. Or even early ANSI-C. This whole "we can use undefined behavior to "optimize" away half your program" idea is fairly new.
OliverJones 11 days ago [-]
Yah, and guess what machine instructions bk and dmr were targeting with that compiler!
oldgradstudent 11 days ago [-]
The Technion in Israel still teaches PDP-11 assembly language in its computer organization course -- using an simulator, of course.

It's a nice, simple, orthogonal instruction set. The textbooks are available. Why change? Here's the syllabus of the latest incarnation:

dboreham 12 days ago [-]
His book "Computer Engineering" is an interesting read. I noticed someone posted a scan PDF :

I have a copy of the book. And a PDP-11. Well, most of one, in a closet.

gravypod 12 days ago [-]
I'm an early career software engineer and I hope that one day I can retire with one of those machines in my basement.
dboreham 11 days ago [-]
A big issue is the hard drives which used materials that don't stand up long-term (urethane foam for example).

You could use an FPGA to build an emulated RL-05 I suppose but where's the fun in that.

hazeii 11 days ago [-]
The bus is sufficiently slow you can emulate the hardware with an MCU.

I'm using a slightly different approach with my PDP-11's, which is to use a custom device driver using DRV-11 parallel interfaces connected to a PC at the other end.

Animats 11 days ago [-]
4096 word memory contained 16 million cores

No, no. 4096*16 = 65536 bits.

16 million cores would be 2 megabytes. An IBM mainframe of the early 1970s might have that much storage, and the cost would be about $1 million. IBM figured out how to weave core memory on a power loom, which gave them a big cost advantage for a while.

Taniwha 11 days ago [-]
Yes - we bought 1.5 megabyte of core for our Burroughs 6700 in the late 70s, we paid NZ$1M for ours (in those days ~US$1.25M, before the Muldoon NZ$ crash of the mid 80s)
digi_owl 12 days ago [-]
It seems that all successful computer designs, large or small, have one thing in common. the presence of a expansion system that allow the computer to take on different tasks for different users.

UNIBUS, S100, ISA, AppleII expansion slots, etc etc etc.

dbcurtis 12 days ago [-]
The Apple II was particularly elegant, in that it allowed the driver code to be in a ROM on the I/O board. Plug-n-play???? pfffft, sorry MSFT. When you plugged in an Apple II peripheral card, the driver was installed. Full stop.

Since the very beginning, I've considered that omission to be USB's great failing. There is no reason, given the technology of the time, that a driver standard built around a platform-neutral byte code could not have been done. The byte could could have been transpiled to any arch/os at driver init time, or on first USB device mount event, or some such. Glaringly missed opportunity, but nobody asked me.

tzs 11 days ago [-]
Something kind of like that was done for x86 with NCR's SCSI host adapters. NCR provided a traditional driver for each supported OS (DOS, Windows, OS/2, Novell Netware, SCO Unix) but that driver didn't know anything about the actual SCSI host adapter hardware. I'll call this the "generic driver".

The host adapter ROM contained a driver for the specific hardware on that card, written in a way that did not make any assumptions about the operating system. (Actually, there were two drivers in the ROM. A 16-bit driver for DOS and Windows 3.x and a 32-bit driver for the others). I'll call the drivers from ROM the "hardware drivers".

The generic driver would find the ROM. The ROM had a header that contained information about the hardware driver. That included pointers to various entry points in the hardware driver, including an init routine. The generic driver would call the init routine, and one of the things it gave the init routine was a table of entry points in the generic driver that the hardware driver could use to do things like allocate and free memory, register interrupt handlers, set up DMA operations, and things like that.

My recollection is that the 16-bit hardware drivers were position-independent code that could be run out of the ROM directly, or copied to RAM where it might run faster. The 32-bit code was not position independent, so the generic driver had to copy it to memory and the fix it up, which it could do because the 32-bit driver in ROM was essentially the .o output of the C compiler used to build it so had everything needed in order to move it around.

Once the hardware driver was initialized and running, the interface between it and the generic driver for actually doing SCSI commands was based on a draft version of the SCSI CAM specification. We [1] were on the CAM committee, and proposed including our ROM-based hardware driver approach as part of the standard, but most other committee members didn't think being able to swap host adapters without having to change OS drivers was useful enough.

[1] "We" == the consulting company that designed and implemented the aforementioned stuff for NCR. I was the lead architect and lead programmer for the project.

DonHopkins 11 days ago [-]
OpenFirmware uses architecture independent FORTH byte code, so peripheral cards can include machine independent drivers and diagnostics!

Open Firmware Forth Code may be compiled into FCode, a bytecode which is independent of computer architecture details such as the instruction set and memory hierarchy. A PCI card may include a program, compiled to FCode, which runs on any Open Firmware system. In this way, it can provide platform-independent boot-time diagnostics, configuration code, and device drivers. FCode is also very compact, so that a disk driver may require only one or two kilobytes. Therefore, many of the same I/O cards can be used on Sun systems and Macintoshes that used Open Firmware. FCode implements ANS Forth and a subset of the Open Firmware library.

userbinator 11 days ago [-]
The Apple II was particularly elegant, in that it allowed the driver code to be in a ROM on the I/O board.

The PC had a similar feature:

fooker 12 days ago [-]
Too easy to reverse engineer,I can not see hardware vendors getting on board.
ChrisSD 12 days ago [-]
Would that not have worse security implications?
flukus 11 days ago [-]
Apparently Steve Wozniak had to fight pretty hard for the expansion slots from Wikipedia (

> During the design stage, Steve Jobs argued that the Apple II should have two expansion slots, while Wozniak wanted six. After a heated argument, during which Wozniak had threatened for Jobs to 'go get himself another computer', they decided to go with eight slots. The Apple II became one of the first highly successful mass-produced personal computers.

apricot 11 days ago [-]
> During the design stage, Steve Jobs argued that the Apple II should have two expansion slots

"Sorry, can't use a modem right now, I have both a floppy drive and a printer plugged in."

Finnucane 11 days ago [-]
It would have been worse than that. I had a II+, and my memory is that the ability to use 80-column text on screen required a video card that took one of the slots (the main difference between the II and the II+ was that this card was included in the package). If you only had two slots, you'd be done as soon as you plugged in the drive.
willtim 11 days ago [-]
Maybe Jobs was hoping to sell us an Apple III with 3 expansion slots the following year.
digi_owl 11 days ago [-]

Given that this was the result when Jobs didn't have Woz vetoing him, i doubt it.

The guy was obsessed with looks and "experience". To him, a computer was to be a magical black box that people powered on and powered off. To open up the case and poke around inside was "dirty".

zaarn 11 days ago [-]
I think they were more aiming at gradually removing expansion slots but selling serial-port-to-expansion-port dongles...
nobleach 11 days ago [-]
Yeah, I had my Disk ][ card in slot 6 or 7. 16K of Ram and the AppleSoft card in slot 1. The Epson MX-80 plugged in somewhere there too.I had an 80 Column card in one of my other slots.... Other than that, the Koala Pad plugged into the joystick/paddle DIP plug. Yeah, 2 wouldn't have done it for me.
DonHopkins 11 days ago [-]
And if that's not enough slots for you, get a Mountain Computer Expansion Box!

11 days ago [-]
timonoko 11 days ago [-]
I had PDP-11 for my personal use in the Finnish Army in 1978. It was quite useless and I learned nothing:
larsbrinkhoff 11 days ago [-]
I don't see it.
timonoko 11 days ago [-]
I was just like the pictures with pretty colors and buttons. My personal Nova was much uglier:
peterburkimsher 11 days ago [-]
My dad started on a PDP-8 at my granddad's workplace (a newspaper).

He later wrote his PhD thesis on an Epson HX-20 and backed it up to PDP-11 magnetic tape.

When I was 16, we went to a computer museum to try to get his old backup off, but their PDP-11's Winchester hard drive was broken, so we couldn't boot it to load up something to read the tape.

Lesson to learn: copy your old backups forward when storage formats change.

cyberferret 12 days ago [-]
Wow - one of my first jobs in computing was changing the backup disks (huge things - bigger than an LP record) on an old Vax PDP-8 and PDP-11 system in my boss's parents business.

Mounting and dismounting those things was one of the factors that made me swear I would get more into the software side of these new fandangled 'computer' thingies rather than hardware... :)

tasty_freeze 11 days ago [-]
You can easily see the influence of the PDP-11 on the 68000 architecture and instruction set.
mark-r 11 days ago [-]
I programmed assembly for both the PDP-11 and the 68000, and a couple of others. The 68000 was by far my favorite.
larsbrinkhoff 11 days ago [-]
Also MSP430.
flyinghamster 11 days ago [-]
Overall, a very good article, though I'd pick this nit: separate I/O instructions survive in 8086 and AMD64 as well, at the very least.

Never having done ARM assembly language programming, does ARM have I/O instructions, or is it strictly memory-mapped?

firethief 11 days ago [-]
Those are descendants of the 8080 he mentions as one of the two exceptions.
upofadown 11 days ago [-]
The comment about I/O instructions really comes from the microprocessor wars of the 80's. Intel and Intel influenced processors had I/O instructions and Motorola and Motorola influenced processors did not. So it was a thing to argue about. The difference really amounted to nothing in the end.

RISC processors tend to not have dedicated instructions for things (it's in the name). They also tend to have reduced access to regular memory (load, store), much less an entirely separate memory bus dedicated to I/O. So processors like the ARM can't practically have I/O instructions. If you belong to the faction that believes that I/O instructions are the way to go then you would consider that a weakness of RISC processors… :)

comex 11 days ago [-]
ARM has no I/O instructions per se. It does have a whole range of instructions to deal with “coprocessors”, which was mainly used for floating point (originally a separate chip, nowadays integrated into the processor but still using that part of the instruction set).
thechao 11 days ago [-]
For those of you recalling fond memories of PDP days, just remember the poor wretches still using these devices. Thankfully, this is no longer me.
fuzzfactor 9 days ago [-]
>through the lens of our own 20/20 hindsight

Here's another quote from the article that seems to be a constant feature over the decades:

>PDP is an acronym for “Programmed Data Processor”, as at the time, computers had a reputation of being large, complicated, and expensive machines, and DEC’s venture capitalists would not support them if they built a “computer”

Looks like some VC's have always been more impressed by the slide deck and presentation than the actual potential of the business concept or individuals developing the technology.

That's something worth learning as well.

Anyway, anybody want a used VAX 4000-200? Available for pick up in Houston this week.

If so, post PM info here along with what you would like to do with it.

I'll check this thread in a few days to see if there is any interest.

Also, an HP1000 in a full rolling rack the size of a refrigerator.

OliverJones 11 days ago [-]
Blast from past! Too much time spent on that 11/70's front panel switches.

One thing I believe Bell missed in "what we learned." Of course, maybe it's hindsight. The regular instruction set in the '11 and the VAX was fertile ground for all sorts of innovation in compilers and optimization technology. Without those innovations it would have been harder for the gnarly-instruction processors (386 line, I'm looking at you) to gain users.

baking 11 days ago [-]
Regarding the UNIBUS, this needs a picture of the wire-wrapped backplane of a PDP-8 (especially the denser later versions with the flip-chips.) The front looks neat and tidy, but opening the back is still a part of my nightmares.
hsnewman 11 days ago [-]
I got a PDP-8 emulator with a really cool blinking lights display here: my friends think I'm a geek.
leed25d 11 days ago [-]
I began programming as a career in 1974 after graduating from college. At the time, and well into the 90s, the PDP-11/VAX was my favorite minicomputer.