which eventually led to me writing my own programs.
The actual PDP 11 was located at a local college. After learning a bit more about the architecture, we two twelve-year-olds called the administrator and asked for root privileges. Sadly, he said, "No." :-)
TECO scared me away; all I ever learned was how to load and run VTEDIT, if I were lucky enough to get one of the few VT100s. Unfortunately, most of the terminals were Visual 200s which frequently broke down and only emulated VT52.
Noting my interest, my dad went out and bought an Apple II+, and mostly by keying in program listings I taught myself 6502 assembler. A Heathkit H-11 (an LSI-11 system in kit form) would have really rocked, but I had a lot of fun with the Apple just the same.
loop: MOVB (src)+, (dst)+
while (*dst++ = *src++);
The influence of the 11's indexing [Bell] versus the 8 on C/Unix abounds, whatever your perspective on 'Macro' move instructions, they were certainly the obvious follow on rage, witness the Movc3/Movc5 in the Vax.
I've always enjoyed the notes K&R made following a non-disclosure presentation of the Vax, the most prescient I recall being a comment on the 512 page size.
I revered the 11, had Qbus machines which could be assembled with superior 3rd party components in places, scattered through my apt/basement as a student from the 23 through the j11 all the way up to a MV2.
Everything since has been anticlimactic :)
Good luck with that.
It's a nice, simple, orthogonal instruction set. The textbooks are available. Why change? Here's the syllabus of the latest incarnation: https://webcourse.cs.technion.ac.il/234118/Spring2017/syllab...
I have a copy of the book. And a PDP-11. Well, most of one, in a closet.
You could use an FPGA to build an emulated RL-05 I suppose but where's the fun in that.
I'm using a slightly different approach with my PDP-11's, which is to use a custom device driver using DRV-11 parallel interfaces connected to a PC at the other end.
No, no. 4096*16 = 65536 bits.
16 million cores would be 2 megabytes. An IBM mainframe of the early 1970s might have that much storage, and the cost would be about $1 million. IBM figured out how to weave core memory on a power loom, which gave them a big cost advantage for a while.
UNIBUS, S100, ISA, AppleII expansion slots, etc etc etc.
Since the very beginning, I've considered that omission to be USB's great failing. There is no reason, given the technology of the time, that a driver standard built around a platform-neutral byte code could not have been done. The byte could could have been transpiled to any arch/os at driver init time, or on first USB device mount event, or some such. Glaringly missed opportunity, but nobody asked me.
The host adapter ROM contained a driver for the specific hardware on that card, written in a way that did not make any assumptions about the operating system. (Actually, there were two drivers in the ROM. A 16-bit driver for DOS and Windows 3.x and a 32-bit driver for the others). I'll call the drivers from ROM the "hardware drivers".
The generic driver would find the ROM. The ROM had a header that contained information about the hardware driver. That included pointers to various entry points in the hardware driver, including an init routine. The generic driver would call the init routine, and one of the things it gave the init routine was a table of entry points in the generic driver that the hardware driver could use to do things like allocate and free memory, register interrupt handlers, set up DMA operations, and things like that.
My recollection is that the 16-bit hardware drivers were position-independent code that could be run out of the ROM directly, or copied to RAM where it might run faster. The 32-bit code was not position independent, so the generic driver had to copy it to memory and the fix it up, which it could do because the 32-bit driver in ROM was essentially the .o output of the C compiler used to build it so had everything needed in order to move it around.
Once the hardware driver was initialized and running, the interface between it and the generic driver for actually doing SCSI commands was based on a draft version of the SCSI CAM specification. We  were on the CAM committee, and proposed including our ROM-based hardware driver approach as part of the standard, but most other committee members didn't think being able to swap host adapters without having to change OS drivers was useful enough.
 "We" == the consulting company that designed and implemented the aforementioned stuff for NCR. I was the lead architect and lead programmer for the project.
Open Firmware Forth Code may be compiled into FCode, a bytecode which is independent of computer architecture details such as the instruction set and memory hierarchy. A PCI card may include a program, compiled to FCode, which runs on any Open Firmware system. In this way, it can provide platform-independent boot-time diagnostics, configuration code, and device drivers. FCode is also very compact, so that a disk driver may require only one or two kilobytes. Therefore, many of the same I/O cards can be used on Sun systems and Macintoshes that used Open Firmware. FCode implements ANS Forth and a subset of the Open Firmware library.
The PC had a similar feature: https://en.wikipedia.org/wiki/Option_ROM
> During the design stage, Steve Jobs argued that the Apple II should have two expansion slots, while Wozniak wanted six. After a heated argument, during which Wozniak had threatened for Jobs to 'go get himself another computer', they decided to go with eight slots. The Apple II became one of the first highly successful mass-produced personal computers.
"Sorry, can't use a modem right now, I have both a floppy drive and a printer plugged in."
Given that this was the result when Jobs didn't have Woz vetoing him, i doubt it.
The guy was obsessed with looks and "experience". To him, a computer was to be a magical black box that people powered on and powered off. To open up the case and poke around inside was "dirty".
He later wrote his PhD thesis on an Epson HX-20 and backed it up to PDP-11 magnetic tape.
When I was 16, we went to a computer museum to try to get his old backup off, but their PDP-11's Winchester hard drive was broken, so we couldn't boot it to load up something to read the tape.
Lesson to learn: copy your old backups forward when storage formats change.
Mounting and dismounting those things was one of the factors that made me swear I would get more into the software side of these new fandangled 'computer' thingies rather than hardware... :)
Never having done ARM assembly language programming, does ARM have I/O instructions, or is it strictly memory-mapped?
RISC processors tend to not have dedicated instructions for things (it's in the name). They also tend to have reduced access to regular memory (load, store), much less an entirely separate memory bus dedicated to I/O. So processors like the ARM can't practically have I/O instructions. If you belong to the faction that believes that I/O instructions are the way to go then you would consider that a weakness of RISC processors… :)
Here's another quote from the article that seems to be a constant feature over the decades:
>PDP is an acronym for “Programmed Data Processor”, as at the time, computers had a reputation of being large, complicated, and expensive machines, and DEC’s venture capitalists would not support them if they built a “computer”
Looks like some VC's have always been more impressed by the slide deck and presentation than the actual potential of the business concept or individuals developing the technology.
That's something worth learning as well.
Anyway, anybody want a used VAX 4000-200? Available for pick up in Houston this week.
If so, post PM info here along with what you would like to do with it.
I'll check this thread in a few days to see if there is any interest.
Also, an HP1000 in a full rolling rack the size of a refrigerator.
One thing I believe Bell missed in "what we learned." Of course, maybe it's hindsight. The regular instruction set in the '11 and the VAX was fertile ground for all sorts of innovation in compilers and optimization technology. Without those innovations it would have been harder for the gnarly-instruction processors (386 line, I'm looking at you) to gain users.