Hacker News new | past | comments | ask | show | jobs | submit login
What Have We Learned from the PDP-11? (cheney.net)
149 points by davecheney on Dec 6, 2017 | hide | past | favorite | 55 comments



We had a terminal to a PDP-11 70 in our middle school library in 1977 when I was 12. The terminal printed on green and white striped paper and no one at our school knew how to operate it until a friend and I began to play with it. I hand-typed games from this book: https://www.atariarchives.org/basicgames/showpage.php?page=c...

which eventually led to me writing my own programs.

The actual PDP 11 was located at a local college. After learning a bit more about the architecture, we two twelve-year-olds called the administrator and asked for root privileges. Sadly, he said, "No." :-)


My high school got a PDP-11/34 when I was a sophomore (class of '82), and offered a one-week intro that summer to all interested students. I was hooked, and wound up taking programming courses in junior and senior years (BASIC as a junior, COBOL as a senior). We were running RSTS/E, with BASIC-PLUS and WATBOL. WATFOR was also available, but none of my courses used it.

TECO scared me away; all I ever learned was how to load and run VTEDIT, if I were lucky enough to get one of the few VT100s. Unfortunately, most of the terminals were Visual 200s which frequently broke down and only emulated VT52.

Noting my interest, my dad went out and bought an Apple II+, and mostly by keying in program listings I taught myself 6502 assembler. A Heathkit H-11 (an LSI-11 system in kit form) would have really rocked, but I had a lot of fun with the Apple just the same.


It was the 70's. I was in a similar situation, I just guessed bad words as passwords until I got in with "login 1 hell". My first hack :)


One can write a string copy routine using two instructions, assuming that the source and destination are already in registers.

    loop:   MOVB (src)+, (dst)+
            BNE loop
Suddenly the C idiom

    while (*dst++ = *src++);
makes a lot more sense!


Though Ritchie has portrayed the similarities as coincidence,

http://csapp.cs.cmu.edu/3e/docs/chistory.html

The influence of the 11's indexing [Bell] versus the 8 on C/Unix abounds, whatever your perspective on 'Macro' move instructions, they were certainly the obvious follow on rage, witness the Movc3/Movc5 in the Vax.

I've always enjoyed the notes K&R made following a non-disclosure presentation of the Vax, the most prescient I recall being a comment on the 512 page size.

https://www.bell-labs.com/usr/dmr/www/vax1.html

I revered the 11, had Qbus machines which could be assembled with superior 3rd party components in places, scattered through my apt/basement as a student from the 23 through the j11 all the way up to a MV2.

Everything since has been anticlimactic :)


For sure. If you look at Old Testament K&R C and PDP-11 assembly language side by side, you can see the mapping very clearly. It's been a looooong time since I did PDP-11 at that level, but if memory serves, I'm pretty sure that a switch(e) {...} can be done as a single jump instruction with indexed-indirect addressing mode.


Adding fun with PDP-11 autoincrement:-

   MOV (PC)+,(PC)+
What does that do? It's a single-instruction program that replicates itself throughout memory.


Suddenly Core War (https://en.wikipedia.org/wiki/Core_War) makes a lot more sense.


Teacher regularly called C a macroassembler


"C: all the power of assembly language with all the ease-of-use of assembly language"


Not quite, unfortunately. In assembly language, you don't have to fear that your compiler optimizes out clearing cryptographic keys out of memory at the end of a function.


In assembly, you don't have to worry about pesky optimizations speeding up your code at all. A similar effect can be achieved in C by not manually enabling optimizations.


"A similar effect can be achieved in C by not manually enabling optimizations."

Good luck with that.


Nor in K&R C. Or even early ANSI-C. This whole "we can use undefined behavior to "optimize" away half your program" idea is fairly new.


Yah, and guess what machine instructions bk and dmr were targeting with that compiler!


The Technion in Israel still teaches PDP-11 assembly language in its computer organization course -- using an simulator, of course.

It's a nice, simple, orthogonal instruction set. The textbooks are available. Why change? Here's the syllabus of the latest incarnation: https://webcourse.cs.technion.ac.il/234118/Spring2017/syllab...


His book "Computer Engineering" is an interesting read. I noticed someone posted a scan PDF : http://bitsavers.trailing-edge.com/pdf/dec/_Books/Bell-Compu...

I have a copy of the book. And a PDP-11. Well, most of one, in a closet.


I'm an early career software engineer and I hope that one day I can retire with one of those machines in my basement.


A big issue is the hard drives which used materials that don't stand up long-term (urethane foam for example).

You could use an FPGA to build an emulated RL-05 I suppose but where's the fun in that.


The bus is sufficiently slow you can emulate the hardware with an MCU.

I'm using a slightly different approach with my PDP-11's, which is to use a custom device driver using DRV-11 parallel interfaces connected to a PC at the other end.


4096 word memory contained 16 million cores

No, no. 4096*16 = 65536 bits.

16 million cores would be 2 megabytes. An IBM mainframe of the early 1970s might have that much storage, and the cost would be about $1 million. IBM figured out how to weave core memory on a power loom, which gave them a big cost advantage for a while.


Yes - we bought 1.5 megabyte of core for our Burroughs 6700 in the late 70s, we paid NZ$1M for ours (in those days ~US$1.25M, before the Muldoon NZ$ crash of the mid 80s)


It seems that all successful computer designs, large or small, have one thing in common. the presence of a expansion system that allow the computer to take on different tasks for different users.

UNIBUS, S100, ISA, AppleII expansion slots, etc etc etc.


The Apple II was particularly elegant, in that it allowed the driver code to be in a ROM on the I/O board. Plug-n-play???? pfffft, sorry MSFT. When you plugged in an Apple II peripheral card, the driver was installed. Full stop.

Since the very beginning, I've considered that omission to be USB's great failing. There is no reason, given the technology of the time, that a driver standard built around a platform-neutral byte code could not have been done. The byte could could have been transpiled to any arch/os at driver init time, or on first USB device mount event, or some such. Glaringly missed opportunity, but nobody asked me.


Something kind of like that was done for x86 with NCR's SCSI host adapters. NCR provided a traditional driver for each supported OS (DOS, Windows, OS/2, Novell Netware, SCO Unix) but that driver didn't know anything about the actual SCSI host adapter hardware. I'll call this the "generic driver".

The host adapter ROM contained a driver for the specific hardware on that card, written in a way that did not make any assumptions about the operating system. (Actually, there were two drivers in the ROM. A 16-bit driver for DOS and Windows 3.x and a 32-bit driver for the others). I'll call the drivers from ROM the "hardware drivers".

The generic driver would find the ROM. The ROM had a header that contained information about the hardware driver. That included pointers to various entry points in the hardware driver, including an init routine. The generic driver would call the init routine, and one of the things it gave the init routine was a table of entry points in the generic driver that the hardware driver could use to do things like allocate and free memory, register interrupt handlers, set up DMA operations, and things like that.

My recollection is that the 16-bit hardware drivers were position-independent code that could be run out of the ROM directly, or copied to RAM where it might run faster. The 32-bit code was not position independent, so the generic driver had to copy it to memory and the fix it up, which it could do because the 32-bit driver in ROM was essentially the .o output of the C compiler used to build it so had everything needed in order to move it around.

Once the hardware driver was initialized and running, the interface between it and the generic driver for actually doing SCSI commands was based on a draft version of the SCSI CAM specification. We [1] were on the CAM committee, and proposed including our ROM-based hardware driver approach as part of the standard, but most other committee members didn't think being able to swap host adapters without having to change OS drivers was useful enough.

[1] "We" == the consulting company that designed and implemented the aforementioned stuff for NCR. I was the lead architect and lead programmer for the project.


OpenFirmware uses architecture independent FORTH byte code, so peripheral cards can include machine independent drivers and diagnostics!

https://en.wikipedia.org/wiki/Open_Firmware

Open Firmware Forth Code may be compiled into FCode, a bytecode which is independent of computer architecture details such as the instruction set and memory hierarchy. A PCI card may include a program, compiled to FCode, which runs on any Open Firmware system. In this way, it can provide platform-independent boot-time diagnostics, configuration code, and device drivers. FCode is also very compact, so that a disk driver may require only one or two kilobytes. Therefore, many of the same I/O cards can be used on Sun systems and Macintoshes that used Open Firmware. FCode implements ANS Forth and a subset of the Open Firmware library.


The Apple II was particularly elegant, in that it allowed the driver code to be in a ROM on the I/O board.

The PC had a similar feature: https://en.wikipedia.org/wiki/Option_ROM


Too easy to reverse engineer,I can not see hardware vendors getting on board.


Would that not have worse security implications?


Apparently Steve Wozniak had to fight pretty hard for the expansion slots from Wikipedia (https://en.wikipedia.org/wiki/Steve_Wozniak):

> During the design stage, Steve Jobs argued that the Apple II should have two expansion slots, while Wozniak wanted six. After a heated argument, during which Wozniak had threatened for Jobs to 'go get himself another computer', they decided to go with eight slots. The Apple II became one of the first highly successful mass-produced personal computers.


> During the design stage, Steve Jobs argued that the Apple II should have two expansion slots

"Sorry, can't use a modem right now, I have both a floppy drive and a printer plugged in."


It would have been worse than that. I had a II+, and my memory is that the ability to use 80-column text on screen required a video card that took one of the slots (the main difference between the II and the II+ was that this card was included in the package). If you only had two slots, you'd be done as soon as you plugged in the drive.


Maybe Jobs was hoping to sell us an Apple III with 3 expansion slots the following year.


https://en.wikipedia.org/wiki/Macintosh_128K#Expansion

Given that this was the result when Jobs didn't have Woz vetoing him, i doubt it.

The guy was obsessed with looks and "experience". To him, a computer was to be a magical black box that people powered on and powered off. To open up the case and poke around inside was "dirty".


I think they were more aiming at gradually removing expansion slots but selling serial-port-to-expansion-port dongles...


Yeah, I had my Disk ][ card in slot 6 or 7. 16K of Ram and the AppleSoft card in slot 1. The Epson MX-80 plugged in somewhere there too.I had an 80 Column card in one of my other slots.... Other than that, the Koala Pad plugged into the joystick/paddle DIP plug. Yeah, 2 wouldn't have done it for me.


And if that's not enough slots for you, get a Mountain Computer Expansion Box!

http://www.appleii-box.de/H054_1_MCEB01.htm


I had PDP-11 for my personal use in the Finnish Army in 1978. It was quite useless and I learned nothing: https://www.flickr.com/photos/timonoko/27931368650/in/album-...


I don't see it.


I was just like the pictures with pretty colors and buttons. My personal Nova was much uglier: https://www.flickr.com/photos/timonoko/102552851/in/album-72...


My dad started on a PDP-8 at my granddad's workplace (a newspaper).

He later wrote his PhD thesis on an Epson HX-20 and backed it up to PDP-11 magnetic tape.

When I was 16, we went to a computer museum to try to get his old backup off, but their PDP-11's Winchester hard drive was broken, so we couldn't boot it to load up something to read the tape.

Lesson to learn: copy your old backups forward when storage formats change.


Wow - one of my first jobs in computing was changing the backup disks (huge things - bigger than an LP record) on an old Vax PDP-8 and PDP-11 system in my boss's parents business.

Mounting and dismounting those things was one of the factors that made me swear I would get more into the software side of these new fandangled 'computer' thingies rather than hardware... :)


You can easily see the influence of the PDP-11 on the 68000 architecture and instruction set.


I programmed assembly for both the PDP-11 and the 68000, and a couple of others. The 68000 was by far my favorite.


Also MSP430.


Overall, a very good article, though I'd pick this nit: separate I/O instructions survive in 8086 and AMD64 as well, at the very least.

Never having done ARM assembly language programming, does ARM have I/O instructions, or is it strictly memory-mapped?


Those are descendants of the 8080 he mentions as one of the two exceptions.


The comment about I/O instructions really comes from the microprocessor wars of the 80's. Intel and Intel influenced processors had I/O instructions and Motorola and Motorola influenced processors did not. So it was a thing to argue about. The difference really amounted to nothing in the end.

RISC processors tend to not have dedicated instructions for things (it's in the name). They also tend to have reduced access to regular memory (load, store), much less an entirely separate memory bus dedicated to I/O. So processors like the ARM can't practically have I/O instructions. If you belong to the faction that believes that I/O instructions are the way to go then you would consider that a weakness of RISC processors… :)


ARM has no I/O instructions per se. It does have a whole range of instructions to deal with “coprocessors”, which was mainly used for floating point (originally a separate chip, nowadays integrated into the processor but still using that part of the instruction set).


For those of you recalling fond memories of PDP days, just remember the poor wretches still using these devices. Thankfully, this is no longer me.


>through the lens of our own 20/20 hindsight

Here's another quote from the article that seems to be a constant feature over the decades:

>PDP is an acronym for “Programmed Data Processor”, as at the time, computers had a reputation of being large, complicated, and expensive machines, and DEC’s venture capitalists would not support them if they built a “computer”

Looks like some VC's have always been more impressed by the slide deck and presentation than the actual potential of the business concept or individuals developing the technology.

That's something worth learning as well.

Anyway, anybody want a used VAX 4000-200? Available for pick up in Houston this week.

If so, post PM info here along with what you would like to do with it.

I'll check this thread in a few days to see if there is any interest.

Also, an HP1000 in a full rolling rack the size of a refrigerator.


Blast from past! Too much time spent on that 11/70's front panel switches.

One thing I believe Bell missed in "what we learned." Of course, maybe it's hindsight. The regular instruction set in the '11 and the VAX was fertile ground for all sorts of innovation in compilers and optimization technology. Without those innovations it would have been harder for the gnarly-instruction processors (386 line, I'm looking at you) to gain users.


Regarding the UNIBUS, this needs a picture of the wire-wrapped backplane of a PDP-8 (especially the denser later versions with the flip-chips.) The front looks neat and tidy, but opening the back is still a part of my nightmares.


I got a PDP-8 emulator with a really cool blinking lights display here: http://obsolescence.wixsite.com/obsolescence/pidp-8 my friends think I'm a geek.


I began programming as a career in 1974 after graduating from college. At the time, and well into the 90s, the PDP-11/VAX was my favorite minicomputer.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: