Jeffrey Snover: From Windows NT to Nano Server

Microsoft Technical Fellow Jeffrey Snover traces the roots of Windows Server and their persistence in the company's cloud

Jeffrey Snover: From Windows NT to Nano Server
Vivian Gjurin (modified) (Creative Commons BY or BY-SA)

It’s been 20 years since Windows NT 4.0 server operating system was released to manufacturing. No one could be sure at the time, but this breakout version turned out to be the one that would finally establish Microsoft in the datacenter.

I recently visited Microsoft’s Redmond headquarters to talk to Jeffrey Snover, Technical Fellow and creator of PowerShell -- and currently the architect of Microsoft's server operating systems. It was time to reminisce about Windows NT’s past as well as muse about its present and its future.

“Engineers overestimate what can be done in two or three years, and underestimate what can be done in 10,” Snover observes. “With 20 years we’ve evolved incredibly. What started off running on a 44MHz 486 now runs our cloud.” That ascendance has closely followed the evolution of the server in the enterprise, from departmental file server to client-server, to n-tier, to the web ... and now to the cloud, both on premises and in massive datacenters.

An interesting career thread ties those early Windows Servers' architects together: They all had senior engineering roles at DEC. As Snover explains it:

You could think about the evolution of Windows Server as the journey of three Digital consulting engineers. Dave Cutler was the first, he came in and gave us the great kernel that led us through the server for the masses era. Then Bill Laing took over as chief architect. He was a big enterprise guy, and he really took that enterprise approach to the server. I took over as chief architect and focused on the management aspects of it and the cloud aspects of it. Digital really was a fantastic engineering environment, but they didn’t have that connection between how to take technology and really turn it into mass appeal and mass monetization that Microsoft has.

That was the key to Microsoft’s server business: making it the server for the masses.

The heart of it all, says Snover, was Cutler's "great kernel." Snover calls him "just one of the great minds of the industry, and he produced a way of bringing the systems together with the object-based kernel.” This extends from the foundations of Windows Server to, today, the entire Windows family of operating systems:

That's been the enduring thing: the heart and soul of Windows. So the things that were so successful 20 years ago, that made Windows Server so successful, wasn't Dave's kernel -- after all, Dave had done a variation of that kernel at Digital. The thing that made Windows so successful was matching that kernel with a great desktop experience and then running it on PC class hardware. That combo now meant that what used to be -- servers that were run by the high priests and princes of the industry -- now anybody could buy their own server and deploy and run it. That really was the magic.

It certainly was. My own career in the industry followed that trend: using Windows NT first to run an office, then to link to minicomputers, before building large-scale web servers and services on that same OS.

From boxes to the cloud

That approach continues today. Snover sums up his philosophy of Windows Server simply: “Architecture is the art of deciding when one thing should be two and when two should be one.” In the early days of Windows NT, it was essential to have one thing: a combined kernel and desktop OS.

But as time progressed permutations evolved. Snover likes to think in terms of four eras of servers: “the server for the masses, the enterprise era, the datacenter era, and now the cloud era.” That has required some changes. The server for the masses is now at heart the familiar Windows desktop client, with server features added -- what Snover calls “fidelity to the client.” But in the datacenter there’s no need to go from server to desktop, hence a focus on the UI-less Server Core:

Now we're confident that Server Core has everything people need so they can be successful. Our focus on Nano Server has driven the clean-up of the long tail of manageability; and that means if you can't do it remotely you can't do it all. It’s a little bit like Cortez burning his ships.

Not every business is ready for that. Snover points out that his four eras persist today, with different companies living in different eras. That’s going to affect what you do with Windows Server and how you do it. As Snover notes, “Each of the eras has its own set of tools its own set of methodologies and its own ecosystem. We've seen partners, tools, and so on that are great in one era and then you never hear of them again.” Snover’s advice is simple: 

You have to decide where it is you want to go and then make sure you have the right people, tools, and partners that want to go there. One sure way you can fail is wanting to go somewhere and coupling that with tools, people, and partners that don’t want to go there as well.

Working with the cloud has meant thinking about servers in a different way. “One way to get that it's 'not just somebody else's server' is serverless computing. Of course there's a server there,” he says, “But you give us your code and we’ll run your code: You don’t have to worry about the server and setting it up. When your code runs we fire up a server, put your code on, and it runs -- and when your code's done, then we throw that server away.” It’s one of the reasons Microsoft has developed Nano Server as part of Windows. “In that environment having a very small, very lightweight, very fast server is very important.”

There’s a parallel with Windows NT 4.0, Snover suggests, noting the release of the Windows NT 4.0 Option Pack, which added new features, including Microsoft Transaction Server. “Before transaction monitors you had to write all this horrible code, and you needed big systems. Transaction monitors came in, and you just write this little code and don’t need worry about all the rest of it.”

The heart of Microsoft

This is what Snover calls “the heart of Microsoft.” How it works is easy to understand, he says. The company takes what was once available only to the elite few and makes it available to everyone by simplifying it and making it affordable. In a modern cloud context, this has also led to Windows Server’s support for containers and its adoption of new hardware approaches.

“Another big change and huge benefit is the increased networking bandwidth, speed, and lower latency,“ says Snover. “It means now I can do with protocols things I could only ever do in the past with DLL calls and because of that I can now separate things into their own environment where they have their own versioning and their own lifecycle. And that's the big thing of this era.”

Architects like Snover need to think about software as a science and the history of thinking about how we build code. That’s where he goes back to big-picture concepts like software factories and employing reusable components with well-defined interfaces. These approaches are key to delivering decoupled environments.

Rebooting the software factory

The software factory idea is one that’s coming back with containers and services and in serverless computing. “You get these decoupled systems that have their own lifecycle management of their environment and their own own versioning and use protocols as the interface. The shift from DLLs to protocols allows the idea of software factories to work.”

Snover thinks that, ultimately, we can shift from writing code to solving business problems. “Using this interface, if I use it a lot, I don’t really care if it is responsible for scaling itself up and down. I just use it as an issue and I'm freed up of that hard problem to go focus on my own business’ hard problem: How do I get you to give me money?”

This follows the structure of scientific revolutions, “As we go from one model to another there's a period of chaos and confusion, while people try to adhere to the old model, even when the old model ceases to solve the problems people want to solve.”

We’re somewhere in the middle of one of those transitions between eras, Snover suggests. “Before the new model arrives is an era of great churn and creativity, and I think that’s what we're seeing now.”

What is that new model? “At the heart of it will be software factories with loosely coupled interfaces exposed as microservices. So much of what we’ll do will be stitching together components that software development becomes software integration.”

Copyright © 2016 IDG Communications, Inc.