Microsoft’s original source code

https://www.gatesnotes.com/home/home-page-topic/reader/microsoft-original-source-code

stkai
The source code is such a fun read (for the comments). I found some source code for GW-BASIC, and here are two of my favorites:

  ;WE COULD NOT FIT THE NUMBER INTO THE BUFFER DESPITE OUR VALIENT
  ;EFFORTS WE MUST POP ALL THE CHARACTERS BACK OFF THE STACK AND
  ;POP OFF THE BEGINNING BUFFER PRINT LOCATION AND INPUT A "%" SIGN THERE

  ;CONSTANTS FOR THE RANDOM NUMBER GENERATOR FOLLOW
  ;DO NOT CHANGE THESE WITHOUT CONSULTING KNUTH VOL 2
  ;CHAPTER 3 FIRST
Edit: GW-BASIC, not QBASIC (https://github.com/microsoft/GW-BASIC)
ndiddy
Fun fact, GW-BASIC was a descendant of the original Altair BASIC. The "Translation created 10-Feb-83" headers on each source file refer to tooling Microsoft had that automatically translated the 8080 assembly to 8086 (it shouldn't be taken as a build date since they were manually modified after that point). Besides GW-BASIC, source code for the 6502 and 6809 rewrites of Microsoft BASIC were available up to this point (see https://www.pagetable.com/?p=774 and https://github.com/davidlinsley/DragonBasic) but I believe this is the first public release of the original 8080 BASIC code.
deathtrader666
Shouldn't it be "valiant" ?
roryirvine
Sure, but in those days spellcheckers were separate apps - the most popular at the time being CorrectStar from MicroPro.

They weren't integrated into programming-oriented editors, and it would have been unusual to run them against code.

3836293648
I still haven't seen anyone using a spellchecker in code outside of IntelliJ
jimbob45
The best programmers I’ve known have all been deficient at spelling. I don’t know why it so uniformly appears among them.
AdmiralAsshat
themadturk
Humans in general, even writers, are deficient at spelling. This is the reason we need spellcheckers.
nilsbunger
Steve Jobs used to say the problem with Microsoft is they don’t have taste.

The font-shimmering effect on scroll immediately reminded me of that, it is really distracting. And you can’t use reader mode to disable it.

(FWIW, I’m a fan of Bill Gates and all he’s done for the world)

toddmorey
The design is fun and gave me a lot of nostalgia, but I admit they overdid it. They could have made that piece feel the same without so much distraction. And please people, support reader mode. It's not hard and it shouldn't be optional.

EDIT: Good god they animated EVERYTHING. It's not even readable... also... not one inline code sample? This is the designer trying to get an awwwards site of the day without any interest in the actual content. It's like a guitar player that solos over everyone else's solos.

nerevarthelame
On top of the poor readability, my 2-year-old laptop can't even navigate through the page without CPU and GPU going insane, and my fans blasting at max speed. It's the poorest, choppiest web performance I can recall, all for what should be a simple blog post.
SpaceNoodled
That's the fault of modern websites being massive JavaScript ad-playing behemoths instead of sub-1kB served HTML as god intended.
hbn
Funny cause just today this made it to the front page of HN

https://animejs.com/

It has way fancier animations and scrolls like butter

bostik
Tim Berners-Lee has been elevated to many things, but an ascension to deity must be a new reach.
kevincox
FWIW the spinning scrolling effects of Apple release announcements are nearly as bad.
graton
Personally I like it :) Tastes differ.
fsckboy
get your hands on DONKEY.BAS you will love it!
dr_dshiv
This is very LSD. I didn’t realize Bill was into it, until recently

https://www.youtube.com/watch?v=uHY5i9-0tJM

zabzonk
I've written an Intel 8080 emulator that was portable between Dec10/VAX/IBM VM CMS. That was easy - the 8080 can be done quite simply with a 256 value switch - I did mine in FORTRAN77.

Writing a BASIC interpreter, with floating point, is much harder. Gates, Allen and other collaborators BASIC was pretty damned good.

teleforce
Fun facts, according to Jobs for some unknown reasons Wozniak refused to add floating point support to Apple Basic thus they had to license BASIC with floating point numbers from Microsoft [1].

[1] Bill & Steve (Jobs!) reminisce about floating point BASIC:

https://devblogs.microsoft.com/vbteam/bill-steve-jobs-remini...

WalterBright
Writing a floating point emulator (I've done it) is not too hard. First, write it in a high level language, and debug the algorithm. Then hand-assembling it is not hard.

What is hard is skipping the high level language step, and trying to do it in assembler in one step.

kragen
Also, though, how big was Apple Integer BASIC? As I understand it, you had an entire PDP-10 at your disposal when you wrote the Fortran version of Empire.
zabzonk
I've never understood floating point :-)
zozbot234
Floating point math was a key feature on these early machines, since it opened up the "glorified desk calculator" use case. This was one use for them (along with gaming and use as a remote terminal) that did not require convenient data storage, which would've been a real challenge before disk drives became a standard. And the float implementation included in BASIC was the most common back in the day. (There are even some subtle differences between it and the modern IEEE variety that we'd be familiar with today.)
musicale
I agree - it's a useful BASIC that can do math and fits in 4 or 8 kilobytes of memory.

And Bill Gates complaining about pirating $150 Altair BASIC inspired the creation of Tiny BASIC, as well as the coining of "copyleft".

phkahler
I still have a cassette tape with Microsoft Basic for the Interact computer. It's got an 8080.
thijson
I remember my old Tandy Color Computer booting up and referencing Microsoft BASIC:

https://tinyurl.com/2jttvjzk

The computer came with some pretty good books with example BASIC programs to type in.

vile_wretch
I have a MS Extended Basic cassette for the Sol-20, also 8080 based.
thesuitonym
You should upload the audio to the Internet Archive!
TMWNN
>Writing a BASIC interpreter, with floating point, is much harder. Gates, Allen and other collaborators BASIC was pretty damned good.

The floating point routines are Monte Davidoff's work. But yes, Gates and Allen writing Altair BASIC on the Harvard PDP-10 without ever actually seeing a real Altair, then having it work on the first try after laboriously entering it with toggle switches at MITS in Albuquerque, was a remarkable achievement.

WalterBright
What Allen did was write an 8080 emulator that ran on the -10. The 8080 is a simple CPU, so writing an emulator for it isn't hard.

https://pastraiser.com/cpu/i8080/i8080_opcodes.html

Then, their BASIC was debugged by running it on the emulator.

The genius was not the difficulty of doing that, it wasn't hard. The genius was the idea of writing an 8080 emulator. Wozniak, in comparison, wrote Apple code all by hand in assembler and then hand-assembled it to binary, a very tedious and error-prone method.

In the same time period, I worked at Aph, and we were developing code that ran on the 6800 and other microprocessors. We used full-fledged macro assemblers running on the PDP-11 to assemble the code into binary, and then download binary into an EPROM which was then inserted into the computer and run. Having a professional macro assembler and text editors on the -11 was an enormous productivity boost, with far fewer errors. (Dan O'Dowd wrote those assemblers.)

(I'm doing something similar with my efforts to write an AArch64 code generator. First I wrote a disassembler for it, testing it by generating AArch64 code via gcc, disassembling that with objdump and then comparing the results with my disassmbler. This helps enormously in verifying that the correct binary is being generated. Since there are thousands of instructions in the AArch64, this is a much scaled up version of the 8080.)

dhosek
The Wozniak method was how I used to write 6502 assembler programs in high school since I didn’t have the money to buy a proper assembler. I wrote everything out longhand on graph paper in three columns. Addresses on the left, a space for the code in the middle and the assembler opcodes on the right, then I’d go through and fill in all the hex codes for what I’d written. When you work like that, it really focuses the mind because there’s not much margin for error and making a big change in logic requires a lot of manual effort.
zabzonk
Allen had to write the loader in machine code, which was toggled in on the Altair console. The BASIC interpreter itself was loaded from paper tape via the loader and a tape reader. The first BASIC program Allen ran on the Altair was apparently "2 + 2", which worked - i.e. it printed "4" I'd like to have such confidence in my own code, particularly the I/O, which must have been tricky to emulate on the Dec10.
WalterBright
> which must have been tricky to emulate on the Dec10

I don't see why it would be tricky. I don't know how Allen's 8080 emulator on the PDP-10 worked, but it seems straightforward to emulate 8080 I/O.

n0rdy
Flipping through the source code is like a time machine tour of tech's evolution over the past 50 years. It made me wonder: will our 2025 code look as ancient by 2075?

And, btw, great infographics within the post.

freedomben
That's interesting to consider. Some of the GNU code is getting quite old and looking through it is a blast from the past. I'm frankly amazed that it continues to work so well. I suspect there is a relatively small group of GNU hackers out there rocking gray and white beards that are silently powering the foundations of our modern world, and I worry what's going to happen when they start retiring. Maybe we'll get rust rewrites of everything and a new generation will take over, but frankly I'm pretty worried about it.
Towaway69
Has there ever been a moment in human history where we’ve (as a society, not as individuals) looked back and were envious?

So my money is that the code I wrote today is the joke of tomorrow - for all involved.

Also, I for one don’t want to go back to punch cards ;)

bojan
> Has there ever been a moment in human history where we’ve (as a society, not as individuals) looked back and were envious?

I am guessing that generation that transitioned from Pax Romana to early middle ages in Europe.

Towaway69
I doubt that since knowledge and education wasn’t wide spread - beyond cloisters, people didn’t general know how well the Romans had it.

Remember it took until the Renaissance until ancient texts (Greek and Roman) were “rediscovered” by European scholars.

deanCommie
I think to most (90+%?) software developers out their in the world, Assembler might as well be hieroglyphics. They/we can guess at the concepts involved of course, but actually being able to read the code end to end, and have a mental model of what is happening is not really going to happen. Not without some sort of Rosetta Stone. (Comments :) )

I think 2075 developers will feel the same way about modern Java, C#, TypeScript, etc.

They will think of themselves as software developers but they won't be writing code the same way, they'll be giving guided instructions to much higher level tools (perhaps AIs that themselves have a provenance back to modern LLMs)

Just as today there will still be those that need to write low level critical code. There are still lots of people today that have to write Assembler, though end up expressing it via C or Rust. And there will be people still working on AI technology. But even those will be built off other AI's.

-__---____-ZXyw
Tried to open this page and the music I was streaming started to stutter so hard I just exed out. Is this a preposterously heavy page, or just very heavy?
jwnin
Some luck, and willingness to take risks paid off in ways that could never be anticipated. Not sure I'll see something like the pc era in my lifetime. Perhaps mobile phones, or the Internet.
vessenes
Having lived through pcs, internet, mobile, social, crypto and ai, I’d say mobile or social has been the biggest so far and AI is likely to be vastly larger impact. Of course they build on each other. But the global impact of mobile and social vastly exceed that of the pc era.
LeFantome
The Internet?
wrobelda
I mean… The AI?
thesuitonym
Consider that nobody ever sat in countless meetings asking "How can we use the PC?" They either saw the vision and went for it, or eventually ran up against the limitations of working without a PC and bought in.
hnuser123456
Well, apparently, the guys in Xerox did sit in meetings not knowing what to do, until Steve Jobs visited PARC and saw what was possible.
kragen
Actually, there was about a 15-year period where many people didn't think PCs were good for anything, because they had access to much better (shared) computers. That's the context where http://catb.org/jargon/html/B/bitty-box.html comes from. See also http://canonical.org/~kragen/tao-of-programming.html#book8. Throughout the 01980s PC Magazine worked hard to convince business decisionmakers that IBM PCs weren't merely game machines; if you look at old issues you'll see that computer games were completely missing from the abundant advertisements in the magazine, presumably due to an explicit policy decision.

I personally encountered people arguing that using PCs (as opposed to VAXen or mainframes) was a waste of time as late as 01992. And I actually even sort of joined them; although I'd been using PCs since before the IBM PC, once I got access to the internet in 01992, I pretty much stopped using PCs as anything but a terminal or a game machine for years, spending virtually 100% of my computer time on VMS or Ultrix. When I was using PCs again, it was because I could run BSD/386 and Linux on them, in 01994.

(Maybe you'd assume from my own story of enthusiastic adoption that "nobody ever sat in countless meetings asking[,] "How can we use the internet?"', but if so, you'd be extremely wrong. In 01992 and even in 01994 there were lots of people who thought the internet was useless or a fad. Bill Gates's The Road Ahead, published November 01995, barely mentioned the internet, instead treating it as a sort of failed experiment that would be supplanted by the Information Superhighway. Metcalfe predicted in 01996 that it would collapse. David Isenberg was still arguing against "Bellheads" and their "Advanced Intelligent Network" in 01997: https://isen.com/stupid.html)

It can be easy looking back in retrospect to oversimplify events like these with the benefit of hindsight, imagining that the things that seem obvious now were obvious then. But not only weren't they obvious—in many cases, they could have turned out differently. I think it was Alan Kay that argued that, without the invention of the sort of graphical user interface used by most non-cellphone personal computers today, the personal computer as we know it never would have become a mass-market phenomenon (though video game consoles were) and therefore Moore's Law would have stalled out decades ago. I'm not sure he was right, but it seems like a plausible alternate history to me.

Of course, there were "killer apps" as early as VisiCalc for the Apple ][. Accountants and corporate executives were willing to read through the manual and take the time to learn how to use it, because it was such a powerful tool for what they were doing. But it was designed for specialists; it's not a UI that rewards casual use the way Excel or MacPaint or NCSA Mosaic is. Without the GUI, or if the GUI had come much later, plausibly personal computers would have remained a niche hobbyist thing for much longer, while somebody like Nintendo would have locked down the unwashed-masses platform—as we now see happening with Android. And (maybe this is obvious) that would have made it immensely less useful.

Izikiel43
That came out of millions of dollars and man hours of investment by Google and OpenAi.

VS

Some college students selling software they didn't have and getting it ready from 0 to sellable in 2 months which led to a behemoth that still innovates to this day.

jonas21
It doesn't sound that different from Alex Krizhevsky training AlexNet on a pair of gaming GPUs in his bedroom, winning ImageNet, and launching the current wave of deep learning / AI.
PythonicIT
I'm not as smart as you guys but I figured that I'm going to try and write wine for life every single thing on GitHub unless someone has done it already so that we could try to compile and build this thing directly on our own computers.
jlmcgraw
I wonder who the handwritten notes on page 98 are by?

Starts with "confirm plane reservation on Tue. Sept 2 or Wed. Sept 3" which is correct for 1975

jer0me
The source code is linked at the end (warning: it's a 100 MB PDF).

https://images.gatesnotes.com/12514eb8-7b51-008e-41a9-512542...

pdw
The printout is dated 10-SEP-75 and is labeled "VERSION 3.0 -- MORE FEATURES TO GO".

Curiously this isn't the oldest extant version of the source code. The Harvard archives have a copy of version 1.1, printed on 30 April 75. http://altairbasic.org/other%20versions/ian.htm

Aardwolf
The printout also contains dates 6-SEP-64 below it, any idea what those are?
seabass-labrax
Thank you for the warning. I once used up my Internet package's entire monthly quota by following a similar link on Hacker News.
mysterydip
Ironic for something designed to take up only 4KB on its target machine :)
paulddraper
(It's a high-res image of the printed code.)
masfuerte
Nice one. Has anyone OCRed this back into text?
pronoiac
I attempted OCR with OCRmyPDF / Tesseract. It's not great, but it's under 1% the size, at least. https://github.com/pronoiac/altair-basic-source-code
noname120
Maybe you should try something like EasyOCR instead: https://github.com/JaidedAI/EasyOCR