;WE COULD NOT FIT THE NUMBER INTO THE BUFFER DESPITE OUR VALIENT
;EFFORTS WE MUST POP ALL THE CHARACTERS BACK OFF THE STACK AND
;POP OFF THE BEGINNING BUFFER PRINT LOCATION AND INPUT A "%" SIGN THERE
;CONSTANTS FOR THE RANDOM NUMBER GENERATOR FOLLOW
;DO NOT CHANGE THESE WITHOUT CONSULTING KNUTH VOL 2
;CHAPTER 3 FIRST
Edit: GW-BASIC, not QBASIC (https://github.com/microsoft/GW-BASIC)They weren't integrated into programming-oriented editors, and it would have been unusual to run them against code.
https://www.teepublic.com/t-shirt/637761-i-write-code-progra...
The font-shimmering effect on scroll immediately reminded me of that, it is really distracting. And you can’t use reader mode to disable it.
(FWIW, I’m a fan of Bill Gates and all he’s done for the world)
EDIT: Good god they animated EVERYTHING. It's not even readable... also... not one inline code sample? This is the designer trying to get an awwwards site of the day without any interest in the actual content. It's like a guitar player that solos over everyone else's solos.
It has way fancier animations and scrolls like butter
Writing a BASIC interpreter, with floating point, is much harder. Gates, Allen and other collaborators BASIC was pretty damned good.
[1] Bill & Steve (Jobs!) reminisce about floating point BASIC:
https://devblogs.microsoft.com/vbteam/bill-steve-jobs-remini...
What is hard is skipping the high level language step, and trying to do it in assembler in one step.
And Bill Gates complaining about pirating $150 Altair BASIC inspired the creation of Tiny BASIC, as well as the coining of "copyleft".
The computer came with some pretty good books with example BASIC programs to type in.
The floating point routines are Monte Davidoff's work. But yes, Gates and Allen writing Altair BASIC on the Harvard PDP-10 without ever actually seeing a real Altair, then having it work on the first try after laboriously entering it with toggle switches at MITS in Albuquerque, was a remarkable achievement.
https://pastraiser.com/cpu/i8080/i8080_opcodes.html
Then, their BASIC was debugged by running it on the emulator.
The genius was not the difficulty of doing that, it wasn't hard. The genius was the idea of writing an 8080 emulator. Wozniak, in comparison, wrote Apple code all by hand in assembler and then hand-assembled it to binary, a very tedious and error-prone method.
In the same time period, I worked at Aph, and we were developing code that ran on the 6800 and other microprocessors. We used full-fledged macro assemblers running on the PDP-11 to assemble the code into binary, and then download binary into an EPROM which was then inserted into the computer and run. Having a professional macro assembler and text editors on the -11 was an enormous productivity boost, with far fewer errors. (Dan O'Dowd wrote those assemblers.)
(I'm doing something similar with my efforts to write an AArch64 code generator. First I wrote a disassembler for it, testing it by generating AArch64 code via gcc, disassembling that with objdump and then comparing the results with my disassmbler. This helps enormously in verifying that the correct binary is being generated. Since there are thousands of instructions in the AArch64, this is a much scaled up version of the 8080.)
I don't see why it would be tricky. I don't know how Allen's 8080 emulator on the PDP-10 worked, but it seems straightforward to emulate 8080 I/O.
And, btw, great infographics within the post.
So my money is that the code I wrote today is the joke of tomorrow - for all involved.
Also, I for one don’t want to go back to punch cards ;)
I am guessing that generation that transitioned from Pax Romana to early middle ages in Europe.
Remember it took until the Renaissance until ancient texts (Greek and Roman) were “rediscovered” by European scholars.
I think 2075 developers will feel the same way about modern Java, C#, TypeScript, etc.
They will think of themselves as software developers but they won't be writing code the same way, they'll be giving guided instructions to much higher level tools (perhaps AIs that themselves have a provenance back to modern LLMs)
Just as today there will still be those that need to write low level critical code. There are still lots of people today that have to write Assembler, though end up expressing it via C or Rust. And there will be people still working on AI technology. But even those will be built off other AI's.
I personally encountered people arguing that using PCs (as opposed to VAXen or mainframes) was a waste of time as late as 01992. And I actually even sort of joined them; although I'd been using PCs since before the IBM PC, once I got access to the internet in 01992, I pretty much stopped using PCs as anything but a terminal or a game machine for years, spending virtually 100% of my computer time on VMS or Ultrix. When I was using PCs again, it was because I could run BSD/386 and Linux on them, in 01994.
(Maybe you'd assume from my own story of enthusiastic adoption that "nobody ever sat in countless meetings asking[,] "How can we use the internet?"', but if so, you'd be extremely wrong. In 01992 and even in 01994 there were lots of people who thought the internet was useless or a fad. Bill Gates's The Road Ahead, published November 01995, barely mentioned the internet, instead treating it as a sort of failed experiment that would be supplanted by the Information Superhighway. Metcalfe predicted in 01996 that it would collapse. David Isenberg was still arguing against "Bellheads" and their "Advanced Intelligent Network" in 01997: https://isen.com/stupid.html)
It can be easy looking back in retrospect to oversimplify events like these with the benefit of hindsight, imagining that the things that seem obvious now were obvious then. But not only weren't they obvious—in many cases, they could have turned out differently. I think it was Alan Kay that argued that, without the invention of the sort of graphical user interface used by most non-cellphone personal computers today, the personal computer as we know it never would have become a mass-market phenomenon (though video game consoles were) and therefore Moore's Law would have stalled out decades ago. I'm not sure he was right, but it seems like a plausible alternate history to me.
Of course, there were "killer apps" as early as VisiCalc for the Apple ][. Accountants and corporate executives were willing to read through the manual and take the time to learn how to use it, because it was such a powerful tool for what they were doing. But it was designed for specialists; it's not a UI that rewards casual use the way Excel or MacPaint or NCSA Mosaic is. Without the GUI, or if the GUI had come much later, plausibly personal computers would have remained a niche hobbyist thing for much longer, while somebody like Nintendo would have locked down the unwashed-masses platform—as we now see happening with Android. And (maybe this is obvious) that would have made it immensely less useful.
VS
Some college students selling software they didn't have and getting it ready from 0 to sellable in 2 months which led to a behemoth that still innovates to this day.
Starts with "confirm plane reservation on Tue. Sept 2 or Wed. Sept 3" which is correct for 1975
https://images.gatesnotes.com/12514eb8-7b51-008e-41a9-512542...
Curiously this isn't the oldest extant version of the source code. The Harvard archives have a copy of version 1.1, printed on 30 April 75. http://altairbasic.org/other%20versions/ian.htm