I mean… The AI?
That came out of millions of dollars and man hours of investment by Google and OpenAi.
VS
Some college students selling software they didn't have and getting it ready from 0 to sellable in 2 months which led to a behemoth that still innovates to this day.
It doesn't sound that different from Alex Krizhevsky training AlexNet on a pair of gaming GPUs in his bedroom, winning ImageNet, and launching the current wave of deep learning / AI.
The big difference is that Bill's dad was one of the best corporate lawyers in America. Microsoft might not have amounted to much if they hadn't struck some extraordinarily prescient licensing deals at the right time and place.
No difference really, just google who Bill Gates' mom was and how he got the IBM DOS deal... It wasn't BASIC that made MS big, it was DOS.
from "Idea Man" - memoir by Paul Allen:
That August, a three-piece-suited contingent led by Jack Sams
approached us about Project Chess, the code name for what would
become IBM’s PC. After we talked them out of an 8-bit machine
and won them over to the Intel 8086 (or as it turned out, the
cheaper but virtually identical 8088), they wanted everything in
our 16- bit cupboard, including FORTRAN, COBOL, and Pascal.
Aside from BASIC, none of these products were even close to being
ready for the 8086 platform. It would take a wild scramble to get
them all done on IBM’s tight timetable.
Then, in late September, Sams asked us if we could provide
a 16-bit operating system. We referred him to Digital Research,
which we’d heard was far along in building one. Bill called Gary
Kildall and said, “I’m sending some people over to you, and I want
you to be good to them, because you and I are both going to make
a lot of money on this deal.” He didn’t mention IBM by name be
cause the company insisted on maximum discretion and secrecy.
We’d had to sign a nondisclosure agreement before they’d even sit
down with us.
As Kildall himself later acknowledged, he was off flying on
business when the Project Chess group arrived. His wife, who was also
his business partner, refused to sign the nondisclosure and offered
a Digital Research document instead. That was something you did
not do with IBM. Sams came back to us and said, “I don’t think
we can work with those guys—it would take our legal department
six months to clear the paperwork. Do you have any other ideas?
Could you handle this on your own?”
After the fact, there would be endless rumors about Microsoft’s
dealings with Digital Research. Kildall theorized that IBM chose to
work with us because we were willing to license an operating system
for a flat fee, while Kildall insisted on a per-copy royalty. But
I had a front-row seat, and this is what happened: We tried to do
Digital Research a favor, and they blew it. They dropped the ball.
I vividly remember how furious Bill was at what had transpired.
He couldn’t believe that Kildall had blown this golden chance and
placed the whole project in jeopardy.
Bill called an emergency meeting with me and Kay Nishi. What
could we do to resuscitate the deal? There was silence for a
moment, and then I said, “There’s another operating system that
might work. I don’t know how good it is, but I think I can get it
for a reasonable price.” I told them the story of Tim Patterson and
Seattle Computer Products, which began shipping its 8086 machine
earlier that year but had found sparse commercial interest.
The missing link was an operating system. Kildall had promised a
CP/M-86 by the first of the year, but he hadn’t delivered; his
company lacked the typical start-up’s urgency. No one knew when his
16-bit software would make it to market.
Tim Patterson had gotten frustrated waiting. Our BASIC-86
was fine for writing programs, but his customers couldn’t run a
word processor or other applications on top of it. So Tim had
cobbled together a provisional 16-bit operating system to help his
company sell a few computers until Kildall came through. (As Tim
later said, “We would have been perfectly happy having somebody
else do the operating system. If [Digital Research] had delivered
in December of ’79, there wouldn’t be anything but CP/M in this
world today.”) He called the program QDOS, for Quick and Dirty
Operating System, which he’d managed to cram into 6K of code.
Once it was mostly done, he changed the name to 86-DOS.
[text deleted]
Bill was less enthusiastic. He didn’t know Tim Patterson, and
we’d be betting our deal with IBM—the most critical one we’d
ever have— on an unknown quantity once called Quick and Dirty.
But Bill realized that we might lose the whole contract unless we
came up with something, and he went along.
I called Rod Brock, who owned Seattle Computer Products, to
work out a licensing agreement. We settled on $10,000, plus a royalty
of $15,000 for every company that licensed the software—a
total of $25,000 for now, as we had only one customer. The next
day, a Microsoft delegation (Bill, Steve, and Bob O’Rear) met with
IBM in Boca Raton and proposed that Microsoft coordinate the
overall software development process for the PC. Five weeks later,
the contract was signed. IBM would pay us a total of $430,000:
$75,000 for “adaptations, testing, and consultation”; $45,000 for
the disk operating system (DOS); and $310,000 for an array of 16-
bit language interpreters and compilers.
Bill and I were willing to forgo per-copy royalties if we could
freely license the DOS software to other manufacturers, our old
strategy for Altair BASIC. Already enmeshed in antitrust litigation,
IBM readily bought this nonexclusive arrangement. They’d later
be slammed for giving away the store, but few people at the time
discerned how quickly the industry was changing. And no one, including
us, foresaw that the IBM deal would ultimately make
Microsoft the largest tech company of its day, or that Bill and I would
become wealthy beyond our imagining.
Great point, I was thinking more on the Transformer architecture, but I stand corrected.
Google started similarly with PageRank as far as I remember.
Grad students, but yeah. CUDA was also basically invented by a grad student.
Many undergrad examples as well in the web era, from Excite to Facebook to Snapchat.
(Note the unanticipated consequences aren't always good.)
Consider that nobody ever sat in countless meetings asking "How can we use the PC?" They either saw the vision and went for it, or eventually ran up against the limitations of working without a PC and bought in.
Well, apparently, the guys in Xerox did sit in meetings not knowing what to do, until Steve Jobs visited PARC and saw what was possible.
Actually, there was about a 15-year period where many people didn't think PCs were good for anything, because they had access to much better (shared) computers. That's the context where http://catb.org/jargon/html/B/bitty-box.html comes from. See also http://canonical.org/~kragen/tao-of-programming.html#book8. Throughout the 01980s PC Magazine worked hard to convince business decisionmakers that IBM PCs weren't merely game machines; if you look at old issues you'll see that computer games were completely missing from the abundant advertisements in the magazine, presumably due to an explicit policy decision.
I personally encountered people arguing that using PCs (as opposed to VAXen or mainframes) was a waste of time as late as 01992. And I actually even sort of joined them; although I'd been using PCs since before the IBM PC, once I got access to the internet in 01992, I pretty much stopped using PCs as anything but a terminal or a game machine for years, spending virtually 100% of my computer time on VMS or Ultrix. When I was using PCs again, it was because I could run BSD/386 and Linux on them, in 01994.
(Maybe you'd assume from my own story of enthusiastic adoption that "nobody ever sat in countless meetings asking[,] "How can we use the internet?"', but if so, you'd be extremely wrong. In 01992 and even in 01994 there were lots of people who thought the internet was useless or a fad. Bill Gates's The Road Ahead, published November 01995, barely mentioned the internet, instead treating it as a sort of failed experiment that would be supplanted by the Information Superhighway. Metcalfe predicted in 01996 that it would collapse. David Isenberg was still arguing against "Bellheads" and their "Advanced Intelligent Network" in 01997: https://isen.com/stupid.html)
It can be easy looking back in retrospect to oversimplify events like these with the benefit of hindsight, imagining that the things that seem obvious now were obvious then. But not only weren't they obvious—in many cases, they could have turned out differently. I think it was Alan Kay that argued that, without the invention of the sort of graphical user interface used by most non-cellphone personal computers today, the personal computer as we know it never would have become a mass-market phenomenon (though video game consoles were) and therefore Moore's Law would have stalled out decades ago. I'm not sure he was right, but it seems like a plausible alternate history to me.
Of course, there were "killer apps" as early as VisiCalc for the Apple ][. Accountants and corporate executives were willing to read through the manual and take the time to learn how to use it, because it was such a powerful tool for what they were doing. But it was designed for specialists; it's not a UI that rewards casual use the way Excel or MacPaint or NCSA Mosaic is. Without the GUI, or if the GUI had come much later, plausibly personal computers would have remained a niche hobbyist thing for much longer, while somebody like Nintendo would have locked down the unwashed-masses platform—as we now see happening with Android. And (maybe this is obvious) that would have made it immensely less useful.