With their drab gray suits and their Buddy Holly glasses, the so-called traitorous eight don’t look like revolutionaries. Given no context, you can imagine them occupying some kind of middle-management role at a small regional bank. And yet these are the people you can thank for the digital world.
The eight—which included Intel cofounder Gordon Moore—had departed Shockley Semiconductor Laboratory to found Fairchild Semiconductor, which soon became the world’s biggest producer of electrical components for computers. Many of its founders would, in turn, leave again to launch their own ventures.
Many of these companies coalesced in the same area—the place we now call Silicon Valley—creating an ecosystem for innovation and technological development that endures to this day.
Look again at that photo. Even with the suits and the glasses, these are arguably some of the most interesting and influential people that the technology industry has ever known. Even if you don’t know their names, and even though they have never appeared on the Joe Rogan podcast, they still have a legacy that endures to this day.
The Power of Hardware
On a personal level, I have always found hardware more interesting than software.
There’s a joke that CPUs are just “rocks we tricked into thinking,” which has some truth to it. You can’t help but be amazed at the process that turns iron, copper, gold, and silicon dioxide into something that can run unfathomably complex mathematical calculations, play chess, and stream Netflix. And that’s before you take into account that even the most basic consumer CPU has billions of transistors, each measuring a fraction of the width of a hair.
Perhaps the main reason I’m drawn to hardware is that it’s often easy to measure whether something is better than the thing that preceded it. With a tape measure, you can see whether one computer is smaller than another. You can calculate how many mathematical operations a CPU can perform in a second, or count the number of pixels on a display. You can measure its weight, or the heat it gives off, or whether one battery has a larger capacity than another.
Hardware is clean-cut. Straightforward. Unambiguous.
And these improvements aren’t theoretical, but are felt directly by the end user. When a physical object is meaningfully better, you can tell. If you have upgraded from an Intel to an Apple Silicon Mac, you know this. You probably remember what it was like when you ditched your bulky CRT monitor for an LCD flat panel. You know the difference between a computer with a mechanical hard drive and one with flash storage.
Hardware is typically built with utility in mind. The old adage “hardware is hard” is true, but it neglects the fact that it’s also pretty expensive. You only really build something if you believe it’s better than the existing thing, and that somebody will find it useful enough to pay for it.
Silicon Valley Needs to Rediscover Its Roots
The modern tech industry—especially that which now occupies the same hallowed ground once trod by the treacherous eight—has become a shell of its former self. Tech’s “innovations” feel only marginally iterative at best.
It is this that makes me nostalgic for the era when Silicon Valley was about silicon—or, more specifically, physical, tangible objects that changed the world. And I believe it is an era that we can, and must, return to.
The Silicon Valley of the late 1960s, 1970s, and 1980s was a glorious time of American innovation and engineering, where verifiable geniuses discovered the breakthroughs that allowed our current world to exist.
The integrated circuit. The microprocessor. The computer mouse.
It was an era when technological vision and clear-thinking business strategy combined to bring new inventions to a market, and then popularize them to a global scale. And in doing so, Silicon Valley changed everything.
To be clear, I am not just talking about vision. I’m talking about hardware. The applications we will need to run in the future will require faster, better computers, and we need somebody to invent them.
Faster, better computers will allow us to reclaim ownership of the tech we use, enabling us to finally break free of the cloud. It will help undo some of the disastrous cultural changes that have occurred over the past decade or so, when people got used to the idea that they must always be subject to the mercies of another, larger tech company.
Hardware is hard. Change is even harder. But in this case, I think it’s worth it.
The Bright Light on the Tip of the Spear
So, there is some cause for optimism, and it’s not in giant GPUs.
Buried in the news coming out of CES was the announcement of Nvidia’s DGX Spark, a $3,000 desktop computer powered by Nvidia’s GB10 Grace Blackwell Superchip, that went relatively unnoticed but I believe is a significant moment in personal computing.
The DGX Spark delivers up to 1 petaflop of performance in a compact form factor, giving researchers and developers unprecedented access to cutting-edge computational power directly at their desks. It’s like having a computer that’s a thousand times faster than a regular desktop in the body of a Mac Mini, and I’m a little surprised it isn’t being taken more seriously.
In more human terms, Nvidia created an ultrapowerful Mac Mini that developers, data scientists, and AI researchers are able to use to run reasonably large data workloads and AI models on their desk as opposed to a fleet of massive GPU servers in the cloud.
While Silicon Valley’s biggest companies have grown on the back of software, the truth is that it needs hardware to grow any further, and while those GPUs might be the headline-grabbers during Nvidia’s earnings, creating meaningful new kinds of computing is what will lead to actual innovation in software.
As a result, by creating a Blackwell chip inside a Mac Mini-size supercomputer, Nvidia allows companies to crunch through large data sets or run self-hosted generative AI models quickly and efficiently, all without relying on the cloud to do so. This vastly lowers the barrier to entry for high-performance computing, which currently requires buying or renting expensive specialized hardware or spinning up expensive infrastructure.
I’m going to dive briefly into why this matters. For years, both at Voltron Data and previously at BlazingSQL, I’ve advocated for clustering smaller, more efficient, and less-expensive GPUs together using high-performance networking. However, network limitations have always prevented full utilization of the cluster’s compute performance since data simply couldn’t move fast enough to keep GPUs fully fed.
While it hasn’t shipped yet, Nvidia has specifically called out the inclusion of ConnectX to allow users to connect two Nvidia DGX Spark computers together, as well as other features (NCCL, RDMA, GPUDirect storage) that are specifically built for faster networking. This will enable efficient parallel processing and high-bandwidth communication, making high-performance AI and analytics workloads accessible to a broader range of researchers and enterprises. A distributed model using a cluster of Nvidia DGX Spark units could offer a more cost-effective and flexible alternative to currently available GPU clusters, lowering the barrier to entry for basically any high-performance computing use cases.
My focus on Nvidia DGX Spark is to illustrate a greater point about what will keep Silicon Valley at the forefront of technological progress. True innovation doesn’t come from just making things “bigger” or “more powerful,” but in the distinct relationship and interactions between software and hardware, and even between different pieces of hardware. Nvidia DGX Spark isn’t just Nvidia making a chip smaller, but finding ways to add faster on-device memory, software to make getting the data to both the memory and the GPU faster, and (I imagine) some unique ways to keep it cool.
The truly world-changing innovations and technological breakthroughs that will advance humanity will come from a deep commitment to silicon engineering, and Silicon Valley needs to remember that this is the only way that software will continue to grow.
Login to add comment
Other posts in this group

Section 230 of the Communications Decency Act, passed i

When government officials accidentally included Jeffrey Goldberg, The Atlantic’s editor-in-chief, in a Signal group chat discussing U.S. military plans, all hell broke loose. The Atla

It really is mind-blowing how much incredible stuff we can do with images these days.
’Twasn’t long ago, after all, that advanced image adjustments required pricey desktop-computer software and s

Rasmus Hougaard is the founder and managing partner of Potential Project. In 2019 he was nominated by Thinkers50 as one of the eight most important leadership thinkers in the world. He writes for&

Almost 23 years ago, an employee at Apple described Steve Jobs to me as one of the world’s few “rock star CEOs.” At the time, I didn’t understand why anyone would talk about the head of a company

Millennials were told the 2008 recession was a “once in a generation” economic crisis. Almost two decades later, déjà vu has struck.
While the U.S. market rose following Pres

Meta is set to face off against the U.S. Federal Trade Commission on Monday in an antitrust trial that could force the social media giant to divest Instagram and WhatsApp.
The closely wa