Why Bell Labs worked so well, and could innovate so much, while today’s innovation, in spite of the huge private funding, goes in hype-and-fizzle cycles that leave relatively little behind, is a question I’ve been asking myself a lot in the past years.

And I think that the author of this article has hit the nail on its head on most of the reasons - but he didn’t take the last step in identifying the root cause.

What Bell Labs achieved within a few decades is probably unprecedented in human history:

  • They employed folks like Nyquist and Shannon, who laid the foundations of modern information theory and electronic engineering while they were employees at Bell.

  • They discovered the first evidence of the black hole at the center of our galaxy in the 1930s while analyzing static noise on shortwave transmissions.

  • They developed in 1937 the first speech codec and the first speech synthesizer.

  • They developed the photovoltaic cell in the 1940, and the first solar cell in the 1950s.

  • They built the first transistor in 1947.

  • They built the first large-scale electronic computers (from Model I in 1939 to Model VI in 1949).

  • They employed Karnaugh in the 1950s, who worked on the Karnaugh maps that we still study in engineering while he was an employee at Bell.

  • They contributed in 1956 (together with AT&T and the British and Canadian telephone companies) to the first transatlantic communications cable.

  • They developed the first electronic musics program in 1957.

  • They employed Kernighan, Thompson and Ritchie, who created UNIX and the C programming language while they were Bell employees.

And then their rate of innovation suddenly fizzled out after the 1980s.

I often hear that Bell could do what they did because they had plenty of funding. But I don’t think that’s the main reason. The author rightly points out that Google, Microsoft and Apple have already made much more profit than Bell has ever seen in its entire history. Yet, despite being awash with money, none of them has been as impactful as Bell. Nowadays those companies don’t even innovate much besides providing you with a new version of Android, of Windows or the iPhone every now and then. And they jump on the next hype wagon (social media, AR/VR, Blockchain, AI…) just to deliver half-baked products that (especially in Google’s case) are abandoned as soon as the hype bubble bursts.

Let alone singlehandedly spear innovation that can revolutionize an entire industry, let alone make groundbreaking discoveries that engineers will still study a century later.

So what was Bell’s recipe that Google and Apple, despite having much more money and talented people, can’t replicate? And what killed that magic?

Well, first of all Bell and Kelly had an innate talent in spotting the “geekiest” among us. They would often recruit from pools of enthusiasts that had built their own home-made radio transmitters for fun, rather than recruiting from the top business schools, or among those who can solve some very abstract and very standardized HackerRank problems.

And they knew how to manage those people. According to Kelly’s golden rule:

How do you manage genius? You don’t

Bell specifically recruited people that had that strange urge of tinkering and solving big problems, they were given their lab and all the funding that they needed, and they could work in peace. Often it took years before Kelly asked them how their work was progressing.

Compare it to a Ph.D today who needs to struggle for funding, needs to produce papers that get accepted in conferences, regardless of their level of quality, and must spend much more time on paperwork than on actual research.

Or to an engineer in a big tech company that has to provide daily updates about their progress, has to survive the next round of layoffs, has to go through endless loops of compliance, permissions and corporate bureaucracy in order to get anything done, has their performance evaluated every 3 months, and doesn’t even have control on what gets shipped - that control has been taken away from engineers and given to PMs and MBA folks.

Compare that way of working with today’s backlogs, metrics, micromanaging and struggle for a dignified salary or a stable job.

We can’t have the new Nyquist, Shannon or Ritchie today simply because, in science and engineering, we’ve moved all the controls away from the passionate technical folks that care about the long-term impact of their work, and handed them to greedy business folks who only care about short-term returns for their investors.

So we ended up with a culture that feels like talent must be managed, even micromanaged, otherwise talented people will start slacking off and spending their days on TikTok.

But, as Kelly eloquently put it:

“What stops a gifted mind from just slacking off?” is the wrong question to ask. The right question is, “Why would you expect information theory from someone who needs a babysitter?”

Or, as Peter Higgs (the Higgs boson guy) put it:

It’s difficult to imagine how I would ever have enough peace and quiet in the present sort of climate to do what I did in 1964… Today I wouldn’t get an academic job. It’s as simple as that. I don’t think I would be regarded as productive enough.

Or, as Shannon himself put it:

I’ve always pursued my interests without much regard for final value or value to the world. I’ve spent lots of time on totally useless things.

So basically the most brilliant minds of the 20th century would be considered lazy slackers today and be put on a PIP because they don’t deliver enough code or write enough papers.

So the article is spot on in identifying why Bell could invent, within a few years, all it did, while Apple, despite having much more money, hasn’t really done anything new in the past decade. MBAs, deadlines, pseudo-objective metrics and short-termism killed scientific inquiry and engineering ingenuity.

But the author doesn’t go one step further and identify the root cause.

It correctly spots the business and organizational issues that exist in managing talent today, but it doesn’t go deeper into their economic roots.

You see, MBA graduates and CEOs didn’t destroy the spirit of scientific and engineering ingenuity spurred by the Industrial Revolution just because they’re evil. I mean, there’s a higher chance for someone who has climbed the whole corporate ladder to be a sociopath than there is for someone you randomly picked from the street, but not to the point where they would willingly tame and screw the most talented minds of their generation, and squeeze them into a Jira board or a metric that looks at the number of commits, out of pure sadism.

They did so because the financial incentives have drastically changed from the times of Bells Labs.

The Bells Labs were basically publicly funded. AT&T operated the telephone lines in the US, paid by everyone who used telephones, and they reinvested a 1% tax into R&D (the Bells Labs). And nobody expected a single dime of profits to come out from the Bells Labs.

And btw, R&D was real R&D with no strings attached at the time. In theory also my employer does R&D today - but we just ended up treating whatever narrow iterative feature requested by whatever random PM as “research and development”. It’s not like scientists have much freedom in what to research or engineers have much freedom in what to develop. R&D programs have mostly just become a way for large businesses to squeeze more money out of taxpayers, put it in their pockets, and not feel any moral obligation of contributing to anything other than their shareholders’ accounts.

And at the time the idea of people paying taxes, so talented people in their country could focus on inventing the computer, the Internet or putting someone on the moon, without the pressure of VCs asking for their dividends, or PMs asking them to migrate everything to another cloud infrastructure by next week, or to a new shiny framework that they’ve just heard in a conference, wasn’t seen as a socialist dystopia. It was before the neoliberal sociopaths of the Chicago school screwed up everything.

The America that invested into the Bell Labs and into the Apollo project was very different from today’s America. It knew that it was the government’s job to foster innovation and to create an environment where genuinely smart people could do great things without external pressure. That America hadn’t yet been infected by the perverse idea that the government should always be small, that it’s not the government’s job to make people’s lives better, and that it was the job of privately funded ventures seeking short-term returns to fund moonshots.

And, since nobody was expecting a dime back from Bell, nobody would put deadlines on talented people, nobody hired unqualified and arrogant business specialists to micromanage them, nobody would put them on a performance improvement plan if they were often late at their daily standups or didn’t commit enough lines of code in the previous quarter. So they had time to focus on how to solve some of the most complex problems that humans ever faced.

So they could invent the transistor, the programming infrastructure still used to this day, and lay the foundations of what engineers study today.

The most brilliant minds of our age don’t have this luxury. So they can’t revolutionarize our world like those in the 20th century did.

Somebody else sets their priorities and their deadlines.

They can’t think of moonshots because they’re forced to work on the next mobile app riding the next wave of hype that their investors want to release to market so they can get even richer.

They have to worry about companies trying to replace them with AI bots and business managers wanting to release products themselves by “vibe coding”, just to ask those smart people to clean up the mess they’ve done, just like babies who are incapable of cleaning up the food they’ve spilled on the floor.

They are seen as a cost, not as a resource. Kelly used to call himself a “patron” rather than a “manager”, and he trusted his employees, while today’s managers and investors mostly see their engineering resources as squishy blobs of flesh standing between their ambitious ideas and their money, and they can’t wait to replace them with robots that just fullfill all of their wishes.

Tech has become all about monetization nowadays and nothing about ingenuity.

As a result, there are way more brilliant minds (and way more money) in our age going towards solving the “convince people to click on this link” problem rather than solving the climate problem, for example.

Then of course they can’t invent the next transistor, or bring the next breakthrough in information theory.

Then of course all you get, after one year of the most brilliant minds of our generation working at the richest company that has ever existed, is just a new iPhone.

https://links.fabiomanganiello.com/share/683ee70d0409e6.66273547

Why Bell Labs Worked. - by areoform

@fabio this really does explain the decline of progress since I've been alive, and what is called 'progress' now is just endless grift, a grift most people are getting wise to now.

The end of Bell Labs was the end of the lot of things....

@Lazarou @fabio British Telecom also had a similar research lab on a smaller scale just up the road from where I live (TBH it still exists, but is now a shadow of its former self, parts have been sold off to "startup" companies, Huawei bought a large portion of it (to the point the govt is now paranoid and regretting this because a whole load of telecoms infrastructure is now run by the Chinese) and its mostly empty other than odd surveillance projects carried out for GCHQ etc and monitoring social media traffic for the authorities as well as cyberthreats..
@vfrmedia @Lazarou @fabio Philips had its NAT Lab in the Netherlands. Also very productive, but does not exist anymore. ASML seems to try it again with another lab.
@wvlith @vfrmedia @Lazarou Philips is that company that somehow managed to do all the right investments ahead of their competitors, managed to put their money on a lot of winning horses, and somehow managed to squander it all, retreating in ever-narrower fields, until it probably implodes on whatever is left of its former self once the folks currently work there retire…

@fabio @Lazarou @wvlith

I think Philips today just sub contracts a lot of its (ready built) products to other companies in Asia although it was already doing the same in the 1980s (for instance all the small audio electronics was often made by Sanyo).

Their inspection lamps (for vehicle maintenance) look aesthetically pleasing, but I can't see how they are that much better than the cheaper Chinese ones I already own..

@vfrmedia @fabio @Lazarou @wvlith

Much stronger than that. Philips Consumer Products was sold years ago to a Chinese PE company. They lease the brand name from the original Philips company, who only make medical devices nowadays. And shavers, for some reason.

@Zamfr @vfrmedia @fabio @Lazarou @wvlith The shavers are likely very profitable. Their shavers are good, but really, really, really expensive. Most of the "innovation" in that market is hype anyway.

@Elrick_Winter @vfrmedia @fabio @Lazarou @wvlith

Then again, they could get a good price for a profitable business. For the last decades, their strategy was to sell business parts and use that money to buy medical-electronic firms.

I think the theory is that shavers, tooth brushes etc. are somehow medically-adjacent

@Zamfr @Elrick_Winter @fabio @Lazarou @wvlith

They are classed as health/personal care items (which Phillips have made for decades and they seem to be concentrating their efforts on healthcare related items these days)

@vfrmedia @Lazarou @fabio Was that Martlesham Heath?
@geoglyphentropy @Lazarou @fabio indeed - I think its still active but compared to even a few years back hear very little about what is going on there (it once used to be in the local news every few weeks)
@Lazarou @fabio I worked at HP Laboratories-and yes, you can do lots of interesting stuff -actually shipping it is really hard. I have mixed feelings about the value of central R&D: I think well funded universities can do good, and VC-funding a business model for bringing stuff to market.
But VCs follow fashion and their horizons are short, while university professors are eternally chasing the next grant...

@stevel @Lazarou the Bell model was actually strikingly simple: AT&T and its subsidiaries made money by operating telephone lines, and at least 1% of that money would go to Bell, and the scientists and engineers working there were in charge of what to do with that money, no strings attached.

I honestly believe that this is the only sustainable way of doing pure research - VCs will always spoil the waters with short-termism, and if you just let universities compete among themselves then pure research becomes a game where the one who produces more papers or has more connections to conferences or publishers wins. Just give geeks a flat fee for geeking out, enough for them to put food on the table and buy new stuff, without having to struggle for the next grant or for the next round of layoffs, and great things will eventually come.

The go-to-market part of course is more challenging, and I agree that in most of the cases those who have great skills for discovering and building things don’t have the skills to make products out of it. And what I’ve seen succeed in most of the cases is the continuous cross-pollination of people, products and ideas between academia and businesses - with academic and corporate spin-offs being the most natural points of touch. But always keeping in mind that cross-pollination should never mean subjection of the freedom of research to the rule of the market. The Industrial Revolution wasn’t fostered only by the Edisons and the Fords that had an innate talent for turning ideas into finished products, but also by the Carnots and the Teslas that had time to focus on their moonshots without the pressure of having to ship to market.

@fabio @stevel @Lazarou

Your point about more brilliance and money being wasted getting people to click links instead of solving climate change hurts to read! It's very true.

And I like the solution "give geeks a flat fee for geeking out ..." - exactly this.

I didn't get into science for the money, I got into it because I love science. I am not alone.

I would love to work on global-scale problems without having to worry about housing, employment a year from now, and competing for funding.

@SorceryForEva @fabio @stevel @Lazarou

I think that Google had a great idea with the "20% time" thing! IE: Every person can spend 20% of their work time on whatever they want. Only 80% of your work time is "doing the job you were assigned."

Gmail came from that!

.

(Google has since "hammered that out of the system," from what I read. There's no "20% time" any more (for any but the most senior people). And they kill things now for not being sufficiently profitable.)

@JeffGrigg @SorceryForEva @fabio @Lazarou HP labs did have a 10% time, with Friday am being it. And I recall that being something you actually had to manage as a real piece of work-so managers could see what you were doing.
But I also recall that Friday pm is the afternoon most likely to be sacrificed if you didn't get everything done in the week as you wanted to.

Really: Monday afternoons should be the independent time – and people should talk about what they've done . Even if it's just things like playing with new data set through your own products.

@stevel @JeffGrigg @fabio @Lazarou

I like the idea of having time to explore ... but really novel ideas take more time to grow than 10% here or there that you have to report on.

Maybe in a few afternoons I could have a really brilliant idea.

But 95+% of the work remains to fully explore that idea until it is a mature result, and it probably won't look profitable until the very end.

@SorceryForEva @JeffGrigg @fabio @Lazarou I think you could point to IMDb as a spare time project.
Col Needham and others had been building up the db as text files and perl code shared over UUnet
The first web servers and browsers were coming up within HP. Without a proxy server it was all internal stuff + mirror of the few public ones (dilbert cartoon of the day)

Col, Dave Ragget and others were saying "this is different!" But local management were sure their vision was better.
cn@hplb.hpl.hp.com changed his email to @imdb.com p; dsr did HTML tables then also went on his way

@fabio @stevel @Lazarou This is what I've been trying to do with my research program at my home lab, on a much smaller scale.

The day job is my "primary business" and some fraction of the profits from that is reinvested in whatever I feel like researching. Some of it dead-ends (crazy distributed build system with "makefiles" written in compiled C++), others end up being much more broadly useful (ngscopeclient)

@azonenberg @fabio @Lazarou I'm now curious about your crazy distributed build system. Did it include test running?

@stevel @fabio @Lazarou It did compiling, linking, and test running including hardware-in-loop testing.

There was caching, distributed building, multi-user support (if you run a build somebody else in the cluster has done, you get their cached artifacts instantly). It would even build FPGA bitstreams (using Xilinx ISE - to give you an idea of how long ago this was)

There was no concept of a "native" target platform, everything cross compiled - you had to explicitly specify a list of architecture triplets for each binary. All of the libraries, headers, etc .were hashed and included in the cache key so that if you upgraded your compiler or a dependency it'd invalidate stale cache entries.

It supported cross-language cross-architecture dependencies (e.g. unit test depends on PC-based driver application plus FPGA bitstream, FPGA contains a ROM with a compiled MIPS binary for a softcore) and would correctly integrate with all of that.

Actual build/test jobs were scheduled using SLURM which allowed you to make any Linux machine a builder or runner, and you could create virtual nodes for running unit tests that e.g. knew about an attached devkit on a specific JTAG server port number etc.

The idea was great, the execution not so much. I've wanted to revisit it eventually but just haven't had the time.

@azonenberg @fabio @Lazarou this is pretty awesome! One of my first bits of Hadoop work was distributed Junit test running - really makes you appreciate the challenge of test running and reporting
@stevel @fabio @Lazarou When you bring hardware into the mix it gets 20x harder

@azonenberg @fabio @Lazarou Ignoring social barriers to spare time work (all the other commitments) -anything which goes beyond software and budget FPGAs can't be done this way. Nobody will do production fusion in their spare time.

Given your day job is IoT/smart city security, our town has a "smart lamppost" platform for experimentation, which does have "individual" in the drop down box when requesting access.
There's a 2017 paper "Dataset: Container Escape Detection for Edge Devices" looking at some security aspects.

#iot #SmartCities

https://www.umbrellaiot.com/

UMBRELLA - A living lab at the heart of innovation

UMBRELLA is one of the largest, world-leading open programmable Industrial Internet of Things (IIoT) networks within South Gloucestershire.

UMBRELLA
@fabio @stevel @Lazarou Do you suppose that that’s what the Chinese are doing right now?
@ELS @fabio @stevel @Lazarou Sure feels like that, from what I see and hear.
@albertcardona @ELS @fabio @stevel @Lazarou They fund their kids to go out to Harvard & Oxford & dozens of other world class institutions round the world. That policy has, in just 35 years, taken China from a poor 3rd world mess to the peak of the world's economy.
It's not actually very hard to figure out how to fix the mess in the USA and the UK. Defunding education, charging more for it? That's the opposite of what we should be doing.
@fabio @stevel @Lazarou At a societal level you're making a beautiful argument for #UBI. Imagine the progress if people are allowed to indulge in their passions rather than being brute-forced into narrow occupations. On the flip side, imagine if people who aren't interested in doing anything other than watching movies or playing video games could just do so in peace, instead of being a drain on society. The effort to cudgel them into a bullshit job and babysit them is a terrible waste.

@brad @fabio @stevel @Lazarou

With a guaranteed #UBI (Universal Basic Income), just think of the amazing startups and business solutions that people would build -- if they were not literally risking their lives, and that of their loved ones, by starting their own businesses!

@JeffGrigg @brad @fabio @Lazarou one comment I've heard from people in the US that even having free healthcare would make a difference to start ups
@fabio @stevel @Lazarou the present situation is the result of being governed by people who know the price of everything, and the value of nothing.

@stevel @Lazarou @fabio

You'll miss out on a huge amount of talent and innovation by focussing on universities. As the infosec world already knows you can become an expert in many areas before you're old enough to go to university and many don't bother. The most talented and sucessful people I know in several different industries didn't go to university.

And I'm not just saying this as soneone who had had enough of education at 17 and had absolutely no intention of going to university.

@geoffl @stevel @Lazarou indeed, Kelly wasn’t focused on degrees either. Reading his thoughts he seemed to be more inclined to hire someone who had built his own radio transmitter in the basement rather than someone with an academic pedigree but no tinkering spirit.
@fabio
That's how I got my dream job, although I did have quals which were necessary to be considered, what got me in was having built a z80 kit computer, designed and fabricated a disk controller and created a simple DOS in machine language showed my less relevant quals were matched by more important qualities and interests.
@stevel @geoffl @Lazarou
@fabio @stevel @geoffl @Lazarou Oh, how I love the phrase "tinkering spirit" :)

@fabio Yes! Yes! Yes!

I spent my working life in the BBC's engineering research department. When I started in the 80s there was a bit of that Bell labs spirit, with some freedom to investigate anything that was interesting. By the time I retired the department was "under control". I no longer fitted.

@fabio his name isn’t in any of those lists, but my grandpa worked at NJ Bell and the Labs from 1928 thru 1970, so I’m taking a little bit of pride in having a part of that 😎
@fabio Kelly's question “Why would you expect information theory from someone who needs a babysitter?” is doubly apt in both applying to "let people do what they're good at and don't bother them" and one interpretation I guess he didn't think of at the time: "there are more geniuses available if we provide free/low cost daycare"
@Mabande @fabio oh wow that second beat. Nice!

@fabio Your point about the relative brutality of academia, especially for junior researchers, is very important.

I spent my early career (the most productive, high-energy years) at Bell Labs. The most important thing, aside from great colleagues and resources, that the Labs gave me was *freedom*. I spent my time actually doing research instead of writing grant proposals, teaching classes, serving on committees, and all the other distractions that eat the majority of a junior professor's time.

@fabio I think I was in the last generation where it was possible to have a career like this, and it gave me a huge advantage. There's no job I've held since then that has been as nurturing and protective of my time and energy.
@mattblaze @fabio I am not sure that your point applies merely to academia. I worked for Pixar, which through the 1990s was actually interested in pushing the boundaries of what was possible, both in technology and in art. But increasingly as it became part of Disney, profitability became the driving force, increasingly at the expense of the people who bought into the creative vision. By the time I left, it was clear that they had ceased to value individuals and their contributions.
@mattblaze Society is facing a serious problem, where rampant capitalism swamps all other concerns in the workplace. Not just diversity and inclusion, but indeed doing things which are not just naked grift. Grift is, after all, a pretty good business model when laws and norms don't punish it. It allows people to be fundamentally lazy.
@brainwagon @mattblaze Don't forget the "busy work", of which there's still plenty.
@Dss @mattblaze There are all sorts of bullshit jobs, not just busy work. Graeber's book arose from polls which show that possibly a third of all jobs are bullshit: at best useless and many possibly harmful. Ironically, many of them pay better than jobs which are more necessary or altruistic. You are expected to get paid less of the privilege of doing something meaningful. https://en.wikipedia.org/wiki/Bullshit_Jobs
Bullshit Jobs - Wikipedia

@brainwagon @mattblaze They aren't the same thing. Bullshit jobs are different from jobs where, to keep being employed, you have to do the work that makes you look effective or "busy", which means the actual work of value suffers.
TPS reports instead of making actual sales, progress stand-ups & endless meetings that simply steal process until they finish and you can get back to *work*.
@Dss @mattblaze Graebner mentions five different kinds of bullshit jobs. Flunkies (jobs that exist to make others feel important), goons (aggressive jobs that don't create any actual value), duct tapers (jobs that patch over problems that should not exist in the first place), box tickers (appearance of doing something useful, but not actually generating value) and taskmasters (micromanagers, or those that create busy work). All might reasonably be called bullshit.
@brainwagon @mattblaze yes, but none of those 5 roles cover "people who are trying to work but are stuck in (unproductive) meetings all day", do they?
@Dss @mattblaze those are not necessary b******* jobs, but they have b******* tasks. My old job had many b******* tasks, although the job wasn't actually b*********** in and of itself.
@brainwagon @Dss A lot of the bullshit jobs aren't *inherently* bullshit, but end up being so from particular circumstances or local choices and culture. For example, there's nothing inherently bullshit about being an assistant, but a lot of those jobs become flunkies. And documenting and verifying things is often very important, but can easily become box ticking.
@mattblaze @brainwagon Having done a "desk job" before the world got chronically online, and one two years ago as a remote employee, I know things are very different now. Having employees complain about not having time to do their work due to back-to-back-to-back meetings is quite common now. That never really happened in the office 25 years ago - it wasn't a 30 or 60 minute meeting, it was 2 minutes at someone's desk. Or maybe that was just a different company culture?
Not my problem to solve though.
@brainwagon @fabio There was a golden age of corporate-sponsored research that seems to have ended with the 20th century. Xerox PARC, DEC SRC, HP Labs, IBM Research are all either gone or shells of their former selves.
@mattblaze @brainwagon @fabio mmm. I was at GE-CRD when Neutron Jack took over. Suddenly everything became “what profit did you make?” Even for corporate R&D (directly, not the profit other divisions made off of the things research provided..) This was at the end of the 1980’s
@mattblaze @brainwagon @fabio (A different kind of R&D vs the ones driven by the military industrial complex..)