4 Comments
User's avatar
Michael's avatar

Are you able to give an example of a military technology that requires cutting edge microprocessors? Remember, you can also get more compute by placing more older chips in parallel.

The sanctions, in this case, are ill-conceived because they aren't actually relevant to developing advanced military tech. They just aren't.

If you can't give an example to answer the above, I wonder why you wrote this article.

Expand full comment
Ryan Nesselrodt's avatar

I’m not arguing for the controls at all, and am also skeptical of the military justification given for them. However the argument I’ve heard given is generally precision munitions. I think even those don’t require the most cutting edge chips, but the satellite and radar systems they depend upon would benefit from better technology. And they also often cite the potential for AI applications that could rely on them. But again, I am skeptical, which is why I wrote this

Expand full comment
Michael's avatar

I see, thanks.

When I press folks on this, I inevitably here about precision munitions, AI drones, bla, bla, bla. Always vague. No one can name a specific chip <-> military system pair and explain why a bit older chip won't work.

I conjecture that a lot of people have a simplistic idea that more advanced chips do more advanced stuff, and that's it. If we want China to be less advanced, take away chips.

Personally, I run about 100 servers with about 2500 cores and spend several million a year on cloud compute and storage (something like a petabyte of data and 100k cores in limited time bursts). My team develops and operates AI systems for high-frequency trading. I've been doing this for more than a decade. New chips have not been meaningfully faster for about 10 years. They just add cores and save on power.

I believe a further fundamental misunderstanding is that people hear about OpenAI spending a ton of money to train ChatGPT on a huge cluster farm. From there they assume that anything "AI" needs crazy bleeding edge compute.

There are many misconceptions--

* AI training is very compute intense, but generally, evaluation is not. The training happens offline in an controlled environment where extra power or rack spaces can simply be thrown at the problem. No need of "newer chips" to do the same task. Evaluation is just not nearly as compute intense and even todays heaviest AI models can eval on your laptop.

* The huge AI models are based on digital data. Text, photos, video. Those are huge datasets at internet scale. Missiles, drones, etc. don't have nearly this scale of data. They have a few physical sensors that put out a number maybe 1000x/second. The most advanced chips have never been associated with physical systems. Physical systems instead want a "ruggedized" chip that is far away from the fastest, but more resilient in a deployed environment. Often the "design" of such systems is compute intensive, but again, this is offline and can be achieved with more volume of slower chips.

* Power consumption is very meaningful for large scale cluster farms-- it's a big component of the cost and upgrading to more efficient hardware can potentially save money over time. That's a nice to have, but hardly something that will hold back China. Power consumption of course matters for a drone/missile/etc, but it's not nearly as severe as people imagine. A HIMARS rocket weighs about 1000 pounds when loaded with a warhead. It uses a 1Ghz single-core Motorola CPU released in 2005. Your smartphone has the compute power of 10 HIMARs rockets and weighs about 7 ounces, including the screen, speaker/mic, camera, etc.

Why hasn't Lockheed upgraded the HIMARS chips since 2011? There's no need to. They used a 6 year old chip even at the last upgrade. There is just not much to gain from shaving a few ounces off a 1000 pound missile.

Senseless..

Expand full comment
Ryan Nesselrodt's avatar

Yeah, thanks for sharing this info! I do not work in this space but I get exactly the same sense, that there is not a clear military advantage and you can do much of the same stuff with older chips.

Expand full comment