I'd bet you a hundred bucks that has everything to do with production and nothing to do with the chip itself... Production is tens of times larger than what it used to be.sepheronx wrote:Mike E wrote:It cannot be that hard to build a replacement... Like I said, it doesn't have to be a "miracle processor". All it has to do, is improve the architecture etc and keep the optimized DSP design. An improved model could handle the same conditions, and EMP-resistance isn't a major challenge. My question to you, is why compromise? - I swore the F-35 used a newer chip, after all ,they wanted to upgrade the F-22's chip....sepheronx wrote:Meh, if the processor works, stick with it. The other aspect is that the Intel processor, much like the Elbrus one, has been tested for extreme conditions of electromagetnic pulses to prevent major damages to it (either that or go back to vaccum tubes for everything) and has been in production for a lot longer (MCST one is actually quite new), thus making it a lot cheaper than another one that could be faster, but provides no added benefit. Same processor is being used for F-35 as well.
It goes back to the old question: Would one computer to power all on an aircraft be better/cheaper than multiple computers for various gadgets in an aircraft? In this case, I would the latter be better since if the computer fails, it wont take everything with it. For radar translations are done so fast, milliseconds, that a new processor, regardless of it being faster, won't add any benefit at all. Only thing it could add is that the processor would be more powerful enough to run multiple tasks at once compared to the old one. But if it fails (which actually, smaller transistor processors have higher failure rates compared to the older 180nm tech), it would take a lot with it.
I asked my father this same question a few years ago, since he worked for Sperry and IBM (when working on processors and other IC for military and civil tech) and his statement is that even though the new may be faster, it is in a lot of cases more prone to failures as well as provides little to no need of tech. Hence why they went with a lot of slower, older MIPS and RISC processors over the more advanced CISC processors at the time.
Funny thing he told me, is that in the military, they have a huge backup of analog systems, old coil wrappings for memory, vacuum tubes and what not, for "just in case".
I agree with that... There should be multiple processing units and not just one, I never suggested otherwise... The ability to process information quicker, and multitask better (like you said), is very much an upgrade. More so when the radar systems themselves are getting more and more complicated. AESA radars always have hundreds of independent emitters/receivers, what is going to happen when they have thousands if not tens or even hundreds of thousands? That much information could overwhelm an older chip like that, especially if trying to resist jamming etc (switching bands etc).
I should ask my father a similar question... He worked in both the hardware and software divisions of defense contractors for many years, and thankfully he continues to retain that knowledge. The chance of failure isn't much higher, if not lower... The advanced nature has less to do with reliability than production facilities and stable software and architectures etc. A die shrink + whatever else they'd do wouldn't destroy the chip....
For good reason....
The failure rate for newer processors is very high compared to the old.
That said, you need to find a reason to upgrade it. In the end, if it runs better, but the difference is 20ms vs 1ms, then it is still beyond human comprehension to notice the difference. Like the lag between sunlight to the human eye. Only area where it may be beneficial is energy consumption wise, but even then, is it that important? They could make them more stronger from EMP effects, but the major issue is not the processor itself but the transistors. Smaller they are, the harder it is to shield them apparently. Radar's are actually not becoming more advanced. These AESA radar's are very similar design and development since the 80's. When my father installed AESA radar in the Dew Lines to replace the PESA, he stated it was a nightmare (and not needed as the PESA was far more accurate in its readings due to how the energy is used) and the AESA was not ready technically. So really, it is just improvements in various areas since then, but still the same technology. Korea is using more powerful processors to power multitude of devices while the west and Russia's aim is multiple processors for multiple applications. This may change though and US may go that route, cause it could reduce costs and logistics. But could also increase issues in other areas.
It is all a give and take really. You could actually have all new processors for these devices, but it comes down to need vs want. It is like in the civil market. The average person won't even use the full power of a core i3, let alone a Core i7. Heck, most programs cannot take advantage of the multicores, let alone the architecture itself. But people still purchase it anyway. The military on the other hand are far more interested in the use of the device more so than the want. But there are cases where there is total bastardization of programs (F-35 is newest example), so who knows.
You said it yourself, humans limit their effectiveness.... Thing is, as jets become more and moreover advanced, human pilots become more and more irrelevant. In a matter of decades, all militarized fighter aircraft with be a drone anyway...
The jets themselves are already capable of maneuvering by themselves (PAK-FA), and in that case they will need (and use) all the time they can get. Those milliseconds may not seem like a long time to a person, but they are a lifetime to a computer.
Radar systems have been getting more advanced, but more importantly, they are getting more complicated. Like I said, a 90's processing unit that runs on a 180 nm lithography (and is outdated in general) isn't going to be able to process the information from thousands of emitters and receivers consistently, never mind quickly.
Shielding them has never been a problem, even on the newest and most advanced systems to date. I doubt a lithography shrink is going to significantly limit the processors shielding.
Like I said... I don't support a single processing unit, but rather the more common multiple units for each system method... Upgrading the computational performance of those independent processors shouldn't be much of a hurdle, as mentioned before.
Today may not need a super powerful chip, but tomorrow's systems will.