From a brief visit to the corporate website it looks like you can run MCP on a VM at a Cloud data center.
This lets business keep running the ancient stuff while using modern hardware and devops practices.
Those customers are never migrating off it.
Probably the same type of shit that cobol is. Some absolutely ancient system that they will never properly update because a 400k a year programmer is cheaper then doing so.
If it was only one shitty ancient system it would be one thing. For the company I work for it’s about 10 big interconnected mainframe systems with hundreds of non-mainframe systems cobbled together around them. They’ve been in place since the 80s, but you can trace their business logic back to the 50s and 60s. They start at cataloging all our parts and get into purchasing components from suppliers, describing the products we assemble, managing the supply chains for our factories, order management from our customers, etc.
Replacing it all will be massive chore, but it’s becoming more and more clear that we need to. At the end of the day, capturing and understanding data in them takes so much skill that we have entire departments dedicated to being an interface between the actual users and the mainframe. The business rules might have worked before the products we build contained electronic controls, but everything is starting to implode now that “parts” also includes software. This has resulted in manual workaround on top of manual workaround.
Tron Legacy perhaps?
I thought the MCP was erased back in the 80s.
We do not talk about Legacy
Wasn’t it defeated by Tron, Ram and Flynn back in the 80s?
Yes. Thank you program!
Best guess would be banking systems and older defense systems. e.g. until 2019, the US nuclear command systems still used 8" floppy disks from the 70s.
Hey, remember when we replaced only the broken stuff and let the other stuff keep working? Of course not: your entire country is barely over 200 years old.
You seem to think I’m complaining about this. I’m not.