I have seen the COBOL issue come up a number of times, and I just don't get it. Am I thinking about this wrong?
COBOL seems to be a straightforward language by modern standards. Any programmer with experience with imperative languages should be able to pick it up. Learning the language itself cannot be a real barrier.
The barrier seems to be the enormous amount of legacy code that has to be digested and understood. But wouldn't that barrier be pretty much the same regardless of the programming language. I mean, if it were 800 billion lines of Perl code, would that make the problem any easier?
So isn't the real problem that we have 800 billion lines of legacy code that needs to be understood and maintained; it just happens to be written in COBOL?
I only have minor experience with COBOL, but in my mind the issue is that those 800 million lines of code don't have a single unit test, because testing COBOL is really hard. Which makes it dangerous to do any refactoring.
Specifically the snippets I've seen do some in-line SQL to get its data, some business logic and then updates. The logic is rarely in a procedure that takes parameters, because that doubles the number of lines of code you need. So the only way to test it is to bring up a database, insert some test data, assert and clean up the database again. Which is a lot harder since "insert some test data" gets complicated because of different constraints and triggers etc. Then running it regularly is harder than it should, there's no SQLite equivalent database so you typically need to run it on a mainframe etc.
I've worked on replacing a piece of functionality from COBOL to Java, and I was working from the business requirements that was basically a large spreadsheet of conditions and results. E.g. if age > 67 and not retired but have more than 3 kids, etc then this value. The code I wrote was gnarly because of all the combinations of different conditions, nested ifs 3-4 layers etc. And then there was ambiguity if multiple conditions hit, which one took precedence. I asked if I could look at the equivalent COBOL and the actual logic was about 10 lines long (not including headers, comments etc). Wish I had seen that before I started coding. It was quite readable and well commented too.
So the actual language isn't _bad_, but the lack of tooling around it is severely lacking.
COBOL the language is pretty simple, you’re right. I learned it in a few weeks in my first job out of college in the early 1990s.
Even today, you can grab Gnu COBOL and learn the language.
What takes longer is understanding the whole architecture of mainframe software. JCL, CICS, etc. It’s quite foreign to anything you learned about programming for Unix. And there really aren’t any free or open-source resources to learn it, as far as I know.
>What takes longer is understanding the whole architecture of mainframe software. JCL, CICS, etc. It’s quite foreign to anything you learned about programming for Unix. And there really aren’t any free or open-source resources to learn it, as far as I know.
This could be what you mean, but isn't the problem here one of access? I know just enough about JCL, CICS, etc to be dangerous, and never found any of the mainframe systems to be that difficult to learn (it's all basically batching, queueing, scheduling...) but without access to a mainframe, it's kind of a non-starter to actually get hands-on with any of this stuff. IE: thanks to the multi-decade shift to open source, we've got generations that have that skill set because it was accessible.
Yeah it was over 30 years ago. I’m sure with more experience it might have been easier conceptually. I didn’t have any trouble with COBOL but found the whole mainframe environment to be rigid, crusty, and difficult to understand and use compared to the Unix systems I had used in school.
Weeeellll... yes and no. You aren't wrong, but I would just note that mainframes were remarkably reliable. They pushed a lot of boundaries on things that we still kinda struggle with today in terms of data integrity, redundancy, hot-swapability, recovery, etc. No question that today's systems are faster and perhaps in some ways more affordable. Definitely the case that the average person wanting to learn about software has more resources and tools available that once upon a time required access to these multi-million dollar machines. But, there's definitely tradeoffs between those centralized systems and our distributed systems of today.
No they are not just beefy servers. Their raw processing power is not remarkable. They are built for throughput. I/O and peripherals are handled by separate hardware so the CPU only deals with data in memory. They also have more sophisticated virtualization and scalability features than standard servers or PCs.
You raise a good point. I can read COBOL to get an idea of what it is doing and how it is structured takes a little getting used to, but so do most languages.
However, in my company (insurance) we have 3 COBOL systems (thanks to mergers) and one is on a rapid track for decommissioning not because it is COBOL, but because it is COBOL written in an obscure framework that was never popular and also coded in a non-English language, requiring knowledge of that language to understand the business logic. We can outsource some mainframe maintenance but not that one. The other mainframes will be around for at least the next 10/15 years.
Some other good points have been raised about what makes it complex, I will add another.
I never learned Cobol, but I learned programming in the 90's on the AS/400 where Cobol and RPG were the primary 2 languages. I obviously used RPG. They are both easy to learn and powerful for what they were meant for but they are also commonly used to create massively monolithic programs where massive transaction programs are all written in a single file or with a lot of "copybooks". It makes it a lot harder to reason in your head what is going on as well as the side effects. Things like unit tests are just non-existent so it is just difficult to make changes because of the potential consequences. Especially when these monoliths are handling such important data.
We wrote our RPG in a very API driven fashion where we avoided monoliths. I think you pay a small performance penalty for this but it always made our code easier to maintain.
Of course it also takes a very long time to compile these programs so the Edit/Compile/Debug cycle is not fun or productive.
You're not thinking wrong. COBOL is, compared to modern languages, very straightforward. It's also very different, though, and so it can feel alien to programmers who only know the new stuff.
No direct experience with either, but, from what I understand, a COBOL application written to run in CICS looks a lot like a serverless app, with the difference that the company owns the multi-million dollar server.
This is sorta true. CICS is a whole different topic all by itself, and is independent of COBOL. I've written CICS applications in COBOL, C, C++ and even Java.
So isn't the real problem that we have 800 billion lines of legacy code that needs to be understood and maintained; it just happens to be written in COBOL?
No.
800 billion lines of COBOL = 1 million lines of BASIC
A gross exaggeration, of course, but after 7 years of COBOL, I discovered BASIC and instantly became 10X of my former self.
I would never go back. I get enough overhead from my bosses. I don't need it from the tech too.
Your experiences aren't as straightforward as moving from perl to php but sure anyone can pick it up but it is a journey. I remember finally getting jcl language clicking for me and feeling like I accomplished something.
COBOL was actually in the CS curriculum for my university, a fact that several of my friends brought up when the 'COVID is prompting a real need for COBOL programmers in light of the need to change benefits rules and unemployment!' stories were a regular feature on the nightly news.
'You should do it, man!' Yeah, sure. I'd already suffered through the liquidation of my entire department in 2018, the prospect of the heat around COBOL dying down and facing that once again with the additional stain of 'oh, what've you been doing? Cool, popular JS frameworks? No, COBOL? Pass.' on my CV during my next round of interviews didn't seem at all appealing.
That mentality should set the expectation for anyone looking into a COBOL job. It's not just this job, it's the next one.
If you know COBOL and know other languages and are fluent in "current" tech, then I see nothing wrong with consulting for $300 an hour doing some COBOL.
Company I work for still uses AS400, which has been released in 1988 just after I was born. We have a dev who supports it. This isn't a small company either.
COSTCO is still using and a lot of huge organizations.
I'm focused on newer tech C#, .NET, Blazor but I do interact with DB2 database which is the backbone of that old system.
At one of my interviews somewhere I mentioned that a lot of apps I'm writing are just extending AS400 and the dude interviewing me was really hyped up. He has spent a lot of time with the green screen in his younger years. The job had nothing to do with AS400, but that conversation got me an easy offer.
Most consultants cannot get $300/hour for COBOL though. It is those who have specific other knowledge - like experience in the system before they retired who can get that much.
> the additional stain of 'oh, what've you been doing? Cool, popular JS frameworks? No, COBOL? Pass.' on my CV
That's pretty much the last thing that I'd worry about. A company that evaluates potential hires that way is not a company that's worth working for, in my opinion. And most companies I've encountered wouldn't think that way.
You would be surprised. Don’t get me wrong I would love for companies to not show prejudice based on a candidate’s current tech stack.
However once you are outside of the SV bubble you will notice a lot of companies dismiss candidates based on their tech stack. The only time I’ve ever seen an inexperienced tech stack hire is from personal referrals from someone already working at said company.
It's the same reason why I opted to spend only minimum time with the dying tech in my old role and find something more broadly applicable. You're super important until you're not, so general tech that is useful in many areas is a lot safer.
Cobol ain't going away anytime soon, but it certainly might limit what jobs you can get other places depending on the hiring algorithm.
Even important people aren’t compensated based on their importance but their marketability. Also, pay is usually just enough to prevent employees from leaving to a competing jobs. Too much institutional knowledge can ironically leave an employee uncompetitive in the market.
Agree. Institutional knowledge is vastly underrated. A lot of jobs exist where the knowledge IS the employee value and not just some crank-turning process that can be taught to someone in a few months. It can take years to adequately understand an organization's tools, software architecture, customers, what works and what doesn't...etc.
You don't really want to work with those people anyway. Some hiring managers will see it as a cool thing. As long as you do a project in whatever you want to target next to stay fresh you shouldn't have any problems except from those who hire superficially.
This is all based on a myth though, primarily the myth that "only COBOL" developers are a common thing (people believe this for COBOL even though it's never really been a thing for other languages, wasn't even ever really a thing on mainframes, increasingly they are majority Java).
What I'm getting at is that you aren't going to find a ton of jobs where you are working on COBOL but somehow avoiding Java when the two are connected at the hip on the platforms they are used on.
> This is all based on a myth though, primarily the myth that "only COBOL" developers are a common thing
Anyone who knows anything about COBOL knows that there’s rarely such a thing as “only COBOL”. The majority of the remaining COBOL shops are IBM mainframe, which means it isn’t just COBOL-CICS, JCL, VSAM, IMS, DB2, TSO, ISPF are all in the mix as well (maybe not all of them at the one site). And if you aren’t doing it on an IBM mainframe, it is probably deeply integrated into some other platform - e.g. PeopleSoft still uses COBOL for some of its batch jobs (especially payroll), and while I’ve never looked at its COBOL code, I’m sure it isn’t vanilla COBOL either, it has some PeopleSoft specific calls in it. Or maybe you are doing Oracle Pro*COBOL (SQL precompiler for the Oracle RDBMS). It’s really no different than Java - who does “just Java”, as opposed to J2EE or Spring or whatever?
People that have not been exposed to legacy industries (banking, insurance, etc) would simply not believe how many of these large/huge companies run the majority of their business on an IBM mainframe running COBOL code. I've seen it firsthand and I didn't believe it. For them, it just works.
I would argue that, despite the design failures, more of modern society runs on JavaScript. I know that it is hard to accept, but a crappy piece of ten year old JS with JQuery submitting a basic XHR request to a server to book a seat on an airplane or at a show matters just as much as your insurance operation that squeezes out a fraction of a point of return for an insurance package.
We live in a world where many of the software languages we use really do matter. If all Ruby interpreters disappeared overnight it would cause global havoc. Real panic. Trillions of dollars of damage.
But because we've learned to rely on these languages continued existence we can undervalue any one of them, including COBOL. I don't because I've thought about it and concluded that we're essentially building on things akin to writing or mathematics. Our building blocks are vital, even if they're easy to take for granted.
One of the slides by some IBM consultants presenting at the financial firm I used to work stated "Western Civilization runs on the mainframe." Considering how much banks still use mainframes, this is probably true.
I recently listened to a podcast episode which featured a gentleman who teaches mainframe related things at a university in the US and is deeply involved with the Open Mainframe Project. His statistic was 95% of financial transactions touch a mainframe somewhere along the way.
From what I know, considering it's not "just" banks but pretty much any long running financial business (IE insurance companies), I believe it's one of the most used but least talked about technologies out there.
Can confirm. In a past life I had the pleasure of auditing the COBOL code of a major US insurer. They had a separate program for every line of service (Auto/GL/etc.) that would read every claim record and produce their actuarial tables.
It was pretty fascinating to see how it all came together. Also despite the languages age after analyzing it for a length of time it was clearly elegant for that type of financial processing.
Not just financial businesses. For instance, I worked on a product also used by one of the largest package shipping companies. A couple of their key systems consisted of mainframes running COBOL code.
If you like to listen to podcasts and are curious about mainframes, get Terminal Talk. It’s a lot of fun and the mainframe ecosystem is a completely alien biosphere totally different from the Unix space we are more used to see.
I suspect they mean "systems that use a lot of legacy technology" rather than "banking and insurance are legacy business models", but it is HN, so who knows. Either way, pretty amusing.
In the 1980s, COBOL was taught in my computer database class. I actually used my COBOL skills at the federal government Veteran's Affairs job I had for several years after college.
If you are always chasing the latest language and get high on syntactic sugar, you are likely going to be a problem for management if you work at an IT shop because you'll be writing important one-off programs in various languages that people after you will be required to support or rewrite.
John Carmack wanted to hire only C++ devs when he was at Oculus but had to relent and hire JavaScipt bros because of Meta.
TCO isn't just a question on your business class exam.
My mom learned COBOL while she was getting a degree in accounting; according to her, the language is such a clear mapping from narrative jargon usage in accounting and bookkeeping that she was using features her class hadn't covered, in test solutions, without realizing it, until her teacher asked if she was studying outside materials.
My conclusion was, if ever I want to do business data (transaction) processing, I'll learn it.
I also programmed a mainframe, with JCL on punch cards, in assembler and PL/1, in college; I'm glad I did; without that experience it might be hard to appreciate how radical and revolutionary unix was.
My mother built her career on COBOL from the mid-70s to the late-90s. She's retired now, but still talks about what a great language it is. She tried learning Java in the late-90s, toward the end of her career, but hated it. "What's an object? Everything? Everything is an object? That's just dumb." LOL
And yeah, she still gets offers for consulting work, but she wants no part of it. She's done working.
I'm pretty sure I'm thinking about it right. How much COBOL code can run on a regular Ubuntu machine? What's the package manager for OSS packages? What test frameworks are in common use?
You know you're in trouble when "There's a syntax file for VSCode." is the height of your modernity.
> How much COBOL code can run on a regular Ubuntu machine?
All of it. GNU Cobol exists, as do proprietary solutions from companies like Micro Focus that target the JVM and .NET (which is what you're looking for for a real COBOL solution).
But why is it so important to run COBOL on Ubuntu? If you need Linux, create a Linux LPAR on your mainframe.
> What's the package manager for OSS packages?
COBOL code runs Western civilization. Not having muh NPM is a feature, not a bug.
> You know you're in trouble when "There's a syntax file for VSCode." is the height of your modernity.
Dude. COBOL has entire modern IDEs written for it.
It's the same problem z/OS has: There's no hobbyist community so it remains obscure. Yes, you can run 1980s-era System/370 software under Hercules, but IBM keeps anything remotely modern under lock and key even though you'd think it would benefit IBM to have people indoctrinate themselves into the Blue Worldview.
It might not be legal, but it isn't hard to find newer versions to download if you know where to look – z/OS 1.11 is floating around, it is from 2009, so still rather old, but a lot newer than the 1970s. Also you can find some OS/390 and MVS/ESA versions from before that (1990s vintage). If someone privately runs it for purely non-commercial purposes – I myself never have, but some people do – I think it is rather unlikely IBM will sue them.
The easiest way to get z/OS install media is to sign up for zPDT – it costs many thousands, but still a lot cheaper than what IBM charges for z/OS for an actual physical mainframe – and then you get the the ADCD media with that. The problem is, starting with z/OS 1.15, IBM began encrypting the ADCD media. As part of the installation, the media is decrypted, using the key on the hardware dongle – but the decrypted copy is watermarked with the dongle ID, meaning that IBM can trace back any leak to the individual customer responsible for it. This means people are no longer willing to publicly share ADCD media, although I've heard rumours of some people passing it around privately, only to trusted individuals. Someone could reverse-engineer the watermarking and remove it, but I'm not aware anybody has done that–I think people would still worry, what if they failed to completely understand the watermarking, and hence some of it survived?
Backward compatibility. In 1970, you buy an IBM mainframe and start writing apps to run on it. In 2023, you are still running the same code base - albeit with innumerable enhancements and fixes over the ensuing decades. On an IBM mainframe, that code base will just work. On Linux, it won’t even run - unless you buy some expensive mainframe rehosting package, which is full of gaps and limitations, and may introduce obscure bugs which the original lacked.
Reliability - most large-scale Linux systems are based on a distributed model - the app runs on a cluster containing dozens/hundreds/thousands of servers, if a single server has a hardware fault, the app just keeps on working and at worst some user might get an error which goes away if they retry. So, no point in spending $$$$ to maximise the reliability of any individual node.
By contrast, many mainframe customers have just one mainframe, and if it breaks they go down - which means the mainframe hardware has to be super-reliable, filled with redundancy, error detection/recovery, etc - and you pay $$$$ for all that redundancy.
IBM mainframes can be clustered - e.g. z/OS Parallel Sysplex - but only the largest sites do that. The maximum supported cluster size is 32 - I wonder if anyone actually runs one that big, 32 mainframes would be horrendously expensive - while Linux clusters with hundreds or thousands of nodes are quite common.
I don’t know if it is official IBM strategy but from outside it looks like they want all their old technologies including z/OS to die so IBM can focus on software consulting without need to build/maintain anything big themselves.
There are several programming languages with package managers that work better than NPM. Not having one is not a positive no matter how you try to spin it.
What exactly would you package manage if you had a package manager for COBOL? You will never get paid to work on any COBOL code that isn't proprietary, and sealed with the blood of innocent victims. You are not going to be installing the latest hot js framework on a mainframe.
> You are not going to be installing the latest hot js framework on a mainframe.
I think IBM would disagree with you, given how much they've pushed Linux (and, therefore, Linux web servers running Node and kin) on their mainframes. You won't be installing much of anything like that on a midrange system, but Linux on Z is pretty well established.
Doesn't that depend on the client company, rather than IBM? I'm going with what I know from the one (big,financial) corp that I worked in with COBOL on mainframes; there, we didn't use linux at all.
Anyway those environments are downright sclerotic (the worst thing about working with COBOL that nobody mentions). They didn't even let me install Firefox on my work laptop. They had people doing web dev, obviously, but the mainframe teams were more, let's say, conservative.
I know you’re attacking my point but I think you’re actually agreeing here. COBOL is tied to proprietary stuff running on big iron, and its claims to modernity are window dressing.
In fact the very first visual compile/debug environment was written in COBOL over 40 years ago: Micro Focus's Animator (from which Microsoft paid to use the patents in their Visual products iirc).
Honestly I picked Ubuntu at random. My point is there’s something intrinsically old school about something being tied to expensive proprietary hardware and while that remains true, it’s still a (possibly very lucrative) cul de sac.
I’m pretty bullish about the importance of COBOL, but while I have a lot of respect for the people working with it right now, it’s not something I’d recommend to someone at the start of their career.
> GnuCOBOL implements a substantial part of the COBOL 85, COBOL 2002, COBOL 2014 and upcoming COBOL 202x standards, as well as many extensions from existing COBOL compilers.
It's also in the Ubuntu package repos, from a quick check.
The GCC Cobol aspirant recently gained compliance with Cobol-85
Unless things have changed lately, it's still very annoying to get your Gnu COBOL program interacting with the test of your system. Something as simple as reading a file is a lot harder than you'd expect. And opening a TCP connection requires you to use specialised API calls to call the C functions directly, and the syntax for that was pretty horrible.
COBOL is designed for handling well formed input data, process them, and then return well formed output to the next step in the chain (on the mainframes this is coordinated by JCL but I'm not sure how you do it on Linux).
That’s not really the same thing is it, though? I’m sure you can write new COBOL programs using it, but the challenge is the old ones. And for a million reasons, they’re tied to big iron.
It’s part of the fabric of our society, but it’s still a pain to work with.
That would be like claiming PowerShell is irrelevant because it is not on the typical Ubuntu machine. While it is irrelevant to the typical Ubuntu user, the world is a diverse place and computing is no exception.
That being said, I feel that the article made very weak arguments so I don't blame anyone for perpetuating the stereotypes. These are things such as mentioning which industries it is used in, but not being specific as to how it is used (which is important since COBOL seems to be a domain specific language) and relying upon authority when claiming it is modern.
I mean, I take your point, but Powershell is actually pretty easy to run on Ubuntu. The catch with most real COBOL is that it’s fundamentally entangled with big iron.
As someone who learned COBOL in British college in the late 1980's, I can confirm that the language was/is quite readable and accessible. I still have the COBOL textbook from back then; Andrew Parkin, COBOL For Students, second edition, ISBN 0-7131-3477-1
I distinctly remember at the time really enjoying learning to code in it - it spoke to my logical-thinking (but quite young) brain a lot.
However, after college, for whatever reason I didn't use that COBOL knowledge at all. If only I knew then what I know today - that COBOL is still somewhat in demand and that if only I kept in with it over the years, I could quite probably have made quite a good living from it.
COBOL (or at least some of its dialects) has some really weird features I’ve never seen anywhere else. For example, only just the other day I learned that IBM mainframe COBOL has a “REVERSED” clause on its OPEN statement, which opens an input file for reading backwards. Apparently it only works for fixed length record format files, and only for files stored on tape. It relies on the fact that the (non-SCSI) IBM mainframe tape command set has a “READ BACKWARDS” command which reads blocks off a tape in reverse order, by playing it backwards.
Back in the day, memory was very small, and hard disks were unavailable or very limited in capacity, so many COBOL programs used tapes as input, output, and temporary storage. OPEN INPUT REVERSED meant you could write out a temporary file to a tape, close it, and then immediately re-open it to start reading it back in (albeit backwards), without having to wait for the tape to rewind in between. It was particularly used to speed up external sorting algorithms (although those were more commonly done in assembler than COBOL, to maximise efficiency)
IBM 3590 and 3592 SCSI drives support the "READ REVERSE" SCSI command[1] which (in variable-block transfer mode) is equivalent to the "READ BACKWARD" channel command.
For application details, see Knuth volume 3 [2], which even has a fold-out chart illustrating tape (and tape operator) movements involved in various sorting algorithms[3].
In brief, reading backwards allows you to use tape as a stack.
> That last application garnered plenty of attention during the pandemic when state UI systems running COBOL were unable to handle the unprecedented flood of new applicants
COBOL was not the issue, it was all the "webizied" front end developed for people to use via a browser that had the issues. A couple of articles came out with details that, as usual, the mainstream press ignored.
I grew up hearing COBOL as a joke language, until one day, I learned that COBOL has the most accurate arbitrary precision math, surpassing modern languages. This is crucial for financial applications. When I then think about how it doesn’t seem any more verbose than SQL, COBOL seems to me like a very respectable language.
Somewhere early in my career, I worked with an accounting app that thought a double float is good enough for a ledger. It was not. We spent too much time trying to track down discrepancies when closing the book for the year, only to find out that a double float is not precise enough for bookkeeping. Lesson learned.
It was not, but for the most part languages don't default to arbitrary precision (i.e., sinple mathematical operators with decimal literals typically gets you binary floating point, not arbitrary precision decimal), even if they have arbitrary precision available at the language or standard library level.
Guess I was wrong. It’s fixed point, and also what you said — COBOL defaults to it. It still makes it easier to write software that need to do a lot of calculations like that.
Even after a 44% bump, the salaries for COBOL positions aren't impressive compared to other specialties. And that's before the quality-of-life question.
The quality of life may not be bad at all. I have a relative who maintains old code for a public utility (I think most of it is PL/1). While the salary is not FAANG-like it is OK and the job security and the sane 9-5 schedule with the opportunity to work remotely are pretty big benefits for someone on the wrong side of 60. My 2c.
Yeah...I know quite a few Cobol programmers. They make good, not extravagant, money. But they also have low job stress, no 'crunch mode'. Production changes are very carefully considered and controlled because of the consequences of botching one. No 'move fast and break things'. They're home with the family at a reasonable time every day. So it's a tradeoff, and a good one for many people.
COBOL is good for dealing with money because it was built with the sort of numeric types required as a natural part of the language. Popular programming languages treat them as an afterthought and provide awkward library functions, if they have anything at all.
1) Python - added to the standard library in 2003.
2) C - no.
3) Java - BigDecimal in the standard library.
4) C++ - no.
5) C# - built in.
6) VB - Currency type kinda does what you'd want, but doesn't scale down very far.
7) Javascript - no. (Use https://github.com/MikeMcl/bignumber.js)
8) SQL - Yes. But check your dialect.
9) PHP - No.
10) Go - No. (Use https://pkg.go.dev/github.com/shopspring/decimal)
That's a fact. A couple of years ago, I had to brush off my JCL skills for a job. It reminded me of why I was so happy when I didn't have to use JCL anymore.
3) paper pointed to is a one pager with stats incl. numbers from stackoverflow
Not saying that there is not an argument to be made that the Cobol/Mainfram approach has it advantages. It is a straight jacket but some applications are really efficient compared with what is done these days.
That goal a flop, so programmers had to take over.
Wasn't there a Cucumber/Gherkin story here really recently? That grew out of Fit/Fitnesse (from Ward Cunningham of "tech debt" and "the wiki" fame), which was intended to have business folks write test cases to define how the delivered software would work. And as many people in the Cuke threads pointed out, Cuke fails, too. (As did Fitnesse.)
Bottom line: There is a ridiculous impedance mismatch between the business side of the house and engineering. Engineering attempts to create a programming language for MBAs is not a solution, but an expression of that mismatch.
> COBOL was designed by Grace Hopper to make it easier for business folks to write code.
Not quite; it (FLOW-MATIC, rather) was designed so that senior management could feel that they could read their organization's code.
> I suggest a reply to those who would like data processing people to use mathematical symbols that they make the first attempt to teach those symbols to vice-presidents or a colonel or admiral. I assure you that I tried it.
There is always money to be made by peddling the idea that you can bypass those pesky programmers and get Don from accounts to write his own applications instead. See also: most low-code / visual programming environments.
How would one even go about finding opportunities to program COBOL professionally? I feel like I'd actually go for it if I was able to do it remote, but mainframes conjure up this image for me of sitting next to a tiny green terminal somewhere deep underground, leashed to The Beast or Blue Giant or some other whimsically-named thing.
But I don't think about COBOL at all, and neither should you. The problems that legacy banks, airlines and government institutions have are not my problems.
The management class doesn't want to hear this. Their only goal is to lower maintenance costs on their legacy systems by bringing in new people to ensure lower wages, so they fund all this nonsense about this very dead language.
"What is it with these nerds that don't want to write COBOL code? It's all just coding, isn't it?"
You know they're desperate when they resort to astroturfing.
I have never written cobol, but I wonder if there is a market for a COBOL transpiler to something like C#/Java/typescript, or even python would be a viable business?
- COBOL UI's run in a customer written environment that provides security, navigation, and session data. In the system that I wrote, there's a WPF application that provides these services.
- Convert SCREEN COBOL to C#. The Tandem uses a VT6530 that is similar to an extended IBM 3270. There are client-side protected fields and scrollable areas that need to be emulated in WPF.
- Convert COBOL servers to C#. The Tandem has a 3-tier architecture where the screen programs talk to services running in a middle tier through an network layer called PATHWAY. These can be converted to classes and be created on demand in C#.
- Remote access to SQL/MP databases. Tandem has a Java Servlet environment that can be used as a data access layer. The transpiler generates these Servlets and the C# data access layer.
Yes. I know of companies that run mainframe emulators purely to keep using the existing COBOL codebase that was written for them.
They go to that level of weirdness because replacing the existing codebase would not only be much more expensive, but would introduce a great deal of risk that they aren't willing to take on.
A transpiler would not really address the needs of that sort of company.
It's a mix of both, I know that some companies do make a good living writing transpilers to languages like Java or C#. But it also require lots of business knowledge to keep track of the changes so you're definitely right in that aspect.
I keep hearing about COBOL and how there is demand for it. But I'm wondering if there are any avenues to learning how to write COBOL and get experience writing the kind of code that these industries require?
COBOL seems to be a straightforward language by modern standards. Any programmer with experience with imperative languages should be able to pick it up. Learning the language itself cannot be a real barrier.
The barrier seems to be the enormous amount of legacy code that has to be digested and understood. But wouldn't that barrier be pretty much the same regardless of the programming language. I mean, if it were 800 billion lines of Perl code, would that make the problem any easier?
So isn't the real problem that we have 800 billion lines of legacy code that needs to be understood and maintained; it just happens to be written in COBOL?