Hacker News new | past | comments | ask | show | jobs | submit login
Defusing COBOL Bombs with Smart Automation (medium.com/bellmar)
74 points by mbellotti on Sept 14, 2018 | hide | past | favorite | 19 comments



I started out in the early 80's as a COBOL programmer, learned COBOL at college and JSP (Jackson Structured Programming) which entailed never using GOTO and how to code as such, even how to handle exceptions.

But no, real world my first program was ripped to bits, rewrote and I was shown the error of my ways and why I should use GOTO as it was how everything else was written and others will need to maintain and modify the program throughout it's life. Hence was the company standard. Even had a system engineer point out how much more efficient GOTO was for error handling and efficiency ruled the waves.

So the spaghetti code today may seem silly, but there will be so many historical reasons for the way it is. Alas companies are not keen to refactor old code a bit at a time for something more readable. They all live the dream that one day they will replace the whole system overnight and the angels will sing their praise. But that rarely goes to plan due to legacy factors and how intertwined so many aspects of the various systems are. This and it works, the hardware is fault tolerant. Then the horror stories of failures of others doing as such spread further than any success story.

So you end up at best in many cases seeing some aspects farmed out to a data warehouse running on some x86 server to plicate the increasing demands for marketing and sales statistics and provide them the tools to play away without impacting and stealing the pool of resources they have to keep the existing system just ticking along.

But the likes of CIC's and MQ interfacing make peeling of some layers much easier and was one of the sain approaches when I last was involved in such matters over a decade ago.

Be interested how people migrate such systems today as encountered many a software house who will promise all the tick box's management want to see and fail so very badly.


I started writing in COBOL in 1976. I wrote a lot of code and even started teaching COBOL in the local community college. The language is pretty simple, although wordy. I think that COBOL code would be pretty obvious to a coder today. They might not be able to code in it, but they should be able to follow the flow. The advantage of COBOL was that it was closer to machine code. A statement in COBOL translates to one or a few instructions of machine language. COBOL was a sort of assembly language. This made it run fast and efficiently on the big iron IBM mainframes. Remember that a machine with a meg or less memory had to run hundreds of terminals serving users at the same time at a clock speed measured in in a few millions of CPS. My last job before I retired was translating COBOL to Java. The COBOL code always ran faster than the Java classes that replaced them.


My first job out of school was programming in COBOL. Never learned it in school. I knew C, Scheme, Pascal, BASIC, a little FORTRAN, and maybe a couple of others. COBOL was easy to pick up. Never did really get JCL.

The time sharing wasn't that great. Yes the there were hundreds of users but they could wait many seconds, even minutes for transactions to run. There was a little clock icon on the status line of the terminal, and I easily spent an hour or more per day just watching that clock, waiting on my terminal to update. Luckily the there was no www then, or I would have been surfing during that time and even less productive.


> Remember that a machine with a meg or less memory had to run hundreds of terminals

(Puts Monty Python voice on)....Piffle!...when I were a lad we looked after a bunch of Data General Nova 4's and Eclipse S/130's with 32k of memory and with 32-64 dasher terminals hanging off them running DG's Interactive Cobol (on top of RDOS)....and running a full-on accounting system. 1MB of memory would have been absolute luxury.

:)


I learned COBOL, Fortran and RPG II in school but only professionally did anything in RPG II. COBOL readability beat the heck out of trying to read RPG!


Most banks suck at doing software and are being outcompeted by startups that have made building good software their core business.

E.g. N26 in Germany is rapidly growing marketshare in Europe (and soon the US apparently) with products that are modern, user friendly, and 100% guaranteed cobol free. There are a few similar examples of bank startups that invest primarily in R&D and UX to grab market share from old banks that are simply not competitive. Their marketshare is melting away rapidly.

N26 bootstrapped with way less funding than most of these dinosaurs spend on keeping their old crap on life support. Per year.

I had a fun incident at Commerzbank here in Germany a few years ago where a request for some older statements that I needed for our compmany taxes was first impossible and then turned into a "we have to send get somebody into an archive to make a copy of the microfilm; this may take a few weeks". This was this century. This decade even. No joke. A major German bank is storing customer records on microfilm. Incompetent banks like this deserve to go bankrupt. They are sitting ducks for any kind of competition willing to simply show up. We took our business elsewhere after that incident. I will never do business again with that bank.

The real problem is that zombie banks like this have no in-house competence left to do anything productive when it comes to R&D. They are running on automatic pilot run by incompetent idiots. They outsourced all their core competence to consultants that happily come in and pile on more crap as long as they get payed enough. Mostly they evolve simply by copying what other banks do via expensive consultancy projects.

They tend to not have CTOs or anything resembling a technical strategy. Kind of odd if you realize that modern banking is essentially a software driven industry.


I really hope banks get disrupted from the bottom (as you're mentioning with N26), and yes, I've worked for a Fortune 500 financial group that was a huge mess (they had entire buildings of software developers, but their software was god awful).

But your statement "they're simply not competitive" is extremely misleading. Most of the big banks don't have users as customers, maybe something to be tolerated. They're not competing on user experience.

Industry and Corporate is where it's at, and I guarantee they're getting very good support, personalized attention, etc... (even if the financials are cobbled together in Excel for a multimillion dollar loan). You're probably small fish for them.


Very interesting. I used to work on a huge Forte4GL program (think in the million lines of code range) and we thought about using traspilers (even had some quotes).

But it was the same old bad code, traspiled.

We looked for better solutions, my opinions and my boss's diverged, and last I heard, they're manually transpiling into Java at a multimillion dollar cost. It's a Java engineer employment program though.


> There are some studies that indicate that different parts of the brain are used when solving large and small problems. That small problems might involve more activity in long term memory.

I’ve been aware of a similar problem for a while now. When you write a complex system you’re always adding one more detail to something you already know. At some point when you’ve sailed far beyond all propriety, you built a system that can only be understood once it’s been memorized. Every new hire seems more useless than the last but the real problem is you.

Breaking up the long convoluted flows into smaller, self contained ones gives the new people a way to go about memorizing the system.

Otherwise they’ll just start crossing their fingers and hoping their changes work.


Serious question: COBOL's just a language. How come no one has put together some kind of runtime interpreter, for COBOL, or thought to produce an emulator facade that keeps the business logic of legacy code, but ports it to another platform, architecture, infrastructure?

I look at some of the perfectly cromulent stuff people do with Emscripten [0], and it just blows my mind the things that a browser will run these days. So, why not that?

[0] https://emscripten.org/


> for COBOL, or thought to produce an emulator facade that keeps the business logic of legacy code, but ports it to another platform, architecture, infrastructure

There are some, but the answer is: because COBOL is opinionated.

In terms of verbosity it makes Java look like a Haiku, and it's a weird mix of lower level and higher level constructs.

Classes? What are those? Code reuse?

It also talks (in weird ways) to things like green screen interfaces and saves data in "records" or builds "reports" (just try to create XML natively in Cobol)


COBOL work doesn't actually pay well compared to other languages, and it isn't that bigger market. Why? If your business logic and requirements haven't changed, why change a running system? Even if you could transpile with 100% accuracy, then you'd lose speed. Maybe you've gained an operational advantage, but there are other ways to do that. On the other extreme, you need to modernize badly, and so a rewrite is actually feasible because you can take advantage of new technologies, especially web and mobile. Between those use-cases, middleware exists which does allow you to bolt on even more awfulness. Finally, businesses die or get acquired. Sometimes good stuff is lost, but equally bad or old stuff is purged.

Maybe ironically, that Emscripten URL won't load for me.


Oh, lol! It just redirects to a github page, and I guess they don't listen on port 443.

http://emscripten.org

So much for https everywhere.


There was a link to another article [1] about how COBOL may be better suited to certain mathematical calculations. The article in question presented this iterative function:

    x[0] = 4
    x[1] = 4.25
    x[i] = 108 - ((815 - 1500 / x[i-1] ) / x[i-2])
It blew up on iteration 14 using floating point, but COBOL could be programmed to go further (by using more decimal places past the decimal point).

[1] https://medium.com/@bellmar/is-cobol-holding-you-hostage-wit...


> may be better suited to certain mathematical calculations

No, it isn't. If that's the best excuse they can find for keeping COBOL alive then throw it in the trash already (though I read the article and it's more subtle than that)

There are arbitrary precision libraries for multiple languages, both binary and decimal.



Rewriting COBOL software in Java also looks like repeating the same mistake. Java as a language is already becoming obsolete for many things, including JVM itself, so languages like Kotlin, Scala taking off. On the other part, the same languages also work very hard on being native [1] [2]. So for the long-term strategy, it makes more sense to use the platforms that already made this leap - OCaml, F#, Haskell. Those languages are already popular to some extent in the fintech/govtech. Elixir might be a good choice too, but less mature and not statically typed, which can be a problem for mission-critical software.

[1] http://www.scala-native.org/

[2] https://kotlinlang.org/docs/reference/native-overview.html


Java isn't going anywhere anytime soon. It's a safe choice for projects that are expected to have lifespans in the decades.


Absolutely. I've heard it referred as 21th century COBOL :) ... it'll have the same problems in 30 years :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: