Hacker News new | past | comments | ask | show | jobs | submit login

Very true. We really should stop overloading the term Workflow in software.

As of now these are the broad categories that abuse the term workflow.

1. State Machines

For example in Jira a bug/ticket moves through different states to reach a final stage. This type of state-machines can be found in a lot of different softwares - most CMS/bug-trackers/CRMs where the entity is different (document/bug/lead).

The motive of these systems who call themselves as workflow engines - is to provide a structure to an otherwise ad-hoc movement of entities so that a lead/manager is able to ensure a process and collect statistics.

2. Automations

For example "apple workflow" app or Zapier or Zoho Flow. These softwares define a sequence of steps that are triggered when an event occurs in the system.

The motive of these systems are to enable automation and integrations between different software components with zero code (thus, no-code).

3. Process Designers (Bad term but can't think of anything better at the moment)

For example Airflow/Camunda. These systems are not necessarily low-code but they mostly deal with arranging individual components of code such that a process can be assembled as quickly as possible. These systems usually are accompanied by a visual designer like what Zapier has, but the intentions are mostly to ease out the process, than being a complete no-code tool to create automations. However, their marketing tries to sell themselves as no-code platform for business folks.

The motive is not yet very clear to me but from my initial intuition they can be used to initiate some data-processing pipeline, I guess? If anybody can throw more clarity, please leave a reply.

Now as you can see, much like how a "Process" can mean many things in many different context, the term "Workflow" can mean a lot depending on the context. Any software that calls itself the ultimate workflow solution is just a lie. It's like calling something an "ultimate process engine" - doesn't make sense.




I'd like you to listen to this file, which is your first 'headline' read by `say`: https://www.dropbox.com/s/fanpqs8lv2d9nvl/say.m4a?dl=0 Now imagine yourself in the shoes of a vision-impaired person relying on a technology like VoiceOver. Do you understand how unbelievably frustrating that is?


Thanks for pointing this out. Never knew this would break screen readers. Sorry. I'm changing the font right now.


[flagged]


What is your point? zacwest isn't asking them to completely rewrite the page—the problem they pointed out is trivial to solve and simply needed to be recognized. If you can spend 20 minutes to drastically improve the usability of your product for "dozens" of users, why wouldn't you?


I am not saying it is bad.

I was just commenting that the scale is offset. Get it?


"However, their marketing tries to sell themselves as no-code platform for business folks."

Huh? I've never seen Airflow described as no code or tried to sell itself that way, in fact all the pipelines are written in python and you can do some really complex orchestration.

I get you're not saying Airflow is no code but that the category you've put it in is typically low code or marketed as low code, but then I don't think Airflow belongs in that category or rather, and maybe more accurately, no/low code is not really a major defining quality of the bucket you're calling "Process Designers".

I've also been calling them "Process Schedulers" because typically it involves translating a more manual, but well defined process into it's automated phase.


I would agree.

The no-code is an illusion in the enterprise realm - before you know it, you are waist deep in the custom code.

No-code can really work only for small businesses imo.

I come from enterprise background and that is one of the reasons I built Titanoboa - to make something that makes it easy to rapidly prototype new integrations on the fly.

I summed up some of my thoughts on this topic here: https://www.titanoboa.io/repl.html

The main point I am trying to test with Titanoboa is this however:

State Machines <-> Process Designers is a spectrum and one product could handle the entire spectrum (or part of it).

Titanoboa makes it possible to pre-define workflow steps and make it "no code" while also making complex custom integrations possible from the same environment with the same concepts. Plus also distributed data handling is in the mix.

I guess now the challenge is how to market this versatility or whether it could create more confusion...


I see what you are getting at. Yes State Machines to Process Designers is a spectrum and has a quite a bit of overlappings.

For example these are the lowest common denominators I see.

#1 Graph: All of these systems allow you to visually design/represent the process as a graph. You yourself has abstracted these into graph problems and have come out with a simpler non-verbose BPMN alternative - which is great.

#3 Computability: Since the base is a definitive graph, essentially a graph that could execute is a finite-automata. That is, all of these systems put the power back to the end-users to create their own machines (without actually coding) hence the relatability with low-code. So at the end, broadly even the motive aligns from computability perspective.

But I'm still not convinced these denominators justify an all-in-one one-size-fits-all solution to this spectrum. I'm not saying one product shouldn't attack them all, but it's better to appropriately categorize them and develop unique features on top of each of them. At least that is what I feel at this point of time.


This is not true in many use cases. There are tons of ways to handle low code/no code. It is a very hard problem to solve, and vendors end up building basic "low-code" wrappers around API endpoints, and that's why it looks like a lost cause.

Done right, (we are a living proof it can be done, at Syncari), one can do a lot of stuff done without any coding


Do you have any links to some blog posts discussing it (the "doing it right" approach)? I would agree that it definitely depends on the use case.

I will definitely check out Syncari - just opened the landing page and it looks great!


Thanks! Haven't had a chance to write a lot about this (heads down building and selling), but https://syncari.com/a-brief-history-of-todays-data-woes-and-... touches on it a bit.

For us, it is about:

* implementing deep integrations that are commonplace

* not spraying too thin in the quest of supporting hundreds of systems

* thinking from a data model/data/eventually consistent system perspective

* completely dropping the reactive/trigger based/if-this-then-that point-to-point model.


Thanks Neelesh for the link, I like the way of thinking with the focus on data.

It is similar with what I am seeing - i.e. lots of older integration systems are terrible with data simply because they force you to use some way of data modeling (e.g. their OOP data models, WSDLs/XSDs etc.) while the newer ones just rely on json which is good but can lack the (sometimes necessary) complexity. To do some data cleansing on the way then seems like an unachievable task (there is certainly such a thing as overdose on XSLT ;) ).

I also like the approach you took with centralized data dictionary - it certainly is something the industry might need, I would wonder how it impacts change management though (especially in bigger companies).

Wishing you good luck!


Which bucket temporal.io and Azure Durable Functions that support `workflow as code` do fall into?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: