So ETL is mostly about getting data into some kind of processing system. That means there are lots of functions for dealing with things like csv data and polling directories and transforming json data into flat structures and... etc.
Dataflow is a programming model for performing actions on data. A system that implements dataflow programming will probably have functions to load external data into the structure needed for the system, but it isn't primarily about moving data to another system.
For example, Google Dataflow[1] has functions for reading files etc, but there aren't really the huge number of things for cleaning and processing data that a real ETL system has. Instead, you load the data into the system, and then process it for a specific task.
Dataflow is a programming model for performing actions on data. A system that implements dataflow programming will probably have functions to load external data into the structure needed for the system, but it isn't primarily about moving data to another system.
For example, Google Dataflow[1] has functions for reading files etc, but there aren't really the huge number of things for cleaning and processing data that a real ETL system has. Instead, you load the data into the system, and then process it for a specific task.
[1] https://cloud.google.com/dataflow/what-is-google-cloud-dataf...