NiFi: Thinking Differently About DataFlow

Recently a question was posed to the Apache NiFi (Incubating) Developer Mailing List about how best to use Apache NiFi to perform Extract, Transform, Load (ETL) types of tasks. The question was “Is it possible to have NiFi service setup and running and allow for multiple dataflows to be designed and deployed (running) at the same time?”

The idea here was to create several disparate dataflows that run alongside one another in parallel. Data comes from Source X and it’s processed this way. That’s one dataflow. Other data comes from Source Y and it’s processed this way. That’s a second dataflow entirely. Typically, this is how we think about dataflow when we design it with an ETL tool. And this is a pretty common question for new NiFi users. With NiFi, though, we tend to think about designing dataflows a little bit differently. Rather than having several disparate, “stovepiped” flows, the preferred approach with NiFi is to have several inputs feed into the same dataflow. Data can then be easily routed (via RouteOnAttribute, for example) to “one-off subflows” if need be.

One of the benefits to having several disparate dataflows, though, is that it makes it much easier to answer when someone comes to you and says “I sent you a file last week. What did you do with it?” or “How do you process data that comes from this source?” You may not know exactly what happened to a specific file that they sent you, step-by-step, because of the different decision points in the flow, but at least you have a good idea by looking at the layout of the dataflow. (more…)

Read More