I work at a very exciting BI-program where we have been able to do things right. The project is part of a larger program that contains Business Consulting, Master Data Management and Business Intelligence and, who knows, maybe even Information Lifecycle Management.
My work on the project includes ETL on the Enterprise Data Warehouse, a central project in the program.
My trade is, and has for some time been architecting developer. This gives me advantages early in the ETL process when it comes to what I like to call "Abstraction Layering"
Abstraction Layering – customer perspective (why)
Abstraction layering helps to set the balance between "keeping things open" vs. "delivering as soon as possible".
For my current project we need to deliver quickly while handling a few issues (from the top of my head):
- Loosely defined long term goal – I think
- Distributed developers, both geographically, and experience-wise
- Many source systems
- Large master data management and other projects changing the environment
- Right time
When implementation is done, we need to focus on simple measurable tasks. One way to do this is to model the work on well defined levels of abstraction. This way we can design top-down by having the most abstract discussions first, then some intermediate discussions and lastly the implementation details.
Things we do interesting to ETL includes:
- Selecting reference architecture.
Master Data Management, Hub and spoke EDW with 2G, full archive of source systems, data marts, custom Meta Data Repository.
- Create ETL "horizontal" layering – interfaces and documentation.
Packages take data from one architecture layer to another, grouping functionality and enabling measurability.
- Create ETL "vertical" layering – restrictions and grouping.
Jobs uses "job packages" uses "aggregated packages" that groups functionality in measurable chunks.
- Specify update intervals and delivery.
We plan for nightly job, hourly job and real-time job. Monthly, weekly reports, operational BI and more.
- Define deployment, operations, etc.
Operations implements ITIL, we should interface with it as it matures.
Abstraction Layering – architect perspective (how)
Architecting abstraction layering is done to serve the data integration projects by empowering a few roles, these includes:
- Project manager
Work breakdown structure gets easier because one for any integration task have some nice metaphors.
Gets assignments with a predictable scope.
Can reuse tests because many atomic items has the same scope.
The architect gets a bag of expression for reuse in the modeling of all the ETL jobs and test templates. It gets possible to create templates for kinds of functionality used often or placeholders for functionality other systems depends on.
Abstraction Layering – project manager perspective (what)
The developer lead gets measurability and some nice metaphors from the abstraction layering, in our current project they are
Roughly equivalent to executable, e.g.: "Nightly job".
Typically one for each of the different states a job goes through, e.g.: "Source extract"
Specific function for an entity, e.g.: "Extract <<customer tables>> from source system X"
- Task / Tsk
Task is part of a function, it moves, changes or collects data. Data in the warehouse knows where it comes from by reference to a such task id, a sample task might be "Extract customer address from source system X".
The project manager must choose when these metaphors are appropriate, of course.
Abstraction Layering – developer perspective (what)
When assigned a task a developer can by the name of the delivery see how it fits into the wide picture on three dimensions
Dictates the level along the low-level to high-level axis.
Horizontal layers in the architecture touched, for instance SourceDsa or DmsaDm.
Typically the 'T' in "ETL".
Most work repetitive by nature should have current templates controlled by the architect.
Abstraction Layering – test perspective (what)
Too early to say, but it looks like a combination of the abstract layering, "data haven" and MR shall make test-driven development beyond simple testing possible. Looks like integration and performance testing shall come relatively cheap.
OK, this grew pretty long, looks like I'll have to do more on parts of this later, with more concrete samples. Hope I find the time.