ETL Data Engineer
Dublin / Hybrid
Permanent
Hybrid
Negotiable
Ref:
Job Description
My client is currently recruiting for an ETL Data Engineer to join the team. This role will be working off a hybrid model.
My client is currently recruiting for an ETL Data Engineer to join the team. This role will be working off a hybrid model. The ETL Data Engineer will be responsible for designing & driving the data integration of data, data warehouse design and delivery of BI Solutions.
Job Responsibilities
- You will design & integration of new data feeds & migration of new data sets as they are identified or are acquired as the business grows.
- You will determine optimal solutions for integrating data from a variety of sources into a common data warehouse normalised structure, within an ETL Framework
- You will be doing ongoing DWH modelling in a normalised star schema structure, along with de-normalised reporting data marts
- You will implement, maintain & monitor batch & stream data pipelines with best practice quality controls
- Performance optimisation of data processing e.g. parallel processing, RAM & CPU usage as well as storage & growth monitoring & ongoing engagement with DBA’s
- You will work closely and collaboratively in an Agile environment with the broader Data Team to analyse issues & find new insights that help drive business performance.
- You will do ongoing unit testing, system integration testing & data reconciliations. Coordination & support for UAT
- Automate the application of master data management transformations.
- Day to day operational support of data infrastructure & services
- Definition & Implementation of data security & compliance policies & regulations e.g. GDPR
- Help define, implement and lead on robust data management data quality processes across the company
- You will support the creation & maintenance of documentation deliverables including, but not limited to architecture diagrams, data flow diagrams, metadata, data dictionary, entity-relationship diagrams, data mapping documents, and other data design artifacts that define technical data specifications and transformation rules.
- You will design & build best in class processes to clean & standardise data
- You will support KPI definition process & hold custody of historic knowledge of data and meaning
- Identify and help standardise the use and governance of data
Experience Required
- Have a min of 5 years development experience in ETL, SQL Programming, large scale systems data integration and Data Warehousing environments
- Experience in data management frameworks; Data Quality Profiling, Metadata Management, Master Data Management, cleansing/standardising, analysing
- Deep knowledge of SQL, database model design (Fact & Dim tables) and all other data warehousing concepts
- Proven experience with SSIS and Azure Data Factory essential
- Expert familiarity with market leading data integration technologies.
- Strong command of relevant programming/scripting languages: e.g. Java, Scala, Python, R
- Strong understanding of Business Intelligence strategies/design as it relates to Data analysis.
- Financial services or Insurance experience strongly desired.
- Good project management skills; experience managing projects / requests with multiple senior stakeholders
Essential and Desirable Skills
- Proven experience with SSIS and Azure Data Factory essential
Educational Requirements
- Bachelor’s degree in computer science/information systems
Working Hours & Benefits
- 37 Hour Working Week
- 24 Days Annual Leave
- Hybrid Working
- PHI Benefit
- Pension 5% / 5%
- Maternity Leave
- 10% Bonus
- Sport & Social Club