Abinitio-Data engineer: Poland
Bytespoke AB
·
3 months ago
Location
Poland
Department
Consulting - SE
Employment Type
Full-time
About the Role
We are seeking an experienced Mid-Level Ab-Initio Data Engineer / Developer to design, develop, and optimize data integration solutions within our analytics and data engineering ecosystem. The ideal candidate will have hands-on experience with Ab-Initio, strong ETL development abilities, and a solid understanding of data warehousing and enterprise data architectures.
Key Responsibilities
- Design, build, and enhance ETL pipelines using Ab-Initio GDE, Co>Op, DQE, EME, and related components.
- Develop scalable data workflows to integrate large volumes of data from multiple sources.
- Perform complex transformations, data cleansing, and data quality validations.
- Optimize ETL performance, including job parallelism, resource management, and error handling.
- Review and analyze business and technical requirements to build effective data solutions.
- Collaborate with architects, analysts, and business teams to understand data needs.
- Troubleshoot production issues, perform impact analysis, and support root-cause investigations.
- Maintain documentation including data mappings, workflow diagrams, and technical specifications.
- Contribute to code reviews, best practices, and process improvements.
Required Skills & Qualifications
- 4–9 years of hands-on experience with Ab-Initio (GDE, Co>Op, EME, PSET, DQE, BRE, Express>It, etc.).
- Strong proficiency in SQL and data modeling concepts.
- Solid understanding of ETL, data warehousing, and distributed data processing.
- Experience with Unix/Linux shell scripting.
- Familiarity with version control (Git), CI/CD pipelines, and job scheduling tools (Control-M, Autosys).
- Strong debugging, performance tuning, and analytical skills.
- Ability to work independently and handle multiple projects simultaneously.
Preferred Skills
- Experience with big-data technologies (Hadoop, Hive, Spark).
- Knowledge of cloud platforms (AWS, Azure, or GCP).
- Familiarity with Python or other programming languages.
- Exposure to DevOps automation and data pipeline orchestration tools (Airflow, Databricks).