Abinitio-Data engineer: Poland

Bytespoke AB · 3 months ago
Location
Poland
Department
Consulting - SE
Employment Type
Full-time

About the Role

We are seeking an experienced Mid-Level Ab-Initio Data Engineer / Developer to design, develop, and optimize data integration solutions within our analytics and data engineering ecosystem. The ideal candidate will have hands-on experience with Ab-Initio, strong ETL development abilities, and a solid understanding of data warehousing and enterprise data architectures.

Key Responsibilities

  1. Design, build, and enhance ETL pipelines using Ab-Initio GDE, Co>Op, DQE, EME, and related components.
  2. Develop scalable data workflows to integrate large volumes of data from multiple sources.
  3. Perform complex transformations, data cleansing, and data quality validations.
  4. Optimize ETL performance, including job parallelism, resource management, and error handling.
  5. Review and analyze business and technical requirements to build effective data solutions.
  6. Collaborate with architects, analysts, and business teams to understand data needs.
  7. Troubleshoot production issues, perform impact analysis, and support root-cause investigations.
  8. Maintain documentation including data mappings, workflow diagrams, and technical specifications.
  9. Contribute to code reviews, best practices, and process improvements.

Required Skills & Qualifications

  1. 4–9 years of hands-on experience with Ab-Initio (GDE, Co>Op, EME, PSET, DQE, BRE, Express>It, etc.).
  2. Strong proficiency in SQL and data modeling concepts.
  3. Solid understanding of ETL, data warehousing, and distributed data processing.
  4. Experience with Unix/Linux shell scripting.
  5. Familiarity with version control (Git), CI/CD pipelines, and job scheduling tools (Control-M, Autosys).
  6. Strong debugging, performance tuning, and analytical skills.
  7. Ability to work independently and handle multiple projects simultaneously.

Preferred Skills

  1. Experience with big-data technologies (Hadoop, Hive, Spark).
  2. Knowledge of cloud platforms (AWS, Azure, or GCP).
  3. Familiarity with Python or other programming languages.
  4. Exposure to DevOps automation and data pipeline orchestration tools (Airflow, Databricks).