Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Senior Data Engineer.
Poland Jobs Expertini

Urgent! Senior Data Engineer Job Opening In Wolow – Now Hiring Innobo

Senior Data Engineer



Job description

Client: automotive industry

Hourly Rate: up to 140 PLN

Location: Wroclaw, Poland

Work arrangement: hybrid, once per week in the office or fully remote - to be determined, full-time


Maintain and evolve the data flows used by the Picto application: Azure + Databricks pipelines (ADF + notebooks) that ingest data from APIs using Ingestion Framework, transform it (PySpark/Spark SQL), and deliver trusted datasets.


Responsibilities:

  • Own day-to-day operations of Picto data pipelines (ingest → transform → publish), ensuring reliability, performance and cost efficiency.
  • Develop and maintain Databricks notebooks (PySpark/Spark SQL) and ADF pipelines/Triggers; manage Jobs/Workflows and CI/CD.
  • Implement data quality checks, monitoring & alerting (SLA/SLO), troubleshoot incidents, and perform root-cause analysis.
  • Secure pipelines (Key Vault, identities, secrets) and follow platform standards (Unity Catalog, environments, branching).
  • Collaborate with BI Analysts and Architects to align data models and outputs with business needs.
  • Document datasets, flows and runbooks; contribute to continuous improvement of the Ingestion Framework.


Requirements:

  • Azure Databricks (PySpark, Spark SQL; Unity Catalog; Jobs/Workflows).
  • Azure data services: Azure Data Factory, Azure Key Vault, storage (ADLS), fundamentals of networking/identities.
  • Python for data engineering (APIs, utilities, tests).
  • Azure DevOps (Repos, Pipelines, YAML) and Git-based workflows.
  • Experience operating production pipelines (monitoring, alerting, incident handling, cost control).


Nice to have:

  • AI


Soft skills:

  • Proactive ownership and “driver” mindset—able to move topics forward end-to-end.
  • Collaborative and business-oriented; comfortable working with IT and business stakeholders.
  • Open-minded, flexible, and quality-focused; clear written documentation and communication.


Tech stack:

  • Azure, Databricks (PySpark/Spark SQL, Unity Catalog, Workflows), ADF, ADLS/Delta, Key Vault, Azure DevOps (Repos/Pipelines YAML), Python, SQL


If you meet most of the requirements and are looking for your next challenge, we’d love to hear from you - feel free to apply below!


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Senior Data Potential: Insight & Career Growth Guide