Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: PySpark Developer.
Poland Jobs Expertini

Urgent! PySpark Developer Position in Warsaw - Axiom Software Solutions Limited

PySpark Developer



Job description

PySpark Developer

Description

We are looking for a skilled Data Engineer with expertise in Python, PySpark, and Cloudera to join our team.

The ideal candidate will be responsible for developing and optimizing big data pipelines while ensuring efficiency and scalability.

Experience with Databricks is a plus.

Additionally, familiarity with Git, GitHub, Jira, and Confluence is highly valued for effective collaboration and version control.

Key Responsibilities

- Design, develop, and maintain ETL pipelines using Python and PySpark.

- Work with Cloudera Hadoop ecosystem to manage and process large-scale datasets.

- Ensure data integrity, performance, and reliability across distributed systems.

- Collaborate with data scientists, analysts, and business stakeholders to deliver data-driven solutions.

- Implement best practices for data governance, security, and performance tuning.

- Use Git and GitHub for version control and efficient code collaboration.

- Track and manage tasks using Jira, and document processes in Confluence.

- (Optional) Work with Databricks for cloud-based big data processing.

Required Skills & Experience

- Strong programming skills in Python.

- Hands-on experience with PySpark for distributed data processing.

- Expertise in Cloudera Hadoop ecosystem (HDFS, Hive, Impala).

- Experience with SQL and working with large datasets.

- Knowledge of Git and GitHub for source code management.

- Experience with Jira for task tracking and Confluence for documentation.

- Strong problem-solving and analytical skills.

Preferred Qualifications

- Basic knowledge of Databricks for cloud-based big data solutions.

- Experience with workflow orchestration tools (e.g., Airflow, Oozie).

- Understanding of cloud platforms (AWS, Azure, or GCP).

- Exposure to Kafka or other real-time streaming technologies.


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your PySpark Developer Potential: Insight & Career Growth Guide