- Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: GCP Data Engineer.
Urgent! GCP Data Engineer Job Opening In Kraków – Now Hiring Antal Sp. z o.o.
GCP Data Engineer – STAR Platform
Location: Kraków / Hybrid Work ( 2 x week from office)
Are you ready to build impactful solutions on a global scale?
Join a forward-thinking team that powers critical risk calculations in one of the world's leading financial institutions.
We are looking for a talented GCP Data Engineer with a strong Java background to join the STAR platform team.
STAR is HSBC’s strategic cloud-native platform designed to generate and deliver risk factor definitions, historical market data, and scenarios for Value at Risk (VaR) and Expected Shortfall (ES) calculations.
The platform leverages data pipelines and microservices, combining both real-time and batch processing to handle large-scale datasets.
You’ll be joining a global team of developers within the Global Traded Risk Technology department, working in an open, inclusive, and innovation-driven environment.
Translate complex business requirements into secure, scalable, and high-performance data solutions
Design and implement performant data processing pipelines (batch and streaming)
Develop REST APIs and data ingestion patterns in a cloud-native architecture
Integrate internal systems with a focus on cost optimization and fast data processing
Modernize and enhance existing pipelines and microservices
Create and maintain solution blueprints and documentation
Conduct peer code reviews and provide constructive feedback
Promote test-centric development practices including unit and regression tests
Ensure consistent logging, monitoring, error handling, and automated recovery aligned with industry standards
Collaborate closely with engineers, analysts, and stakeholders across regions
Strong proficiency in Java and Spring Boot
Understanding of key software design principles: KISS, SOLID, DRY
Hands-on experience building data processing pipelines (preferably with Apache Beam)
Experience designing and building RESTful APIs
Familiarity with relational and NoSQL databases, especially PostgreSQL and Bigtable
Basic knowledge of DevOps and CI/CD tools, including Jenkins and Groovy scripting
Experience with integration frameworks and patterns (e.g., Saga, Lambda)
Strong problem-solving and analytical skills
Excellent communication skills and ability to thrive in a collaborative team environment
Experience with Google Cloud Platform (GCP) services: GKE, Cloud SQL, Dataflow, Bigtable
Familiarity with OpenTelemetry, Prometheus, Grafana
Knowledge of Kubernetes, Docker, and Terraform
Messaging/streaming experience with Kafka
UI experience with Vaadin
Exposure to Apache Beam in large-scale data environments
To learn more about Antal, please visit
✨ Smart • Intelligent • Private • Secure
Practice for Any Interview Q&A (AI Enabled)
Predict interview Q&A (AI Supported)
Mock interview trainer (AI Supported)
Ace behavioral interviews (AI Powered)
Record interview questions (Confidential)
Master your interviews
Track your answers (Confidential)
Schedule your applications (Confidential)
Create perfect cover letters (AI Supported)
Analyze your resume (NLP Supported)
ATS compatibility check (AI Supported)
Optimize your applications (AI Supported)
O*NET Supported
O*NET Supported
O*NET Supported
O*NET Supported
O*NET Supported
European Union Recommended
Institution Recommended
Institution Recommended
Researcher Recommended
IT Savvy Recommended
Trades Recommended
O*NET Supported
Artist Recommended
Researchers Recommended
Create your account
Access your account
Create your professional profile
Preview your profile
Your saved opportunities
Reviews you've given
Companies you follow
Discover employers
O*NET Supported
Common questions answered
Help for job seekers
How matching works
Customized job suggestions
Fast application process
Manage alert settings
Understanding alerts
How we match resumes
Professional branding guide
Increase your visibility
Get verified status
Learn about our AI
How ATS ranks you
AI-powered matching
Join thousands of professionals who've advanced their careers with our platform
Unlock Your GCP Data Potential: Insight & Career Growth Guide
Real-time GCP Data Jobs Trends in Kraków, Poland (Graphical Representation)
Explore profound insights with Expertini's real-time, in-depth analysis, showcased through the graph below. This graph displays the job market trends for GCP Data in Kraków, Poland using a bar chart to represent the number of jobs available and a trend line to illustrate the trend over time. Specifically, the graph shows 1576 jobs in Poland and 139 jobs in Kraków. This comprehensive analysis highlights market share and opportunities for professionals in GCP Data roles. These dynamic trends provide a better understanding of the job market landscape in these regions.
Great news! Antal Sp. z o.o. is currently hiring and seeking a GCP Data Engineer to join their team. Feel free to download the job details.
Wait no longer! Are you also interested in exploring similar jobs? Search now: GCP Data Engineer Jobs Kraków.
An organization's rules and standards set how people should be treated in the office and how different situations should be handled. The work culture at Antal Sp. z o.o. adheres to the cultural norms as outlined by Expertini.
The fundamental ethical values are:The average salary range for a GCP Data Engineer Jobs Poland varies, but the pay scale is rated "Standard" in Kraków. Salary levels may vary depending on your industry, experience, and skills. It's essential to research and negotiate effectively. We advise reading the full job specification before proceeding with the application to understand the salary package.
Key qualifications for GCP Data Engineer typically include Computer Occupations and a list of qualifications and expertise as mentioned in the job specification. Be sure to check the specific job listing for detailed requirements and qualifications.
To improve your chances of getting hired for GCP Data Engineer, consider enhancing your skills. Check your CV/Résumé Score with our free Resume Scoring Tool. We have an in-built Resume Scoring tool that gives you the matching score for each job based on your CV/Résumé once it is uploaded. This can help you align your CV/Résumé according to the job requirements and enhance your skills if needed.
Here are some tips to help you prepare for and ace your job interview:
Before the Interview:To prepare for your GCP Data Engineer interview at Antal Sp. z o.o., research the company, understand the job requirements, and practice common interview questions.
Highlight your leadership skills, achievements, and strategic thinking abilities. Be prepared to discuss your experience with HR, including your approach to meeting targets as a team player. Additionally, review the Antal Sp. z o.o.'s products or services and be prepared to discuss how you can contribute to their success.
By following these tips, you can increase your chances of making a positive impression and landing the job!
Setting up job alerts for GCP Data Engineer is easy with Poland Jobs Expertini. Simply visit our job alerts page here, enter your preferred job title and location, and choose how often you want to receive notifications. You'll get the latest job openings sent directly to your email for FREE!