Data Engineer (Hadoop, GCP)
Capco
This job is no longer accepting applications
See open jobs at Capco.See open jobs similar to "Data Engineer (Hadoop, GCP)" HTX.
CAPCO POLAND
Capco Poland is a global technology and management consultancy specializing in driving digital transformation across the financial services industry. We are passionate about helping our clients succeed in an ever-changing industry.
We also are experts in focused on development, automation, innovation, and long-term projects in financial services. In Capco, you can code, write, create, and live at your maximum capabilities without getting dull, tired, or foggy.
We are looking for Data Engineers to support one of our best-in-class clients from banking. Our clients' Big Data Lake is the largest aggregation of data ever within financial services with over 300 sources and a rapidly growing book of work. The main aim is to build new analysis approaches and cases based on existing data sources and create a Big Data ecosystem. Sometimes it is migration from on-premise to cloud, sometimes establishment of on-premises solutions, and sometimes cloud solutions separately.
THINGS YOU WILL DO
- Deliver an ecosystem of curated, enriched, and protected sets of data – created from global, raw, structured, and unstructured sources
- Collect, store, analyze, and leverage data
- Integrate data with the architecture used across the company
- Data Engineering and Management
- Data development process: design, build and test data products that are complex or large-scale
- Promote development standards, code reviews, mentoring, testing, scrum story writing
- Cooperate with customers/stakeholders
TECH STACK: ETL, Hadoop-Based Analytics (Hbase, Hive, mapReduce, Kafka, Spark, BI, ETL, Database, etc), Python, Spark, GCP (cloud storage, Big Query, Pub/Sub, Data Flow), Jenkins, GitHub
SKILLS & EXPERIENCES YOU NEED TO GET THE JOB DONE
- Experience working with data pipeline building technologies: Python, Spark
- Experience with Hadoop eco-system and data management frameworks
- Good knowledge of ETL and Data warehouse concepts
- Good knowledge of SQL and relational database design
- Knowledge of GCP (cloud storage, Big Query, Pub/Sub, Data Flow)
- Knowledge of CI/ CD, Agile, DevOps, Software Development Life Cycle (SDLC)
- Understanding users’ requirements and functional specification
- Understanding of tools and components of Data Architecture
- Excellent communication, interpersonal, and decision-making skills
- Good English knowledge
WHY JOIN CAPCO?
- Employment contract and/or Business to Business - whichever you prefer
- Possibility to work remotely
- Speaking English on daily basis, mainly in contact with foreign stakeholders and peers
- Multiple employee benefits packages (MyBenefit Cafeteria, private medical care, life-insurance)
- Access to 3.000+ Business Courses Platform (Udemy)
- Access to required IT equipment
- Paid Referral Program
- Participation in charity events e.g. Szlachetna Paczka
- Ongoing learning opportunities to help you acquire new skills or deepen existing expertise
- Being part of the core squad focused on the growth of the Polish business unit
- A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients
- A work culture focused on innovation and creating lasting value for our clients and employees
ONLINE RECRUITMENT PROCESS STEPS
- Screening call with the Recruiter
- Technical/Competencies interview with Capco Hiring Manager
- Culture fit with Head of Engineering
- Feedback/Offer
This job is no longer accepting applications
See open jobs at Capco.See open jobs similar to "Data Engineer (Hadoop, GCP)" HTX.