Make your next career move with one of Houston’s fastest-growing tech companies. Browse and filter thousands of jobs in tech.

HTX Talent is the only job board highlighting top tech talent in the Bayou City.

One small step, one giant leap for your career.

Land your dream job in...

Take the next step create a talent profile

HTX Talent Hero Mobile Image

Data Engineer (Hadoop, GCP)



Data Science
Posted on Thursday, May 11, 2023


Capco Poland is a global technology and management consultancy specializing in driving digital transformation across the financial services industry. We are passionate about helping our clients succeed in an ever-changing industry.

We also are experts in focused on development, automation, innovation, and long-term projects in financial services. In Capco, you can code, write, create, and live at your maximum capabilities without getting dull, tired, or foggy.

We are looking for Data Engineers to support one of our best-in-class clients from banking. Our clients' Big Data Lake is the largest aggregation of data ever within financial services with over 300 sources and a rapidly growing book of work. The main aim is to build new analysis approaches and cases based on existing data sources and create a Big Data ecosystem. Sometimes it is migration from on-premise to cloud, sometimes establishment of on-premises solutions, and sometimes cloud solutions separately.


  • Deliver an ecosystem of curated, enriched, and protected sets of data – created from global, raw, structured, and unstructured sources
  • Collect, store, analyze, and leverage data
  • Integrate data with the architecture used across the company
  • Data Engineering and Management
  • Data development process: design, build and test data products that are complex or large-scale
  • Promote development standards, code reviews, mentoring, testing, scrum story writing
  • Cooperate with customers/stakeholders

TECH STACK: ETL, Hadoop-Based Analytics (Hbase, Hive, mapReduce, Kafka, Spark, BI, ETL, Database, etc), Python, Spark, GCP (cloud storage, Big Query, Pub/Sub, Data Flow), Jenkins, GitHub


  • Experience working with data pipeline building technologies: Python, Spark
  • Experience with Hadoop eco-system and data management frameworks
  • Good knowledge of ETL and Data warehouse concepts
  • Good knowledge of SQL and relational database design
  • Knowledge of GCP (cloud storage, Big Query, Pub/Sub, Data Flow)
  • Knowledge of CI/ CD, Agile, DevOps, Software Development Life Cycle (SDLC)
  • Understanding users’ requirements and functional specification
  • Understanding of tools and components of Data Architecture
  • Excellent communication, interpersonal, and decision-making skills
  • Good English knowledge


  • Employment contract and/or Business to Business - whichever you prefer
  • Possibility to work remotely
  • Speaking English on daily basis, mainly in contact with foreign stakeholders and peers
  • Multiple employee benefits packages (MyBenefit Cafeteria, private medical care, life-insurance)
  • Access to 3.000+ Business Courses Platform (Udemy)
  • Access to required IT equipment
  • Paid Referral Program
  • Participation in charity events e.g. Szlachetna Paczka
  • Ongoing learning opportunities to help you acquire new skills or deepen existing expertise
  • Being part of the core squad focused on the growth of the Polish business unit
  • A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients
  • A work culture focused on innovation and creating lasting value for our clients and employees


  • Screening call with the Recruiter
  • Technical/Competencies interview with Capco Hiring Manager
  • Culture fit with Head of Engineering
  • Feedback/Offer