Big Data Engineer (GCP)
Phoenix, AZ
Contracted
Experienced
Job Title: Big Data Engineer (GCP)
Location: Phoenix, AZ (Onsite/Hybrid)
Duration:
Job Summary
We are seeking an experienced Big Data Engineer with strong expertise in Google Cloud Platform (GCP) to design, build, and optimize scalable data pipelines and analytics solutions. The ideal candidate will have hands-on experience with BigQuery and GCP data services, and will collaborate closely with data scientists, architects, and business stakeholders to deliver high-performance, reliable data systems.
Key Responsibilities
Data Engineering & Pipeline Development
- Design, develop, and maintain scalable data pipelines using GCP services.
- Build efficient ETL/ELT processes for structured and unstructured data.
- Ensure data quality, integrity, and availability across systems.
GCP & Big Data Technologies
- Work extensively with BigQuery, Dataflow, and Dataproc for data processing and analytics.
- Optimize BigQuery queries for performance and cost efficiency.
- Leverage GCP-native tools for scalable and resilient data architectures.
Programming & Processing
- Develop data processing solutions using Python, Java, or Scala.
- Implement batch and real-time data processing frameworks.
Workflow Orchestration & Automation
- Design and manage workflows using Airflow or Cloud Composer.
- Automate data pipelines and integrate with CI/CD processes.
Collaboration & Delivery
- Partner with data scientists, analysts, and business teams to understand requirements.
- Participate in Agile ceremonies and contribute to sprint deliverables.
- Ensure timely delivery of high-quality data solutions.
Required Qualifications
- 7+ years of experience in Big Data Engineering.
- Strong hands-on experience with GCP services (BigQuery, Dataflow, Dataproc).
- Proficiency in Python, Java, or Scala for data engineering.
- Strong SQL skills with experience in query optimization.
- Experience with workflow orchestration tools (Airflow/Composer).
- Familiarity with Agile methodologies and CI/CD practices.
- Strong problem-solving and analytical skills.
Nice to Have
- Experience with real-time streaming (Pub/Sub, Kafka).
- Knowledge of data warehousing and data lake architectures.
- Exposure to data governance and security best practices.
Apply for this position
Required*