Data Engineer - GCP Job at A2C, Alpharetta, GA

OGpNM0NGaW1BNXUxZllnZ2hBYVRnaVE0NkE9PQ==
  • A2C
  • Alpharetta, GA

Job Description

Exciting contract opportunity for a Data Engineer in Alpharetta, GA. In this role you will work 3 days/week in the office (Monday, Tuesday, Thursday). As a Data Engineer you will have the opportunity to design and execute vital projects such as re-platforming our data services on Cloud and on-prem and delivering real-time streaming capabilities to our business applications. You will bring a clear point of view on data processing optimization, data modeling, data pipeline architecture, data SLA management. Our client’s mission is design & implement a data and analytics platform/infrastructure that enables a future-state analytics lifecycle, data monetization opportunities, data acquisition, analysis & feature engineering, model training, impact analysis, reporting, predictive and quantitative analysis, & monitoring.

What You Get to Do:

  • An ideal candidate is intellectually curious, has a solution-oriented attitude, and enjoys learning new tools and techniques.
  • You will have the opportunity to design and execute vital projects such as re-platforming our data services on Cloud and on-prem, and delivering real-time streaming capabilities to our business applications.
  • Brings a clear point of view on data processing optimization, data modeling, data pipeline architecture, data SLA management.
  • Holds accountability for the quality, usability, and performance of the solutions.
  • Leads design sessions and code reviews to elevate the quality of engineering across the organization.
  • Design, develop data foundation on cloud data platform using GCP tools and techniques e.g.: Google Cloud Platform, Pub/Sub, Big Query, Cloud SQL, BigTable, BigLake DataForm, DataFlow, DataStream, Google cloud storage, Cloud Composer/DAG, Cloud Run, Cloud RESTAPI, ADO GITREPO, CI/CD Pipelines, Secret Manager, Cloud IAM Terraform/YAML etc.
  • ETL pipeline using Python builds and scalable solutions.
  • Multi-level Data Curation and modeling.
  • Data design and architecture.
  • Hands on experience in building complete CI/CD Pipeline creation and maintenance using Azure DevOps and Terraform/Terragrunt.
  • Increase the efficiency and speed of complicated data processing systems.
  • Collaborating with our Architecture group, recommend and ensure the optimal data architecture.
  • Analyzing data gathered during tests to identify strengths and weaknesses of ML Models
  • increase the efficiency and speed of complicated data processing systems.
  • Collaborate across all functional areas to translate complex business problems into optimal data modeling and analytical solutions that drive business value.
  • Lead the improvements and advancement of reporting and data capabilities across the company, including analytics skills, data literacy, visualization, and storytelling.
  • Develop a certified vs. self-service analytics framework for the organization.
  • Collaborating with our Architecture group, recommend and ensure the optimal data architecture.
  • Highly skilled on RDMS (Oracle, SQL server), NoSQL Database, and Messaging services (Publish / Subscribe) systems.
  • Extensive knowledge/coding skills of Python including understanding of data modeling and data engineering.

What You Bring to the Table:

  • Bachelor’s degree in computer science, Engineering, Mathematics, Sciences, or related field of study from an accredited college or university; will consider a combination of experience and/or education.
  • Ideally 5+ years of experience in developing data and analytics solutions and approximately 4+ years data modeling and architecture.
  • Expertise in programming languages including Python and SQL.
  • Familiarity with certain software development methodologies such as Agile, or Scrum.
  • Critical thinking.
  • Leveraging cloud-native services for data processing and storage.
  • Storage – BigQuery, GCS, Cloud SQL, BigTable, BigLake
  • Event processing – Pub/Sub, EventArc
  • Data pipeline and analytics – Dataflow, DataForm, Cloud Run, Cloud Run Function, DataStream, Cloud Scheduler, Workflows, Composer, Dataplex, ADO GITREPO, CI/CD Pipelines, Terraform/YAML
  • Security – Secret Manager, Cloud IAM
  • Others – Artifact Registry, Cloud Logging, Cloud Monitoring
  • Work with distributed data processing frameworks like Spark.
  • Strong knowledge of database systems, and data modeling techniques.
  • Ability to adapt to evolving technologies and business requirements.
  • Ability to explain technical concepts to nontechnical business leaders.
  • Monitor system performance and troubleshoot issues.
  • Ensure data security.
  • Proficiency in technical skills, cloud tools and technologies.

Technical Skills:

  • Must Have
  • GCP (Google Cloud Platform)
  • ETL pipeline using Python
  • Expertise in programming languages including Python and SQL.

Got Extra to Bring?

  • GCP – Professional Data Engineer Certification
  • Documenting all steps in the development process
  • Manage the data collection process providing interpretation and recommendations to management

Job Tags

Contract work, 3 days per week,

Similar Jobs

Colorado Athletic Club

Regional Personal Training Manager - Colorado Athletic Club Job at Colorado Athletic Club

 ...Colorado Athletic Club Denver, CO A Wellbridge Company Full-time We're looking for a Regional Personal Training Manager! Who we are: Colorado Athletic Club is Denvers leading upscale athletic, aquatics, tennis and family fitness club with 4 convenient... 

Kane Partners LLC

Director of Operations - Hazardous Waste & Remediation Services Job at Kane Partners LLC

 ...Kane Partners has partnered with a trusted leader in hazardous waste management and environmental remediation, delivering high-impact solutions that protect people, communities, and the planet. Our mission is clear: to provide safe, compliant, and cost-effective services... 

International SOS

International Operations Specialist Job at International SOS

 ...International SOS is the worlds leading medical and security services company with over 12,000 employees working in 1,000 locations in...  ...solving problems together and are proud of the work we do. We have a social, collaborative culture that values personal development.... 

Berkley Hunt

Site Reliability Engineer Job at Berkley Hunt

 ...Senior Site Reliability Engineer (GPU Compute) | Hybrid Bay Area, CA Berkley Hunt is supporting a fast-growing AI startup building a high-performance, cloud-native platform to power cutting-edge machine learning workloads. As they scale, theyre hiring a Senior/Staff... 

Luna Data Solutions, Inc.

Senior Business Analyst Job at Luna Data Solutions, Inc.

 ...Luna Data Solutions has an exciting long term contract opportunity for a Business Analyst in Austin, TX. Opportunities: (30%) Coordinates and analyzes user requirements, procedures, and problems to automate or improve existing systems; and coordinates and analyzes...