Data Engineer - GCP Job at A2C, Alpharetta, GA

OGpNM0NGaW1BNXUxZllnZ2hBYVRnaVE0NkE9PQ==
  • A2C
  • Alpharetta, GA

Job Description

Exciting contract opportunity for a Data Engineer in Alpharetta, GA. In this role you will work 3 days/week in the office (Monday, Tuesday, Thursday). As a Data Engineer you will have the opportunity to design and execute vital projects such as re-platforming our data services on Cloud and on-prem and delivering real-time streaming capabilities to our business applications. You will bring a clear point of view on data processing optimization, data modeling, data pipeline architecture, data SLA management. Our client’s mission is design & implement a data and analytics platform/infrastructure that enables a future-state analytics lifecycle, data monetization opportunities, data acquisition, analysis & feature engineering, model training, impact analysis, reporting, predictive and quantitative analysis, & monitoring.

What You Get to Do:

  • An ideal candidate is intellectually curious, has a solution-oriented attitude, and enjoys learning new tools and techniques.
  • You will have the opportunity to design and execute vital projects such as re-platforming our data services on Cloud and on-prem, and delivering real-time streaming capabilities to our business applications.
  • Brings a clear point of view on data processing optimization, data modeling, data pipeline architecture, data SLA management.
  • Holds accountability for the quality, usability, and performance of the solutions.
  • Leads design sessions and code reviews to elevate the quality of engineering across the organization.
  • Design, develop data foundation on cloud data platform using GCP tools and techniques e.g.: Google Cloud Platform, Pub/Sub, Big Query, Cloud SQL, BigTable, BigLake DataForm, DataFlow, DataStream, Google cloud storage, Cloud Composer/DAG, Cloud Run, Cloud RESTAPI, ADO GITREPO, CI/CD Pipelines, Secret Manager, Cloud IAM Terraform/YAML etc.
  • ETL pipeline using Python builds and scalable solutions.
  • Multi-level Data Curation and modeling.
  • Data design and architecture.
  • Hands on experience in building complete CI/CD Pipeline creation and maintenance using Azure DevOps and Terraform/Terragrunt.
  • Increase the efficiency and speed of complicated data processing systems.
  • Collaborating with our Architecture group, recommend and ensure the optimal data architecture.
  • Analyzing data gathered during tests to identify strengths and weaknesses of ML Models
  • increase the efficiency and speed of complicated data processing systems.
  • Collaborate across all functional areas to translate complex business problems into optimal data modeling and analytical solutions that drive business value.
  • Lead the improvements and advancement of reporting and data capabilities across the company, including analytics skills, data literacy, visualization, and storytelling.
  • Develop a certified vs. self-service analytics framework for the organization.
  • Collaborating with our Architecture group, recommend and ensure the optimal data architecture.
  • Highly skilled on RDMS (Oracle, SQL server), NoSQL Database, and Messaging services (Publish / Subscribe) systems.
  • Extensive knowledge/coding skills of Python including understanding of data modeling and data engineering.

What You Bring to the Table:

  • Bachelor’s degree in computer science, Engineering, Mathematics, Sciences, or related field of study from an accredited college or university; will consider a combination of experience and/or education.
  • Ideally 5+ years of experience in developing data and analytics solutions and approximately 4+ years data modeling and architecture.
  • Expertise in programming languages including Python and SQL.
  • Familiarity with certain software development methodologies such as Agile, or Scrum.
  • Critical thinking.
  • Leveraging cloud-native services for data processing and storage.
  • Storage – BigQuery, GCS, Cloud SQL, BigTable, BigLake
  • Event processing – Pub/Sub, EventArc
  • Data pipeline and analytics – Dataflow, DataForm, Cloud Run, Cloud Run Function, DataStream, Cloud Scheduler, Workflows, Composer, Dataplex, ADO GITREPO, CI/CD Pipelines, Terraform/YAML
  • Security – Secret Manager, Cloud IAM
  • Others – Artifact Registry, Cloud Logging, Cloud Monitoring
  • Work with distributed data processing frameworks like Spark.
  • Strong knowledge of database systems, and data modeling techniques.
  • Ability to adapt to evolving technologies and business requirements.
  • Ability to explain technical concepts to nontechnical business leaders.
  • Monitor system performance and troubleshoot issues.
  • Ensure data security.
  • Proficiency in technical skills, cloud tools and technologies.

Technical Skills:

  • Must Have
  • GCP (Google Cloud Platform)
  • ETL pipeline using Python
  • Expertise in programming languages including Python and SQL.

Got Extra to Bring?

  • GCP – Professional Data Engineer Certification
  • Documenting all steps in the development process
  • Manage the data collection process providing interpretation and recommendations to management

Job Tags

Contract work, 3 days per week,

Similar Jobs

KenTay Logistics

Delivery Driver Job at KenTay Logistics

 ...Are you ready to make some money? We are urgently hiring last-mile delivery drivers in 2 cities: Albuquerque, NM and Phoenix, AZ. Our company delivers small packages from companies like TEMU, Shein, AliExpress, and many others for our parent company (UniUni). We pride... 

Hiring Drivers Now

CDL-A Tanker Truck Driver Job at Hiring Drivers Now

We are seeking a skilled CDL-A Tanker Truck Driver to join our team in ensuring the safe and timely transportation of bulk liquids. As a critical component of our logistics operations, you will be entrusted with maintaining safety protocols, adherence to regulatory guidelines... 

The Sycamore School

STEM Teacher Job at The Sycamore School

 ...Job Summary The Sycamore School is seeking a full-time STEM Teacher for the 2025-2026 school year. Applicants would be responsible for building middle and high school science curricula and integrating it with other subjects. Applicants must be comfortable with providing... 

Potpourri Group

General Merchandise Manager Job at Potpourri Group

Potpourri Group Inc., referred to as "PGI", was founded in 1963, publishing a single consumer catalog title. New brands were added through internal development and acquisitions so that today PGI has grown to be one of the most successful multi-brand direct-to-consumer ...

NAPCO Media

Marketing Specialist Job at NAPCO Media

 ...organization serving the printing, retail, travel and hospitality andnonprofit industries. We specialize in the creation and cross-channel...  ...and clients we serve. Are you a proactive, take-charge marketer who thrives in a collaborative environment? Do you have the...