Data Engineer w/ Google Cloud Platform
Hybrid in Alpharetta, GA, US • Posted 60+ days ago • Updated 24 days ago

A2C Consulting
Dice Job Match Score™
📊 Calculating match score...
Job Details
Skills
- google cloud platform
- gcp
- data engineer
Summary
Job Summary:
- As a Data Engineer will have the opportunity to design and execute vital projects such as re-platforming our data services on Cloud and on-prem and delivering real-time streaming capabilities to our business applications.
- This position will bring a clear point of view on data processing optimization, data modeling, data pipeline architecture, data SLA management.
- The Data Engineer holds accountability for the quality, usability, and performance of the solutions.
- Our mission is design & implement a data and analytics platform/infrastructure that enables a future-state analytics lifecycle, data monetization opportunities, data acquisition, analysis & feature engineering, model training, impact analysis, reporting, predictive and quantitative analysis, & monitoring.
What You Get to Do:
- An ideal candidate is intellectually curious, has a solution-oriented attitude, and enjoys learning new tools and techniques.
- You will have the opportunity to design and execute vital projects such as re-platforming our data services on Cloud and on-prem, and delivering real-time streaming capabilities to our business applications.
- Brings a clear point of view on data processing optimization, data modeling, data pipeline architecture, data SLA management.
- Holds accountability for the quality, usability, and performance of the solutions.
- Leads design sessions and code reviews to elevate the quality of engineering across the organization.
- Design, develop data foundation on cloud data platform using Google Cloud Platform tools and techniques e.g.: Google Cloud Platform, Pub/Sub, Big Query, Cloud SQL, BigTable, BigLake DataForm, DataFlow, DataStream, Google cloud storage, Cloud Composer/DAG, Cloud Run, Cloud RESTAPI, ADO GITREPO, CI/CD Pipelines, Secret Manager, Cloud IAM Terraform/YAML etc.
- ETL pipeline using Python builds and scalable solutions.
- Multi-level Data Curation and modeling.
- Data design and architecture.
- Hands on experience in building complete CI/CD Pipeline creation and maintenance using Azure DevOps and Terraform/Terragrunt.
- Increase the efficiency and speed of complicated data processing systems.
- Collaborating with our Architecture group, recommend and ensure the optimal data architecture.
- Analyzing data gathered during tests to identify strengths and weaknesses of ML Models
- increase the efficiency and speed of complicated data processing systems.
- Collaborate across all functional areas to translate complex business problems into optimal data modeling and analytical solutions that drive business value.
- Lead the improvements and advancement of reporting and data capabilities across the company, including analytics skills, data literacy, visualization, and storytelling.
- Develop a certified vs. self-service analytics framework for the organization.
- Collaborating with our Architecture group, recommend and ensure the optimal data architecture.
- Highly skilled on RDMS (Oracle, SQL server), NoSQL Database, and Messaging services (Publish / Subscribe) systems.
- Extensive knowledge/coding skills of Python including understanding of data modeling and data engineering.
What You Bring to the Table:
- Bachelor s degree in computer science, Engineering, Mathematics, Sciences, or related field of study from an accredited college or university; will consider a combination of experience and/or education.
- Ideally 3+ years of experience in developing data and analytics solutions and approximately 4+ years data modeling and architecture.
- Expertise in programming languages including Python and SQL.
- Familiarity with certain software development methodologies such as Agile, or Scrum.
- Critical thinking.
- Leveraging cloud-native services for data processing and storage.
- Storage BigQuery, GCS, Cloud SQL, BigTable, BigLake
- Event processing Pub/Sub, EventArc
- Data pipeline and analytics Dataflow, DataForm, Cloud Run, Cloud Run Function, DataStream, Cloud Scheduler, Workflows, Composer, Dataplex, ADO GITREPO, CI/CD Pipelines, Terraform/YAML
- Security Secret Manager, Cloud IAM
- Others Artifact Registry, Cloud Logging, Cloud Monitoring
- Work with distributed data processing frameworks like Spark.
- Strong knowledge of database systems, and data modeling techniques.
- Ability to adapt to evolving technologies and business requirements.
- Ability to explain technical concepts to nontechnical business leaders.
- Monitor system performance and troubleshoot issues.
- Ensure data security.
- Proficiency in technical skills, cloud tools and technologies.
Got Extra to Bring?
- Google Cloud Platform Professional Data Engineer Certification
- Ideally 2+ years in the Energy Sector.
- Documenting all steps in the development process
- Manage the data collection process providing interpretation and recommendations to management
- Dice Id: 10275036
- Position Id: 8597641
- Posted 30+ days ago
Company Info
About A2C Consulting
We are technology professionals who have been in the industry for over 25 years and have built a sound reputation for delivering on our commitments. We sold our former successful staffing and consulting practice in the early 2000s when we realized the world was changing and that our ability to find and attract talent was at the core of what is needed in today's highly competitive technology landscape.
We created A2C with the belief that a new strategy would result in the best outcome for both our customers and our new organization. Our goal has been to meet our customers’ newest technology needs and provide them with the ability to quickly and effectively ramp up to deliver expertise in both the cloud and data spaces.


Similar Jobs
It looks like there aren't any Similar Jobs for this job yet.
Search all similar jobs