Our client in Dearborn, MI is looking for a Data Engineer General. This is a contract position.
What You Will Do:
The GDIA Data Factory Platform covers all business processes and technical components involved in ingesting a wide range of enterprise data into the GDIA Data Factory (Data Lake) and the transformation of that data into consumable data sets in support of analytics. The Data Factory Enablement Team, as the name suggests enables teams build their solutions in the GCP Data Factory Platform by proving Tools, Guidelines, processes and support. We are looking for candidates who have a broad set of technology skills across areas and come from a background of DevOps, with exposure to infrastructure and solution monitoring. This person will be expected to provide consultative services to the Software Development and Data Engineering teams.
Key responsibilities include:
- Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successfully deployment of our client’s Data Platform.
- Implement methods for standardization of all parts of the pipeline to maximize data usability and consistency.
- Test and compare competing solutions and report out a point of view on the best solution.
- Design and Build CICD Pipelines for Google Cloud Platform (GCP) services: BigQuery,DataFlow,Pub/Sub, DBT and others.
- Work with stakeholders including Analytics, Product, and Design teams to assist with data related technical issues and support their data infrastructure needs.
- Develop IAC tekton pipelines to execute pattern playbooks and templates.
- Designing cloud performance and monitoring strategies.
- Designing and implementing workflows to automate the infrastructure release and upgrade process for applications in Dev, UAT and Production environments.
- Mentor and grow technical skills of engineers across multiple sprint teams by giving high quality feedback in design and code reviews and providing training for new methods, tools, and patterns.
- Bachelors in Computer Science, Engineering or equivalent degree.
- In-depth understanding of GCP product technology and underlying architectures.
- Strong Experience with development eco-system such as Git, GCP Cloud Build, Terraform and Tekton for CI/CD.
- Experience in working with Agile and Lean methodologies.
- Experience with GCP services like Big Query, GCS.
- Good communication skills as candidate will be working with multiple Data Factory teams.
- At least 2 years of tekton experience.
- At least 5 years of Terraform experience or 3 years with terraform certification.
- At least 3 years of experience in Google cloud and Google cloud professional architect certification.
- Python programming experience.
- Experience with GCP services like Big Query & GCS.
- HashiCorp Certified: Terraform Associate.
- Google Cloud Certified Associate Cloud Engineer.
How You Will Be Successful:
- Envision the Future
- Communicate Honestly and Broadly
- Seek Technology and Business “First”
- Embrace Diversity and Take Risks
What We Offer:
- Competitive Salary
- Comprehensive Benefit Package
- 401(k) with matching contributions
- Paid Time Off
- Employee Discounts
- Free training on all Altair products
Apply for job
To view the job application please visit phh.tbe.taleo.net.