Sr. AWS Data Engineer with ETL Development || Livonia, MI (Hybrid)—Locals Only || Contract || Locals Only

C2C
  • C2C
  • Anywhere

Hi,

Hope you are doing well,

Please find the job description given below and let me know your interest.

Position: Sr. AWS Data Engineer with ETL Development

Location: Livonia, MI (Hybrid)—Locals Only

Duration: 12+ Months

 Job Description: 

Responsibilities: 

  • Translate business needs into ETL and report requirements, as well as provide expert advice and education in the usage and interpretation of data. 
  • Ability to diagram complex technical processes and translate abstract problems into discrete technology actions.
  • Design and implement data pipelines with DBT, Python, and GitHub using software and data engineering best practices to ensure that data is accurate, reliable, and timely.
  • Enhance development and implementation environments by implementing new and custom DBT libraries, and writing/modifying Python and shell scripts. 
  • Monitor daily job execution and fix issues as they arise to ensure SLAs are met with internal stakeholders.
  • Enhance data warehouse design using data modeling best practices and identify gaps in our data collection capabilities. 
  • Collaborate with product and engineering, governance, and security on requirements for data capture, retention, and security. 
  • Develop reports and dashboards using to answer key business questions clearly, accurately, and with minimal latency. 
  • Work with Cloud Engineering to configure docker images.
  • Coordinate with Data Platform Engineering team, Enterprise Data Analyst Team, Data Governance Team, and IT Teams to align with change management processes and to ensure performance, processes and access is optimal for our reporting solutions.

Requirements: 

  • Bachelor’s degree in Engineering, Computer Science, Math or a related technical field required.
  • A minimum of 7 years’ experience in data engineering and analytics, with a focus on ETL development, data modeling, and data analysis.
  • Expertise in SQL and ETL optimization techniques, especially within cloud-based data warehouses like Snowflake, BigQuery, and Redshift, as well as use of command line and version control software (git).
  • Experience with cloud-based data storage solutions such as AWS S3, Azure Blob Storage, or Google Cloud Storage.
  • Proficiency in Python including the Pandas and Numpy libraries.
  • Ability to leverage tools, business intuition, and attention to detail for data validation and QA.
  • Excellent communication skills, particularly when explaining technical or complex matters to less technical co-workers.
  • A high degree of motivation to be proactive and go above and beyond the task at hand.
  • Understanding of major marketing channels, such as TV, direct mail, social media, search.
  • Familiarity with data schemas and logging of common marketing platforms, such as Salesforce, Adobe Campaign, and Google Analytics.
  • Experience mentoring and guiding junior data engineers and analysts.

Please share your updated resume and suggest the best number & time to connect with you.

Thanks & Regards,

https://d36urhup7zbd7q.cloudfront.net/a/c9789719-e51f-4b56-8398-327c4dc041dd.jpeg

Raveena Mourya
US IT Recruiter, DMS Visions Inc

972-452-6160/972-325-9476  |  dmsvisions.com/  |  raveena@dmsvisions.com

4645 Avon Lane, Suite 210, Frisco, TX 75033

linkedin.com/in/raveena-mourya-766314250


From:
Raveena Mourya,
DMS Visions Inc
raveena@dmsvisions.com
Reply to:   raveena@dmsvisions.com