Urgent requirement for Data Engineer with Epic at Remote

C2C
  • C2C
  • Anywhere

Job Title: Data Engineer with Epic

Location: Remote

Duration: 1+ Year

Job Description:

(Should have EPIC experience and worked on Truveta feeds)

• Extensive experience in building scalable data engineering frameworks and applications on the Azure cloud platform.

• Should have experience US healthcare domain experience primarily into provider domain

• Should have built Truveta feeds’ pipeline from EPIC Clarity

• Should have experience working with EPIC Clarity and is EPIC Certified

• Key technologies include Databricks and Airflow for Orchestration alongside proficiency in implementing CI/CD pipelines using Azure DevOps

• Design and develop reusable, modular, and scalable frameworks in Python for data engineering and data processing tasks.

• Python Application Developer with a strong focus on framework development for Data Engineering.

• Implement and maintain frameworks that facilitate the ingestion, transformation, and processing of large datasets using Azure services.

• Develop and optimize data engineering pipelines using Azure Function Apps, Databricks, and Airflow.

• Architect and deploy serverless solutions using Azure Function Apps to support data engineering frameworks.

• Develop and manage CI/CD pipelines using Azure DevOps for continuous integration and deployment of data engineering frameworks.

• Implement automated testing, deployment, and monitoring processes to ensure high-quality code and rapid iteration


From:
Praveen Kumar,
Magicforce
praveen@magicforce.us
Reply to:   praveen@magicforce.us