Big data Developer : AZ

C2C
  • C2C
  • Anywhere
Role name: Developer
Role Description: 5-7 years of experience in Data technologies – Modeling, Data Warehousing, Data Marts, ETL pipelines, Data Visualization.Hands on exp in Big data, Data lake, data bricks, Haddop, HiveCloud experience will be added advantage – with GCPExperience in working with relational databases like PostgreSQL, MySQL, SQL Server, AWS RDS, Big QueryExperienced with Docker and Kubernetes ,containerization Extensive experience in setting up CI/CD pipelines using tools such as Jenkins, Bit Bucket, GitHub, Maven, SVN and Azure DevOps.
Competencies: Digital : BigData and Hadoop Ecosystems, Digital : BigData and Hadoop Ecosystem – MapR
Experience (Years): 6-8
Essential Skills: 5-7 years of experience in Data technologies – Modeling, Data Warehousing, Data Marts, ETL pipelines, Data Visualization.Hands on exp in Big data, Data lake, data bricks, Haddop, HiveCloud experience will be added advantage – with GCPExperience in working with relational databases like PostgreSQL, MySQL, SQL Server, AWS RDS, Big QueryExperienced with Docker and Kubernetes ,containerization Extensive experience in setting up CI/CD pipelines using tools such as Jenkins, Bit Bucket, GitHub, Maven, SVN and Azure DevOps.
Desirable Skills: 5-7 years of experience in Data technologies – Modeling, Data Warehousing, Data Marts, ETL pipelines, Data Visualization.Hands on exp in Big data, Data lake, data bricks, Haddop, HiveCloud experience will be added advantage – with GCPExperience in working with relational databases like PostgreSQL, MySQL, SQL Server, AWS RDS, Big QueryExperienced with Docker and Kubernetes ,containerization Extensive experience in setting up CI/CD pipelines using tools such as Jenkins, Bit Bucket, GitHub, Maven, SVN and Azure DevOps.
Country: United States
Branch | City | Location: TCS – Phoenix,AZ
PHOENIX
Phoenix,AZ


From:
Chandra N,
Siri Info
chandra.n@siriinfo.com
Reply to:   chandra.n@siriinfo.com