Senior Data Architect PySpark

C2C
  • C2C
  • Anywhere

Title: Senior Data Architect – PySpark

Location: Charlotte NC (onsite)

 

JD: The ideal candidate will have 10+ years of experience in software architecture, data engineering, and large-scale data processing systems, with a strong focus on PySpark. Experience in Finance Technology or Enterprise Function technology domains will be a significant advantage. This role requires a leader with a strategic mindset who can design, implement, and oversee high-performance, distributed data processing systems.

 

Key Responsibilities:

  • Lead the architecture, design, and implementation of large-scale distributed data systems using PySpark.
  • Collaborate with business stakeholders, technology teams, and data engineers to gather requirements, define objectives, and build scalable data pipelines.
  • Drive end-to-end solution design, including data acquisition, storage, processing, and analysis.
  • Optimize performance of big data processing systems, ensuring low-latency and high-throughput data flows.
  • Ensure alignment with industry best practices and compliance standards in data security and privacy.
  • Mentor and guide a team of developers and engineers, promoting best practices in coding, architecture, and design patterns.
  • Evaluate new tools and technologies, identifying opportunities for innovation and driving their implementation.
  • Collaborate closely with cross-functional teams, including finance and enterprise functions, to ensure solutions meet business objectives.
  • Support critical decision-making and roadmapping to enhance the organization’s data processing capabilities.

Qualifications:

  • 10+ years of experience in software architecture, with a strong focus on PySpark and big data processing systems.
  • Proficient in Apache Spark, Hadoop, and other distributed computing frameworks.
  • Deep understanding of data architecture, ETL/ELT processes, and cloud-based data platforms.
  • Proven experience in Finance Technology or Enterprise Functions is highly desirable.
  • Strong knowledge of relational databases, NoSQL databases, and data warehousing solutions.
  • Solid experience in working with cloud platforms such as AWS, Azure, or Google Cloud.
  • Hands-on experience with streaming technologies like Kafka and real-time data processing.
  • Excellent problem-solving skills, strategic thinking, and a results-oriented approach.
  • Proven leadership abilities, with experience managing technical teams in a fast-paced environment.
  • Strong communication skills, capable of presenting ideas clearly to both technical and non-technical stakeholders.

Preferred Skills:

  • Experience with financial systems or enterprise applications.
  • Familiarity with machine learning frameworks and AI-driven data insights.
  • Experience with DevOps practices and CI/CD pipelines in data engineering

 

Thanks,

Barla Santosh

Technical Recruiter

E: Sbarla@gacsol.com  

www.gacsol.com

wHJ8bd2V5UduQAAAABJRU5ErkJggg==

‘Experts in Digitalization and Engineering – Enterprise 4.0’

 


From:
Barla Santosh,
Gacsol
sbarla@gacsol.com
Reply to:   sbarla@gacsol.com