Experience : 4-6yrs 
Job description :
Must have skills:
Look for 4+ yrs of experience in Pyspark, Python or Scala with Spark.    Good exposure to Google Cloud(Mandate)    
Mandatory experience in SQL  
 Knowledge is on GCP & Big Query mandatory    
Exposure in Schemas like Rest or Graph QL will be an added advantage.       Responsibilities  
- 4+ years of experience as Data Engineer  
 
- Cloud certification (GCP is added plus)  
 
- Disqualifiers at resume level: [Must have] 
 
- Apache Spark: Proven experience for big data processing.  
 
- Strong ability to write complex SQL queries for data manipulation and analysis.  
 
- Programming Language: Expertise in at least one of the following languages: Scala (highly preferred), Python  
 
- Excellent analytical and problem-solving skills. We expect engineers to work with minimal or no support.  
 
- We consider resumes best when candidates have:  
 
- Experience with Spark SQL, Data Frames, and Datasets   o Familiarity with distributed systems concepts  
 
- Experience with big data processing tools and techniques  
 
- Cloud technologies: Experience with Google Cloud Platform (GCP), particularly Big Query (BQ) for data warehousing and analytics.  
 
- Knowledge of AI/ML is added plus.