AWS data engineering with experience of s3, lambda, boto3, serverless emr , serverless redshift , pyspark , step functions
₹400-750 INR / hour
Closed
Posted 10 months ago
₹400-750 INR / hour
I am looking for an experienced AWS data engineer who can assist me with Serverless Redshift and PySpark. I do not need help with setting up a system of automation, but I may require assistance with running analytics on the data. The ideal candidate should have experience with the following:
- Serverless Redshift
- PySpark
Skills and experience required for this project:
- Strong knowledge of AWS services, particularly Serverless Redshift and PySpark
- Experience in data engineering and analytics
- Familiarity with S3, Lambda, Boto3, and step functions would be a plus
- Ability to work independently and efficiently
- Excellent problem-solving and communication skills
Working time = 8:30 PM EST to 10:30 PM EST (6 AM IST to 8 AM IST)
Duration = 3 to 6 months
I understand that you are looking for an experienced AWS data engineer who can assist you with Serverless Redshift and PySpark. I have extensive knowledge in data engineering and analytics that would be perfect for this project. I have experience in data engineering and analytics which would be great for this project.
I am available 8:30 PM EST to 10:30 PM EST (6 AM IST to 8 AM IST) for the duration of 3 to 6 months. My skills include strong knowledge of AWS services such as Serverless Redshift and PySpark as well as experience in data engineering and analytics.
I believe I would be the best fit for this project due to my extensive experience in data engineering and analytics as well as familiarity with S3, Lambda, Boto3 and step functions.