ETLs with Java and Hive queries to Spark ETLs, MapReduce, Spark

Closed Posted 2 years ago Paid on delivery
Closed Paid on delivery

Hi all,

Looking for support on below skill set

Transition of legacy ETLs with Java and Hive queries to Spark ETLs.

Design and develop data processing solutions and custom ETL pipelines for varied data formats like parquet and Avro.

Design, develop, test and release ETL mappings, mapplets, workflows using Streamsets, Java MapReduce, Spark and SQL.

Let me know if you have experience in it

Java Spark Informatica Powercenter ETL Amazon Web Services Software Architecture

Project ID: #32188044

About the project

4 proposals Remote project Active 2 years ago

4 freelancers are bidding on average ₹7625 for this job

manojmo

hi, i have 20+ years experience working with java, web and database tech. I have worked a bit on spark, cresting some reports. I have also build custom ETL jobs and using some like Talend. I have working knowledge, but More

₹7000 INR in 7 days
(4 Reviews)
4.5
hjdsolution

Hello, I am having 9+ years of working experience and certified bigdata developer. Please message me here to discuss further project requirements I will complete your work with reasonable fees Best Regards, Jay

₹5000 INR in 7 days
(6 Reviews)
4.6
shahparam

Hi I have good experience with Spark based Data processing & ETL applications on local, hadoop & AWS. I prefer to use scala but happy to support your requirements.

₹6000 INR in 2 days
(4 Reviews)
2.1
roshanr1993

Hi, I have 6 years if experience in the skill set you have mentioned and working as senior bigdata engineer. I think you can let me know about the requirement and we can discuss more during call. Thanks

₹12500 INR in 7 days
(1 Review)
0.8