Find Jobs
Hire Freelancers

Spark Script for Hive to S3 Data Migration

$30-250 AUD

Closed
Posted 22 days ago

$30-250 AUD

Paid on delivery
Create a Spark script to transfer metastore data from Hive to S3 - Create a connection to Hive metastore - Fetch [login to view URL] definition for the database - Create a connection to S3 bucket - Create a new [login to view URL] within S3 Hive metastore - Transfer data from Hive metastore to S3 - Configure multiple [login to view URL] creation based on config variables - Create recursive data transfer based on difference in data Skills and Experience: - Proficiency in Spark and Hive - Extensive experience with S3 buckets - Understanding of data backup strategies Project Details: - The script needs to read the schema and perform metadata transfer for selected schema to s3 bucket. - Only bid if you have work experience with spark, hive, s3. - multiple schemas needs to be migrated. - I have local instance of netapp s3 available and bucket created.
Project ID: 38029405

About the project

5 proposals
Remote project
Active 15 days ago

Looking to make some money?

Benefits of bidding on Freelancer

Set your budget and timeframe
Get paid for your work
Outline your proposal
It's free to sign up and bid on jobs
5 freelancers are bidding on average $150 AUD for this job
User Avatar
Hello, I have 10 years of experience in Spark and AWS S3. I will create a spark script which read the schema and perform metadata transfer for selected schema to s3 bucket. Regards, VishnuLal
$250 AUD in 3 days
5.0 (4 reviews)
3.5
3.5
User Avatar
With over 2 years of hands-on experience in Pyspark and AWS, I specialize in developing Python scripts and Pyspark code tailored for bank environments. My expertise lies in leveraging these technologies to efficiently process and analyze data, ensuring robust and scalable solutions for banking operations.
$120 AUD in 7 days
0.0 (0 reviews)
0.0
0.0
User Avatar
Hi, I have expertise in spark, hive, s3 with 9+ years on the same. lets connect to discuss on it. Thanks
$140 AUD in 7 days
0.0 (0 reviews)
0.0
0.0
User Avatar
Hello, how are you? Thank you for the job posting. With a robust background in Spark, Hive, and S3, I am well-equipped to undertake your metadata transfer project. My experience includes crafting Spark scripts for seamless data migration, leveraging Hive for efficient schema management, and managing S3 buckets for optimal data storage. I understand the nuances of data backup strategies, ensuring the integrity and security of your information throughout the transfer process. I am confident in my ability to create a flexible and efficient script that accommodates multiple schemas, adhering to your project specifications. My past work in similar projects underscores my capability to deliver results promptly and reliably. I look forward to the opportunity to contribute to your project's success. Best regards, Darko Djokic
$140 AUD in 7 days
0.0 (0 reviews)
0.0
0.0
User Avatar
With five years of experience as a data analyst, I am excited to propose my expertise for the development of a Spark script to migrate data from Hive to an S3 bucket. My extensive experience includes working on Spark scripts for data migration from various sources like Hive, PostgreSQL, MySQL etc. My recent project also involved migrating data from Hive to an S3 bucket, where I successfully optimized storage for improved efficiency. This hands-on experience ensures that I am well-equipped to handle the challenges and requirements of this project effectively.
$100 AUD in 4 days
0.0 (0 reviews)
0.0
0.0

About the client

Flag of AUSTRALIA
Middle Park, Australia
5.0
21
Payment method verified
Member since Sep 9, 2009

Client Verification

Thanks! We’ve emailed you a link to claim your free credit.
Something went wrong while sending your email. Please try again.
Registered Users Total Jobs Posted
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Loading preview
Permission granted for Geolocation.
Your login session has expired and you have been logged out. Please log in again.