• Responsible for the building, deployment, and maintenance of mission critical analytics solutions that process data quickly at big data scales
• Contributes design, code, configurations, and documentation for components that manage data ingestion, real time streaming, batch processing, data extraction, transformation, and loading across multiple game franchises.
• Owns one or more key components of the infrastructure and works to continually improve it, identifying gaps and improving the platform’s quality, robustness, maintainability, and speed.
• Cross-trains other team members on technologies being developed, while also continuously learning new technologies from other team members.
• Interacts with engineering teams across WB and ensures that solutions meet customer requirements in terms of functionality, performance, availability, scalability, and reliability.
• Performs development, QA, and dev-ops roles as needed to ensure total end to end responsibility of solutions.
• Works directly with business analysts and data scientists to understand and support their usecases
JOB REQUIREMENTS
• 5+ years of experience coding in Java, Python, or Scala, with solid CS fundamentals including data structure and algorithm design
• 4+ years contributing to R&D and production deployments of large backend data processing and analysis systems
• 3+ years of hands-on implementation experience working with a combination of the following technologies: Hadoop, Map Reduce, Pig, Hive, Impala, Spark, Kafka, Storm, SQL and NoSQL data warehouses such as Hbase and Cassandra.
• 3+ years of experience standing up and automating the deployment of solutions in AWS using the AWS CLI, with a focus on EMR, EC2, S3, EBS, Redshift, Dynamo, VPCs
• Knowledge of SQL and MPP databases (e.g. Vertica, Netezza, Greenplum, Aster Data)
• Knowledge of professional software engineering best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations.
• Experience participating in an Agile software development team, e.g. SCRUM
• Experience designing, documenting, and defending designs for key components in large distributed computing systems
• A consistent track record of delivering exceptionally high quality software on large, complex, cross-functional projects
• Demonstrated ability to learn new technologies quickly and independently
• Demonstrated ability to achieve stretch goals in a very innovative and fast paced environment
• Ability to handle multiple competing priorities in a fast-paced environment
• Excellent verbal and written communication skills, especially in technical communications
• Strong inter-personal skills and a desire to work collaboratively
• Undergraduate degree in Computer Science or Engineering from a top CS program required. Masters preferred.
• Experience with supporting data scientists and complex statistical usecases highly desirable