CAREERS

Distributed Systems Engineering Intern at Splice Machine
San Francisco, CA, US / St. Louis, MO, US
Use your distributed systems programming experience to maximum effect by working on Splice Machine’s ground-breaking Hadoop-based relational database.

Company Description

Named as one of the the 20 Red Hot, Pre-IPO Companies in 2015 B2B Tech by IDG, Splice Machine is disrupting the $30 billion traditional database with the first dual-engine database on Hadoop and Spark. Leveraging in-memory technology from Spark and scale-out capabilities from Hadoop, Splice Machine can replace Oracle® and MySQL™ databases, while increasing performance by 10-20 times at one-fourth the cost. We are headquartered in the South of Market (SOMA) neighborhood of San Francisco.

Job Description

As a member of the Product Development team you will help build out Splice Machine's Hadoop-based Relational Database Management System. Splice Machine's RDBMS is ACID-compliant and supports Analytical, Transactional, and mixed workloads.

Responsibilities:

Design and develop key features for the RDBMS, ensuring your work is performant and scalable in a concurrent multi-node Hadoop execution environment
Collaborate with other engineers through the product lifecycle, including architecture, product support issues, beta testing, bug fixes, etc.
Ensure your work promotes product stability, reliability, and maintainability
Qualifications:

Currently in school working toward an undergraduate or graduate degree in Computer Science or equivalent
3+ years of Java programming
Some experience and comfort with concurrent programming principles
Previous experience developing commercial products desirable
Hadoop ecosystem programming experience highly desirable, especially Apache Spark and Apache HBase
Database development experience highly desirable
SQL highly desirable