Ripple’s mission is to enable payments every way, everywhere for everyone. We believe connecting traditional financial entities like banks, payment providers and corporations with emerging blockchain technologies and users is the path to an open, decentralized, and more inclusive financial future. This Internet of Value gives any internet-enabled person, application or device access to financial services that are transparent, fast, reliable, and cheap. Delivering this vision is a challenge of massive scale spanning $155 trillion in annual cross border fiat payments and the $1.5 trillion market of digital assets that has grown 10X in the last year.
Ripple’s distributed financial technology outperforms today’s banking infrastructure by driving down costs, increasing processing speeds and delivering end-to-end visibility into payment fees, timing and delivery. Ripple’s Database Engineering team manages distributed database platform infrastructure that enable banks to instantly settle cross-border payments to create the Internet of Value. The team partners with stakeholders across the company for cutting edge product development, business strategy, complex inter dependencies, high availability, security, performance, scalability, cost optimization requirements and customer servicing.
We are looking for a passionate engineer with experience in SQL & NoSQL database technologies and cloud native technologies. This position will be responsible for designing, developing, managing and scaling enterprise level database infrastructure in order to power high-impact initiatives across the broader organization.
WHAT YOU’LL DO:
Design, implement, and maintain complex relational database systems with Cloud Native principles and best practices to ensure highly available, secure, performant and scalable database systems.
Plan and deploy database infrastructure build out, perform upgrades and migrations
Resident expert in database performance, scalability, complex query tuning, monitoring and incident response.
Collaborate with multiple functional teams to coordinate database changes adhering to best practices in an agile development environment.
Contribute to infrastructure changes in AWS with deep understanding of AWS services.
Manage complex data replication environments serving batch as well as real time use cases.
Engage in on-call for production systems and take action for incoming pagers and alerts.
Independently troubleshoot incoming production and pre-production issues and provide timely resolution.
Contribute to major system upgrades, deployment automation, monitoring enhancements and Production changes.
Create operational playbooks, contribute to how-to articles, and gain domain knowledge to drive changes in the team.
Participate in developing monitoring dashboards and alerts, to improve our ability to proactively analyze and detect problems impacting stability of the application.
Responsible for tuning and configuring our databases as well as building tools and scripts to monitor, troubleshoot and automate our systems.
Coordinate with customer success and developer teams to triage, escalate, and ensure timely resolution of any incidents.
WHAT WE’RE LOOKING FOR:
Bachelor or Master’s degree in computer science or a related technical field.
8+ years of solid hands-on database design, development & enterprise class operational experience with one or more of the following database technologies: Aurora PostgreSQL/Aurora MySQL/PostgreSQL/MySQL
2+ years of deep experience in any one of the following cloud platforms: AWS/GCP/Azure
Experience managing a highly reliable database platform with focus on security, performance, and scale to meet the requirements of enterprise customers.
Experience migrating mission-critical database applications from on-premise to cloud or managing a hybrid cloud environment for large enterprises applications.
Experience with driving great cross functional relationships with internal leadership, stakeholders, and partners to create relatable and scale database offerings for enterprise customers.
Experience with AWS services and hands-on knowledge of hosting on cloud.
Experience with data modeling for OLTP and data warehousing including performance tuning.
Experience with AWS DMS, Goldengate and other replication technologies.
Extensive performance monitoring, troubleshooting and tuning experience.
Experience with any one of the programming languages is highly desirable: Python or Ruby.
Experience with any one of the monitoring tools is highly desirable: Sumo Logic, Splunk, Wavefront, SentryOne, Wavefront, Nagios, Datadog, AppDynamics
Experience with influencing product roadmap and strategy, prioritizing pre-existing database initiatives to drive extraordinary business outcomes.
NICE TO HAVE:
Experience operating software as a service is a plus
Experience working in FinTech is a plus
Knowledge of Docker & Kubernetes is a plus.
Knowledge of NoSQL & Big Data technologies is a plus.
Familiarity with blockchain technology is a plus.
Cloud certification is highly desirable.
The chance to work in a fast-paced start-up environment with experienced industry leaders
A learning environment where you can dive deep into the latest technologies and make an impact
Competitive salary and equity
Health plans for employees and dependents
Industry-leading parental leave policies
Generous wellness reimbursement program
Employee giving donation match
Training in our Headquarters in San Francisco, CA, USA
Weekly company meeting – ask me anything style discussions with our Leadership Team
YOU CAN APPLY DIRECTLY BY CLICKING THE LINK AND GOING TO RAINMAKRR.XYZ AND THEN CLICKING THE APPLY LINK TO GO DIRECTLY TO THE CAREERS PAGE OF THE STARTUP OR UPLOAD YOUR RESUME BELOW TO BE CONSIDERED FOR OTHER ROLES
Privacy & Cookies Policy
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.