Join us in bringing joy to customer experience. Five9 is a leading provider of cloud contact center software, bringing the power of cloud innovation to customers worldwide.
Living our values everyday results in our team-first culture and enables us to innovate, grow, and thrive while enjoying the journey together. We celebrate diversity and foster an inclusive environment, empowering our employees to be their authentic selves. Five9 is a leading provider of cloud software for the enterprise contact center market. Our platform delivers a secure, reliable, compliant, and scalable solution that empowers organizations to create exceptional customer experiences, boost agent productivity, and achieve meaningful business results.
At Five9, we live our values every day-fostering a team-first culture that promotes innovation, growth, and collaboration. We celebrate diversity and maintain an inclusive environment that empowers our employees to bring their authentic selves to work.
Position Overview:
We are seeking a skilled and proactive Data Engineer to join our growing data team. This role will focus on designing and implementing data extraction and integration solutions from enterprise systems such as Salesforce, Jira, NetSuite, and Logisense, leveraging APIs and Python within Google Cloud Platform (GCP). The Data Engineer will build and manage scalable data pipelines that ensure seamless access to data for analytics, business intelligence, and data science initiatives.This role will collaborate closely with business stakeholders and cross-functional teams to support a wide range of strategic projects.
The ideal candidate brings strong technical expertise in data engineering, a solid understanding of data warehousing principles, and the ability to translate technical processes into clear documentation.
Key Responsibilities:
- Collaborate with stakeholders across the Finance team to understand business requirements and translate them into well-defined, actionable data sets for analysis and reporting.
- Design, develop, and maintain scalable data extraction pipelines in Google Cloud Platform (GCP) to process structured and unstructured data from diverse sources including databases, APIs, and cloud storage systems.
- Leverage a suite of GCP services such as Big Query, Dataflow, Pub/Sub, Cloud Functions, Cloud Run, Airflow, and Cloud Storage to build efficient, secure, and high-performing data workflows.
- Continuously monitor and optimize data pipelines for performance, scalability, reliability, and cost-efficiency.
- Manage and monitor scheduled production jobs (Daily, Weekly, Bi-Weekly, and Monthly), ensuring timely and accurate data processing across all cycles.
- Provide production support during critical month-end data load windows (Day 1 to Day 5), ensuring data availability and resolving issues swiftly to meet business reporting deadlines.
- Maintain clear and thorough technical documentation of data pipelines, workflows, and system architecture to support ongoing development and cross-functional collaboration.
- Identify opportunities for improvement in existing applications and workflows, recommending and implementing scalable solutions that enhance system functionality and user experience.
- Provide insights by collecting, analyzing, and summarizing data-related development and operational issues to support troubleshooting and continuous improvement.
- Manage multiple tasks and projects simultaneously, effectively prioritizing work across the full lifecycle of data engineering initiatives.
- Respond to and fulfill ad-hoc data requests from business stakeholders, delivering timely and accurate datasets or insights to support decision-making and operational needs.
Required Qualifications:
- Bachelor’s and/or master’s degree in computer science, Computer Engineering, or a related technical discipline.
- Strong experience working with enterprise systems, with an understanding of the architectural differences between transactional systems and analytical data warehouses.
- 2+ years of hands-on experience with Google Cloud Platform (GCP) services, including BigQuery, Dataflow, Pub/Sub, and Cloud Functions.
- 2+ years of experience in professional or open-source development, with a focus on reading data from APIs and building data applications using Python, JavaScript and Java.
- Proven expertise in building, deploying, and maintaining data pipelines using open-source tools such as Apache Airflow, Cloud Composer, DataProc, and Dataflow.
- Proficiency with GitLab, CI/CD tools and methodologies.
- Experience designing and writing efficient ETL/ELT jobs to ingest and transform data into Google Cloud Storage and Big Query, ensuring scalability, performance, and data integrity.
- Advanced SQL proficiency, with the ability to write, optimize, and manage complex queries in high-volume data environments.
- Strong skills in data visualization, with experience creating dashboards using Google Data Studio, Looker, DOMO, or similar BI tools.
- Experience supporting scheduled production jobs during critical finance month-end data load cycles, ensuring data accuracy, timeliness, and system reliability throughout the Day 1 to Day 5 close period.
- Solid understanding of cloud architecture design principles, including performance and cost optimization.
- Exceptional attention to detail, with a commitment to delivering high-quality, reliable work.
- Self-starter with the ability to thrive in unstructured environments, manage ambiguity, and independently drive initiatives forward.
- GCP certifications (e.g., Professional Cloud Developer, Professional Cloud Database Engineer) are a strong plus.
Five9 embraces diversity and is committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better we are. Five9 is an equal opportunity employer.View our privacy policy, including our privacy notice to California residents here: https://www.five9.com/pt-pt/legal. Note: Five9 will never request that an applicant send money as a prerequisite for commencing employment with Five9.