Remote – LATAM
About Us
AccelOne provides custom software development and design services for various companies across the US and Latin America. We are built on core principles of transparency, communication, and accountability, and we aim to deliver exceptional solutions for our clients.
About Our Project
Our client is a U.S.-based industry leader in digital experience personalization. Their platform enables global retail and e-commerce brands to deliver real-time, data-driven experiences across digital channels. With a strong focus on optimization and customer engagement, they serve a wide range of enterprise clients and operate at scale across international markets. Their solutions leverage advanced analytics and decisioning engines to help companies drive performance and personalization.
Role Summary
As a Data Engineer, you will design, build, and maintain scalable real-time and batch data pipelines leveraging Python, Kafka, AWS, Snowflake, and MySQL. You will collaborate with cross-functional teams to create reliable data products that support advanced analytics and personalization. This is an excellent opportunity for someone passionate about building high-quality data infrastructure in a fast-paced environment.
Responsibilities
- Design, build, and maintain scalable real-time and batch data pipelines using Kafka, AWS, and Python.
- Develop event-driven data workflows leveraging Kafka topics, Lambda functions, and downstream sinks (Snowflake, S3).
- Build and maintain robust ETL processes across Snowflake and MySQL.
- Ensure high-quality data delivery by writing modular, well-tested Python code.
- Monitor pipelines using CloudWatch, Grafana, or custom metrics.
- Contribute to technical design, architectural reviews, and infrastructure improvements.
- Maintain data integrity and scalability in high-volume environments.
- Continuously explore and implement best practices in data engineering, streaming, and analytics infrastructure.
Qualifications
Basic Requirements:
- 4+ years of experience as a Data Engineer in a production environment.
- Fluency in Python with strong experience in a Linux environment.
- Strong SQL skills, with hands-on experience in Snowflake and MySQL.
- Solid understanding of data modeling, schema design, and performance optimization.
- Proficiency with Python data libraries: NumPy, Pandas, SciPy.
- Experience building and maintaining real-time pipelines with Kafka (Apache Kafka or AWS MSK).
- Familiarity with AWS services: Lambda, Kinesis, S3, CloudWatch.
- Experience with event-driven architectures and streaming data patterns.
- Knowledge of RESTful APIs, data integration, and Git workflows.
- Comfortable in Agile environments with strong testing and code review practices.
Preferred:
- Experience with large-scale streaming data systems.
- Exposure to performance monitoring and debugging tools.
- Familiarity with infrastructure-as-code and CI/CD pipelines.
What We Offer
- Remote Work: Flexibility to work from anywhere in LATAM.
- Professional Growth: Career development opportunities, training, and certifications.
- Inclusive Environment: We foster a people-first culture where everyone can thrive professionally and personally.
- At AccelOne, we value our team and prioritize a supportive, balanced work environment. Join us in delivering top-tier solutions to our clients while advancing your career in a rewarding setting.