Calling All Upstarters!
SENIOR BACK END ENGINEER WANTED!
We are Upstart 13. We are humble, hungry, and competent people who are radically changing the expectations and experience of outsourcing for all participants by challenging barriers that create inequality and by bringing down borders in technology for people everywhere. We’re all about delivering value and doing big things. We have become a game-changer for teams around the world who look to Upstart’s services as a differentiator.
Job Description:
We are seeking a highly skilled Senior AI Data Engineer located in Latin America who bridges the worlds of data engineering, business intelligence, and artificial intelligence. The ideal candidate has deep roots in data modeling, semantic layers, and BI platforms (particularly Microsoft Fabric and Power BI) and has evolved their expertise to architect AI-powered analytics solutions.You'll design and build intelligent data platforms where semantic models become the foundation for AI agents, RAG systems, and natural language interfaces.
This role combines the precision of dimensional modeling with the innovation of generative AI, creating systems that transform how users interact with data.Responsibilities:
Data & Semantic Modeling:
- Design and build enterprise semantic models in Microsoft Fabric using Direct Lake mode and optimized Delta Lake storage.
- Architect dimensional models and star schemas that serve both traditional BI and AI workloads.
- Build and maintain complex DAX measures and calculated columns that encapsulate business logic.
- Implement incremental refresh strategies, aggregations, and partitioning for large-scale datasets (100M+ rows).
- Create and optimize data pipelines using Fabric Data Factory and Dataflows Gen2.
- Implement row-level security and governance frameworks across semantic models.
- Build integrations connecting diverse data sources (databases, QuickBooks, Excel) to unified semantic models.
AI-Enhanced Analytics:
- Develop AI agents that understand semantic model metadata and generate contextually-aware DAX queries from natural language.
- Build RAG systems that leverage your semantic models' business definitions, relationships, and metric calculations for context.
- Create conversational interfaces allowing users to explore financial data and business metrics through natural language.
- Implement AI-powered anomaly detection on financial metrics and vendor payment patterns.
- Build multi-agent systems for complex analytical workflows (forecasting, variance analysis, what-if scenarios).
- Design backend APIs that expose both semantic model data and AI agent capabilities.
- Connect AI agents with Power BI REST APIs for automated report and dashboard generation.
- Develop real-time streaming analytics combining Microsoft Fabric Real-Time Analytics with AI inference.
AI-Driven Data Platform Intelligence:
- Build AI agents that analyze semantic models and generate improvement recommendations (new measures, optimized relationships, performance enhancements, schema refinements).
- Develop AI systems that integrate with Fabric pipeline orchestration to provide intelligent monitoring, automated error resolution suggestions, and workflow optimization.
- Create AI-powered semantic model governance tools that analyze data quality, usage patterns, and suggest structural improvements.
- Implement AI assistants that analyze query patterns and recommend semantic model enhancements for common business questions.
- Design AI systems that validate and propose DAX measure improvements based on usage analytics and performance metrics.
- Build AI agents that monitor data pipeline execution and proactively suggest preventive maintenance or configuration changes.
- Stay current with GenAI advancements and apply emerging techniques to data analytics challenges.
Qualifications
Education and Experience:
- Bachelor's or Master's degree in Computer Science, Data Engineering, or related field.
- 6+ years in data engineering, BI development, or analytics engineering.
- 3+ years of hands-on experience with Microsoft Fabric, Power BI, or Azure Synapse Analytics.
- 2+ years building AI applications with LLMs on top of data platforms.
Technical skills:
- Deep expertise in semantic/tabular modeling (star schemas, dimensional modeling, conformed dimensions).
- Expert-level DAX and M (Power Query) development.
- Strong SQL skills and data warehousing concepts.
- Extensive experience with Python for data pipelines and AI development.
- Proven track record deploying RAG implementations using vector databases.
- Experience with AI agentic frameworks (LangChain, LlamaIndex, Semantic Kernel, Pydantic AI).
- Strong understanding of RESTful API design and implementation.
- Experience with natural language to SQL/DAX translation.
- Understanding of security best practices for AI applications.
- Experience with version control (Git) and CI/CD pipelines.
- Familiarity with MCP, A2A, or similar integration protocols.
Soft skills:
- Data Modeler at Heart — thinks in dimensions, facts, and relationships; sees semantic models as the foundation for intelligent systems.
- Execution-First Mindset — delivers working functionality quickly, then iterates.
- Curious Integrator — enjoys untangling messy, niche data feeds and bringing order.
- Quality Advocate — insists on tests, logging, and clear handoffs.
- Collaborative Communicator — explains technical decisions to peers and stakeholders.
- Continuous Learner — keeps abreast of AI and data platform advancements.
Bonus skills:
- Experience migrating from traditional BI to AI-augmented analytics.
- Background in Analytics Engineering or dbt.
- Microsoft Fabric certifications (DP-600, PL-300).
- Microsoft AI certifications (AI-102).
- Experience with TMSL, XMLA endpoints, and Tabular Object Model.
- Knowledge of Lakehouse architecture and medallion patterns.
- Experience with Real-Time Intelligence in Fabric.
- Understanding of VertiPaq engine optimization.
- Familiarity with Direct Lake mode and Delta Lake format optimization.
- Experience building serverless-architected solutions, including monitoring, debugging, and deployment.
- Experience with large-scale semantic models (10GB+ or 100M+ rows).
- Prior work as a BI Developer, Data Analyst, or Analytics Engineer who expanded into AI/ML.
- Experience building AI systems that analyze and optimize data infrastructure.
Why Upstart13?
- We put people first at Upstart 13! We believe the world is filled with amazing people and we are willing to go to great lengths to seek out others who share our values to join our cause of bringing down borders in technology for people everywhere.
- We develop leaders at Upstart 13, we focus on what matters to do meaningful work, we own our shit, we stay curious, and we understand responsibility leads to giving. We do big things together!
Perks:
- Job-type: Project-based, full-time job.
- Fully remote.
- USD competitive salary.
Are you ready to join our cause? Be sure to ask, “Why 13?”