Sr. Software Engineer - Connectors Framework
Nexla
About Nexla
Nexla is the leading Integration platform, built with AI, for AI. Nexla takes a metadata driven approach to converge diverse integrations across Data, Documents, Agents, Applications, and APIs into a single design pattern. We accelerate the development of solutions for GenAI, Analytics, and Inter-company data. Nexla makes data users and developers up to 10x more productive by delivering a true blend of no-code, low-code, and pro-code interfaces.
Leading companies including DoorDash, LinkedIn, Johnson & Johnson, and LiveRamp trust Nexla for mission-critical data. Named in the 2022, 2023, and 2024 Gartner Magic Quadrant™ for Data Integration Tools and top-rated by customers on Gartner Peer Insights, Nexla is a remote-first company headquartered in San Mateo, California.
At Nexla, our culture is built around our core values: Have Empathy, Be Curious, Be Intellectually Honest, Achieve Excellence, and Remember to Relax. We put our customers at the heart of everything we do, foster a data-driven mindset, take ownership of our work, and believe in the power of teamwork to achieve ambitious goals.
Role
We are seeking an experienced Senior Software Engineer to join our Connector Engineering team at Nexla. In this role, you will be instrumental in expanding and enhancing the core framework of our universal bidirectional connector ecosystem that enables seamless data integration across 100+ SaaS applications, databases, data warehouses, APIs, and cloud platforms. You will design, develop, and maintain this framework to keep pace with the latest technologies and service needs of various connectors, current and future - keeping scalability, reliability, resilience, extensibility, maintainability and performance in mind.
Responsibilities
- Architect, design, develop & enhance production-grade connectors framework for various data sources including databases (SQL/NoSQL), SaaS applications, REST/SOAP APIs, cloud storage systems, streaming platforms, and enterprise applications
- Make the framework scalable and resilient to handle high-volume data transfers, data loss prevention & detection, and incorporate error recovery mechanisms
- Build intelligent schema inference and mapping systems that automatically detect and adapt to changes in data structures, APIs, and source system configurations
- Develop comprehensive authentication and authorization mechanisms supporting OAuth 2.0, API keys, SAML, certificate-based auth, and other enterprise security protocols.
- Optimize connector performance through various strategies like efficient data pagination, parallel processing, connection pooling, and intelligent batching strategies for large-scale data movements.
- Contribute to connector SDK and framework development to enable rapid development of new connectors and empower partners/customers to build custom integrations.
- Conduct code & design reviews and mentor junior engineers on best practices for connector development, testing strategies, and system design
- Take ownership of projects from inception to deployment, driving them to completion and out into production with minimal supervision
- Effectively communicate technical concepts to both technical and non-technical stakeholders
- Research emerging technologies and integration patterns to keep Nexla's connector ecosystem at the cutting edge of data integration capabilities
Qualifications
- 10-15 years of software engineering experience with at least 4 years focused on data integration, ETL/ELT pipelines, or framework design & development
- Strong proficiency in JVM based programming languages with deep understanding of concurrent programming and async patterns
- Experienced in building frameworks and platforms at the core of systems.
- Well versed in Distributed Systems concepts & implementations..
- Extensive experience with API development and integration including REST, GraphQL, SOAP, webhooks, and event-driven architectures
- Deep understanding of database technologies including relational (PostgreSQL, MySQL, Oracle, SQL Server) and NoSQL (MongoDB, DynamoDB, Cassandra) systems
- Expertise in data formats and serialization like JSON, XML, Avro, Parquet, Protocol Buffers, and handling of semi-structured and unstructured data
- Experience with streaming and message queue systems such as Kafka, Kinesis, RabbitMQ, or similar technologies
- Proficiency in testing methodologies including unit testing, integration testing, end-to-end testing, and test automation frameworks
- Strong problem-solving and debugging skills with ability to diagnose complex integration issues across distributed systems
- Bachelor's degree in Computer Science, Engineering, or related technical field (or equivalent practical experience)
Bonus Points
- Familiarity with CDC (Change Data Capture) technologies and real-time data replication strategies
- Knowledge of data warehouse technologies including Snowflake, BigQuery, Redshift, Databricks, and their specific loading patterns and optimization techniques
- Experience with cloud platforms (AWS, Azure, GCP) and their native services for data storage, compute, and integration
- Experience with iPaaS platforms or competing solutions (MuleSoft, Informatica, Talend, Fivetran, Stitch, Airbyte)