Skip to main content

Data Engineer, Integrations and Implementation

Technology
Kizen
Austin, United States$105,000 - $149,500 /year1 months agoUntil 4/23/2026
On-site

Job description

Location: Austin (Domain), TX (In-office 4x per week) Company: Kizen, www.kizen.com About Us At Kizen, we build technology that makes AI actually work for people. Our platform helps organizations automate the workflows that matter most — from helping insurance agents manage contracting and commissions, to enabling healthcare providers to deliver proactive patient care, to empowering financial institutions to classify documents and extract data automatically. And that's just the beginning.

With Kizen, developers can build new enterprise applications, while business users can customize dashboards and agents using AI, low-code, or code — all on the same platform. The possibilities are limited only by imagination. What used to take months or years to build, configure, and connect now takes just days.

Kizen combines powerful pre-built agents and use cases with an enterprise platform for automation and data connection — all built on an AI-first architecture that's easy to adopt, scale, and trust. We're not chasing hype. We're solving real operational problems and helping the world's most complex organizations work faster, smarter, and more humanely.

Join us in transforming how industries work — one workflow, one agent, and one customer success story at a time.

About the Role Want to help create the future of work? We're putting AI and intelligent automation to real use in the hands of real people. We are looking for a Senior Data Engineer specializing in working with data pipelines and multiple types of data formats as we bring our next-gen platform to the world.

This is a rare opportunity for someone who wants to be at the center of the AI movement and actually build things that matter. As a Senior Data Engineer at Kizen, you'll work directly with our Senior Director of Ecosystem Engineering and partner closely with our Implementations team to design and develop scalable data pipelines and system integrations that power the platform. You'll collaborate with product owners, solutions engineers, and software engineers to drive technical projects that require deep expertise in data transformation and real-world system connectivity.

Key Responsibilities Design, develop and deploy efficient data pipelines and ETL processes with modern software engineering practices

Ensure data integrity and quality by implementing robust data validation, schema checks, and error-handling mechanisms to prevent data corruption and maintain reliability.

Serve as a technical steward by continuously optimizing data performance and resource utilization to ensure maximum cost-effectiveness and system speed.

Write clean, maintainable code for complex data transformation workflows between disparate systems

Optimize existing pipelines for performance, security, and maintainability

Collaborate with the engineering team in code reviews and architectural decisions

Create technical documentation for data specifications

Evaluating and recommending process modifications

Participate in agile development processes including sprint planning and retrospectives Requirements 3+ years of cloud-based, high-level experience focused on modeling, building, and maintaining production-grade data solutions

Strong proficiency in at least one of: JavaScript/TypeScript, Python, Go, or other modern programming languages

Advanced knowledge of SQL, database design, and data transformation techniques

Strong understanding of authentication mechanisms (OAuth, API keys, JWT) and API security best practices

Experience with containerization (Docker) and container orchestration (Kubernetes)

Familiarity with CI/CD pipelines and automated deployment processes

Solid understanding of software design patterns and architectural principles

Experience with version control systems (Git) and collaborative development workflows Preferred Qualifications Experience with cloud platforms (AWS, GCP, Azure)

Experience with message queues and streaming platforms (Kafka, RabbitMQ)

Background in enterprise software integrations across CRM, ERP, marketing automation systems

Experience with SQL databases and data warehousing solutions

Understanding of data security, compliance, and privacy considerations in software development Why Kizen We're a fast-growing company that values innovation, growth, and continuous improvement. By joining Kizen, you'll play a pivotal role in shaping the future of the company while enjoying a supportive, dynamic, and collaborative workplace. You'll have opportunities for professional development, impact, and career advancement.

What We Offer Career Growth Opportunities

Engaging Work Culture

Top-Tier Compensation

Equity Package

Healthcare Coverage

Professional Development Stipends

PTO Kizen is proud to be an equal-opportunity employer. We are committed to building a diverse and inclusive culture that celebrates authenticity to win as one. We do not discriminate on the basis of race, religion, color, national origin, gender, gender identity, sexual orientation, age, marital status, disability, protected veteran status, citizenship or immigration status, or any other legally protected characteristics.

At Kizen, we fully comply with the Americans with Disabilities Act (ADA). We are dedicated to embracing challenges and creating an accessible, inclusive workplace for all individuals. The base salary range for this position is [$105,000-$130,000].

However, base pay offered may vary depending on job-related knowledge, skills, and experience. In addition to base salary, we also offer generous equity and benefits packages. If you're excited about creating impactful experiences and contributing to a fast-paced, people-focused team, we'd love to meet you!

OTE - $120,750 - $149,500 Compensation Range $105,000—$149,500 USD

Keywords
Data PipelinesETL ProcessesData ValidationData TransformationAPI SecurityOAuthAPI KeysJWTContainerizationDockerKubernetesCI/CDGitSQLPythonJavaScriptData EngineerIntegrationsImplementationAIAutomationETLData IntegrityData QualityData PerformanceSystem ConnectivityTypeScriptGoDatabase DesignCloud PlatformsKafkaRabbitMQCRMERP

¿Te interesa este puesto?