Senior Data Analyst
Stellenbeschreibung
About Slyce At Slyce, we’re reinventing how food delivery operations work. Our globally unique SaaS platform is already trusted by some of the world’s largest restaurant brands, such as Domino’s. By combining cutting-edge technology with deep food industry expertise, we help restaurants expand their reach, boost revenue, and build lasting customer loyalty.
Our founders previously built and scaled Honest Food, the world’s largest virtual restaurant brand, which was successfully acquired by Delivery Hero. We are building the next big thing in food-tech - If you want to make a real impact, this is your chance.
Role Overview We’re looking for a Senior Data Analyst who empowers our C‑level to make data-driven decisions every day. You will report to the Chief Data Officer and will be part of the data team. In your day-to-day work, you will work closely with the CEO and COO and dive deep into the performance of our algorithms for our clients.
You will help the business prepare case studies to win trials and help us understand what works and what does not. Additionally, you will work closely with Data Engineering and Data Science to improve the performance of our decision-making.
This is a unique opportunity to work closely with the founding team in a fast-growing startup.
What You'll Do
- Own performance analytics: Analyze how our algorithms impact restaurant performance (orders, revenue, margin, marketing spend) and define the KPIs that matter.
- Partner with C‑level: Work directly with the CEO, COO, and CDO to answer strategic questions, prioritize opportunities, and support key decisions with data.
- Build case studies & narratives: Create clear, compelling case studies and ROI analyses to support sales, customer success, and marketing.
- Design metrics & dashboards: Define core metrics and build dashboards that give the business a real-time view of performance across markets, products, and cohorts.
- Collaborate with Data Science & Engineering: Work with DS/DE to evaluate model performance, define experiments, and translate business needs into data and modeling requirements.
- Champion data quality & self‑service: Help improve our data models, documentation, and tooling so that stakeholders can reliably self‑serve for routine questions.
What We’re Looking For
- Experience: 3 years of experience in data analytics, product analytics, or analytics engineering.
- SQL & warehousing: Strong proficiency in SQL and experience working with large datasets in a cloud data warehouse (BigQuery or similar).
- Analytics & statistics: Solid understanding of analytic methods and experimentation (A/B testing, cohort analysis, basic statistical inference).
- Python for analysis: Strong proficiency in Python for data processing and automation (e.g. pandas, Jupyter/Colab, simple scripts).
- Data modeling & transformations: Experience working with well-structured data models; comfortable reading and contributing to dbt-based transformations.
- Workflow literacy: Hands-on experience with modern data workflows and orchestration (e.g. Airflow or Cloud Composer), at least as a power user.
- BI & visualization: Experience building reports and dashboards in a BI tool (e.g. Looker, Tableau, Metabase, Mode, Power BI).
- Stakeholder communication: Strong communication skills, especially in explaining complex analyses to non-technical stakeholders and C‑level.
- Solution-oriented mindset: Ability to prioritize and find solutions that are appropriate for the available time.
- Collaboration: Ability to work both independently and in cross-functional teams, proactively driving topics to completion.
- Work authorization: EU work authorization required.
Nice to Have Skills
- Domain knowledge: Experience with marketplaces, food delivery, or digital marketing/advertising.
- ML product analytics: Familiarity with evaluating and monitoring ML-driven products (recommendation, bidding, optimization).
- GCP experience: Previous experience with GCP (BigQuery, Cloud Composer, Pub/Sub, Vertex AI) or GCP Professional Data Engineer certification.
- Data quality: Understanding of data quality frameworks (e.g.
Great
Expectations, Soda) and how to implement basic checks.
- Streaming & real-time: Experience with streaming or event-based data (Pub/Sub, Kafka, Dataflow, or similar).
- APIs & integrations: Experience working with RESTful APIs and third-party platform integrations.
- Infrastructure as Code: Knowledge of tools like Terraform or Pulumi is a plus, especially if you’ve worked closely with data infrastructure.
Our Tech Stack
- Data & ML: Cloud Composer (Airflow), dbt, Python, BigQuery, Cloud Storage, Pub/Sub, Vertex AI
- Other: TypeScript, Docker, Kubernetes, Pulumi, GCP
What We Offer
- Opportunity to shape a category-defining food tech platform at an early stage
- Build scalable ML-powered infrastructure using modern data tools on GCP
- Direct collaboration with founders and senior leadership
- A culture of ownership, learning, and continuous growth
- Competitive compensation aligned with market standards
- Hybrid work culture with flexible working styles
Application Process Ready to actively shape the future of food delivery? Send your CV, portfolio, and a few lines on why you’re a great fit for Slyce to till@slyce.io
Let’s create something extraordinary - together. 🚀
¿Te interesa este puesto?