Lead Data Engineer (#374)


港区白金, 東京都
Full time Permanent
Insurance

Job description

We are looking for an experienced Lead Data Engineer to join our dynamic team. As a Lead Data Engineer, you will play a pivotal role in the development and maintenance of our data platform, ensuring that our solutions are both technically sound and meet business objectives. You will be responsible for designing, implementing, and optimizing core data platform components using AWS and Databricks, and collaborating closely with cross-functional teams to deliver high-quality solutions. In this role, you will not only be responsible for executing hands-on technical development but also for driving technical discussions, managing risks, and ensuring the platform's stability while meeting business timelines.

Key Responsibilities:

Platform Development:

  • Develop and maintain core data platform components using AWS and Databricks.
  • Build and optimize Data Pipelines and transformation workflows for large datasets.
  • Create and implement Data Ingestion Patterns for various data sources.
  • Maintain Automated Testing and deployment processes for continuous integration and delivery.
  • Implement Monitoring and Alerting solutions to ensure the stability of platform components.

Technical Implementation:

  • Write high-quality, maintainable, and efficient code, following best practices.
  • Lead the Infrastructure as Code (IaC) efforts using Terraform to provision infrastructure, and implement best practices for infrastructure testing, validation, and optimization.
  • Ensure security and compliance requirements are incorporated into platform components.
  • Optimize the performance of data processing workflows, ensuring scalability and efficiency.
  • Create and maintain Technical Documentation for platform components and project architecture.
  • Participate in Code Reviews and technical discussions to ensure high code quality.

Collaboration & Knowledge Sharing:

  • Collaborate closely with other Data Engineers and developers to ensure consistency and reuse of implementation patterns.
  • Share technical knowledge, best practices, and insights with the team.
  • Work with Cross-Functional Stakeholders to understand their requirements and translate them into compliant and scalable solution designs.
  • Actively participate in Technical Design Discussions, providing feedback and solutions.
  • Communicate effectively with Project Managers and business stakeholders to align technical deliverables with business goals.
  • Provide accurate Technical Estimates and progress updates.
  • Identify and escalate any Risks or Blockers early in the delivery process.

Requirements:

Experience and Skills:

  • Bachelor's or Master’s degree in Computer Science or a related technical field.
  • 6+ years of experience in data engineering and software development.
  • Strong expertise in Cloud Platforms (AWS preferred) and data processing frameworks.
  • Advanced proficiency in Python (preferred), Scala, or Java, and SQL (preferred).
  • Experience with data processing frameworks such as Apache Spark (preferred).
  • Expertise in modern SQL Warehouses like Databricks (preferred), Snowflake, or BigQuery.
  • Strong knowledge of columnar format handling (e.g., Parquet, Delta, or ORC formats).
  • Experience with DBT (Data Build Tool) and data modeling methodologies.
  • Proficiency in CI/CD practices, DevOps tools (e.g., JIRA, Jenkins, Terraform, AWS CDK for IaC).
  • Solid understanding of data modeling and data architecture principles.
  • Experience with real-time data processing and streaming technologies.

Soft Skills:

  • Strong analytical and problem-solving skills.
  • Excellent attention to detail and quality focus.
  •  Good communication skills.
  •  Self-motivated and proactive approach to work.
  • Ability to work effectively in an agile environment.
  • Commitment to continuous learning and improvement.

Language requirement

English (Business), Japanese (Conversation)

Working hours

9:00-18:00

Back to jobs