AWS Data Engineer

Temporary, Full or part-time · Dublin - Hybrid

About the job

We are seeking a highly skilled and motivated AWS Data Engineer to join our dynamic team on a fixed term contract. The ideal candidate will have extensive experience in building and maintaining data pipelines, data warehouses, and data lakes using AWS services. You will work closely with our data science and analytics teams to ensure data is readily available, secure, and efficiently processed.

Specifically, we see your time is spent:
  • Design, develop, and maintain scalable data pipelines and ETL processes using AWS Glue, Glue Crawlers, and Glue Data Catalog.

  • Manage and optimise Redshift clusters, ensuring high availability and performance of data warehousing solutions.

  • Implement serverless data processing workflows using AWS Lambda.

  • Utilise Apache Airflow for orchestrating complex data workflows and ensure smooth and timely execution of data pipelines.

  • Implement and manage data versioning and schema evolution using Apache Hudi.

  • Ensure data security and compliance through robust IAM policies and AWS Lake Formation.

  • Collaborate with data scientists, analysts, and stakeholders to understand data needs and deliver high-quality data solutions.

  • Monitor and troubleshoot data pipelines and workflows to ensure data quality and system reliability.

  • Document data workflows, architectures, and best practices to ensure knowledge sharing and continuous improvement.

What will pique our interest is someone that might have:
  • Bachelor’s degree in Computer Science, Information Technology, or a related field.

  • Minimum of 1 year experience using DBT.

  • 3+ years of experience as a Data Engineer with a focus on AWS technologies.

  • Strong expertise in AWS Glue, Glue Crawlers, and Glue Data Catalog.

  • Proficiency in Amazon Redshift including performance tuning, cluster management, and data warehousing best practices.

  • Experience with AWS Lambda for building serverless data processing applications.

  • Hands-on experience with Apache Airflow for orchestrating ETL workflows.

  • Familiarity with Apache Hudi for managing large-scale data lakes.

  • In-depth knowledge of IAM for managing access and permissions in AWS.

  • Experience with AWS Lake Formation for managing secure data lakes.

  • Strong programming skills in Python, SQL, and other relevant languages.

  • Excellent problem-solving skills and the ability to troubleshoot complex data issues.

  • Strong communication and collaboration skills to work effectively with cross-functional teams.

How we can support you:
  • We offer a flexible hybrid work environment that combines the best aspects of remote work and office collaboration. Our headquarters are in Dublin's vibrant Temple Bar, while our cutting-edge R&D centre resides in the historic Centro Histórico in Malaga, Spain; each providing a dynamic backdrop for your work. You can enjoy the perks of in-person collaboration, networking, and a vibrant office culture; however, you also have the flexibility to work remotely (with a home office allowance) and maintain a healthy work-life balance.
  • Unlimited leave. Spark believes in recognising and rewarding based expertise and value brought to our customers. We prioritise results and empower our team members to achieve greatness while maintaining a healthy work-life balance.
  • We believe in equitable compensation for all our team members. We offer market-beating packages and pension plan that acknowledge the value and expertise everyone brings to Spark. Our dedication to providing transparent and competitive salaries reflects our appreciation for hard work and our commitment to fostering a motivated and thriving workforce.
  • We prioritise our employees' wellbeing, and we've been recognised as a finalist for Employee Wellbeing Award. Our Employee Wellbeing initiatives include programs for physical health (Fitbit, Bike to Work Scheme, etc), mental resilience, work-life balance, and career development, along with a comprehensive VHI Health Insurance Plan.
  • Spark offers an annual training budget with 2 weeks of study leave to attend conferences and courses that enhance your skills, providing the dedicated time to focus on your learning and development.
We are looking forward to hearing from you!
Thank you for your interest in Spark. Please fill out the following short form. Should you have difficulties with the upload of your data, please send an email to recruitment@spark-hq.com
Uploading document. Please wait.
Please add all mandatory information with a * to send your application.