Data Engineer

Data Engineer

We are looking for a Data Engineer for our Data Lab area.

The resource will be responsible for the design, development and maintenance of end-to-end data pipelines for Big Data analytics. You will work closely with other engineers and analysts to acquire, transform, analyse and visualise large-scale data.

Responsibilities:

  • Design, develop and deploy data pipelines using Apache Spark, Hadoop and other Big Data tools.
  • Develop and maintain ETL (Extract, Transform, Load) to load data from different sources into data warehouse and data lake.
  • Optimise the performance of data pipelines and data storage systems.
  • Collaborate with data scientists and analysts to understand their needs and develop data analysis solutions.
  • Develop and maintain reports and dashboards for data visualisation.
  • Ensure data quality, security and integrity.
  • Keep up-to-date on Big Data technologies and analysis tools.

Key requirements:

  • 2+ years working experience as a Data Engineer.
  • Hands-on experience with Apache Spark, Hadoop, Hive, Pig and other Big Data tools.
  • Experience with GCP (Google Cloud Platform) and managing data warehouse and data lake on cloud.
  • Knowledge of SQL and scripting languages such as Python or Bash.
  • Ability to design and develop end-to-end data pipelines.
  • Capacity to optimise the performance of data pipelines and data warehouse systems.
  • Capability to work independently and as part of a team.

Qualifications:

  • Bachelor’s degree in Computer Science, Computer Engineering or Mathematics.
  • Professional knowledge of the English language.

Desirable experience:

  • Experience with GCP (Google Cloud Platform) and managing data warehouse and data lake on cloud.
  • Experience with machine learning and deep learning tools.
  • Knowledge of Docker and Kubernetes.
  • Experience with CI/CD (Continuous Integration/Continuous Delivery).

Location: Naples and Milan

Apply for this position

Allowed Type(s): .pdf, .doc, .docx