Kineton is seeking a Data Engineer for our Media, IT & Telco area.
You will be responsible for designing, developing and maintaining end- to-end data pipelines for Big Data analytics. You will work closely with other engineers and analysts to acquire, transform, analyze and visualize large-scale data.
You will join our team, taking a key role in the following activities:
- Design, develop and deploy data pipelines using Apache Spark, Hadoop and other Big Data tools.
- Develop and maintain ETL (Extract, Transform, Load) to load data from different sources into data warehouse and data lake.
- Optimize the performance of data pipelines and data storage systems.
- Collaborate with data scientists and analysts to understand their needs and develop data analysis solutions.
- Develop and maintain reports and dashboards for data visualization
- Ensure data quality, security and integrity.
- Keep up-to-date on Big Data technologies and analysis tools.
Key Requirements:
- 2+ years of working experience as a Data Engineer.
- Hands-on experience with Apache Spark, Hadoop, Hive, Pig and other Big Data tools.
- Experience with GCP (Google Cloud Platform) and managing data warehouse and data lake on the cloud.
- Knowledge of SQL and scripting languages such as Python or Bash.
- Ability to design and develop end-to-end data pipelines.
- Knowledge of Docker and Kubernetes.
- Capacity to optimize the performance of data pipelines and data warehouse systems.
- Capability to work independently and as part of a team.
- Bachelor’s degree in Computer Science, Computer Engineering or Mathematics.
- Professional knowledge of the English language.
What we offer:
- Competitive economic packages;
- Ongoing training and professional growth;
- Access to discounting platforms;
- Rewarding company system.
Place of work: Naples, Milan.