Data Engineer (AWS)
Cluj, RO Brasov, RO Sibiu, RO Remote, RO Iasi, RO Timisoara, RO Bucuresti, RO
Who We Are
At the heart of our outsourcing organization, the Data & Intelligence Competence Center serves as a dedicated hub for advanced data-driven solutions. We specialize in data engineering, analytics, and AI-powered insights, helping businesses turn raw information into actionable intelligence. By combining deep technical expertise with industry best practices, we enable smarter decision-making, optimize processes, and foster innovation across diverse sectors.
Building on this foundation, we collaborate with a world-leading reinsurance and risk management company. Our client delivers comprehensive solutions in insurance, underwriting, and data-driven risk assessment. With a strong commitment to innovation and long-term stability, they empower organizations to navigate complex risks and create sustainable value in an ever-changing global landscape. To support this mission, we are seeking a highly skilled AWS Data Engineer to strengthen our data and analytics ecosystem. In this role, you will design, develop, and manage cloud-based data solutions, leveraging big data frameworks to build efficient pipelines, optimize storage, and implement robust processing workflows—ensuring high-quality data availability for analytics and business intelligence.
What you'll be doing
- Build and maintain large-scale ETL pipelines using AWS Glue, Lambda, and Step Functions
- Design and manage data lakes on Amazon S3, implementing robust schema management and lifecycle policies
- Work with Apache Iceberg and Parquet formats to support efficient and scalable data storage
- Develop distributed data processing workflows using PySpark
- Implement secure, governed data environments using AWS Lake Formation
- Build and maintain integrations using AWS API Gateway and data exchange APIs
- Automate infrastructure provisioning using Terraform or CDK for Terraform (CDKTF)
- Develop CI/CD pipelines and containerized solutions within modern DevOps practices
- Implement logging, observability, and monitoring solutions to maintain reliable data workflows
- Perform root cause analysis and optimize data processing for improved performance and quality
- Collaborate with business intelligence teams and analysts to support reporting and analytics needs
- Work in cross-functional, Agile teams and actively participate in sprint ceremonies, backlog refinement, and planning
- Provide data-driven insights and recommendations that support business decision-making
What you'll bring along
- Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent experience)
- Minimum 3–5 years of experience in a Data Engineering role
- Strong knowledge of AWS services: Glue, Lambda, S3, Athena, Lake Formation, Step Functions, DynamoDB
- Proficiency in Python and PySpark for data processing, optimization, and automation
- Hands-on experience with Terraform or CDKTF for Infrastructure as Code
- Solid understanding of ETL development, data lakes, schema evolution, and distributed processing
- Experience working with Apache Iceberg and Parquet formats (highly valued)
- Experience with CI/CD pipelines, automation, and containerization
- Familiarity with API Gateway and modern integration patterns
- Strong analytical and problem-solving skills
- Experience working in Agile Scrum environments
- Good understanding of data governance, security, and access control principles
- Experience with visualization/BI tools such as Power BI or AWS QuickSight is a plus
- Excellent command of both spoken and written English.
Nice to Have
- Experience designing data products, implementing tag-based access control, or applying federated governance using AWS Lake Formation
- Familiarity with Amazon SageMaker for AI/ML workflows
- Hands-on experience with AWS QuickSight for building analytics dashboards
- Exposure to data mesh architectures
- Experience with container orchestration (e.g., Kubernetes, ECS, EKS)
- Knowledge of modern data architecture patterns (e.g., CDC, event-driven pipelines, near-real-time ingestion)
What’s in it for you
✔ New beginnings can be a challenge. We promise a smooth integration and a supportive mentor
✔ Pick your working style: choose from Remote, Hybrid or Office work opportunities
✔ Early bird or night owl? Our projects have different working hours to suit your needs
✔ Nobody is born an expert. Sharpen your tech skills with our sponsored certifications, trainings and top e-learning platforms
✔ We want you to stay healthy! Enjoy our Private Health Insurance – it’s custom-made for you
✔ A clear mind is a healthy mind. Attend individual coaching sessions or go one step further by joining our accredited Coaching School
✔ Make the most of our epic parties or themed events – they’re lovingly designed for our people and their families
NTT DATA Romania is an equal opportunity employer and considers all applicants regardless to race, color, religion, citizenship, national origin, ancestry, age, sex, sexual orientation, gender identity, genetic information, physical or mental disability, veteran or marital status, or any other characteristic protected by law. We are committed to creating a diverse and inclusive environment for all employees.
Not the job for you? Perhaps you have a friend who would be a perfect fit. Send them this link!
Third parties fraudulently posing as NTT DATA recruiters
NTT DATA recruiters will never ask job seekers and candidates for payment or banking information during the recruitment process, for any reason. Please remain vigilant of third parties that may try to impersonate NTT DATA recruiters, either in writing or by phone, in an attempt to deceptively obtain personal data or money from you. All email communications from an NTT DATA recruiter will be associated with an @nttdata.com email address. NTT DATA will not use any non-NTT DATA or personal email domains (Gmail, Yahoo, etc.) or personal communication channels (WhatsApp, Facebook etc) at any time during the recruitment process. If you suspect any fraudulent activity, please contact us.
Job Segment:
Data Architect, Risk Management, Business Intelligence, Engineer, Underwriter, Data, Finance, Technology, Engineering, Insurance