Description:
Key Requirements: 3-10 years of hands-on experience in data engineering with a focus on ETL workflows, data pipelines, and cloud computing. Strong experience with AWS services for data processing and storage (e.g., S3, Glue, Athena, Lambda, Redshift). Proficiency in programming languages such as Python, PySpark Deep understanding of microservices architecture and distributed systems. Familiarity with AI/ML tools and frameworks (e.g., TensorFlow, PyTorch) and their integration into data pipel
Feb 10, 2026;
from:
dice.com