Data Engineers

Skill Set: Scala,Data Bricks,DATA ARCHITECTURE

Skill to Evaluate: Scala,Data Bricks,DATA ARCHITECTURE,Databricks,Spark,Java,System Integration,Data Migration,Systems,Windows,ETL,Other,SQL

Job Description:

6-8 years of IT experience focusing on enterprise data architecture and management. ? Experience in Conceptual/Logical/Physical Data Modeling & expertise in Relational and Dimensional Data Modeling ? Experience with Databricks & on Prem , Structured Streaming, Delta Lake concepts, and Delta Live Tables required ? Experience with Spark scala and java programming ? Data Lake concepts such as time travel and schema evolution and optimization ? Structured Streaming and Delta Live Tables with Databricks a bonus ? Experience leading and architecting enterprise-wide initiatives specifically system integration, data migration, transformation, data warehouse build, data mart build, and data lakes implementation / support ? Advanced level understanding of streaming data pipelines and how they differ from batch systems ? Formalize concepts of how to handle late data, defining windows, and data freshness ? Advanced understanding of ETL and ELT and ETL/ELT tools such as Data Migration Service etc ? Understanding of concepts and implementation strategies for different incremental data loads such as tumbling window, sliding window, high watermark, etc. ? Familiarity and/or expertise with Great Expectations or other data quality/data validation frameworks a bonus ? Familiarity with concepts such as late data, defining windows, and how window definitions impact data freshness ? Advanced level SQL experience (Joins, Aggregation, Windowing functions, Common Table Expressions, RDBMS schema design performance optimization) ? Indexing and partitioning strategy experience ? Debug, troubleshoot, design and implement solutions to complex technical issues ? Experience with large-scale, high-performance enterprise big data application deployment and solution ? Architecture experience in AWS environment a bonus ? Familiarity working with Lambda specifically with how to push and pull data, how to use AWS tools to view data for processing massive data at scale a bonus ? Experience with Gitlabs and CloudWatch and ability to write and maintain gitlabs for supporting CI/CD pipelines ? Experience working with AWS Lambdas for configuration and optimization and experience with S3 ? Familiarity with Schema Registry, message formats such as Avro, ORC, etc. ? Ability to thrive in a team-based environment ? Experience briefing the benefits and constraints of technology solutions to technology partners, stakeholders, team members, and senior level of management Skillset: Java, Scala, S3, Glue, Redshift Location – Scottsdale”

Please submit your resume to: info@bytelinksys.com

Job Category: Data Engineers
Job Type: Full Time
Job Location: SANTA CLARA

Apply for this position

Allowed Type(s): .pdf, .doc, .docx

About the Author

You may also like these

No Related Post