Principal Data Engineer
New Today
At Citizens we’re more than a bank and here you’ll experience new things, create new opportunities, think beyond your role and make an impact!
While in this role you will serve as a key contributor and leader in the space of technological innovation. Furthermore, you will combine their technical expertise and strong leadership skills to spearhead a results driven engineering operation. Additionally, you will be given the autonomy to lead, design and develop innovative solutions to some of the biggest technical issues facing the banking industry. Lastly, while in this role, you will serve as a peer leader tasked with pursuing cutting edge initiatives and solutions and the dynamic workload that will encompass both attribute enhancements and cutting – edge innovations.
Most importantly, you’ll feel valued for who you are and supported to achieve what’s important to you, personally and professionally!
Primary responsibilities include
Serve as a key contributor to the development of data ; collaborate with stakeholders to determine solution requirements.
Engineer relational and nonrelational databases.
Develop mechanisms, applications and processes that continuously harvest big data . systems architectures, programming, database design, database configuration, interface configuration, sensor configuration, etc.
Create comparative analyses of data stores and information flow within the organization; identify areas of friction, lag or compromised data integrity.
Review and manage the user interfaces of data flow mechanisms.
Produce data models in alignment with business objectives.
Qualifications, Education, Certifications and/or Other Professional Credentials
Required Qualifications Bachelor's Degree in Computer Science, statistics or related field discipline 8+ years of working experience in data engineering; experience managing engineers and/or technical personnel Experience handling big data, building big data solutions using cloud platforms Experience in Hadoop ecosystem (HDFS, Hive, PIG, Hbase, Hiveql, Experience with ETL/ELT tools . DataStage, Informatica, Sqood building/enhancing ETL framework, etc. Experience with the following technologies: RDBMS (with Teradata, Oracle, MS-SQL, DB2, RedShift, Snowflake Shell scripting, SQL, PL/SQL, Python, Java, Spark, Scala, etc. SCM tools such as Bitbucket with Jira, Starteam or SVN AWS native series such as S3, RDS, Glu, Athena, etc.
Preferred Qualifications Master's Degree in Computer Science, statistics or related field discipline Big Data Tech/AWS/Agile Certifications Experience working in the financial services industry; understanding of the role of consumer and commercial banking Experience working with machine learning technologies and/or AI Experience with Tableau, Cognos or SAS
Hours & Work Schedule
Hours per Week: 40
Work Schedule: Monday - Friday
- Location:
- Us