Job Reference # 297450BR Job Type Full Time Your role Do you like building complex, secure platforms with a touch of a button? Are passionate about developing automated infrastructure as code that is successfully rolled out across a global implementation? Do you have what it takes to build robust solution that aide data engineers in delivering their data pipelines? UBS , the Group Compliance & Regulatory Governance (GCRG) Technology team is looking for a hands-on Data Engineer on Azure leveraging Databricks for Scala/Python to: • engineer reliable data pipelines for sourcing, processing, distributing, and storing data in different ways, using Databricks and Airflow • craft complex transformation pipelines on multiple datasets producing valuable insights that inform business decisions, making use of our internal data platforms and educate others about Best Practices for analytics big data • develop, train, and apply data engineering techniques to automate manual processes, and solve challenging business problems and ensure the quality, security, reliability, and compliance of our solutions by applying our digital principles and implementing both functional and non-functional requirements • build observability into our solutions, monitor production health, help to resolve incidents, and remediate the root cause of risks and issues • leverage Airflow to build complex branching Data Driven pipelines and leverage Databricks to build the spark layer of data pipelines • leverage Python and Scala for low level complex data operations, codify best practices, methodology and share knowledge with other engineers in UBS Your team You will be working as part of the Group Compliance Regulatory Governance Technology stream that focuses on Data Analytics and Reporting. Our crew is using the latest data platforms to further the group’s data strategy to realize the true potential of data as an asset through utilizing data lakes, data virtualization, for use with advanced analytics, AI/ML. The crew also ensure the data is managed with strategic sourcing and data quality tooling. Your team will be responsible for building the central GCRG data lake, developing data pipelines to strategically source data from master and authoritative source, creating data virtualization layer, building connectivity for advanced analytics and elastic search capabilities with the aid of cloud computing. Diversity helps us grow, together. That’s why we are committed to fostering and advancing diversity, equity, and inclusion. It strengthens our business and brings value to our clients. Your expertise • Bachelor’s or master’s degree in computer science or any similar engineering is highly desired • ideally 5+ years of total IT experience in SWD or engineering and ideally 3+ years of hand-on experience designing and building scalable data pipelines for large datasets on cloud data platforms • ideally 3+ years of hand-on experience in distributed processing using Databricks, Apache Python/Spark, Kafka & leveraging Airflow scheduler/executor framework • ideally 2+ years of hand-on experience programming experience in Scala (must have), Python & Java (preferred) • experience with monitoring solutions such as Spark Cluster Logs, Azure Logs, AppInsights, Graphana to optimize pipelines and knowledge in Azure capable languages, Python, Scala or Java • proficiency at working with large and complex code base management systems like: Github/Gitlab, Gitflow as a project commiter at both command-line and IDEs levels using: tools like: IntelliJ/AzureStudio • experience working with Agile development methodologies and delivering within Azure DevOps, automated testing on tools used to support CI and release management • expertise in optimized dataset structures in Parquet and Delta Lake formats, with ability to design and implement complex transformations between datasets • expertise in optimized Airflow DAGS and branching logic for Tasks to implement complex pipelines and outcomes and expertise in both traditional SQL and NO-SQL authorship About Us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors.. We have a presence in all major financial centers in more than 50 countries. How We Hire This role requires an assessment on application. Learn more about how we hire: www.ubs.com/global/en/careers/experienced-professionals.html Join us At UBS, we embrace flexible ways of working when the role permits. We offer different working arrangements like part-time, job-sharing and hybrid (office and home) working. Our purpose-led culture and global infrastructure help us connect, collaborate, and work together in agile ways to meet all our business needs. From gaining new experiences in different roles to acquiring fresh knowledge and skills, we know that great work is never done alone. We know that it's our people, with their unique backgrounds, skills, experience levels and interests, who drive our ongoing success. Together we’re more than ourselves. Ready to be part of #teamUBS and make an impact? Disclaimer / Policy Statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce. Your Career Comeback We are open to applications from career returners. Find out more about our program on ubs.com/careercomeback.
Data Analytics
Microsoft Legal DepartmentAustin, TX
Join Microsoft Legal Department as a Data Analytics professional to develop BI solutions and analytical insights that drive business success. Collaborate with stakeholders to analyze data and present findings.
AI/ML Engineer (RapidScale)
Cox EnterprisesRaleigh, North Carolina
Cox Enterprises is seeking an AI/ML Engineer to design and implement advanced AI solutions across multiple cloud platforms. The role involves collaboration with cross-functional teams to optimize AI models and ensure robust monitoring in production environments.
Senior Engineer, IT Disaster Recovery
Stride, Inc.Dover, DE
Stride, Inc. is seeking a Senior Engineer for IT Disaster Recovery to ensure the availability and resiliency of critical IT systems. The role involves designing disaster recovery plans, leading recovery efforts, and collaborating with various IT and business teams.
Job Type
Fulltime role
Skills required
Azure, Python, Java, Agile
Location
Weehawken, NJ
Salary
No salary information was found.
Date Posted
October 17, 2024
Data Analytics
Microsoft Legal DepartmentAustin, TX
Join Microsoft Legal Department as a Data Analytics professional to develop BI solutions and analytical insights that drive business success. Collaborate with stakeholders to analyze data and present findings.
AI/ML Engineer (RapidScale)
Cox EnterprisesRaleigh, North Carolina
Cox Enterprises is seeking an AI/ML Engineer to design and implement advanced AI solutions across multiple cloud platforms. The role involves collaboration with cross-functional teams to optimize AI models and ensure robust monitoring in production environments.
Senior Engineer, IT Disaster Recovery
Stride, Inc.Dover, DE
Stride, Inc. is seeking a Senior Engineer for IT Disaster Recovery to ensure the availability and resiliency of critical IT systems. The role involves designing disaster recovery plans, leading recovery efforts, and collaborating with various IT and business teams.
UBS is seeking a Senior Data Engineer to build and maintain scalable data pipelines on Azure using Databricks for Scala/Python. The role involves developing automated infrastructure and ensuring data quality and compliance.