Skip to content

ComputerWork: Jobs for Technical People

 

Job Application

 
 
 

Please answer the following questions in order to process your application.

 
 
Email Address *
 
Do you require a work permit/Visa to work in the country of this job? *
 
 
 
File Attachments:
(2MB file maximum. doc, docx, pdf, rtf or txt files only)
 
Attach a CV * 
 
Optional covering letter 
OR
Clear covering letter
 
 
 * denotes required field
 
 
 
Additional Information:
 
First Name
 
Last Name
 
Address
 
Country
 
Home Telephone
 
Mobile/Cell
 
Availability/Notice
 
Hourly Rate GBP
 
Approximately how far are you willing to travel to work (in miles) ?
 
 
 

Key Privacy Information

When you apply for a job, ComputerWork will collect the information you provide in the application and disclose it to the advertiser of the job.

If the advertiser wishes to contact you they have agreed to use your information following data protection law.

ComputerWork will keep a copy of the application for 90 days.

More information about our Privacy Policy.

 

Job Details

 

Data Engineer (Contract)

Location: Frankfurt, Germany Country: Germany Rate: DOE
 

Our Client is looking for a Data Engineer, who's key objective is to design and build out a global, stable and up-to-date data warehouse functionality that can fulfil the needs of portfolio managers, data scientists and analysts. Therefore, this platform requires setup and integration of multiple internal and external data sources, building a data catalogue and ensuring that data ownership and data completeness are in place

1. Build and ongoing review of functional and solution design of a data lake.
2. Develop Databricks notebooks for data delivery and Datamodelling as well as SQL scripts
3. Translate functional requirements into technical specs supported by a business analyst.
4. Technical design and build of interfaces and data feeds using Informatica and/or Azure technologies.
5. Create and maintain appropriate documentation using existing tools such as Jira/Confluence and applying Company's standards.
6. Ensure collection of meaningful data and feedback across the whole organization, while engaging stakeholders into the importance of a Data Layer
7. Ensure alignment with Enterprise architecture and Enterprise Data Office for all interface solutions.
8. Analyze and understand business processes and requirements, design IT solutions based on the analysis, assist business and IT-department during the implementation phase.
9. Design test planning as well as automated and manual management and execution. Document and execute functional test plans. Work with the end-user community to support the documentation and execution of UAT test cases.
10. Apply agile project and design methodologies

Onboard new datasets onto the central platform using Microsoft Azure stack (Azure Data factory, Azure Databricks)
1. Implement dimensional modelling based on the design provided using Azure stack
2. Conduct data demand analysis to ensure that all standards are fulfilled
3. Document the pipelines in Wiki

Overall Project Deliverables:
1. Data model and related data base structure.
2. Delivery of data migration activities, data onboarding and design required plans (eg migration, testing, data collection)
3. Review test cases and test execution and provide feedback for improvement.
4. Defect analysis and resolution based on provided information, infrastructure and tools.

Key Requirements:
Track record of project including Python and SQL
Previous experience or good understanding/capabilities of Cloud services (ie Microsoft Azure (Data Factory, Databricks, Data Lake)
Knowledge and experience of Asset Management industry is a plus


Posted Date: 03 May 2024 Reference: JS/542 Employment Business: Optimus E2E Contact: Rob