Vaultspeed
Alliance for Recruitment is the largest recruitment consultancy in Lithuania measured by capacity, number of successful placements and annual growth. We are a high performing team of recruitment experts from various different industries.
Our client, Vaultspeed - building automation solutions to make sure that organizations that are conducting analyses to make the right decisions have all the data in place, in time. The company started in 2019 with the radical idea that automation should not only cover the collection of all data into a centralized location but also reconcile different sources. Today, Vaultspeed provides a trusted, no-code tool that meets analytics requirements by solving complexity, accelerating delivery, increasing agility and reducing the chance of human error.
VaultSpeed, a worldwide leader in Data Integration Automation Products is looking for Medior Data Engineer that are willing to join our team of developers. The ideal candidate will be willing to learn about and contribute to Data Integration Projects or developing Data Integrations tools (ETL-tools, automations tools, data pipelines).
He/She will need to have at least 2-3 years of experience working on Data Integration Projects or developing Data Integration tools (ELT-tools, Automation Tools, Data Pipelines). The background in the Data integration space can be based on Big Data Technology the traditional relational world or the cloud based relational database engines.
VaultSpeed is looking for a team player that can work independently from home and later on part-time in de-central offices. We offer great remuneration, great benefits, openness, working on a responsibility basis, and timely local and global team events to keep you motivated.
Responsibilities:
- Working with other senior developers on various data engineering related tasks
- Learning about the Data Vault principles and developed the required skills
- Contributing to the existing code base and writing new tests
- Implementing the new features
- For the development profile VaultSpeed is looking for a Bachelor degree in IT-science or higher.
- Medior SQL knowledge in a Data Integration Context.
- Medior Python programming skills.
- ETL-knowledge (any ETL-tool: Informatica, Talend, Oracle Data Integrator, Matillion, DBT or SQL-Scripts or Python).
OR
- Big Data knowledge in a Data Integration Context (Spark, Hive, …).
OR
- Used a Logical Data Warehousing tool like Denodo or Tibco to implement a Logical Data Integration solution.
- Basic Knowledge of at least 1 of the Data Integration Architecture for the Data Factory:
- Kimball’s Bus Architecture (Dimensional modeling - Star schema’s).
- Bill Inmon’s, Corporate information Factory.
- Dan Linsted’s, Data Vault 1.0 and/or Data Vault 2.0.
- Java programming skills.
- A real team player
Differentiating skills:
- Worked for at least 1 year on a Data Vault 2.0 Integration project.
- Developed code or tools to generate code to load data in an automated way.
- Understands the concept of Metadata in the context of generating code.
- Has knowledge about orchestration tools like Airflow or Azure Data Factory.
- Worked on an integration project with Snowflake or Microsoft Synapse as the Relational Database Engine.
- Knowledge of Groovy and/or Antler.
What we are looking for:
- Desire to investigate and try-out new tools and technologies in data engineering domain
- Good interpersonal skills and ability to work effectively within high-performing teams
- Evidence of self-motivation for continuous development
- Attention to details & analytical thinking
- Strong interpersonal and collaboration skills
4000-4500 Gross.