We’re a Data Science Company with expertise in Big Data Technology and intelligent software development, always looking for new talent to join our growing Data Science family.
We are looking for ‘Big’ Data Engineers for our Dutch office.
How can you tell a story if your facts aren’t connected and your sources aren’t validated? Do you want to sense the challenges of data integration?
Maybe you should consider joining the Intellerts team as ‘Big’ Data Engineer!
As a Data Science and AI company, focussing on delivering ground-breaking AI solutions for various clients in different industries, it all starts with good questions and knowledge of the data we need to work with. Data which needs to be extracted, loaded and transformed, streamlined, analysed and put into an analytical infrastructure for our clients. For this we are looking for Data Engineers who master data integration from different sources, create data pipelines for efficient exchange of data and always have creative ideas to solve complex data problems. Our data engineers playing a crucial role in the team of data analysts, data scientists and software engineers working together on data initiatives, you will ensure optimal data delivery architecture and it is consistent throughout ongoing projects. You must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products.
On the job, you will;
- create, maintain and optimize data pipeline architecture;
- Build on large complex data sets that meet functional / non- functional requirements set by business;
- Identify, design, and implement internal process improvements, with your creativity you will automate manual processes, optimizing data delivery, re-designing infrastructure for greater scalability where needed;
- Build and maintain data pipeline infrastructures for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, PostgreSQL and cloud-based technologies like Azure, AWS, Google Cloud;
- Work with clients, data and analytics experts to improve our data systems and deliver richer functionalities to our clients
If this thought of responsibility scares you, we’re not the right team for you. But if you would like to be challenged on daily basis and have fun with lots of data we would love to talk with you.
We are looking for an engineering jack of all trades and above all, you want to continue to develop your skills within the following areas, apply them on width range of projects and share them with your team;
- High curiosity level for Data, Analytics, and Artificial Intelligence
- Asking questions for the matter you don’t understand but would like to
- Hands-on experience with data storage relational- (SQL Server, PostgreSQL, MySQL etc.) and non-relational- (HDFS, HBase, Cassandra, MongoDB) databases and cloud-based platforms (Azure, AWS, Google Cloud)
- Knowledge of data manipulation and transformation, e.g. SQL, R, Talend
- Hands-on experience building complex data pipelines e.g. ETL
- Programming skills in e.g. Python, R,
- Experience in all word combinations of the extract, load & transform process
Next, to awesome data engineering skills, you master Dutch and English in full professional proficiency as you would start working with our international team for international clients.
Did you recognize yourself in this profile and you getting warmed up?! Send your application letter and resume to (firstname.lastname@example.org).
For questions about this position and/or Intellerts you can call +31 (0) 6 – 2150 4918
We are looking for a Database Developer in Kaunas.
As a database specialist, your main activity will be the design, development, optimization and maintenance of data warehouse and data integrations. You will work with the product development team, product managers and technical experts within the company to deliver high-quality solutions.
- Collaborate with a diverse team of business analysts and BI developers to develop and build Enterprise Data Warehouse (EDW) artifacts, such as SQL code and logical/physical data models;
- Optimize the performance of BI reports by improving warehouse data model and optimizing SQL queries;
- Maintain the ETL process for data warehouse structures that support the reporting and data analytic systems;
- Refactor the data models of the warehouse to enable long-term goals of maintainability and performance;
- Collaborate with data providers to integrate them with the warehouse and maintain the integrations as changes occur on the data providers’ side;
- Create SQL scripts to find inconsistencies and anomalies within data;
- Perform both analyst and developer responsibilities as needed within the full stack supporting the data warehouse;
- Bachelor’s degree in computer science, information technology or related areas;
- At least 3 years of experience working with SQL query/script writing and ETL management in Microsoft SQL Server and/or PostgreSQL
- Good SQL query optimization skills (logical/physical operators, hints, etc.)
- Proficiency in data modeling (OLTP/OLAP) and warehousing
- Hands-on experience with the Agile process and its management tools (e.g. JIRA);
- Good English skills (both written and oral) to communicate and coordinate with functional teams;
- Excellent critical thinking, troubleshooting, and creative problem-solving abilities;
- Ability to deliver against challenging deadlines;
- Responsibility for tasks and challenges.
- Knowledge of advanced data technology stacks (e.g. NoSQL, Big Data);
- Basic knowledge of data visualization tools (e.g. Tableau, Qlik);
- Basic understanding of programming languages;
- Understanding of business processes;
- The opportunity to grow in your field in an international company;
- The opportunity to learn from professionals;
- Excellent environment to expand your knowledge and skills;
- Competitive salary.
We are a team with challenging goals. Join our cause and realize your personal goals in the process! Let’s talk!
Please send your CV to email@example.com
Only selected candidates will be informed about the further process