Work history

January 2021 – Present

Data Engineer / Architect at Cognetik, Cluj-Napoca

– Developed and maintained scalable data pipelines and built out new API integrations to support continuing increases in data volume and complexity.

– Collaborated with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.

– Designs and evaluated open source and vendor tools for data lineage.

– Worked closely with all business units and engineering teams to develop strategy for long term data platform architecture.

– Employ an array of technological languages and tools to connect systems together.

– Build high-performance algorithms, predictive models, and prototypes.

– Recommend different ways to constantly improve data reliability and quality.

September 2019 – Present

Data Engineer / Architect at Cognetik, Cluj-Napoca

– Developed and maintained scalable data pipelines and built out new API integrations to support continuing increases in data volume and complexity.

– Collaborated with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.

– Designs and evaluated open source and vendor tools for data lineage.

– Worked closely with all business units and engineering teams to develop strategy for long term data platform architecture.

– Employ an array of technological languages and tools to connect systems together.

– Build high-performance algorithms, predictive models, and prototypes.

– Recommend different ways to constantly improve data reliability and quality.

February 2018 — September 2019

Senior Business Intelligence Analyst at Cognetik, Cluj-Napoca

– Client Team Lead:

1. Handling new client requirements and acting as the face of the business.

2. Work with various business stakeholders (analysts, engineers, developers, managers, executive team) to find the best data solution.

3. Co-ordinating the daily allocation of work.

4. Estimation of projects and tasks.

5. Dealing with product team to improve product quality.

6. Mentoring and training up junior and new staff.

7. Travel on client site to provide training of tools.

– Delivery Mentor:

1. Ensure that the quality of work meets the highest requirements.

2. Provide training and mentorship for new tools.

3. Guide the team on how to improve their work and how to automate their processes.

4. Identify the weak areas of the individual and designed appropriate training.

5. Working with key stakeholders to define requirements.

– Manipulate, transform and analyze large quantities of data from different database systems, relational or non-relational systems (e.g. Amazon Redshift, Snowflake, Google BigQuery, PosgreSQL)

– Create/improve client dashboards: build the logic and design of the data from the backend; improve performance; create de formulas and functions to transform and cleanse the data (e.g. Excel, PowerBI, Tableau, Microstrategy, Domo, Google Data Studio)

– Develop scripts to automate tasks

– Developed ETL custom scripts to enable daily processing multiple big data sources into data warehouse.

– Data architecture: build conceptual, logical and physical models for data

integrated (e.g. ER/Studio, Oracle Data Modeler).

– Use of ETL tools: Talend, SSIS, IBM DataStage, Cloud Dataflow.

– Developed and implemented custom data validation.

– Use of APIs to build integrations.

– Build web crawlers/scrapers for online data sources with no API endpoint.

– Work with data science and insights teams to build more extensive solutions.

– Inspect and improve SQL queries created by analysts.

– Create workflows in Alteryx to prepare, blend and analyze data from all database systems

– Analyze data from e-marketing systems like Google Analytics and Adobe Analytics

– Use of different BI tools

– Analyse data using various data science methods

– Automate Excel reports using different SQL connectors

October 2017 — February 2018

Data Analyst at Cognetik, Cluj-Napoca

August 2016 — October 2017

Data Analyst at Endava, Cluj-Napoca

– Develop reports and dashboards in PowerBI

– Develop reports in Excel

– Create SQL servers to store data

– Create live connections between different systems and the reporting software

– Advanced statistics and machine learning

– Administer Performance Analytics module from ServiceNow

November 2015 — August 2018

Research Assistant at Babes-Bolyai University, Cluj-Napoca

– analyze sediments with different geophysical and geochemical methods.

– analyse data using different statistical methods and softwares (Origin, R, SPSS, plugins in Excel: Xlstat, NumXL).

– write scientific papers based on the analysed data.