AI tools: A game-changing boost to productivity. Read the report.

Sirojiddin S.

Data Engineer

Sirojiddin is a Data Engineer with over six years of experience in developing Business Intelligence and Big Data solutions. His expertise spans various data stacks, including SQL Server Data Stack (T-SQL, SSIS, SSRS, SSAS). He is proficient in programming languages like Python and C#, and is adept with monitoring tools such as Apache Airflow.

Sirojiddin has hands-on experience in report development using MS Power BI and SSRS. Additionally, he has a strong background in developing solutions using both on-premise tools and cloud technologies. His cloud expertise includes MS Azure Stack (ADF, ADLS Gen2, Azure Synapse, Azure Databricks), as well as Snowflake and dbt (data build tool).

Main expertise
  • Apache Spark
    Apache Spark 2 years
  • Azure Data Factory
    Azure Data Factory 4 years
  • CSV 5 years
Other skills
  • Data Modeling 3 years
  • dbt
    dbt 3 years
  • Apache Hive
    Apache Hive 2 years
Sirojiddin S.


Get started

Selected experience


  • Senior Data Engineer

    Data Integrity Services Inc - 4 years 10 months

    • A solid understanding and great experience in working with different subsets of SQL servers such as DML, DDL, TCL, and DCL Commands.
    • Created and worked with various database objects such as complex stored procedures, multivalued user-defined functions, temporary tables, CTEs, Derived tables, table types, views, and triggers.
    • Wrote complex queries using subqueries, cross-apply, and outer apply.
    • Implemented error handling in T-SQL inside transactions using TRY-CATCH blocks.
    • Implemented performance Tuning of Power BI reports using Performance Analyzer and DAX Studio.
    • Wrote complex DAX queries in order to create calculated columns and measures.
    • Created DAGs, ETL, and ELT processes and monitored jobs using Apache Airflow.
    • Worked on Airflow Performance Tuning of the DAGs and task instance.


    • Technologies:
    • Apache Spark Apache Spark
    • Azure Data Factory Azure Data Factory
    • CSV
    • Data Engineering
    • Databricks Databricks
    • Microsoft Power BI Microsoft Power BI
    • Azure Synapse Azure Synapse
    • Azure Blob storage Azure Blob storage
    • Snowflake Snowflake

    MaxData Solutions LLC - 1 year 6 months

    • Implemented complex joins and subqueries and inserted data from Oracle, IBM DB2, and PostgreSQL databases into SQL server by creating Linked Server and using open query functions.
    • Implemented Performance Tuning on slow-running queries by modifying the properties of SQL server database, creating proper indexes, re-writing bad running queries, and checking execution plan.
    • Used Execute Package Task in order to execute child packages from the master package.
    • Implemented slowly changing dimension type 2 using staging table with T-SQL Merge Statement in SQL Server Integration Services.
    • Created and modified Clustered and Non-Clustered Indexes to optimize the queries using the Index tuning wizard.
    • Normalized the tables and maintained Integrity by using Primary and Foreign Keys.
    • Developed measures with complex dynamic DAX queries to calculate all the required KPIs requested.


    • Technologies:
    • Azure Data Factory Azure Data Factory
    • CSV
    • Databricks Databricks
    • SQL SQL
    • Microsoft Power BI Microsoft Power BI
    • Azure Synapse Azure Synapse

    DataSite Technology - 1 year 11 months

    • Created and scheduled various SQL jobs, using SQL Server Agent to perform various administrative tasks;
    • Created stored procedure using nodes method to load XML file data into SQL server tables.
    • Created and used different Database constraints, such as Primary Key, Foreign Key, Unique Key, Check, and NULL, in order to provide referential integrity as well as apply business logic into tables in SQL server.
    • Implemented reverse engineering so as to examine a database or script file in T-SQL
    • Created complex ETL packages to load data from OLTP to Datawarehouse with incremental load using SQL Server Integration Services.
    • Loaded data from various sources, such as a flat file, CSV, Excel, SQL Server, and Oracle Database, into Datawarehouse using SSIS.
    • Used Script Task in order to load data from multiple flat files into SQL Server.
    • Involved in Analyses Services, particularly in building both Tabular and Multidimensional models (OLAP, Cubes) on top of DW/DM/DB and writing complex DAX and MDX queries against the models.


    • Technologies:
    • CSV
    • SQL SQL
    • Microsoft Power BI Microsoft Power BI


  • BSc.World Economy and International Economic Relations

    UWED · 2014 - 2019

Find your next developer within days, not months

We can help you deliver your product faster with an experienced remote developer. All from €31.90/hour. Only pay if you’re happy with your first week.

In a short 25-minute call, we would like to:

  • Understand your development needs
  • Explain our process to match you with qualified, vetted developers from our network
  • Share next steps to finding the right match, often within less than a week

Not sure where to start?

Let’s have a chat

First developer starts within days. No aggressive sales pitch.