vinod.tiwari@tekgence.com
- Responsible for understanding requirements and performing data analysis.
- Responsible for setup of Microsoft Fabric and its components.
- Building secure, scalable solutions across the Microsoft Fabric platform.
- Create and manage Lakehouse.
- Implement Data Factory processes for data ingestion, scalable ETL, and data integration.
- Design, implement, and manage comprehensive warehousing solutions for analytics using Fabric.
- Create and schedule data pipelines using Azure Data Factory.
- Build robust data solutions using Microsoft data engineering tools like Notebook, Lakehouse, and Spark applications.
- Build and automate deployment pipelines using CI/CD tools for release of Fabric content from lower to higher environments.
- Set and use Git as a repository and version control for Fabric components.
- Create and manage Power BI reports and semantic models.
- Write and optimize complex SQL queries to extract and analyze data, ensuring accurate reporting.
- Work closely with customers, business analysts, and project teams to understand business requirements and align with architecture standards.
- Follow change management procedures to implement project deliverables.
- Coordinate with support teams to resolve issues in a timely manner.
Job Qualifications
Mandatory
- Bachelor’s degree in Computer Science or related field, or equivalent experience.
- 3+ years of experience working in Microsoft Fabric.
- Expertise in OneLake, Lakehouse, Warehouse, and Notebooks.
- Strong understanding of Power BI reports and semantic models using Fabric.
- Proven experience building ETL and data solutions using Azure Data Factory.
- Strong understanding of data warehousing concepts and ETL processes.
- Hands-on experience building data warehouses in Fabric.
- Strong skills in Python and PySpark.
- Practical experience implementing Spark in Fabric, scheduling Spark jobs, and writing Spark SQL queries.
- Experience using Data Activator for data asset management and analytics.
- Ability to adapt to different tools and technologies.
- Strong learning attitude.
- Good written and verbal communication skills.
- Experience working in distributed teams across multiple locations.
Preferred
- Knowledge of AWS services
- Knowledge of Snowflake
- Knowledge of real-time analytics in Fabric
