Step-by-Step Guide for Connecting Snowflake to Microsoft Fabric for Enhanced Data Analytics

Blog | November 4, 2024 | By Deeksha Gupta, Jaikishan Pandey

Step-by-Step Snowflake Integration with Microsoft Fabric

Power Up Your Data Analytics with Snowflake and Microsoft Fabric

Achieving Seamless Data Sharing and Analysis

Why Integrate Snowflake with Microsoft Fabric?

Stepwise Guide for Snowflake and Microsoft Fabric Connection

Benefits of Snowflake-Microsoft Fabric Integration for Analytics

Key Features of Enhanced Analytics with Snowflake

Setting Up Your Snowflake Account for Integration

Configuring Microsoft Fabric for Snowflake Connection

Data Sharing and Analytics Benefits

Troubleshooting Common Integration Issues

Integrating Snowflake with Microsoft Fabric involves a systematic approach to ensure that data flows efficiently from Snowflake’s cloud-based data warehouse into Microsoft Fabric’s comprehensive data platform. In this blog we will explore how to achieve this integration effectively and share best practices for maximizing the strengths of both platforms. By connecting Snowflake’s robust data storage and querying capabilities with Microsoft Fabric’s advanced analytics and data management features, organizations can create a powerful data ecosystem. 

Prerequisites 

  • Snowflake Account and Access
    Ensure that you have an active Snowflake account with the necessary permissions to access databases, warehouses, and schemas. You should also have your credentials ready, including Server URL, Warehouse name, Username, Password, and Role.
  • Microsoft Fabric Subscription
    You need an active Microsoft Fabric subscription with appropriate permissions to create and manage workspaces, warehouses, lakehouses, and dataflows. 

Data Flow Overview

How Data flow from snowflake to power BI

To analyze and manage data from Snowflake within Microsoft Fabric, follow the outlined steps for setting up your workspace, creating necessary data structures, and loading data from Snowflake. This guide will prepare your data for insightful analysis within Power BI. 

Steps to load the data:

Step 1

Creating a new workspace in Power BI Service

  • Click on the workspace tab, then add a new workspace, and name it accordingly
New Workflow in Power BI

Step 2

Creating a Warehouse: Here, we are setting a destination for our data flow

  • Open the Newly Created Workspace >> Click on New>> More Options
  • Now under the Data Warehouse pane, select Warehouse >> Name it accordingly
Creating a warehouse in power BI

Step 3

Creating a Lakehouse: For this particular Use Case we are setting up a destination path for dataflow to both Lakehouse and Warehouse.

  • Open the Newly Created Workspace >> Click on New>> More Options
  • Now under the Data Engineering pane, select Lakehouse >> Name it accordingly
Creating a lake house in power BI

Step 4

Creating Dataflow

  • Open the Newly Created Workspace >> Click on New>> More Options
Snowflake DB data to Fabric

Now under the Data Factory pane, select Dataflow Gen2

How to select dataflow Gen2

Step 5

Now Click on the newly created Dataflow, and we will see a Power query window. Here, we can load the data from Snowflake, and perform the desired transformations or customizations (Removing duplicates, Custom columns, calculations).

Loading the data

  • Click on Get data tab >> More..>> Search “Snowflake” in the Search box 
Loading Data to Snowflake
  • Click on Snowflake Database, and a new prompt will appear for passing the credentials
  • Fill up the Mandatory blanks – Server, Warehouse, Username, and Password

Note: You need to specify the Role as well to see the data from the desired database

Connecting snowflake to Data Source

After clicking next, we will see a prompt for Snowflake database and schemas, Select the desired Tables from the respective Schema

Snowflake database and schemas

Step 6

After loading all the tables, we will see this view in the Power Query

Power BI after loading all the tables

Now we can perform transformations in the tables accordingly, and after finishing, we need to set the Destination of each table, in our case we are pointing the destination to the Warehouse that we have created. 

Step 7

Pointing the Destination of tables to Warehouse

  • Select the table, navigate to the Data destination setting icon >> Select Warehouse >> Click on Next
Pointing the destination tables in power BI

Select the warehouse, give the name of the table, and hit next

Create a destination setting in power bi
  • A new prompt will appear.  Here, we can change the data types of the columns. However, if we don’t want to change anything, just click on save settings, and our table will be loaded in the warehouse

Step 8

Creating a Data Model

  • After setting the destination for all the tables to the Warehouse, go to the Workspace and select the desired Warehouse. There we can see all the tables
Data model in power bi

Now to create a data, click on the Model tab below, here we can build the model to develop the report

Create a data model in power bi

Step 9

Creating a Power BI Report
 
After making appropriate relationships among the tables, we can create a Power BI Report.

  • Click on the Data tab next to the query tab. There we need to navigate to the “New Report” option
Create Power Bi report

A new window of the Power BI report will appear. Here we can create a report, based on data we have loaded

Power Bi dashboard report

Conclusion

Integrating Snowflake with Microsoft Fabric involves creating and configuring workspaces, warehouses, and lakehouses, setting up dataflows, and loading data from Snowflake into Microsoft Fabric. This integration provides a robust platform for managing and analyzing data, combining Snowflake’s scalable data warehousing capabilities with Microsoft Fabric’s advanced analytics and data management features. This streamlined approach ensures efficient data handling and insightful analysis, empowering organizations to make informed business decisions.

Deeksha Gupta
About the Author
A BI Professional with 6+ years of experience in Power BI and Tableau, skilled at turning complex data into actionable insights. Passionate about storytelling through data to drive business decisions and build data-driven cultures.
Deeksha GuptaBI Analyst, Data Value | USEReady
Jaikishan Pandey
About the Author
Business Intelligence professional with 2.5 years of experience in data analysis and visualization. Skilled in data visualization, data cloud solutions, and data migration. Adept at creating impactful visualizations that support informed decision-making and enhance productivity across diverse environment
Jaikishan PandeyBI Analyst, Data Value | USEReady